WorldWideScience

Sample records for model organisms series

  1. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  2. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  3. Computer system organization the B5700/B6700 series

    CERN Document Server

    Organick, Elliott I

    1973-01-01

    Computer System Organization: The B5700/B6700 Series focuses on the organization of the B5700/B6700 Series developed by Burroughs Corp. More specifically, it examines how computer systems can (or should) be organized to support, and hence make more efficient, the running of computer programs that evolve with characteristically similar information structures.Comprised of nine chapters, this book begins with a background on the development of the B5700/B6700 operating systems, paying particular attention to their hardware/software architecture. The discussion then turns to the block-structured p

  4. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  5. Multiple Indicator Stationary Time Series Models.

    Science.gov (United States)

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  6. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    Science.gov (United States)

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  7. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  8. Modeling of Volatility with Non-linear Time Series Model

    OpenAIRE

    Kim Song Yon; Kim Mun Chol

    2013-01-01

    In this paper, non-linear time series models are used to describe volatility in financial time series data. To describe volatility, two of the non-linear time series are combined into form TAR (Threshold Auto-Regressive Model) with AARCH (Asymmetric Auto-Regressive Conditional Heteroskedasticity) error term and its parameter estimation is studied.

  9. Environmental parameters series. 3. Concentration factors of radionuclides in freshwater organisms

    International Nuclear Information System (INIS)

    1994-03-01

    This report outlines recent research activities of Radioactive Waste Management Center. Aiming to estimate the radiation dose of man exposed to radioactive materials in an environment, construction of a calculation model on the transfer of radionuclide in the environment was attempted. This issue, Environmental parameter series No.3 includes six reports on the factors related to environmental concentration for radionuclides. The title of the reports are as follows; Factors modifying the concentration factor (CF), Evaluation of accumulation of radionuclides in brackish water organisms, Dose assessment, CF derived from Japanese limnological data, Data table of CF and Metabolic parameters in relation to bioaccumulation of elements by organisms. In addition to collect and arrange the existing data, CF was calculated based on the concentration of stable elements in various lakes and rivers in Japan. (M.N.)

  10. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    Science.gov (United States)

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  11. The Time Is Right to Focus on Model Organism Metabolomes

    Directory of Open Access Journals (Sweden)

    Arthur S. Edison

    2016-02-01

    Full Text Available Model organisms are an essential component of biological and biomedical research that can be used to study specific biological processes. These organisms are in part selected for facile experimental study. However, just as importantly, intensive study of a small number of model organisms yields important synergies as discoveries in one area of science for a given organism shed light on biological processes in other areas, even for other organisms. Furthermore, the extensive knowledge bases compiled for each model organism enable systems-level understandings of these species, which enhance the overall biological and biomedical knowledge for all organisms, including humans. Building upon extensive genomics research, we argue that the time is now right to focus intensively on model organism metabolomes. We propose a grand challenge for metabolomics studies of model organisms: to identify and map all metabolites onto metabolic pathways, to develop quantitative metabolic models for model organisms, and to relate organism metabolic pathways within the context of evolutionary metabolomics, i.e., phylometabolomics. These efforts should focus on a series of established model organisms in microbial, animal and plant research.

  12. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  13. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  14. Self-organising mixture autoregressive model for non-stationary time series modelling.

    Science.gov (United States)

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  15. Time series modeling in traffic safety research.

    Science.gov (United States)

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Empirical investigation on modeling solar radiation series with ARMA–GARCH models

    International Nuclear Information System (INIS)

    Sun, Huaiwei; Yan, Dong; Zhao, Na; Zhou, Jianzhong

    2015-01-01

    Highlights: • Apply 6 ARMA–GARCH(-M) models to model and forecast solar radiation. • The ARMA–GARCH(-M) models produce more accurate radiation forecasting than conventional methods. • Show that ARMA–GARCH-M models are more effective for forecasting solar radiation mean and volatility. • The ARMA–EGARCH-M is robust and the ARMA–sGARCH-M is very competitive. - Abstract: Simulation of radiation is one of the most important issues in solar utilization. Time series models are useful tools in the estimation and forecasting of solar radiation series and their changes. In this paper, the effectiveness of autoregressive moving average (ARMA) models with various generalized autoregressive conditional heteroskedasticity (GARCH) processes, namely ARMA–GARCH models are evaluated for their effectiveness in radiation series. Six different GARCH approaches, which contain three different ARMA–GARCH models and corresponded GARCH in mean (ARMA–GARCH-M) models, are applied in radiation data sets from two representative climate stations in China. Multiple evaluation metrics of modeling sufficiency are used for evaluating the performances of models. The results show that the ARMA–GARCH(-M) models are effective in radiation series estimation. Both in fitting and prediction of radiation series, the ARMA–GARCH(-M) models show better modeling sufficiency than traditional models, while ARMA–EGARCH-M models are robustness in two sites and the ARMA–sGARCH-M models appear very competitive. Comparisons of statistical diagnostics and model performance clearly show that the ARMA–GARCH-M models make the mean radiation equations become more sufficient. It is recommended the ARMA–GARCH(-M) models to be the preferred method to use in the modeling of solar radiation series

  17. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  18. Human Resource Management in Virtual Organizations. Research in Human Resource Management Series.

    Science.gov (United States)

    Heneman, Robert L., Ed.; Greenberger, David B., Ed.

    This document contains 14 papers on human resources (HR) and human resource management (HRM) in virtual organizations. The following papers are included: "Series Preface" (Rodger Griffeth); "Volume Preface" (Robert L. Heneman, David B. Greenberger); "The Virtual Organization: Definition, Description, and…

  19. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  20. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  1. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...... applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic...... and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...

  2. Modelling conditional heteroscedasticity in nonstationary series

    NARCIS (Netherlands)

    Cizek, P.; Cizek, P.; Härdle, W.K.; Weron, R.

    2011-01-01

    A vast amount of econometrical and statistical research deals with modeling financial time series and their volatility, which measures the dispersion of a series at a point in time (i.e., conditional variance). Although financial markets have been experiencing many shorter and longer periods of

  3. Long Memory Models to Generate Synthetic Hydrological Series

    Directory of Open Access Journals (Sweden)

    Guilherme Armando de Almeida Pereira

    2014-01-01

    Full Text Available In Brazil, much of the energy production comes from hydroelectric plants whose planning is not trivial due to the strong dependence on rainfall regimes. This planning is accomplished through optimization models that use inputs such as synthetic hydrologic series generated from the statistical model PAR(p (periodic autoregressive. Recently, Brazil began the search for alternative models able to capture the effects that the traditional model PAR(p does not incorporate, such as long memory effects. Long memory in a time series can be defined as a significant dependence between lags separated by a long period of time. Thus, this research develops a study of the effects of long dependence in the series of streamflow natural energy in the South subsystem, in order to estimate a long memory model capable of generating synthetic hydrologic series.

  4. Fetal organ dosimetry for the Techa River and Ozyorsk offspring cohorts. Pt. 1. A Urals-based series of fetal computational phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Maynard, Matthew R.; Bolch, Wesley E. [University of Florida, Advanced Laboratory for Radiation Dosimetry Studies (ALRADS), J. Crayton Pruitt Family Department of Biomedical Engineering, Gainesville, FL (United States); Shagina, Natalia B.; Tolstykh, Evgenia I.; Degteva, Marina O. [Urals Research Center for Radiation Medicine, Chelyabinsk (Russian Federation); Fell, Tim P. [Public Health England, Centre for Radiation, Chemical and Environmental Health, Didcot, Chilton, Oxon (United Kingdom)

    2015-03-15

    The European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) project aims to improve understanding of cancer risks associated with chronic in utero radiation exposure. A comprehensive series of hybrid computational fetal phantoms was previously developed at the University of Florida in order to provide the SOLO project with the capability of computationally simulating and quantifying radiation exposures to individual fetal bones and soft tissue organs. To improve harmonization between the SOLO fetal biokinetic models and the computational phantoms, a subset of those phantoms was systematically modified to create a novel series of phantoms matching anatomical data representing Russian fetal biometry in the Southern Urals. Using previously established modeling techniques, eight computational Urals-based phantoms aged 8, 12, 18, 22, 26, 30, 34, and 38 weeks post-conception were constructed to match appropriate age-dependent femur lengths, biparietal diameters, individual bone masses and whole-body masses. Bone and soft tissue organ mass differences between the common ages of the subset of UF phantom series and the Urals-based phantom series illustrated the need for improved understanding of fetal bone densities as a critical parameter of computational phantom development. In anticipation for SOLO radiation dosimetry studies involving the developing fetus and pregnant female, the completed phantom series was successfully converted to a cuboidal voxel format easily interpreted by radiation transport software. (orig.)

  5. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  7. Foundations of Sequence-to-Sequence Modeling for Time Series

    OpenAIRE

    Kuznetsov, Vitaly; Mariet, Zelda

    2018-01-01

    The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting. We provide the first theoretical analysis of this time series forecasting framework. We include a comparison of sequence-to-sequence modeling to classical time series models, and as such our theory can serve as a quantitative guide for practiti...

  8. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  9. Modeling seasonality in bimonthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1992-01-01

    textabstractA recurring issue in modeling seasonal time series variables is the choice of the most adequate model for the seasonal movements. One selection method for quarterly data is proposed in Hylleberg et al. (1990). Market response models are often constructed for bimonthly variables, and

  10. Adaptive time-variant models for fuzzy-time-series forecasting.

    Science.gov (United States)

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  11. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  12. Trend time-series modeling and forecasting with neural networks.

    Science.gov (United States)

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  13. Body and organ dimensions of the 1945 Japanese population used in dosimetry system DS86 and data available for an expanded series of phantoms

    International Nuclear Information System (INIS)

    Cullings, H. M.; Kawamura, H.; Chen, J.

    2012-01-01

    The computational phantoms used in dosimetry system DS86 and re-used in DS02 were derived from models and methods developed at Oak Ridge National Laboratories (ORNL) in the US, but referred to Japanese anthropometric data for the Japanese population of 1945, from studies conducted at the Japanese National Inst. of Radiological Sciences and other sources. The phantoms developed for DS86 were limited to three hermaphroditic models: infant, child and adult. After comparing data from Japanese and Western populations, phantoms were adapted from the pre-existing ORNL series, adjusting some organs in the adult phantom to reflect differences between Japanese and Western data, but not in the infant and child phantoms. To develop a new and larger series of more age- and sex-specific models, it appears necessary to rely on the original Japanese data and values derived from them, which can directly provide population-average body dimensions for various ages. Those data were re-analysed in conjunction with other Asian data for an Asian Reference Man model, providing a rather complete table of organ weights that could be used to scale organs for growth during childhood and adolescence. Although the resulting organ volumes might have some inaccuracies in relation to true population-average values, this is a minor concern because in the DS02 context organ size per se is less important than the correct body size and correct placement of the organ in the body. (authors)

  14. Analisis Tingkat Motivasi Siswa Dalam Pembelajaran IPA Model Advance Organizer Berbasis Proyek

    Directory of Open Access Journals (Sweden)

    Tasiwan -

    2014-04-01

    Full Text Available atur kemajuan (advance organizer berbasis proyek. Sampel penelitian dipilih secara acak. Pada kelas  eksperimen diterapkan model pembelajaran advance organizer berbasis proyek sedangkan pada kelas kontrol diterapkan pembelajaran langsung (direct instruction tanpa advance organizer. Sebelum pembelajaran di kelas, siswa eksperimen dikelompokkan menjadi 8 kelompok yang terdiri atas 4 – 5 siswa. Setiap kelompok ditugaskan untuk merealisasikan proyek bel listrik, rangkaian arus seri – paralel, dan tuas. Produk proyek digunakan dalam pembelajaran dikelas sebagai advance organizer. Data diperoleh melalui observasi partisipatif, penilaian produk, peta konsep, laporan eksperimen, dan angket. Instrumen motivasi menggunakan skala motivasi ARCS. Hasil penelitian menunjukkan bahwa kelas eksperimen memiliki tingkat motivasi lebih baik dalam aspek perhatian, relevansi, kepercayaan diri, dan kepuasan pembelajaran dengan rata – rata tingkat motivasi sebesar 77,20, sedangkan tanpa advance organizer berbasis proyek sebesar 71,10. Disarankan siswa diberikan kemandirian penuh dalam proyek. This study was conducted to analyze the level of student motivation in learning science through models of advance organizer  based project . Samples were selected at random . In the experimental class advance organizer applied learning model based on a class project while learning control direct instruction without advance organizer . Prior learning in the classroom , students are grouped into 8 experimental groups consisting of 4-5 students . Each group was assigned a project to realize an electric bell , the circuit current series - parallel , and lever . Products used in a learning class project as advance organizer . The data obtained through participant observation , assessment product , concept maps , experimental reports , and questionnaires . Motivation instrument using ARCS motivation scale . Results showed that the experimental class had better motivation level

  15. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  16. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  17. FOURIER SERIES MODELS THROUGH TRANSFORMATION

    African Journals Online (AJOL)

    DEPT

    monthly temperature data (1996 – 2005) collected from the National Root ... KEY WORDS: Fourier series, square transformation, multiplicative model, ... fluctuations or movements are often periodic(Ekpeyong,2005). .... significant trend or not, if the trend is not significant, the grand mean may be used as an estimate of trend.

  18. Estimation of pure autoregressive vector models for revenue series ...

    African Journals Online (AJOL)

    This paper aims at applying multivariate approach to Box and Jenkins univariate time series modeling to three vector series. General Autoregressive Vector Models with time varying coefficients are estimated. The first vector is a response vector, while others are predictor vectors. By matrix expansion each vector, whether ...

  19. Investigation on Self-Organization Processes in DC Generators by Synergetic Modeling

    OpenAIRE

    Ion Voncilă; Mădălin Costin; Răzvan Buhosu

    2014-01-01

    In this paper is suggested a new mathematical model, based on which it can be justified the self-excitation DC generators, either shunt or series excitation, by self-organization phenomena that appear to overcome threshold values (self-excitation in these generators is an avalanche process, a positive feedback, considered at first glance uncontrollable).

  20. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  1. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...

  2. A four-stage hybrid model for hydrological time series forecasting.

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  3. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  4. Prediction of traffic-related nitrogen oxides concentrations using Structural Time-Series models

    Science.gov (United States)

    Lawson, Anneka Ruth; Ghosh, Bidisha; Broderick, Brian

    2011-09-01

    Ambient air quality monitoring, modeling and compliance to the standards set by European Union (EU) directives and World Health Organization (WHO) guidelines are required to ensure the protection of human and environmental health. Congested urban areas are most susceptible to traffic-related air pollution which is the most problematic source of air pollution in Ireland. Long-term continuous real-time monitoring of ambient air quality at such urban centers is essential but often not realistic due to financial and operational constraints. Hence, the development of a resource-conservative ambient air quality monitoring technique is essential to ensure compliance with the threshold values set by the standards. As an intelligent and advanced statistical methodology, a Structural Time Series (STS) based approach has been introduced in this paper to develop a parsimonious and computationally simple air quality model. In STS methodology, the different components of a time-series dataset such as the trend, seasonal, cyclical and calendar variations can be modeled separately. To test the effectiveness of the proposed modeling strategy, average hourly concentrations of nitrogen dioxide and nitrogen oxides from a congested urban arterial in Dublin city center were modeled using STS methodology. The prediction error estimates from the developed air quality model indicate that the STS model can be a useful tool in predicting nitrogen dioxide and nitrogen oxides concentrations in urban areas and will be particularly useful in situations where the information on external variables such as meteorology or traffic volume is not available.

  5. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  6. Comparison of annual maximum series and partial duration series methods for modeling extreme hydrologic events

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rasmussen, Peter F.; Rosbjerg, Dan

    1997-01-01

    Two different models for analyzing extreme hydrologic events, based on, respectively, partial duration series (PDS) and annual maximum series (AMS), are compared. The PDS model assumes a generalized Pareto distribution for modeling threshold exceedances corresponding to a generalized extreme value......). In the case of ML estimation, the PDS model provides the most efficient T-year event estimator. In the cases of MOM and PWM estimation, the PDS model is generally preferable for negative shape parameters, whereas the AMS model yields the most efficient estimator for positive shape parameters. A comparison...... of the considered methods reveals that in general, one should use the PDS model with MOM estimation for negative shape parameters, the PDS model with exponentially distributed exceedances if the shape parameter is close to zero, the AMS model with MOM estimation for moderately positive shape parameters, and the PDS...

  7. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.

  8. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    Science.gov (United States)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  9. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  10. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    Science.gov (United States)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  11. Investigation on Self-Organization Processes in DC Generators by Synergetic Modeling

    Directory of Open Access Journals (Sweden)

    Ion Voncilă

    2014-09-01

    Full Text Available In this paper is suggested a new mathematical model, based on which it can be justified the self-excitation DC generators, either shunt or series excitation, by self-organization phenomena that appear to overcome threshold values (self-excitation in these generators is an avalanche process, a positive feedback, considered at first glance uncontrollable.

  12. Modeling vector nonlinear time series using POLYMARS

    NARCIS (Netherlands)

    de Gooijer, J.G.; Ray, B.K.

    2003-01-01

    A modified multivariate adaptive regression splines method for modeling vector nonlinear time series is investigated. The method results in models that can capture certain types of vector self-exciting threshold autoregressive behavior, as well as provide good predictions for more general vector

  13. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  14. Outlier Detection in Structural Time Series Models

    DEFF Research Database (Denmark)

    Marczak, Martyna; Proietti, Tommaso

    investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality......Structural change affects the estimation of economic signals, like the underlying growth rate or the seasonally adjusted series. An important issue, which has attracted a great deal of attention also in the seasonal adjustment literature, is its detection by an expert procedure. The general......–to–specific approach to the detection of structural change, currently implemented in Autometrics via indicator saturation, has proven to be both practical and effective in the context of stationary dynamic regression models and unit–root autoregressions. By focusing on impulse– and step–indicator saturation, we...

  15. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.

  16. Modelling bursty time series

    International Nuclear Information System (INIS)

    Vajna, Szabolcs; Kertész, János; Tóth, Bálint

    2013-01-01

    Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)

  17. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  18. time series modeling of daily abandoned calls in a call centre

    African Journals Online (AJOL)

    DJFLEX

    Models for evaluating and predicting the short periodic time series in daily ... Ugwuowo (2006) proposed asymmetric angular- linear multivariate regression models, ..... Using the parameter estimates in Table 3, the fitted Fourier series model is ..... For the SARIMA model with the stochastic component also being white noise, ...

  19. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  20. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  1. Fourier series

    CERN Document Server

    Tolstov, Georgi P

    1962-01-01

    Richard A. Silverman's series of translations of outstanding Russian textbooks and monographs is well-known to people in the fields of mathematics, physics, and engineering. The present book is another excellent text from this series, a valuable addition to the English-language literature on Fourier series.This edition is organized into nine well-defined chapters: Trigonometric Fourier Series, Orthogonal Systems, Convergence of Trigonometric Fourier Series, Trigonometric Series with Decreasing Coefficients, Operations on Fourier Series, Summation of Trigonometric Fourier Series, Double Fourie

  2. Reactions of 3d-series metallocenes with organic cadmium compounds

    International Nuclear Information System (INIS)

    Razuvaev, G.A.; Mar'in, V.P.; Vyshinskaya, L.I.; Grinval'd, I.I.; Spiridonova, N.N.

    1987-01-01

    Interaction of organic cadmium compounds and 3d-series metallocenes, Cp 2 M (M=V, Cr, Mn, Ni, Co) has been studied. It is shown that direction of these reactions is determined by metallocene nature. Reactions of oxidizing addition leading to σ-complexes formation are characteristic for vanadium and chromium metallocenes. When reacting cobaltocene with R 2 Cd, R group introduction to cyclopentadienyl ring and elimination of cobalt diene complexes take place. Manganocene and nickelocene interaction goes through the stage of complex formation with transition metal - cadmium bond

  3. Short-Term Bus Passenger Demand Prediction Based on Time Series Model and Interactive Multiple Model Approach

    Directory of Open Access Journals (Sweden)

    Rui Xue

    2015-01-01

    Full Text Available Although bus passenger demand prediction has attracted increased attention during recent years, limited research has been conducted in the context of short-term passenger demand forecasting. This paper proposes an interactive multiple model (IMM filter algorithm-based model to predict short-term passenger demand. After aggregated in 15 min interval, passenger demand data collected from a busy bus route over four months were used to generate time series. Considering that passenger demand exhibits various characteristics in different time scales, three time series were developed, named weekly, daily, and 15 min time series. After the correlation, periodicity, and stationarity analyses, time series models were constructed. Particularly, the heteroscedasticity of time series was explored to achieve better prediction performance. Finally, IMM filter algorithm was applied to combine individual forecasting models with dynamically predicted passenger demand for next interval. Different error indices were adopted for the analyses of individual and hybrid models. The performance comparison indicates that hybrid model forecasts are superior to individual ones in accuracy. Findings of this study are of theoretical and practical significance in bus scheduling.

  4. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  5. Estimating High-Dimensional Time Series Models

    DEFF Research Database (Denmark)

    Medeiros, Marcelo C.; Mendes, Eduardo F.

    We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly......, larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency), and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic. A simulation study shows...

  6. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    Science.gov (United States)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the

  7. The use of synthetic input sequences in time series modeling

    International Nuclear Information System (INIS)

    Oliveira, Dair Jose de; Letellier, Christophe; Gomes, Murilo E.D.; Aguirre, Luis A.

    2008-01-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure

  8. Time Series Modelling using Proc Varmax

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2007-01-01

    In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box...

  9. Time-series modeling: applications to long-term finfish monitoring data

    International Nuclear Information System (INIS)

    Bireley, L.E.

    1985-01-01

    The growing concern and awareness that developed during the 1970's over the effects that industry had on the environment caused the electric utility industry in particular to develop monitoring programs. These programs generate long-term series of data that are not very amenable to classical normal-theory statistical analysis. The monitoring data collected from three finfish programs (impingement, trawl and seine) at the Millstone Nuclear Power Station were typical of such series and thus were used to develop methodology that used the full extent of the information in the series. The basis of the methodology was classic Box-Jenkins time-series modeling; however, the models also included deterministic components that involved flow, season and time as predictor variables. Time entered into the models as harmonic regression terms. Of the 32 models fitted to finfish catch data, 19 were found to account for more than 70% of the historical variation. The models were than used to forecast finfish catches a year in advance and comparisons were made to actual data. Usually the confidence intervals associated with the forecasts encompassed most of the observed data. The technique can provide the basis for intervention analysis in future impact assessments

  10. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    Science.gov (United States)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  11. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    Science.gov (United States)

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  12. On modeling panels of time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    2002-01-01

    textabstractThis paper reviews research issues in modeling panels of time series. Examples of this type of data are annually observed macroeconomic indicators for all countries in the world, daily returns on the individual stocks listed in the S&P500, and the sales records of all items in a

  13. Towards a paradigm shift in the modeling of soil organic carbon decomposition for earth system models

    Science.gov (United States)

    He, Yujie

    Soils are the largest terrestrial carbon pools and contain approximately 2200 Pg of carbon. Thus, the dynamics of soil carbon plays an important role in the global carbon cycle and climate system. Earth System Models are used to project future interactions between terrestrial ecosystem carbon dynamics and climate. However, these models often predict a wide range of soil carbon responses and their formulations have lagged behind recent soil science advances, omitting key biogeochemical mechanisms. In contrast, recent mechanistically-based biogeochemical models that explicitly account for microbial biomass pools and enzyme kinetics that catalyze soil carbon decomposition produce notably different results and provide a closer match to recent observations. However, a systematic evaluation of the advantages and disadvantages of the microbial models and how they differ from empirical, first-order formulations in soil decomposition models for soil organic carbon is still needed. This dissertation consists of a series of model sensitivity and uncertainty analyses and identifies dominant decomposition processes in determining soil organic carbon dynamics. Poorly constrained processes or parameters that require more experimental data integration are also identified. This dissertation also demonstrates the critical role of microbial life-history traits (e.g. microbial dormancy) in the modeling of microbial activity in soil organic matter decomposition models. Finally, this study surveys and synthesizes a number of recently published microbial models and provides suggestions for future microbial model developments.

  14. New insights into soil temperature time series modeling: linear or nonlinear?

    Science.gov (United States)

    Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram

    2018-03-01

    Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and

  15. High-temperature series expansions for random Potts models

    Directory of Open Access Journals (Sweden)

    M.Hellmund

    2005-01-01

    Full Text Available We discuss recently generated high-temperature series expansions for the free energy and the susceptibility of random-bond q-state Potts models on hypercubic lattices. Using the star-graph expansion technique, quenched disorder averages can be calculated exactly for arbitrary uncorrelated coupling distributions while keeping the disorder strength p as well as the dimension d as symbolic parameters. We present analyses of the new series for the susceptibility of the Ising (q=2 and 4-state Potts model in three dimensions up to the order 19 and 18, respectively, and compare our findings with results from field-theoretical renormalization group studies and Monte Carlo simulations.

  16. Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005)

    NARCIS (Netherlands)

    Van Engeland, T.; Soetaert, K.E.R.; Knuijt, A.; Laane, R.W.P.M.; Middelburg, J.J.

    2010-01-01

    Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the

  17. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Regional modeling of carbonaceous aerosols over Europe-focus on secondary organic aerosols

    International Nuclear Information System (INIS)

    Bessagnet, B.; Menut, L.; Curci, G.; Hodzic, A.; Guillaume, B.; Liousse, C.; Moukhtar, S.; Pun, B.; Seigneur, C.; Schulz, M.

    2008-01-01

    In this study, an improved and complete secondary organic aerosols (SOA) chemistry scheme was implemented in the CHIMERE model. The implementation of isoprene chemistry for SOA significantly improves agreement between long series of simulated and observed particulate matter concentrations. While simulated organic carbon concentrations are clearly improved at elevated sites by adding the SOA scheme, time correlation are impaired at low level sites in Portugal, Italy and Slovakia. At several sites a clear underestimation by the CHIMERE model is noticed in wintertime possibly due to missing wood burning emissions as shown in previous modeling studies. In Europe, the CHIMERE model gives yearly average SOA concentrations ranging from 0.5 μg m -3 in the Northern Europe to 4 μg m -3 over forested regions in Spain, France, Germany and Italy. In addition, our work suggests that during the highest fire emission periods, fires can be the dominant source of primary organic carbon over the Mediterranean Basin, but the SOA contribution from fire emissions is low. Isoprene chemistry has a strong impact on SOA formation when using current available kinetic schemes. (authors)

  19. Clustering gene expression time series data using an infinite Gaussian process mixture model.

    Science.gov (United States)

    McDowell, Ian C; Manandhar, Dinesh; Vockley, Christopher M; Schmid, Amy K; Reddy, Timothy E; Engelhardt, Barbara E

    2018-01-01

    Transcriptome-wide time series expression profiling is used to characterize the cellular response to environmental perturbations. The first step to analyzing transcriptional response data is often to cluster genes with similar responses. Here, we present a nonparametric model-based method, Dirichlet process Gaussian process mixture model (DPGP), which jointly models data clusters with a Dirichlet process and temporal dependencies with Gaussian processes. We demonstrate the accuracy of DPGP in comparison to state-of-the-art approaches using hundreds of simulated data sets. To further test our method, we apply DPGP to published microarray data from a microbial model organism exposed to stress and to novel RNA-seq data from a human cell line exposed to the glucocorticoid dexamethasone. We validate our clusters by examining local transcription factor binding and histone modifications. Our results demonstrate that jointly modeling cluster number and temporal dependencies can reveal shared regulatory mechanisms. DPGP software is freely available online at https://github.com/PrincetonUniversity/DP_GP_cluster.

  20. Clustering gene expression time series data using an infinite Gaussian process mixture model.

    Directory of Open Access Journals (Sweden)

    Ian C McDowell

    2018-01-01

    Full Text Available Transcriptome-wide time series expression profiling is used to characterize the cellular response to environmental perturbations. The first step to analyzing transcriptional response data is often to cluster genes with similar responses. Here, we present a nonparametric model-based method, Dirichlet process Gaussian process mixture model (DPGP, which jointly models data clusters with a Dirichlet process and temporal dependencies with Gaussian processes. We demonstrate the accuracy of DPGP in comparison to state-of-the-art approaches using hundreds of simulated data sets. To further test our method, we apply DPGP to published microarray data from a microbial model organism exposed to stress and to novel RNA-seq data from a human cell line exposed to the glucocorticoid dexamethasone. We validate our clusters by examining local transcription factor binding and histone modifications. Our results demonstrate that jointly modeling cluster number and temporal dependencies can reveal shared regulatory mechanisms. DPGP software is freely available online at https://github.com/PrincetonUniversity/DP_GP_cluster.

  1. Rotation in the dynamic factor modeling of multivariate stationary time series.

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    2001-01-01

    A special rotation procedure is proposed for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white

  2. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    Science.gov (United States)

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9

  3. Forecasting daily meteorological time series using ARIMA and regression models

    Science.gov (United States)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  4. Hierarchical Hidden Markov Models for Multivariate Integer-Valued Time-Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Di Mari, Roberto

    2018-01-01

    We propose a new flexible dynamic model for multivariate nonnegative integer-valued time-series. Observations are assumed to depend on the realization of two additional unobserved integer-valued stochastic variables which control for the time-and cross-dependence of the data. An Expectation......-Maximization algorithm for maximum likelihood estimation of the model's parameters is derived. We provide conditional and unconditional (cross)-moments implied by the model, as well as the limiting distribution of the series. A Monte Carlo experiment investigates the finite sample properties of our estimation...

  5. Modelling Changes in the Unconditional Variance of Long Stock Return Series

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long return series. For the purpose, we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta (2011...... show that the long-memory property in volatility may be explained by ignored changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecast accuracy of the new model over the GJR-GARCH model at all...... horizons for a subset of the long return series....

  6. Modelling changes in the unconditional variance of long stock return series

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    2014-01-01

    In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long daily return series. For this purpose we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta...... that the apparent long memory property in volatility may be interpreted as changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecasting accuracy of the new model over the GJR-GARCH model at all horizons for eight...... subsets of the long return series....

  7. Small-signal model for the series resonant converter

    Science.gov (United States)

    King, R. J.; Stuart, T. A.

    1985-01-01

    The results of a previous discrete-time model of the series resonant dc-dc converter are reviewed and from these a small signal dynamic model is derived. This model is valid for low frequencies and is based on the modulation of the diode conduction angle for control. The basic converter is modeled separately from its output filter to facilitate the use of these results for design purposes. Experimental results are presented.

  8. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  9. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  10. Model-based Clustering of Categorical Time Series with Multinomial Logit Classification

    Science.gov (United States)

    Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea

    2010-09-01

    A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.

  11. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  12. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data

    Directory of Open Access Journals (Sweden)

    Kansuporn eSriyudthsak

    2016-05-01

    Full Text Available The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  13. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Shiraishi, Fumihide; Hirai, Masami Yokota

    2016-01-01

    The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although, hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  14. Degeneracy of time series models: The best model is not always the correct model

    International Nuclear Information System (INIS)

    Judd, Kevin; Nakamura, Tomomichi

    2006-01-01

    There are a number of good techniques for finding, in some sense, the best model of a deterministic system given a time series of observations. We examine a problem called model degeneracy, which has the consequence that even when a perfect model of a system exists, one does not find it using the best techniques currently available. The problem is illustrated using global polynomial models and the theory of Groebner bases

  15. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  16. Dynamic model of organic pollutant degradation in three dimensional packed bed electrode reactor.

    Science.gov (United States)

    Pang, Tianting; Wang, Yan; Yang, Hui; Wang, Tianlei; Cai, Wangfeng

    2018-04-21

    A dynamic model of semi-batch three-dimensional electrode reactor was established based on the limiting current density, Faraday's law, mass balance and a series of assumptions. Semi-batch experiments of phenol degradation were carried out in a three-dimensional electrode reactor packed with activated carbon under different conditions to verify the model. The factors such as the current density, the electrolyte concentration, the initial pH value, the flow rate of organic and the initial organic concentration were examined to know about the pollutant degradation in the three-dimensional electrode reactor. The various concentrations and logarithm of concentration of phenol with time were compared with the dynamic model. It was shown that the calculated data were in good agreement with experimental data in most cases. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Neural network versus classical time series forecasting models

    Science.gov (United States)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  18. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  19. Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition

    Science.gov (United States)

    Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.

    2005-12-01

    Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.

  20. Multiband Prediction Model for Financial Time Series with Multivariate Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Md. Rabiul Islam

    2012-01-01

    Full Text Available This paper presents a subband approach to financial time series prediction. Multivariate empirical mode decomposition (MEMD is employed here for multiband representation of multichannel financial time series together. Autoregressive moving average (ARMA model is used in prediction of individual subband of any time series data. Then all the predicted subband signals are summed up to obtain the overall prediction. The ARMA model works better for stationary signal. With multiband representation, each subband becomes a band-limited (narrow band signal and hence better prediction is achieved. The performance of the proposed MEMD-ARMA model is compared with classical EMD, discrete wavelet transform (DWT, and with full band ARMA model in terms of signal-to-noise ratio (SNR and mean square error (MSE between the original and predicted time series. The simulation results show that the MEMD-ARMA-based method performs better than the other methods.

  1. Regional modeling of carbonaceous aerosols over Europe-focus on secondary organic aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Bessagnet, B. [INERIS, Inst Nat Env Indust Risques, F-60550 Verneuil en Halatte, (France); Menut, L. [Ecole Poltechnique, Inst Pierre Simon Laplace, Lab Meteorol Dyn, F-91128 Palaiseau, (France); Curci, G. [Univ degli Studi dell' Aquila, CETEMPS, 67010 Coppito - L' Aquila, (Italy); Hodzic, A. [NCAR, Nat Center for Atmosph Research, Boulder, 80301, CO, (United States); Guillaume, B.; Liousse, C. [LA/OMP, Lab Aerol/Observ Midi-Pyrenees, F-31400 Toulouse, (France); Moukhtar, S. [York Univ, Centre Atmosph Chem, Toronto, (Italy); Pun, B.; Seigneur, C. [Atmosph and Environ Research, San Ramon, CA 94583, (United States); Schulz, M. [CEA-CNRS-UVSQ, IPSL, Lab Sciences Climat et Environm, F-91191 Gif sur Yvette, (France)

    2008-07-01

    In this study, an improved and complete secondary organic aerosols (SOA) chemistry scheme was implemented in the CHIMERE model. The implementation of isoprene chemistry for SOA significantly improves agreement between long series of simulated and observed particulate matter concentrations. While simulated organic carbon concentrations are clearly improved at elevated sites by adding the SOA scheme, time correlation are impaired at low level sites in Portugal, Italy and Slovakia. At several sites a clear underestimation by the CHIMERE model is noticed in wintertime possibly due to missing wood burning emissions as shown in previous modeling studies. In Europe, the CHIMERE model gives yearly average SOA concentrations ranging from 0.5 {mu}g m{sup -3} in the Northern Europe to 4 {mu}g m{sup -3} over forested regions in Spain, France, Germany and Italy. In addition, our work suggests that during the highest fire emission periods, fires can be the dominant source of primary organic carbon over the Mediterranean Basin, but the SOA contribution from fire emissions is low. Isoprene chemistry has a strong impact on SOA formation when using current available kinetic schemes. (authors)

  2. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  3. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Science.gov (United States)

    Fu, Peihua; Zhu, Anding; Fang, Qiwen; Wang, Xi

    Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR)-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data. We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution. We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation. To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount. The buzz in public social communities

  4. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Directory of Open Access Journals (Sweden)

    Peihua Fu

    Full Text Available Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data.We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution.We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation.To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount. The buzz in public

  5. Modeling Periodic Impulsive Effects on Online TV Series Diffusion

    Science.gov (United States)

    Fang, Qiwen; Wang, Xi

    2016-01-01

    Background Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR)-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data. Methods We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution. Results We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation. Conclusion To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount

  6. Hybrid pregnant reference phantom series based on adult female ICRP reference phantom

    Science.gov (United States)

    Rafat-Motavalli, Laleh; Miri-Hakimabad, Hashem; Hoseinian-Azghadi, Elie

    2018-03-01

    This paper presents boundary representation (BREP) models of pregnant female and her fetus at the end of each trimester. The International Commission on Radiological Protection (ICRP) female reference voxel phantom was used as a base template in development process of the pregnant hybrid phantom series. The differences in shape and location of the displaced maternal organs caused by enlarging uterus were also taken into account. The CT and MR images of fetus specimens and pregnant patients of various ages were used to replace the maternal abdominal pelvic organs of template phantom and insert the fetus inside the gravid uterus. Each fetal model contains 21 different organs and tissues. The skeletal model of the fetus also includes age-dependent cartilaginous and ossified skeletal components. The replaced maternal organ models were converted to NURBS surfaces and then modified to conform to reference values of ICRP Publication 89. The particular feature of current series compared to the previously developed pregnant phantoms is being constructed upon the basis of ICRP reference phantom. The maternal replaced organ models are NURBS surfaces. With this great potential, they might have the feasibility of being converted to high quality polygon mesh phantoms.

  7. Identification of neutral biochemical network models from time series data.

    Science.gov (United States)

    Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S

    2009-05-05

    The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  8. Bayesian dynamic modeling of time series of dengue disease case counts.

    Science.gov (United States)

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful

  9. TIME SERIES MODELS OF THREE SETS OF RXTE OBSERVATIONS OF 4U 1543–47

    International Nuclear Information System (INIS)

    Koen, C.

    2013-01-01

    The X-ray nova 4U 1543–47 was in a different physical state (low/hard, high/soft, and very high) during the acquisition of each of the three time series analyzed in this paper. Standard time series models of the autoregressive moving average (ARMA) family are fitted to these series. The low/hard data can be adequately modeled by a simple low-order model with fixed coefficients, once the slowly varying mean count rate has been accounted for. The high/soft series requires a higher order model, or an ARMA model with variable coefficients. The very high state is characterized by a succession of 'dips', with roughly equal depths. These seem to appear independently of one another. The underlying stochastic series can again be modeled by an ARMA form, or roughly as the sum of an ARMA series and white noise. The structuring of each model in terms of short-lived aperiodic and 'quasi-periodic' components is discussed.

  10. Road safety forecasts in five European countries using structural time series models.

    Science.gov (United States)

    Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George

    2014-01-01

    Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.

  11. New series of 3 D lattice integrable models

    International Nuclear Information System (INIS)

    Mangazeev, V.V.; Sergeev, S.M.; Stroganov, Yu.G.

    1993-01-01

    In this paper we present a new series of 3-dimensional integrable lattice models with N colors. The weight functions of the models satisfy modified tetrahedron equations with N states and give a commuting family of two-layer transfer-matrices. The dependence on the spectral parameters corresponds to the static limit of the modified tetrahedron equations and weights are parameterized in terms of elliptic functions. The models contain two free parameters: elliptic modulus and additional parameter η. 12 refs

  12. Tree-Structured Digital Organisms Model

    Science.gov (United States)

    Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo

    Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.

  13. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    Science.gov (United States)

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  14. Modeling dyadic processes using Hidden Markov Models: A time series approach to mother-infant interactions during infant immunization.

    Science.gov (United States)

    Stifter, Cynthia A; Rovine, Michael

    2015-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at two and six months of age, used hidden Markov modeling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a 4-state model for the dyadic responses to a two-month inoculation whereas a 6-state model best described the dyadic process at six months. Two of the states at two months and three of the states at six months suggested a progression from high intensity crying to no crying with parents using vestibular and auditory soothing methods. The use of feeding and/or pacifying to soothe the infant characterized one two-month state and two six-month states. These data indicate that with maturation and experience, the mother-infant dyad is becoming more organized around the soothing interaction. Using hidden Markov modeling to describe individual differences, as well as normative processes, is also presented and discussed.

  15. Nonlinearity, Breaks, and Long-Range Dependence in Time-Series Models

    DEFF Research Database (Denmark)

    Hillebrand, Eric Tobias; Medeiros, Marcelo C.

    We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building...

  16. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA; GENTON, MARC G.

    2009-01-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided

  17. Identification of neutral biochemical network models from time series data

    Directory of Open Access Journals (Sweden)

    Maia Marco

    2009-05-01

    Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  18. Time-series modeling of long-term weight self-monitoring data.

    Science.gov (United States)

    Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka

    2015-08-01

    Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.

  19. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  20. Automated Bayesian model development for frequency detection in biological time series.

    Science.gov (United States)

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time

  1. Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns

    Science.gov (United States)

    Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto

    2017-09-01

    Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.

  2. Tempered fractional time series model for turbulence in geophysical flows

    Science.gov (United States)

    Meerschaert, Mark M.; Sabzikar, Farzad; Phanikumar, Mantha S.; Zeleke, Aklilu

    2014-09-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model.

  3. Mathematical modeling of atmospheric fine particle-associated primary organic compound concentrations

    Science.gov (United States)

    Rogge, Wolfgang F.; Hildemann, Lynn M.; Mazurek, Monica A.; Cass, Glen R.; Simoneit, Bernd R. T.

    1996-08-01

    An atmospheric transport model has been used to explore the relationship between source emissions and ambient air quality for individual particle phase organic compounds present in primary aerosol source emissions. An inventory of fine particulate organic compound emissions was assembled for the Los Angeles area in the year 1982. Sources characterized included noncatalyst- and catalyst-equipped autos, diesel trucks, paved road dust, tire wear, brake lining dust, meat cooking operations, industrial oil-fired boilers, roofing tar pots, natural gas combustion in residential homes, cigarette smoke, fireplaces burning oak and pine wood, and plant leaf abrasion products. These primary fine particle source emissions were supplied to a computer-based model that simulates atmospheric transport, dispersion, and dry deposition based on the time series of hourly wind observations and mixing depths. Monthly average fine particle organic compound concentrations that would prevail if the primary organic aerosol were transported without chemical reaction were computed for more than 100 organic compounds within an 80 km × 80 km modeling area centered over Los Angeles. The monthly average compound concentrations predicted by the transport model were compared to atmospheric measurements made at monitoring sites within the study area during 1982. The predicted seasonal variation and absolute values of the concentrations of the more stable compounds are found to be in reasonable agreement with the ambient observations. While model predictions for the higher molecular weight polycyclic aromatic hydrocarbons (PAH) are in agreement with ambient observations, lower molecular weight PAH show much higher predicted than measured atmospheric concentrations in the particle phase, indicating atmospheric decay by chemical reactions or evaporation from the particle phase. The atmospheric concentrations of dicarboxylic acids and aromatic polycarboxylic acids greatly exceed the contributions that

  4. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  5. Volterra-series-based nonlinear system modeling and its engineering applications: A state-of-the-art review

    Science.gov (United States)

    Cheng, C. M.; Peng, Z. K.; Zhang, W. M.; Meng, G.

    2017-03-01

    Nonlinear problems have drawn great interest and extensive attention from engineers, physicists and mathematicians and many other scientists because most real systems are inherently nonlinear in nature. To model and analyze nonlinear systems, many mathematical theories and methods have been developed, including Volterra series. In this paper, the basic definition of the Volterra series is recapitulated, together with some frequency domain concepts which are derived from the Volterra series, including the general frequency response function (GFRF), the nonlinear output frequency response function (NOFRF), output frequency response function (OFRF) and associated frequency response function (AFRF). The relationship between the Volterra series and other nonlinear system models and nonlinear problem solving methods are discussed, including the Taylor series, Wiener series, NARMAX model, Hammerstein model, Wiener model, Wiener-Hammerstein model, harmonic balance method, perturbation method and Adomian decomposition. The challenging problems and their state of arts in the series convergence study and the kernel identification study are comprehensively introduced. In addition, a detailed review is then given on the applications of Volterra series in mechanical engineering, aeroelasticity problem, control engineering, electronic and electrical engineering.

  6. Time series modeling of soil moisture dynamics on a steep mountainous hillside

    Science.gov (United States)

    Kim, Sanghyun

    2016-05-01

    The response of soil moisture to rainfall events along hillslope transects is an important hydrologic process and a critical component of interactions between soil vegetation and the atmosphere. In this context, the research described in this article addresses the spatial distribution of soil moisture as a function of topography. In order to characterize the temporal variation in soil moisture on a steep mountainous hillside, a transfer function, including a model for noise, was introduced. Soil moisture time series with similar rainfall amounts, but different wetness gradients were measured in the spring and fall. Water flux near the soil moisture sensors was modeled and mathematical expressions were developed to provide a basis for input-output modeling of rainfall and soil moisture using hydrological processes such as infiltration, exfiltration and downslope lateral flow. The characteristics of soil moisture response can be expressed in terms of model structure. A seasonal comparison of models reveals differences in soil moisture response to rainfall, possibly associated with eco-hydrological process and evapotranspiration. Modeling results along the hillslope indicate that the spatial structure of the soil moisture response patterns mainly appears in deeper layers. Similarities between topographic attributes and stochastic model structures are spatially organized. The impact of temporal and spatial discretization scales on parameter expression is addressed in the context of modeling results that link rainfall events and soil moisture.

  7. Tempered fractional time series model for turbulence in geophysical flows

    International Nuclear Information System (INIS)

    Meerschaert, Mark M; Sabzikar, Farzad; Phanikumar, Mantha S; Zeleke, Aklilu

    2014-01-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model. (paper)

  8. A prediction method based on wavelet transform and multiple models fusion for chaotic time series

    International Nuclear Information System (INIS)

    Zhongda, Tian; Shujiang, Li; Yanhong, Wang; Yi, Sha

    2017-01-01

    In order to improve the prediction accuracy of chaotic time series, a prediction method based on wavelet transform and multiple models fusion is proposed. The chaotic time series is decomposed and reconstructed by wavelet transform, and approximate components and detail components are obtained. According to different characteristics of each component, least squares support vector machine (LSSVM) is used as predictive model for approximation components. At the same time, an improved free search algorithm is utilized for predictive model parameters optimization. Auto regressive integrated moving average model (ARIMA) is used as predictive model for detail components. The multiple prediction model predictive values are fusion by Gauss–Markov algorithm, the error variance of predicted results after fusion is less than the single model, the prediction accuracy is improved. The simulation results are compared through two typical chaotic time series include Lorenz time series and Mackey–Glass time series. The simulation results show that the prediction method in this paper has a better prediction.

  9. A time series model: First-order integer-valued autoregressive (INAR(1))

    Science.gov (United States)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  10. On the maximum-entropy/autoregressive modeling of time series

    Science.gov (United States)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  11. 75 FR 13259 - NOAA Is Hosting a Series of Informational Webinars for Individuals and Organizations To Learn...

    Science.gov (United States)

    2010-03-19

    ... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration NOAA Is Hosting a Series of Informational Webinars for Individuals and Organizations To Learn About the Proposed NOAA Climate Service AGENCY: Office of Oceanic and Atmospheric Research, National Oceanic and Atmospheric Administration (NOAA...

  12. Assimilation of LAI time-series in crop production models

    Science.gov (United States)

    Kooistra, Lammert; Rijk, Bert; Nannes, Louis

    2014-05-01

    Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor

  13. Markov Chain Modelling for Short-Term NDVI Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Stepčenko Artūrs

    2016-12-01

    Full Text Available In this paper, the NDVI time series forecasting model has been developed based on the use of discrete time, continuous state Markov chain of suitable order. The normalised difference vegetation index (NDVI is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation; therefore, it is an important variable for vegetation forecasting. A Markov chain is a stochastic process that consists of a state space. This stochastic process undergoes transitions from one state to another in the state space with some probabilities. A Markov chain forecast model is flexible in accommodating various forecast assumptions and structures. The present paper discusses the considerations and techniques in building a Markov chain forecast model at each step. Continuous state Markov chain model is analytically described. Finally, the application of the proposed Markov chain model is illustrated with reference to a set of NDVI time series data.

  14. Modeling and Forecasting of Water Demand in Isfahan Using Underlying Trend Concept and Time Series

    Directory of Open Access Journals (Sweden)

    H. Sadeghi

    2016-02-01

    Full Text Available Introduction: Accurate water demand modeling for the city is very important for forecasting and policies adoption related to water resources management. Thus, for future requirements of water estimation, forecasting and modeling, it is important to utilize models with little errors. Water has a special place among the basic human needs, because it not hampers human life. The importance of the issue of water management in the extraction and consumption, it is necessary as a basic need. Municipal water applications is include a variety of water demand for domestic, public, industrial and commercial. Predicting the impact of urban water demand in better planning of water resources in arid and semiarid regions are faced with water restrictions. Materials and Methods: One of the most important factors affecting the changing technological advances in production and demand functions, we must pay special attention to the layout pattern. Technology development is concerned not only technically, but also other aspects such as personal, non-economic factors (population, geographical and social factors can be analyzed. Model examined in this study, a regression model is composed of a series of structural components over time allows changed invisible accidentally. Explanatory variables technology (both crystalline and amorphous in a model according to which the material is said to be better, but because of the lack of measured variables over time can not be entered in the template. Model examined in this study, a regression model is composed of a series of structural component invisible accidentally changed over time allows. In this study, structural time series (STSM and ARMA time series models have been used to model and estimate the water demand in Isfahan. Moreover, in order to find the efficient procedure, both models have been compared to each other. The desired data in this research include water consumption in Isfahan, water price and the monthly pay

  15. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  16. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    Carlo method of forecasting using a special nonlinear time series model, called logistic smooth transition ... We illustrate this new method using some simulation ..... in MATLAB 7.5.0. ... process (DGP) using the logistic smooth transi-.

  17. The application of time series models to cloud field morphology analysis

    Science.gov (United States)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  18. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  19. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    OpenAIRE

    Yanhui Xi; Hui Peng; Yemei Qin

    2016-01-01

    The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation....

  20. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...

  1. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    Science.gov (United States)

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  2. Project-matrix models of marketing organization

    Directory of Open Access Journals (Sweden)

    Gutić Dragutin

    2009-01-01

    Full Text Available Unlike theory and practice of corporation organization, in marketing organization numerous forms and contents at its disposal are not reached until this day. It can be well estimated that marketing organization today in most of our companies and in almost all its parts, noticeably gets behind corporation organization. Marketing managers have always been occupied by basic, narrow marketing activities as: sales growth, market analysis, market growth and market share, marketing research, introduction of new products, modification of products, promotion, distribution etc. They rarely found it necessary to focus a bit more to different aspects of marketing management, for example: marketing planning and marketing control, marketing organization and leading. This paper deals with aspects of project - matrix marketing organization management. Two-dimensional and more-dimensional models are presented. Among two-dimensional, these models are analyzed: Market management/products management model; Products management/management of product lifecycle phases on market model; Customers management/marketing functions management model; Demand management/marketing functions management model; Market positions management/marketing functions management model. .

  3. Modeling sports highlights using a time-series clustering framework and model interpretation

    Science.gov (United States)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  4. Improved time series prediction with a new method for selection of model parameters

    International Nuclear Information System (INIS)

    Jade, A M; Jayaraman, V K; Kulkarni, B D

    2006-01-01

    A new method for model selection in prediction of time series is proposed. Apart from the conventional criterion of minimizing RMS error, the method also minimizes the error on the distribution of singularities, evaluated through the local Hoelder estimates and its probability density spectrum. Predictions of two simulated and one real time series have been done using kernel principal component regression (KPCR) and model parameters of KPCR have been selected employing the proposed as well as the conventional method. Results obtained demonstrate that the proposed method takes into account the sharp changes in a time series and improves the generalization capability of the KPCR model for better prediction of the unseen test data. (letter to the editor)

  5. Pin failure modeling of the A series CABRI tests

    International Nuclear Information System (INIS)

    Young, M.F.; Portugal, J.L.

    1978-01-01

    The EXPAND pin fialure model, a research tool designed to model pin failure under prompt burst conditions, has been used to predict failure conditions for several of the A series CABRI tests as part of the United States participation in the CABRI Joint Project. The Project is an international program involving France, Germany, England, Japan, and the United States and has the goal of obtaining experimental data relating to the safety of LMFBR's. The A series, designed to simulate high ramp rate TOP conditions, initially utilizes single, fresh UO 2 pins of the PHENIX type in a flowing sodium loop. The pins are preheated at constant power in the CABRI reactor to establish steady state conditions (480 w/cm at the axial peak) and then subjected to a power pulse of 14 ms to 24 ms duration

  6. Time series modeling by a regression approach based on a latent process.

    Science.gov (United States)

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.

  7. Organic matter production in an age series of Eucalyptus globulus plantations in Tamil Nadu

    Energy Technology Data Exchange (ETDEWEB)

    Negi, J D.S.; Bora, N K.S.; Tandon, V N; Thapliyal, H D

    1984-08-01

    The distribution of organic matter in an age series of Eucalyptus globulus plantations in Tamil Nadu is discussed. The total biomass ranges from 38 tonnes (5 years) to 220 tonnes (16 years) per ha with 85 to 88 percent being contributed by the aboveground parts and 15 to 12 percent by the roots and the average annual production of non-photosynthetic components is at its peak (19 tonnes/ha) at the age of 7 years. 17 references, 4 tables.

  8. Molecular simulation of a model of dissolved organic matter.

    Science.gov (United States)

    Sutton, Rebecca; Sposito, Garrison; Diallo, Mamadou S; Schulten, Hans-Rolf

    2005-08-01

    A series of atomistic simulations was performed to assess the ability of the Schulten dissolved organic matter (DOM) molecule, a well-established model humic molecule, to reproduce the physical and chemical behavior of natural humic substances. The unhydrated DOM molecule had a bulk density value appropriate to humic matter, but its Hildebrand solubility parameter was lower than the range of current experimental estimates. Under hydrated conditions, the DOM molecule went through conformational adjustments that resulted in disruption of intramolecular hydrogen bonds (H-bonds), although few water molecules penetrated the organic interior. The radius of gyration of the hydrated DOM molecule was similar to those measured for aquatic humic substances. To simulate humic materials under aqueous conditions with varying pH levels, carboxyl groups were deprotonated, and hydrated Na+ or Ca2+ were added to balance the resulting negative charge. Because of intrusion of the cation hydrates, the model metal-humic structures were more porous, had greater solvent-accessible surface areas, and formed more H-bonds with water than the protonated, hydrated DOM molecule. Relative to Na+, Ca2+ was both more strongly bound to carboxylate groups and more fully hydrated. This difference was attributed to the higher charge of the divalent cation. The Ca-DOM hydrate, however, featured fewer H-bonds than the Na-DOM hydrate, perhaps because of the reduced orientational freedom of organic moieties and water molecules imposed by Ca2+. The present work is, to our knowledge, the first rigorous computational exploration regarding the behavior of a model humic molecule under a range of physical conditions typical of soil and water systems.

  9. LSOT: A Lightweight Self-Organized Trust Model in VANETs

    Directory of Open Access Journals (Sweden)

    Zhiquan Liu

    2016-01-01

    Full Text Available With the advances in automobile industry and wireless communication technology, Vehicular Ad hoc Networks (VANETs have attracted the attention of a large number of researchers. Trust management plays an important role in VANETs. However, it is still at the preliminary stage and the existing trust models cannot entirely conform to the characteristics of VANETs. This work proposes a novel Lightweight Self-Organized Trust (LSOT model which contains trust certificate-based and recommendation-based trust evaluations. Both the supernodes and trusted third parties are not needed in our model. In addition, we comprehensively consider three factor weights to ease the collusion attack in trust certificate-based trust evaluation, and we utilize the testing interaction method to build and maintain the trust network and propose a maximum local trust (MLT algorithm to identify trustworthy recommenders in recommendation-based trust evaluation. Furthermore, a fully distributed VANET scenario is deployed based on the famous Advogato dataset and a series of simulations and analysis are conducted. The results illustrate that our LSOT model significantly outperforms the excellent experience-based trust (EBT and Lightweight Cross-domain Trust (LCT models in terms of evaluation performance and robustness against the collusion attack.

  10. A COMPARATIVE STUDY OF FORECASTING MODELS FOR TREND AND SEASONAL TIME SERIES DOES COMPLEX MODEL ALWAYS YIELD BETTER FORECAST THAN SIMPLE MODELS

    Directory of Open Access Journals (Sweden)

    Suhartono Suhartono

    2005-01-01

    Full Text Available Many business and economic time series are non-stationary time series that contain trend and seasonal variations. Seasonality is a periodic and recurrent pattern caused by factors such as weather, holidays, or repeating promotions. A stochastic trend is often accompanied with the seasonal variations and can have a significant impact on various forecasting methods. In this paper, we will investigate and compare some forecasting methods for modeling time series with both trend and seasonal patterns. These methods are Winter's, Decomposition, Time Series Regression, ARIMA and Neural Networks models. In this empirical research, we study on the effectiveness of the forecasting performance, particularly to answer whether a complex method always give a better forecast than a simpler method. We use a real data, that is airline passenger data. The result shows that the more complex model does not always yield a better result than a simpler one. Additionally, we also find the possibility to do further research especially the use of hybrid model by combining some forecasting method to get better forecast, for example combination between decomposition (as data preprocessing and neural network model.

  11. Disease management with ARIMA model in time series.

    Science.gov (United States)

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  12. Modeling the full-bridge series-resonant power converter

    Science.gov (United States)

    King, R. J.; Stuart, T. A.

    1982-01-01

    A steady state model is derived for the full-bridge series-resonant power converter. Normalized parametric curves for various currents and voltages are then plotted versus the triggering angle of the switching devices. The calculations are compared with experimental measurements made on a 50 kHz converter and a discussion of certain operating problems is presented.

  13. Development of Simulink-Based SiC MOSFET Modeling Platform for Series Connected Devices

    DEFF Research Database (Denmark)

    Tsolaridis, Georgios; Ilves, Kalle; Reigosa, Paula Diaz

    2016-01-01

    A new MATLAB/Simulink-based modeling platform has been developed for SiC MOSFET power modules. The modeling platform describes the electrical behavior f a single 1.2 kV/ 350 A SiC MOSFET power module, as well as the series connection of two of them. A fast parameter initialization is followed...... by an optimization process to facilitate the extraction of the model’s parameters in a more automated way relying on a small number of experimental waveforms. Through extensive experimental work, it is shown that the model accurately predicts both static and dynamic performances. The series connection of two Si......C power modules has been investigated through the validation of the static and dynamic conditions. Thanks to the developed model, a better understanding of the challenges introduced by uneven voltage balance sharing among series connected devices is possible....

  14. Modelos de gestión de conflictos en serie de ficción televisiva (Conflict management models in television fiction series

    Directory of Open Access Journals (Sweden)

    Yolanda Navarro-Abal

    2012-12-01

    Full Text Available Television fiction series sometimes generate an unreal vision of life, especially among young people, becoming a mirror in which they can see themselves reflected. The series become models of values, attitudes, skills and behaviours that tend to be imitated by some viewers. The aim of this study was to analyze the conflict management behavioural styles presented by the main characters of television fiction series. Thus, we evaluated the association between these styles and the age and sex of the main characters, as well as the nationality and genre of the fiction series. 16 fiction series were assessed by selecting two characters of both sexes from each series. We adapted the Rahim Organizational Conflict Inventory-II for observing and recording the data. The results show that there is no direct association between the conflict management behavioural styles presented in the drama series and the sex of the main characters. However, associations were found between these styles and the age of the characters and the genre of the fiction series.

  15. From Taylor series to Taylor models

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    An overview of the background of Taylor series methods and the utilization of the differential algebraic structure is given, and various associated techniques are reviewed. The conventional Taylor methods are extended to allow for a rigorous treatment of bounds for the remainder of the expansion in a similarly universal way. Utilizing differential algebraic and functional analytic arguments on the set of Taylor models, arbitrary order integrators with rigorous remainder treatment are developed. The integrators can meet pre-specified accuracy requirements in a mathematically strict way, and are a stepping stone towards fully rigorous estimates of stability of repetitive systems

  16. Time-series models on somatic cell score improve detection of matistis

    DEFF Research Database (Denmark)

    Norberg, E; Korsgaard, I R; Sloth, K H M N

    2008-01-01

    In-line detection of mastitis using frequent milk sampling was studied in 241 cows in a Danish research herd. Somatic cell scores obtained at a daily basis were analyzed using a mixture of four time-series models. Probabilities were assigned to each model for the observations to belong to a normal...... "steady-state" development, change in "level", change of "slope" or "outlier". Mastitis was indicated from the sum of probabilities for the "level" and "slope" models. Time-series models were based on the Kalman filter. Reference data was obtained from veterinary assessment of health status combined...... with bacteriological findings. At a sensitivity of 90% the corresponding specificity was 68%, which increased to 83% using a one-step back smoothing. It is concluded that mixture models based on Kalman filters are efficient in handling in-line sensor data for detection of mastitis and may be useful for similar...

  17. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  18. Evaluating an Automated Number Series Item Generator Using Linear Logistic Test Models

    Directory of Open Access Journals (Sweden)

    Bao Sheng Loe

    2018-04-01

    Full Text Available This study investigates the item properties of a newly developed Automatic Number Series Item Generator (ANSIG. The foundation of the ANSIG is based on five hypothesised cognitive operators. Thirteen item models were developed using the numGen R package and eleven were evaluated in this study. The 16-item ICAR (International Cognitive Ability Resource1 short form ability test was used to evaluate construct validity. The Rasch Model and two Linear Logistic Test Model(s (LLTM were employed to estimate and predict the item parameters. Results indicate that a single factor determines the performance on tests composed of items generated by the ANSIG. Under the LLTM approach, all the cognitive operators were significant predictors of item difficulty. Moderate to high correlations were evident between the number series items and the ICAR test scores, with high correlation found for the ICAR Letter-Numeric-Series type items, suggesting adequate nomothetic span. Extended cognitive research is, nevertheless, essential for the automatic generation of an item pool with predictable psychometric properties.

  19. Time Series Modeling of Nano-Gold Immunochromatographic Assay via Expectation Maximization Algorithm.

    Science.gov (United States)

    Zeng, Nianyin; Wang, Zidong; Li, Yurong; Du, Min; Cao, Jie; Liu, Xiaohui

    2013-12-01

    In this paper, the expectation maximization (EM) algorithm is applied to the modeling of the nano-gold immunochromatographic assay (nano-GICA) via available time series of the measured signal intensities of the test and control lines. The model for the nano-GICA is developed as the stochastic dynamic model that consists of a first-order autoregressive stochastic dynamic process and a noisy measurement. By using the EM algorithm, the model parameters, the actual signal intensities of the test and control lines, as well as the noise intensity can be identified simultaneously. Three different time series data sets concerning the target concentrations are employed to demonstrate the effectiveness of the introduced algorithm. Several indices are also proposed to evaluate the inferred models. It is shown that the model fits the data very well.

  20. 75 FR 47199 - Airworthiness Directives; McDonnell Douglas Corporation Model DC-9-10 Series Airplanes, DC-9-30...

    Science.gov (United States)

    2010-08-05

    ... Airworthiness Directives; McDonnell Douglas Corporation Model DC- 9-10 Series Airplanes, DC-9-30 Series... existing airworthiness directive (AD), which applies to all McDonnell Douglas Model DC-9-10 series..., 2010). That AD applies to all McDonnell Douglas Corporation Model DC-9-10 series airplanes, DC-9-30...

  1. On determining the prediction limits of mathematical models for time series

    International Nuclear Information System (INIS)

    Peluso, E.; Gelfusa, M.; Lungaroni, M.; Talebzadeh, S.; Gaudio, P.; Murari, A.; Contributors, JET

    2016-01-01

    Prediction is one of the main objectives of scientific analysis and it refers to both modelling and forecasting. The determination of the limits of predictability is an important issue of both theoretical and practical relevance. In the case of modelling time series, reached a certain level in performance in either modelling or prediction, it is often important to assess whether all the information available in the data has been exploited or whether there are still margins for improvement of the tools being developed. In this paper, an information theoretic approach is proposed to address this issue and quantify the quality of the models and/or predictions. The excellent properties of the proposed indicator have been proved with the help of a systematic series of numerical tests and a concrete example of extreme relevance for nuclear fusion.

  2. Big Data impacts on stochastic Forecast Models: Evidence from FX time series

    Directory of Open Access Journals (Sweden)

    Sebastian Dietz

    2013-12-01

    Full Text Available With the rise of the Big Data paradigm new tasks for prediction models appeared. In addition to the volume problem of such data sets nonlinearity becomes important, as the more detailed data sets contain also more comprehensive information, e.g. about non regular seasonal or cyclical movements as well as jumps in time series. This essay compares two nonlinear methods for predicting a high frequency time series, the USD/Euro exchange rate. The first method investigated is Autoregressive Neural Network Processes (ARNN, a neural network based nonlinear extension of classical autoregressive process models from time series analysis (see Dietz 2011. Its advantage is its simple but scalable time series process model architecture, which is able to include all kinds of nonlinearities based on the universal approximation theorem of Hornik, Stinchcombe and White 1989 and the extensions of Hornik 1993. However, restrictions related to the numeric estimation procedures limit the flexibility of the model. The alternative is a Support Vector Machine Model (SVM, Vapnik 1995. The two methods compared have different approaches of error minimization (Empirical error minimization at the ARNN vs. structural error minimization at the SVM. Our new finding is, that time series data classified as “Big Data” need new methods for prediction. Estimation and prediction was performed using the statistical programming language R. Besides prediction results we will also discuss the impact of Big Data on data preparation and model validation steps. Normal 0 21 false false false DE X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

  3. Modeling multivariate time series on manifolds with skew radial basis functions.

    Science.gov (United States)

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  4. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  5. Incorporating Satellite Time-Series Data into Modeling

    Science.gov (United States)

    Gregg, Watson

    2008-01-01

    In situ time series observations have provided a multi-decadal view of long-term changes in ocean biology. These observations are sufficiently reliable to enable discernment of even relatively small changes, and provide continuous information on a host of variables. Their key drawback is their limited domain. Satellite observations from ocean color sensors do not suffer the drawback of domain, and simultaneously view the global oceans. This attribute lends credence to their use in global and regional model validation and data assimilation. We focus on these applications using the NASA Ocean Biogeochemical Model. The enhancement of the satellite data using data assimilation is featured and the limitation of tongterm satellite data sets is also discussed.

  6. A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series

    Directory of Open Access Journals (Sweden)

    Fernando Luiz Cyrino Oliveira

    2014-01-01

    Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.

  7. An Illustration of Generalised Arma (garma) Time Series Modeling of Forest Area in Malaysia

    Science.gov (United States)

    Pillai, Thulasyammal Ramiah; Shitan, Mahendran

    Forestry is the art and science of managing forests, tree plantations, and related natural resources. The main goal of forestry is to create and implement systems that allow forests to continue a sustainable provision of environmental supplies and services. Forest area is land under natural or planted stands of trees, whether productive or not. Forest area of Malaysia has been observed over the years and it can be modeled using time series models. A new class of GARMA models have been introduced in the time series literature to reveal some hidden features in time series data. For these models to be used widely in practice, we illustrate the fitting of GARMA (1, 1; 1, δ) model to the Annual Forest Area data of Malaysia which has been observed from 1987 to 2008. The estimation of the model was done using Hannan-Rissanen Algorithm, Whittle's Estimation and Maximum Likelihood Estimation.

  8. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  9. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    Science.gov (United States)

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  10. Series-NonUniform Rational B-Spline (S-NURBS) model: a geometrical interpolation framework for chaotic data.

    Science.gov (United States)

    Shao, Chenxi; Liu, Qingqing; Wang, Tingting; Yin, Peifeng; Wang, Binghong

    2013-09-01

    Time series is widely exploited to study the innate character of the complex chaotic system. Existing chaotic models are weak in modeling accuracy because of adopting either error minimization strategy or an acceptable error to end the modeling process. Instead, interpolation can be very useful for solving differential equations with a small modeling error, but it is also very difficult to deal with arbitrary-dimensional series. In this paper, geometric theory is considered to reduce the modeling error, and a high-precision framework called Series-NonUniform Rational B-Spline (S-NURBS) model is developed to deal with arbitrary-dimensional series. The capability of the interpolation framework is proved in the validation part. Besides, we verify its reliability by interpolating Musa dataset. The main improvement of the proposed framework is that we are able to reduce the interpolation error by properly adjusting weights series step by step if more information is given. Meanwhile, these experiments also demonstrate that studying the physical system from a geometric perspective is feasible.

  11. Travel Cost Inference from Sparse, Spatio-Temporally Correlated Time Series Using Markov Models

    DEFF Research Database (Denmark)

    Yang, Bin; Guo, Chenjuan; Jensen, Christian S.

    2013-01-01

    of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...... with the sparsity, spatio-temporal correlation, and heterogeneity of the time series. Using the resulting STHMM, near future travel costs in the transportation network, e.g., travel time or greenhouse gas emissions, can be inferred, enabling a variety of routing services, e.g., eco-routing. Empirical studies...

  12. Forecasting electricity spot-prices using linear univariate time-series models

    International Nuclear Information System (INIS)

    Cuaresma, Jesus Crespo; Hlouskova, Jaroslava; Kossmeier, Stephan; Obersteiner, Michael

    2004-01-01

    This paper studies the forecasting abilities of a battery of univariate models on hourly electricity spot prices, using data from the Leipzig Power Exchange. The specifications studied include autoregressive models, autoregressive-moving average models and unobserved component models. The results show that specifications, where each hour of the day is modelled separately present uniformly better forecasting properties than specifications for the whole time-series, and that the inclusion of simple probabilistic processes for the arrival of extreme price events can lead to improvements in the forecasting abilities of univariate models for electricity spot prices. (Author)

  13. Effect of calibration data series length on performance and optimal parameters of hydrological model

    Directory of Open Access Journals (Sweden)

    Chuan-zhe Li

    2010-12-01

    Full Text Available In order to assess the effects of calibration data series length on the performance and optimal parameter values of a hydrological model in ungauged or data-limited catchments (data are non-continuous and fragmental in some catchments, we used non-continuous calibration periods for more independent streamflow data for SIMHYD (simple hydrology model calibration. Nash-Sutcliffe efficiency and percentage water balance error were used as performance measures. The particle swarm optimization (PSO method was used to calibrate the rainfall-runoff models. Different lengths of data series ranging from one year to ten years, randomly sampled, were used to study the impact of calibration data series length. Fifty-five relatively unimpaired catchments located all over Australia with daily precipitation, potential evapotranspiration, and streamflow data were tested to obtain more general conclusions. The results show that longer calibration data series do not necessarily result in better model performance. In general, eight years of data are sufficient to obtain steady estimates of model performance and parameters for the SIMHYD model. It is also shown that most humid catchments require fewer calibration data to obtain a good performance and stable parameter values. The model performs better in humid and semi-humid catchments than in arid catchments. Our results may have useful and interesting implications for the efficiency of using limited observation data for hydrological model calibration in different climates.

  14. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    Science.gov (United States)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  15. Intuitionistic Fuzzy Time Series Forecasting Model Based on Intuitionistic Fuzzy Reasoning

    Directory of Open Access Journals (Sweden)

    Ya’nan Wang

    2016-01-01

    Full Text Available Fuzzy sets theory cannot describe the data comprehensively, which has greatly limited the objectivity of fuzzy time series in uncertain data forecasting. In this regard, an intuitionistic fuzzy time series forecasting model is built. In the new model, a fuzzy clustering algorithm is used to divide the universe of discourse into unequal intervals, and a more objective technique for ascertaining the membership function and nonmembership function of the intuitionistic fuzzy set is proposed. On these bases, forecast rules based on intuitionistic fuzzy approximate reasoning are established. At last, contrast experiments on the enrollments of the University of Alabama and the Taiwan Stock Exchange Capitalization Weighted Stock Index are carried out. The results show that the new model has a clear advantage of improving the forecast accuracy.

  16. Transfer function modeling of the monthly accumulated rainfall series over the Iberian Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Mateos, Vidal L.; Garcia, Jose A.; Serrano, Antonio; De la Cruz Gallego, Maria [Departamento de Fisica, Universidad de Extremadura, Badajoz (Spain)

    2002-10-01

    In order to improve the results given by Autoregressive Moving-Average (ARMA) modeling for the monthly accumulated rainfall series taken at 19 observatories of the Iberian Peninsula, a Discrete Linear Transfer Function Noise (DLTFN) model was applied taking the local pressure series (LP), North Atlantic sea level pressure series (SLP) and North Atlantic sea surface temperature (SST) as input variables, and the rainfall series as the output series. In all cases, the performance of the DLTFN models, measured by the explained variance of the rainfall series, is better than the performance given by the ARMA modeling. The best performance is given by the models that take the local pressure as the input variable, followed by the sea level pressure models and the sea surface temperature models. Geographically speaking, the models fitted to those observatories located in the west of the Iberian Peninsula work better than those on the north and east of the Peninsula. Also, it was found that there is a region located between 0 N and 20 N, which shows the highest cross-correlation between SST and the peninsula rainfalls. This region moves to the west and northwest off the Peninsula when the SLP series are used. [Spanish] Con el objeto de mejorar los resultados porporcionados por los modelos Autorregresivo Media Movil (ARMA) ajustados a las precipitaciones mensuales acumuladas registradas en 19 observatorios de la Peninsula Iberica se han usado modelos de funcion de transferencia (DLTFN) en los que se han empleado como variable independiente la presion local (LP), la presion a nivel del mar (SLP) o la temperatura de agua del mar (SST) en el Atlantico Norte. En todos los casos analizados, los resultados obtenidos con los modelos DLTFN, medidos mediante la varianza explicada por el modelo, han sido mejores que los resultados proporcionados por los modelos ARMA. Los mejores resultados han sido dados por aquellos modelos que usan la presion local como variable de entrada, seguidos

  17. From Taylor series to Taylor models

    International Nuclear Information System (INIS)

    Berz, M.

    1997-01-01

    An overview of the background of Taylor series methods and the utilization of the differential algebraic structure is given, and various associated techniques are reviewed. The conventional Taylor methods are extended to allow for a rigorous treatment of bounds for the remainder of the expansion in a similarly universal way. Utilizing differential algebraic and functional analytic arguments on the set of Taylor models, arbitrary order integrators with rigorous remainder treatment are developed. The integrators can meet pre-specified accuracy requirements in a mathematically strict way, and are a stepping stone towards fully rigorous estimates of stability of repetitive systems. copyright 1997 American Institute of Physics

  18. Mesoscopic kinetic Monte Carlo modeling of organic photovoltaic device characteristics

    Science.gov (United States)

    Kimber, Robin G. E.; Wright, Edward N.; O'Kane, Simon E. J.; Walker, Alison B.; Blakesley, James C.

    2012-12-01

    Measured mobility and current-voltage characteristics of single layer and photovoltaic (PV) devices composed of poly{9,9-dioctylfluorene-co-bis[N,N'-(4-butylphenyl)]bis(N,N'-phenyl-1,4-phenylene)diamine} (PFB) and poly(9,9-dioctylfluorene-co-benzothiadiazole) (F8BT) have been reproduced by a mesoscopic model employing the kinetic Monte Carlo (KMC) approach. Our aim is to show how to avoid the uncertainties common in electrical transport models arising from the need to fit a large number of parameters when little information is available, for example, a single current-voltage curve. Here, simulation parameters are derived from a series of measurements using a self-consistent “building-blocks” approach, starting from data on the simplest systems. We found that site energies show disorder and that correlations in the site energies and a distribution of deep traps must be included in order to reproduce measured charge mobility-field curves at low charge densities in bulk PFB and F8BT. The parameter set from the mobility-field curves reproduces the unipolar current in single layers of PFB and F8BT and allows us to deduce charge injection barriers. Finally, by combining these disorder descriptions and injection barriers with an optical model, the external quantum efficiency and current densities of blend and bilayer organic PV devices can be successfully reproduced across a voltage range encompassing reverse and forward bias, with the recombination rate the only parameter to be fitted, found to be 1×107 s-1. These findings demonstrate an approach that removes some of the arbitrariness present in transport models of organic devices, which validates the KMC as an accurate description of organic optoelectronic systems, and provides information on the microscopic origins of the device behavior.

  19. State-space prediction model for chaotic time series

    Science.gov (United States)

    Alparslan, A. K.; Sayar, M.; Atilgan, A. R.

    1998-08-01

    A simple method for predicting the continuation of scalar chaotic time series ahead in time is proposed. The false nearest neighbors technique in connection with the time-delayed embedding is employed so as to reconstruct the state space. A local forecasting model based upon the time evolution of the topological neighboring in the reconstructed phase space is suggested. A moving root-mean-square error is utilized in order to monitor the error along the prediction horizon. The model is tested for the convection amplitude of the Lorenz model. The results indicate that for approximately 100 cycles of the training data, the prediction follows the actual continuation very closely about six cycles. The proposed model, like other state-space forecasting models, captures the long-term behavior of the system due to the use of spatial neighbors in the state space.

  20. 3D Bioprinting of Tissue/Organ Models.

    Science.gov (United States)

    Pati, Falguni; Gantelius, Jesper; Svahn, Helene Andersson

    2016-04-04

    In vitro tissue/organ models are useful platforms that can facilitate systematic, repetitive, and quantitative investigations of drugs/chemicals. The primary objective when developing tissue/organ models is to reproduce physiologically relevant functions that typically require complex culture systems. Bioprinting offers exciting prospects for constructing 3D tissue/organ models, as it enables the reproducible, automated production of complex living tissues. Bioprinted tissues/organs may prove useful for screening novel compounds or predicting toxicity, as the spatial and chemical complexity inherent to native tissues/organs can be recreated. In this Review, we highlight the importance of developing 3D in vitro tissue/organ models by 3D bioprinting techniques, characterization of these models for evaluating their resemblance to native tissue, and their application in the prioritization of lead candidates, toxicity testing, and as disease/tumor models. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...

  2. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress.

    Science.gov (United States)

    Cheng, Ching-Hsue; Chan, Chia-Pang; Yang, Jun-He

    2018-01-01

    The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.

  3. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  4. Cointegration and Error Correction Modelling in Time-Series Analysis: A Brief Introduction

    Directory of Open Access Journals (Sweden)

    Helmut Thome

    2015-07-01

    Full Text Available Criminological research is often based on time-series data showing some type of trend movement. Trending time-series may correlate strongly even in cases where no causal relationship exists (spurious causality. To avoid this problem researchers often apply some technique of detrending their data, such as by differencing the series. This approach, however, may bring up another problem: that of spurious non-causality. Both problems can, in principle, be avoided if the series under investigation are “difference-stationary” (if the trend movements are stochastic and “cointegrated” (if the stochastically changing trendmovements in different variables correspond to each other. The article gives a brief introduction to key instruments and interpretative tools applied in cointegration modelling.

  5. Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox

    DEFF Research Database (Denmark)

    Nonejad, Nima

    This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...... and efficient framework for estimation. These advantages are used to for instance estimate stochastic volatility models with leverage effect or with Student-t distributed errors. We also model changing time series characteristics of the US inflation rate by considering a heteroskedastic ARFIMA model where...

  6. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  7. Application of the Laplace transform method for computational modelling of radioactive decay series

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Deise L.; Damasceno, Ralf M.; Barros, Ricardo C. [Univ. do Estado do Rio de Janeiro (IME/UERJ) (Brazil). Programa de Pos-graduacao em Ciencias Computacionais

    2012-03-15

    It is well known that when spent fuel is removed from the core, it is still composed of considerable amount of radioactive elements with significant half-lives. Most actinides, in particular plutonium, fall into this category, and have to be safely disposed of. One solution is to store the long-lived spent fuel as it is, by encasing and burying it deep underground in a stable geological formation. This implies estimating the transmutation of these radioactive elements with time. Therefore, we describe in this paper the application of the Laplace transform technique in matrix formulation to analytically solve initial value problems that mathematically model radioactive decay series. Given the initial amount of each type of radioactive isotopes in the decay series, the computer code generates the amount at a given time of interest, or may plot a graph of the evolution in time of the amount of each type of isotopes in the series. This computer code, that we refer to as the LTRad{sub L} code, where L is the number of types of isotopes belonging to the series, was developed using the Scilab free platform for numerical computation and can model one segment or the entire chain of any of the three radioactive series existing on Earth today. Numerical results are given to typical model problems to illustrate the computer code efficiency and accuracy. (orig.)

  8. Application of the Laplace transform method for computational modelling of radioactive decay series

    International Nuclear Information System (INIS)

    Oliveira, Deise L.; Damasceno, Ralf M.; Barros, Ricardo C.

    2012-01-01

    It is well known that when spent fuel is removed from the core, it is still composed of considerable amount of radioactive elements with significant half-lives. Most actinides, in particular plutonium, fall into this category, and have to be safely disposed of. One solution is to store the long-lived spent fuel as it is, by encasing and burying it deep underground in a stable geological formation. This implies estimating the transmutation of these radioactive elements with time. Therefore, we describe in this paper the application of the Laplace transform technique in matrix formulation to analytically solve initial value problems that mathematically model radioactive decay series. Given the initial amount of each type of radioactive isotopes in the decay series, the computer code generates the amount at a given time of interest, or may plot a graph of the evolution in time of the amount of each type of isotopes in the series. This computer code, that we refer to as the LTRad L code, where L is the number of types of isotopes belonging to the series, was developed using the Scilab free platform for numerical computation and can model one segment or the entire chain of any of the three radioactive series existing on Earth today. Numerical results are given to typical model problems to illustrate the computer code efficiency and accuracy. (orig.)

  9. 75 FR 55699 - Series LLCs and Cell Companies

    Science.gov (United States)

    2010-09-14

    ... organization, such as franchise fees or administrative costs) or of any other series of the series organization... the amount of State unemployment tax paid. In light of these issues, the proposed regulations do not...

  10. Time series modeling and forecasting using memetic algorithms for regime-switching models.

    Science.gov (United States)

    Bergmeir, Christoph; Triguero, Isaac; Molina, Daniel; Aznarte, José Luis; Benitez, José Manuel

    2012-11-01

    In this brief, we present a novel model fitting procedure for the neuro-coefficient smooth transition autoregressive model (NCSTAR), as presented by Medeiros and Veiga. The model is endowed with a statistically founded iterative building procedure and can be interpreted in terms of fuzzy rule-based systems. The interpretability of the generated models and a mathematically sound building procedure are two very important properties of forecasting models. The model fitting procedure employed by the original NCSTAR is a combination of initial parameter estimation by a grid search procedure with a traditional local search algorithm. We propose a different fitting procedure, using a memetic algorithm, in order to obtain more accurate models. An empirical evaluation of the method is performed, applying it to various real-world time series originating from three forecasting competitions. The results indicate that we can significantly enhance the accuracy of the models, making them competitive to models commonly used in the field.

  11. Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System

    Directory of Open Access Journals (Sweden)

    Wuyang Cheng

    2014-01-01

    Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.

  12. A new model for reliability optimization of series-parallel systems with non-homogeneous components

    International Nuclear Information System (INIS)

    Feizabadi, Mohammad; Jahromi, Abdolhamid Eshraghniaye

    2017-01-01

    In discussions related to reliability optimization using redundancy allocation, one of the structures that has attracted the attention of many researchers, is series-parallel structure. In models previously presented for reliability optimization of series-parallel systems, there is a restricting assumption based on which all components of a subsystem must be homogeneous. This constraint limits system designers in selecting components and prevents achieving higher levels of reliability. In this paper, a new model is proposed for reliability optimization of series-parallel systems, which makes possible the use of non-homogeneous components in each subsystem. As a result of this flexibility, the process of supplying system components will be easier. To solve the proposed model, since the redundancy allocation problem (RAP) belongs to the NP-hard class of optimization problems, a genetic algorithm (GA) is developed. The computational results of the designed GA are indicative of high performance of the proposed model in increasing system reliability and decreasing costs. - Highlights: • In this paper, a new model is proposed for reliability optimization of series-parallel systems. • In the previous models, there is a restricting assumption based on which all components of a subsystem must be homogeneous. • The presented model provides a possibility for the subsystems’ components to be non- homogeneous in the required conditions. • The computational results demonstrate the high performance of the proposed model in improving reliability and reducing costs.

  13. Neural network modeling of nonlinear systems based on Volterra series extension of a linear model

    Science.gov (United States)

    Soloway, Donald I.; Bialasiewicz, Jan T.

    1992-01-01

    A Volterra series approach was applied to the identification of nonlinear systems which are described by a neural network model. A procedure is outlined by which a mathematical model can be developed from experimental data obtained from the network structure. Applications of the results to the control of robotic systems are discussed.

  14. Modelling organic particles in the atmosphere

    International Nuclear Information System (INIS)

    Couvidat, Florian

    2012-01-01

    Organic aerosol formation in the atmosphere is investigated via the development of a new model named H 2 O (Hydrophilic/Hydrophobic Organics). First, a parameterization is developed to take into account secondary organic aerosol formation from isoprene oxidation. It takes into account the effect of nitrogen oxides on organic aerosol formation and the hydrophilic properties of the aerosols. This parameterization is then implemented in H 2 O along with some other developments and the results of the model are compared to organic carbon measurements over Europe. Model performance is greatly improved by taking into account emissions of primary semi-volatile compounds, which can form secondary organic aerosols after oxidation or can condense when temperature decreases. If those emissions are not taken into account, a significant underestimation of organic aerosol concentrations occurs in winter. The formation of organic aerosols over an urban area was also studied by simulating organic aerosols concentration over the Paris area during the summer campaign of Megapoli (July 2009). H 2 O gives satisfactory results over the Paris area, although a peak of organic aerosol concentrations from traffic, which does not appear in the measurements, appears in the model simulation during rush hours. It could be due to an underestimation of the volatility of organic aerosols. It is also possible that primary and secondary organic compounds do not mix well together and that primary semi volatile compounds do not condense on an organic aerosol that is mostly secondary and highly oxidized. Finally, the impact of aqueous-phase chemistry was studied. The mechanism for the formation of secondary organic aerosol includes in-cloud oxidation of glyoxal, methylglyoxal, methacrolein and methylvinylketone, formation of methyltetrols in the aqueous phase of particles and cloud droplets, and the in-cloud aging of organic aerosols. The impact of wet deposition is also studied to better estimate the

  15. Adaptive Anchoring Model: How Static and Dynamic Presentations of Time Series Influence Judgments and Predictions.

    Science.gov (United States)

    Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick

    2018-01-01

    When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley

  16. Physics constrained nonlinear regression models for time series

    International Nuclear Information System (INIS)

    Majda, Andrew J; Harlim, John

    2013-01-01

    A central issue in contemporary science is the development of data driven statistical nonlinear dynamical models for time series of partial observations of nature or a complex physical model. It has been established recently that ad hoc quadratic multi-level regression (MLR) models can have finite-time blow up of statistical solutions and/or pathological behaviour of their invariant measure. Here a new class of physics constrained multi-level quadratic regression models are introduced, analysed and applied to build reduced stochastic models from data of nonlinear systems. These models have the advantages of incorporating memory effects in time as well as the nonlinear noise from energy conserving nonlinear interactions. The mathematical guidelines for the performance and behaviour of these physics constrained MLR models as well as filtering algorithms for their implementation are developed here. Data driven applications of these new multi-level nonlinear regression models are developed for test models involving a nonlinear oscillator with memory effects and the difficult test case of the truncated Burgers–Hopf model. These new physics constrained quadratic MLR models are proposed here as process models for Bayesian estimation through Markov chain Monte Carlo algorithms of low frequency behaviour in complex physical data. (paper)

  17. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress

    Science.gov (United States)

    2018-01-01

    The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies. PMID:29765399

  18. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress

    Directory of Open Access Journals (Sweden)

    Ching-Hsue Cheng

    2018-01-01

    Full Text Available The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i the proposed model is different from the previous models lacking the concept of time series; (ii the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.

  19. Modeling self-organization of novel organic materials

    Science.gov (United States)

    Sayar, Mehmet

    In this thesis, the structural organization of oligomeric multi-block molecules is analyzed by computational analysis of coarse-grained models. These molecules form nanostructures with different dimensionalities, and the nanostructured nature of these materials leads to novel structural properties at different length scales. Previously, a number of oligomeric triblock rodcoil molecules have been shown to self-organize into mushroom shaped noncentrosymmetric nanostructures. Interestingly, thin films of these molecules contain polar domains and a finite macroscopic polarization. However, the fully polarized state is not the equilibrium state. In the first chapter, by solving a model with dipolar and Ising-like short range interactions, we show that polar domains are stable in films composed of aggregates as opposed to isolated molecules. Unlike classical molecular systems, these nanoaggregates have large intralayer spacings (a ≈ 6 nm), leading to a reduction in the repulsive dipolar interactions that oppose polar order within layers. This enables the formation of a striped pattern with polar domains of alternating directions. The energies of the possible structures at zero temperature are computed exactly and results of Monte Carlo simulations are provided at non-zero temperatures. In the second chapter, the macroscopic polarization of such nanostructured films is analyzed in the presence of a short range surface interaction. The surface interaction leads to a periodic domain structure where the balance between the up and down domains is broken, and therefore films of finite thickness have a net macroscopic polarization. The polarization per unit volume is a function of film thickness and strength of the surface interaction. Finally, in chapter three, self-organization of organic molecules into a network of one dimensional objects is analyzed. Multi-block organic dendron rodcoil molecules were found to self-organize into supramolecular nanoribbons (threads) and

  20. Time series models of environmental exposures: Good predictions or good understanding.

    Science.gov (United States)

    Barnett, Adrian G; Stephen, Dimity; Huang, Cunrui; Wolkewitz, Martin

    2017-04-01

    Time series data are popular in environmental epidemiology as they make use of the natural experiment of how changes in exposure over time might impact on disease. Many published time series papers have used parameter-heavy models that fully explained the second order patterns in disease to give residuals that have no short-term autocorrelation or seasonality. This is often achieved by including predictors of past disease counts (autoregression) or seasonal splines with many degrees of freedom. These approaches give great residuals, but add little to our understanding of cause and effect. We argue that modelling approaches should rely more on good epidemiology and less on statistical tests. This includes thinking about causal pathways, making potential confounders explicit, fitting a limited number of models, and not over-fitting at the cost of under-estimating the true association between exposure and disease. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A regional and nonstationary model for partial duration series of extreme rainfall

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Madsen, Henrik; Rosbjerg, Dan

    2017-01-01

    as the explanatory variables in the regional and temporal domain, respectively. Further analysis of partial duration series with nonstationary and regional thresholds shows that the mean exceedances also exhibit a significant variation in space and time for some rainfall durations, while the shape parameter is found...... of extreme rainfall. The framework is built on a partial duration series approach with a nonstationary, regional threshold value. The model is based on generalized linear regression solved by generalized estimation equations. It allows a spatial correlation between the stations in the network and accounts...... furthermore for variable observation periods at each station and in each year. Marginal regional and temporal regression models solved by generalized least squares are used to validate and discuss the results of the full spatiotemporal model. The model is applied on data from a large Danish rain gauge network...

  2. Book Review: "Hidden Markov Models for Time Series: An ...

    African Journals Online (AJOL)

    Hidden Markov Models for Time Series: An Introduction using R. by Walter Zucchini and Iain L. MacDonald. Chapman & Hall (CRC Press), 2009. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · http://dx.doi.org/10.4314/saaj.v10i1.61717 · AJOL African Journals Online.

  3. Comparison of ARIMA and Random Forest time series models for prediction of avian influenza H5N1 outbreaks.

    Science.gov (United States)

    Kane, Michael J; Price, Natalie; Scotch, Matthew; Rabinowitz, Peter

    2014-08-13

    Time series models can play an important role in disease prediction. Incidence data can be used to predict the future occurrence of disease events. Developments in modeling approaches provide an opportunity to compare different time series models for predictive power. We applied ARIMA and Random Forest time series models to incidence data of outbreaks of highly pathogenic avian influenza (H5N1) in Egypt, available through the online EMPRES-I system. We found that the Random Forest model outperformed the ARIMA model in predictive ability. Furthermore, we found that the Random Forest model is effective for predicting outbreaks of H5N1 in Egypt. Random Forest time series modeling provides enhanced predictive ability over existing time series models for the prediction of infectious disease outbreaks. This result, along with those showing the concordance between bird and human outbreaks (Rabinowitz et al. 2012), provides a new approach to predicting these dangerous outbreaks in bird populations based on existing, freely available data. Our analysis uncovers the time-series structure of outbreak severity for highly pathogenic avain influenza (H5N1) in Egypt.

  4. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  5. Estimating the basic reproduction rate of HFMD using the time series SIR model in Guangdong, China.

    Directory of Open Access Journals (Sweden)

    Zhicheng Du

    Full Text Available Hand, foot, and mouth disease (HFMD has caused a substantial burden of disease in China, especially in Guangdong Province. Based on notifiable cases, we use the time series Susceptible-Infected-Recovered model to estimate the basic reproduction rate (R0 and the herd immunity threshold, understanding the transmission and persistence of HFMD more completely for efficient intervention in this province. The standardized difference between the reported and fitted time series of HFMD was 0.009 (<0.2. The median basic reproduction rate of total, enterovirus 71, and coxsackievirus 16 cases in Guangdong were 4.621 (IQR: 3.907-5.823, 3.023 (IQR: 2.289-4.292 and 7.767 (IQR: 6.903-10.353, respectively. The heatmap of R0 showed semiannual peaks of activity, including a major peak in spring and early summer (about the 12th week followed by a smaller peak in autumn (about the 36th week. The county-level model showed that Longchuan (R0 = 33, Gaozhou (R0 = 24, Huazhou (R0 = 23 and Qingxin (R0 = 19 counties have higher basic reproduction rate than other counties in the province. The epidemic of HFMD in Guangdong Province is still grim, and strategies like the World Health Organization's expanded program on immunization need to be implemented. An elimination of HFMD in Guangdong might need a Herd Immunity Threshold of 78%.

  6. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    Science.gov (United States)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  7. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  8. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.

    Science.gov (United States)

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  9. A stochastic HMM-based forecasting model for fuzzy time series.

    Science.gov (United States)

    Li, Sheng-Tun; Cheng, Yi-Chung

    2010-10-01

    Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.

  10. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  11. Chain length dependence of the critical density of organic homologous series

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios M.; Fredenslund, Aage; Tassios, Dimitrios P.

    1995-01-01

    Whether the critical density of organic compounds belonging to a certain homologous series increases or decreases with (increasing) molecular weight has been a challenging question over the years. Two sets of experimental data have recently appeared in the literature for the critical density of n......-alkanes: Steele's data (up to n-decane) suggest that critical density increases with carbon number and reaches a limiting value. On the other hand, the data of Teja et al., 1990 which cover a broader range of n-alkanes (up to n-octadecane), reveal a decreasing trend of the critical density after a maximum at n......-heptane. Teja et al. have also presented critical density measurements for 1-alkenes (up to 1-decene) and 1-alkanols (up to 1-undecanol). These data follow the same decreasing trend with the molecular weight as n-alkanes. This trend is not in agreement with the predictions of most group-contribution methods...

  12. 75 FR 6865 - Airworthiness Directives; The Boeing Company Model 737-700 (IGW) Series Airplanes Equipped With...

    Science.gov (United States)

    2010-02-12

    ... replacing aging float level switch conduit assemblies, periodically inspecting the external dry bay system... Model 737-700 (IGW) Series Airplanes Equipped With Auxiliary Fuel Tanks Installed in Accordance With... airworthiness directive (AD) for certain Model 737-700 (IGW) series airplanes. This proposed AD would require...

  13. Selection of organic chemicals for subsurface transport. Subsurface transport program interaction seminar series. Summary

    International Nuclear Information System (INIS)

    Zachara, J.M.; Wobber, F.J.

    1984-11-01

    Model compounds are finding increasing use in environmental research. These individual compounds are selected as surrogates of important contaminants present in energy/defense wastes and their leachates and are used separately or as mixtures in research to define the anticipated or ''model'' environmental behavior of key waste components and to probe important physicochemical mechanisms involved in transport and fate. A seminar was held in Germantown, Maryland, April 24-25, 1984 to discuss the nature of model organic compounds being used for subsurface transport research. The seminar included participants experienced in the fields of environmental chemistry, microbiology, geohydrology, biology, and analytic chemistry. The objectives of the seminar were two-fold: (1) to review the rationale for the selection of organic compounds adopted by research groups working on the subsurface transport of organics, and (2) to evaluate the use of individual compounds to bracket the behavior of compound classes and compound constructs to approximate the behavior of complex organic mixtures

  14. Seasonal trend analysis and ARIMA modeling of relative humidity and wind speed time series around Yamula Dam

    Science.gov (United States)

    Eymen, Abdurrahman; Köylü, Ümran

    2018-02-01

    Local climate change is determined by analysis of long-term recorded meteorological data. In the statistical analysis of the meteorological data, the Mann-Kendall rank test, which is one of the non-parametrical tests, has been used; on the other hand, for determining the power of the trend, Theil-Sen method has been used on the data obtained from 16 meteorological stations. The stations cover the provinces of Kayseri, Sivas, Yozgat, and Nevşehir in the Central Anatolia region of Turkey. Changes in land-use affect local climate. Dams are structures that cause major changes on the land. Yamula Dam is located 25 km northwest of Kayseri. The dam has huge water body which is approximately 85 km2. The mentioned tests have been used for detecting the presence of any positive or negative trend in meteorological data. The meteorological data in relation to the seasonal average, maximum, and minimum values of the relative humidity and seasonal average wind speed have been organized as time series and the tests have been conducted accordingly. As a result of these tests, the following have been identified: increase was observed in minimum relative humidity values in the spring, summer, and autumn seasons. As for the seasonal average wind speed, decrease was detected for nine stations in all seasons, whereas increase was observed in four stations. After the trend analysis, pre-dam mean relative humidity time series were modeled with Autoregressive Integrated Moving Averages (ARIMA) model which is statistical modeling tool. Post-dam relative humidity values were predicted by ARIMA models.

  15. Univaried models in the series of temperature of the air

    International Nuclear Information System (INIS)

    Leon Aristizabal Gloria esperanza

    2000-01-01

    The theoretical framework for the study of the air's temperature time series is the theory of stochastic processes, particularly those known as ARIMA, that make it possible to carry out a univaried analysis. ARIMA models are built in order to explain the structure of the monthly temperatures corresponding to the mean, the absolute maximum, absolute minimum, maximum mean and minimum mean temperatures, for four stations in Colombia. By means of those models, the possible evolution of the latter variables is estimated with predictive aims in mind. The application and utility of the models is discussed

  16. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  17. A mixing-model approach to quantifying sources of organic matter to salt marsh sediments

    Science.gov (United States)

    Bowles, K. M.; Meile, C. D.

    2010-12-01

    Salt marshes are highly productive ecosystems, where autochthonous production controls an intricate exchange of carbon and energy among organisms. The major sources of organic carbon to these systems include 1) autochthonous production by vascular plant matter, 2) import of allochthonous plant material, and 3) phytoplankton biomass. Quantifying the relative contribution of organic matter sources to a salt marsh is important for understanding the fate and transformation of organic carbon in these systems, which also impacts the timing and magnitude of carbon export to the coastal ocean. A common approach to quantify organic matter source contributions to mixtures is the use of linear mixing models. To estimate the relative contributions of endmember materials to total organic matter in the sediment, the problem is formulated as a constrained linear least-square problem. However, the type of data that is utilized in such mixing models, the uncertainties in endmember compositions and the temporal dynamics of non-conservative entitites can have varying affects on the results. Making use of a comprehensive data set that encompasses several endmember characteristics - including a yearlong degradation experiment - we study the impact of these factors on estimates of the origin of sedimentary organic carbon in a saltmarsh located in the SE United States. We first evaluate the sensitivity of linear mixing models to the type of data employed by analyzing a series of mixing models that utilize various combinations of parameters (i.e. endmember characteristics such as δ13COC, C/N ratios or lignin content). Next, we assess the importance of using more than the minimum number of parameters required to estimate endmember contributions to the total organic matter pool. Then, we quantify the impact of data uncertainty on the outcome of the analysis using Monte Carlo simulations and accounting for the uncertainty in endmember characteristics. Finally, as biogeochemical processes

  18. Advanced methods for modeling water-levels and estimating drawdowns with SeriesSEE, an Excel add-in

    Science.gov (United States)

    Halford, Keith; Garcia, C. Amanda; Fenelon, Joe; Mirus, Benjamin B.

    2012-12-21

    Water-level modeling is used for multiple-well aquifer tests to reliably differentiate pumping responses from natural water-level changes in wells, or “environmental fluctuations.” Synthetic water levels are created during water-level modeling and represent the summation of multiple component fluctuations, including those caused by environmental forcing and pumping. Pumping signals are modeled by transforming step-wise pumping records into water-level changes by using superimposed Theis functions. Water-levels can be modeled robustly with this Theis-transform approach because environmental fluctuations and pumping signals are simulated simultaneously. Water-level modeling with Theis transforms has been implemented in the program SeriesSEE, which is a Microsoft® Excel add-in. Moving average, Theis, pneumatic-lag, and gamma functions transform time series of measured values into water-level model components in SeriesSEE. Earth tides and step transforms are additional computed water-level model components. Water-level models are calibrated by minimizing a sum-of-squares objective function where singular value decomposition and Tikhonov regularization stabilize results. Drawdown estimates from a water-level model are the summation of all Theis transforms minus residual differences between synthetic and measured water levels. The accuracy of drawdown estimates is limited primarily by noise in the data sets, not the Theis-transform approach. Drawdowns much smaller than environmental fluctuations have been detected across major fault structures, at distances of more than 1 mile from the pumping well, and with limited pre-pumping and recovery data at sites across the United States. In addition to water-level modeling, utilities exist in SeriesSEE for viewing, cleaning, manipulating, and analyzing time-series data.

  19. Applying ARIMA model for annual volume time series of the Magdalena River

    OpenAIRE

    Gloria Amaris; Humberto Ávila; Thomas Guerrero

    2017-01-01

    Context: Climate change effects, human interventions, and river characteristics are factors that increase the risk on the population and the water resources. However, negative impacts such as flooding, and river droughts may be previously identified using appropriate numerical tools. Objectives: The annual volume (Millions of m3/year) time series of the Magdalena River was analyzed by an ARIMA model, using the historical time series of the Calamar station (Instituto de Hidrología, Meteoro...

  20. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  1. A linear solvation energy relationship model of organic chemical partitioning to dissolved organic carbon.

    Science.gov (United States)

    Kipka, Undine; Di Toro, Dominic M

    2011-09-01

    Predicting the association of contaminants with both particulate and dissolved organic matter is critical in determining the fate and bioavailability of chemicals in environmental risk assessment. To date, the association of a contaminant to particulate organic matter is considered in many multimedia transport models, but the effect of dissolved organic matter is typically ignored due to a lack of either reliable models or experimental data. The partition coefficient to dissolved organic carbon (K(DOC)) may be used to estimate the fraction of a contaminant that is associated with dissolved organic matter. Models relating K(DOC) to the octanol-water partition coefficient (K(OW)) have not been successful for many types of dissolved organic carbon in the environment. Instead, linear solvation energy relationships are proposed to model the association of chemicals with dissolved organic matter. However, more chemically diverse K(DOC) data are needed to produce a more robust model. For humic acid dissolved organic carbon, the linear solvation energy relationship predicts log K(DOC) with a root mean square error of 0.43. Copyright © 2011 SETAC.

  2. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  3. Dynamic data-driven integrated flare model based on self-organized criticality

    Science.gov (United States)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2013-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self-organized critical state. We describe them with a dynamic integrated flare model whose initial conditions and driving mechanism are derived from observations. Aims: We investigate whether well-known scaling laws observed in the distribution functions of characteristic flare parameters are reproduced after the self-organized critical state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy, and event duration follow the expected scaling laws, we first applied the previously reported static cellular automaton model to a time series of seven solar vector magnetograms of the NOAA active region 8210 recorded by the Imaging Vector Magnetograph on May 1 1998 between 18:59 UT and 23:16 UT until the self-organized critical state was reached. We then evolved the magnetic field between these processed snapshots through spline interpolation, mimicking a natural driver in our dynamic model. We identified magnetic discontinuities that exceeded a threshold in the Laplacian of the magnetic field after each interpolation step. These discontinuities were relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent interpolation and relaxation steps covered all transitions until the end of the processed magnetograms' sequence. We additionally advanced each magnetic configuration that has reached the self-organized critical state (SOC configuration) by the static model until 50 more flares were triggered, applied the dynamic model again to the new sequence, and repeated the same process sufficiently often to generate adequate statistics. Physical requirements, such as the divergence-free condition for the magnetic field, were approximately imposed. Results: We obtain robust power laws in the distribution functions of the modeled flaring events with scaling indices that agree well

  4. A series connection architecture for large-area organic photovoltaic modules with a 7.5% module efficiency.

    Science.gov (United States)

    Hong, Soonil; Kang, Hongkyu; Kim, Geunjin; Lee, Seongyu; Kim, Seok; Lee, Jong-Hoon; Lee, Jinho; Yi, Minjin; Kim, Junghwan; Back, Hyungcheol; Kim, Jae-Ryoung; Lee, Kwanghee

    2016-01-05

    The fabrication of organic photovoltaic modules via printing techniques has been the greatest challenge for their commercial manufacture. Current module architecture, which is based on a monolithic geometry consisting of serially interconnecting stripe-patterned subcells with finite widths, requires highly sophisticated patterning processes that significantly increase the complexity of printing production lines and cause serious reductions in module efficiency due to so-called aperture loss in series connection regions. Herein we demonstrate an innovative module structure that can simultaneously reduce both patterning processes and aperture loss. By using a charge recombination feature that occurs at contacts between electron- and hole-transport layers, we devise a series connection method that facilitates module fabrication without patterning the charge transport layers. With the successive deposition of component layers using slot-die and doctor-blade printing techniques, we achieve a high module efficiency reaching 7.5% with area of 4.15 cm(2).

  5. Time series ARIMA models for daily price of palm oil

    Science.gov (United States)

    Ariff, Noratiqah Mohd; Zamhawari, Nor Hashimah; Bakar, Mohd Aftar Abu

    2015-02-01

    Palm oil is deemed as one of the most important commodity that forms the economic backbone of Malaysia. Modeling and forecasting the daily price of palm oil is of great interest for Malaysia's economic growth. In this study, time series ARIMA models are used to fit the daily price of palm oil. The Akaike Infromation Criterion (AIC), Akaike Infromation Criterion with a correction for finite sample sizes (AICc) and Bayesian Information Criterion (BIC) are used to compare between different ARIMA models being considered. It is found that ARIMA(1,2,1) model is suitable for daily price of crude palm oil in Malaysia for the year 2010 to 2012.

  6. Stochastic series expansion simulation of the t -V model

    Science.gov (United States)

    Wang, Lei; Liu, Ye-Hua; Troyer, Matthias

    2016-04-01

    We present an algorithm for the efficient simulation of the half-filled spinless t -V model on bipartite lattices, which combines the stochastic series expansion method with determinantal quantum Monte Carlo techniques widely used in fermionic simulations. The algorithm scales linearly in the inverse temperature, cubically with the system size, and is free from the time-discretization error. We use it to map out the finite-temperature phase diagram of the spinless t -V model on the honeycomb lattice and observe a suppression of the critical temperature of the charge-density-wave phase in the vicinity of a fermionic quantum critical point.

  7. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  8. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.

    1977-01-01

    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  9. Neural Network Models for Time Series Forecasts

    OpenAIRE

    Tim Hill; Marcus O'Connor; William Remus

    1996-01-01

    Neural networks have been advocated as an alternative to traditional statistical forecasting methods. In the present experiment, time series forecasts produced by neural networks are compared with forecasts from six statistical time series methods generated in a major forecasting competition (Makridakis et al. [Makridakis, S., A. Anderson, R. Carbone, R. Fildes, M. Hibon, R. Lewandowski, J. Newton, E. Parzen, R. Winkler. 1982. The accuracy of extrapolation (time series) methods: Results of a ...

  10. Investigating ecological speciation in non-model organisms

    DEFF Research Database (Denmark)

    Foote, Andrew David

    2012-01-01

    Background: Studies of ecological speciation tend to focus on a few model biological systems. In contrast, few studies on non-model organisms have been able to infer ecological speciation as the underlying mechanism of evolutionary divergence. Questions: What are the pitfalls in studying ecological...... speciation in non-model organisms that lead to this bias? What alternative approaches might redress the balance? Organism: Genetically differentiated types of the killer whale (Orcinus orca) exhibiting differences in prey preference, habitat use, morphology, and behaviour. Methods: Review of the literature...... on killer whale evolutionary ecology in search of any difficulty in demonstrating causal links between variation in phenotype, ecology, and reproductive isolation in this non-model organism. Results: At present, we do not have enough evidence to conclude that adaptive phenotype traits linked to ecological...

  11. Cardiac Electromechanical Models: From Cell to Organ

    Directory of Open Access Journals (Sweden)

    Natalia A Trayanova

    2011-08-01

    Full Text Available The heart is a multiphysics and multiscale system that has driven the development of the most sophisticated mathematical models at the frontiers of computation physiology and medicine. This review focuses on electromechanical (EM models of the heart from the molecular level of myofilaments to anatomical models of the organ. Because of the coupling in terms of function and emergent behaviors at each level of biological hierarchy, separation of behaviors at a given scale is difficult. Here, a separation is drawn at the cell level so that the first half addresses subcellular/single cell models and the second half addresses organ models. At the subcelluar level, myofilament models represent actin-myosin interaction and Ca-based activation. Myofilament models and their refinements represent an overview of the development in the field. The discussion of specific models emphasizes the roles of cooperative mechanisms and sarcomere length dependence of contraction force, considered the cellular basis of the Frank-Starling law. A model of electrophysiology and Ca handling can be coupled to a myofilament model to produce an EM cell model, and representative examples are summarized to provide an overview of the progression of field. The second half of the review covers organ-level models that require solution of the electrical component as a reaction-diffusion system and the mechanical component, in which active tension generated by the myocytes produces deformation of the organ as described by the equations of continuum mechanics. As outlined in the review, different organ-level models have chosen to use different ionic and myofilament models depending on the specific application; this choice has been largely dictated by compromises between model complexity and computational tractability. The review also addresses application areas of EM models such as cardiac resynchronization therapy and the role of mechano-electric coupling in arrhythmias and

  12. Modeling of plasma chemistry in a corona streamer pulse series in air

    International Nuclear Information System (INIS)

    Nowakowska, H.; Stanco, J.; Dors, M.; Mizeraczyk, J.

    2002-01-01

    The aim of this study is to analyse the chemistry in air treated by a series of corona discharge streamers. Attention is focused on the conversion of ozone and nitrogen oxides. In the model it is assumed that the streamer head of relatively small geometrical dimensions propagates from the anode to the cathode, leaving the streamer channel behind. Any elemental gas volume in the streamer path is subjected first to the conditions of the streamer head, and next to those of the streamer channel. The kinetics of plasma-chemical processes occurring in the gas is modeled numerically for a single streamer and a series of streamers. The temporal evolution of 25 chemical compounds initially present or produced in air is calculated. (author)

  13. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  14. Time series modelling to forecast prehospital EMS demand for diabetic emergencies.

    Science.gov (United States)

    Villani, Melanie; Earnest, Arul; Nanayakkara, Natalie; Smith, Karen; de Courten, Barbora; Zoungas, Sophia

    2017-05-05

    Acute diabetic emergencies are often managed by prehospital Emergency Medical Services (EMS). The projected growth in prevalence of diabetes is likely to result in rising demand for prehospital EMS that are already under pressure. The aims of this study were to model the temporal trends and provide forecasts of prehospital attendances for diabetic emergencies. A time series analysis on monthly cases of hypoglycemia and hyperglycemia was conducted using data from the Ambulance Victoria (AV) electronic database between 2009 and 2015. Using the seasonal autoregressive integrated moving average (SARIMA) modelling process, different models were evaluated. The most parsimonious model with the highest accuracy was selected. Forty-one thousand four hundred fifty-four prehospital diabetic emergencies were attended over a seven-year period with an increase in the annual median monthly caseload between 2009 (484.5) and 2015 (549.5). Hypoglycemia (70%) and people with type 1 diabetes (48%) accounted for most attendances. The SARIMA (0,1,0,12) model provided the best fit, with a MAPE of 4.2% and predicts a monthly caseload of approximately 740 by the end of 2017. Prehospital EMS demand for diabetic emergencies is increasing. SARIMA time series models are a valuable tool to allow forecasting of future caseload with high accuracy and predict increasing cases of prehospital diabetic emergencies into the future. The model generated by this study may be used by service providers to allow appropriate planning and resource allocation of EMS for diabetic emergencies.

  15. Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005)

    Science.gov (United States)

    Van Engeland, T.; Soetaert, K.; Knuijt, A.; Laane, R. W. P. M.; Middelburg, J. J.

    2010-09-01

    Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the robustness of the patterns against these missing data. No long-term trends in DON concentrations were found over the sampling period (1995-2005). Inter-annual variability in the different time series showed both common and station-specific behavior. The stations could be divided into two regions, based on absolute concentrations and the dominant times scales of variability. Average DON concentrations were 11 μmol l -1 in the coastal region and 5 μmol l -1 in the open sea. Organic fractions of total dissolved nitrogen (TDN) averaged 38 and 71% in the coastal zone and open sea, respectively, but increased over time due to decreasing dissolved inorganic nitrogen (DIN) concentrations. In both regions intra-annual variability dominated over inter-annual variability, but DON variation in the open sea was markedly shifted towards shorter time scales relative to coastal stations. In the coastal zone a consistent seasonal DON cycle existed with high values in spring-summer and low values in autumn-winter. In the open sea seasonality was weak. A marked shift in the seasonality was found at the Dogger Bank, with DON accumulation towards summer and low values in winter prior to 1999, and accumulation in spring and decline throughout summer after 1999. This study clearly shows that DON is a dynamic actor in the North Sea and should be monitored systematically to enable us to understand fully the functioning of this ecosystem.

  16. Modelling of series of types of automated trenchless works tunneling

    Science.gov (United States)

    Gendarz, P.; Rzasinski, R.

    2016-08-01

    Microtunneling is the newest method for making underground installations. Show method is the result of experience and methods applied in other, previous methods of trenchless underground works. It is considered reasonable to elaborate a series of types of construction of tunneling machines, to develop this particular earthworks method. There are many design solutions of machines, but the current goal is to develop non - excavation robotized machine. Erosion machines with main dimensions of the tunnels which are: 1600, 2000, 2500, 3150 are design with use of the computer aided methods. Series of types of construction of tunneling machines creating process was preceded by analysis of current state. The verification of practical methodology of creating the systematic part series was based on the designed erosion machines series of types. There were developed: method of construction similarity of the erosion machines, algorithmic methods of quantitative construction attributes variant analyzes in the I-DEAS advanced graphical program, relational and program parameterization. There manufacturing process of the parts will be created, which allows to verify the technological process on the CNC machines. The models of designed will be modified and the construction will be consulted with erosion machine users and manufacturers like: Tauber Rohrbau GmbH & Co.KG from Minster, OHL ZS a.s. from Brna,. The companies’ acceptance will result in practical verification by JUMARPOL company.

  17. Self-Organizing Map Models of Language Acquisition

    Directory of Open Access Journals (Sweden)

    Ping eLi

    2013-11-01

    Full Text Available Connectionist models have had a profound impact on theories of language. While most early models were inspired by the classic PDP architecture, recent models of language have explored various other types of models, including self-organizing models for language acquisition. In this paper we aim at providing a review of the latter type of models, and highlight a number of simulation experiments that we have conducted based on these models. We show that self-organizing connectionist models can provide significant insights into long-standing debates in both monolingual and bilingual language development.

  18. Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.

    Science.gov (United States)

    Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark

    2016-01-01

    Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.

  19. A time series modeling approach in risk appraisal of violent and sexual recidivism.

    Science.gov (United States)

    Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E

    2010-10-01

    For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.

  20. Forecast models for suicide: Time-series analysis with data from Italy.

    Science.gov (United States)

    Preti, Antonio; Lentini, Gianluca

    2016-01-01

    The prediction of suicidal behavior is a complex task. To fine-tune targeted preventative interventions, predictive analytics (i.e. forecasting future risk of suicide) is more important than exploratory data analysis (pattern recognition, e.g. detection of seasonality in suicide time series). This study sets out to investigate the accuracy of forecasting models of suicide for men and women. A total of 101 499 male suicides and of 39 681 female suicides - occurred in Italy from 1969 to 2003 - were investigated. In order to apply the forecasting model and test its accuracy, the time series were split into a training set (1969 to 1996; 336 months) and a test set (1997 to 2003; 84 months). The main outcome was the accuracy of forecasting models on the monthly number of suicides. These measures of accuracy were used: mean absolute error; root mean squared error; mean absolute percentage error; mean absolute scaled error. In both male and female suicides a change in the trend pattern was observed, with an increase from 1969 onwards to reach a maximum around 1990 and decrease thereafter. The variances attributable to the seasonal and trend components were, respectively, 24% and 64% in male suicides, and 28% and 41% in female ones. Both annual and seasonal historical trends of monthly data contributed to forecast future trends of suicide with a margin of error around 10%. The finding is clearer in male than in female time series of suicide. The main conclusion of the study is that models taking seasonality into account seem to be able to derive information on deviation from the mean when this occurs as a zenith, but they fail to reproduce it when it occurs as a nadir. Preventative efforts should concentrate on the factors that influence the occurrence of increases above the main trend in both seasonal and cyclic patterns of suicides.

  1. Time Series with Long Memory

    OpenAIRE

    西埜, 晴久

    2004-01-01

    The paper investigates an application of long-memory processes to economic time series. We show properties of long-memory processes, which are motivated to model a long-memory phenomenon in economic time series. An FARIMA model is described as an example of long-memory model in statistical terms. The paper explains basic limit theorems and estimation methods for long-memory processes in order to apply long-memory models to economic time series.

  2. Generation of Natural Runoff Monthly Series at Ungauged Sites Using a Regional Regressive Model

    Directory of Open Access Journals (Sweden)

    Dario Pumo

    2016-05-01

    Full Text Available Many hydrologic applications require reliable estimates of runoff in river basins to face the widespread lack of data, both in time and in space. A regional method for the reconstruction of monthly runoff series is here developed and applied to Sicily (Italy. A simple modeling structure is adopted, consisting of a regression-based rainfall–runoff model with four model parameters, calibrated through a two-step procedure. Monthly runoff estimates are based on precipitation, temperature, and exploiting the autocorrelation with runoff at the previous month. Model parameters are assessed by specific regional equations as a function of easily measurable physical and climate basin descriptors. The first calibration step is aimed at the identification of a set of parameters optimizing model performances at the level of single basin. Such “optimal” sets are used at the second step, part of a regional regression analysis, to establish the regional equations for model parameters assessment as a function of basin attributes. All the gauged watersheds across the region have been analyzed, selecting 53 basins for model calibration and using the other six basins exclusively for validation. Performances, quantitatively evaluated by different statistical indexes, demonstrate relevant model ability in reproducing the observed hydrological time-series at both the monthly and coarser time resolutions. The methodology, which is easily transferable to other arid and semi-arid areas, provides a reliable tool for filling/reconstructing runoff time series at any gauged or ungauged basin of a region.

  3. Rewinding Frankenstein and the body-machine: organ transplantation in the dystopian young adult fiction series Unwind.

    Science.gov (United States)

    Wohlmann, Anita; Steinberg, Ruth

    2016-12-01

    While the separation of body and mind (and the entailing metaphor of the body as a machine) has been a cornerstone of Western medicine for a long time, reactions to organ transplantation among others challenge this clear-cut dichotomy. The limits of the machine-body have been negotiated in science fiction, most canonically in Mary Shelley's Frankenstein (1818). Since then, Frankenstein's monster itself has become a motif that permeates both medical and fictional discourses. Neal Shusterman's contemporary dystology for young adults, Unwind, draws on traditional concepts of the machine-body and the Frankenstein myth. This article follows one of the young protagonists in the series, who is entirely constructed from donated tissue, and analyses how Shusterman explores the complicated relationship between body and mind and between self and other as the teenager matures into an adult. It will be shown that, by framing the story of a transplanted individual along the lines of a coming-of-age narrative, Shusterman inter-relates the acceptance of a donor organ with the transitional space of adolescence and positions the quest for embodied selfhood at the centre of both developments. By highlighting the interconnections between medical discourse and a literary tradition, the potential contribution of the series to the treatment and understanding of post-transplant patients will be addressed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Air oxidation and biodegradation of the organic matter from the Boom Claycomparison between artificial and natural altered series

    International Nuclear Information System (INIS)

    Blanchart, Pascale; Faure, Pierre; Michels, Raymond; Bruggeman, Christophe; De Craen, Mieke

    2010-01-01

    Document available in extended abstract form only. The Boom Clay formation (Belgium) is studied as the reference host rock for methodological studies on the geological disposal of high-level and long-lived radioactive waste. The drilling of galleries in the Boom Clay at Mol lead to the perturbation of the initial physical and chemical conditions. Since organic matter is present in this argillaceous formation, it is important to know its response to these new conditions. The Boom Clay is of Tertiary age (Rupelian) and has a TOC content up to 5%. Its pore water (20% in mass of rock) contains significant quantities of Dissolved Organic Carbon (DOC) with a mean concentration of 115+/-15 mg/L, determined on the basis of piezometer water as well as squeezing and leaching experiments. Yet, in piezometers, the DOC may show considerable and irregular variations through time, with values ranging between 80 and 425 mg/L. The origin and bio-physico-chemical controls of such variations are yet unknown. Perturbation of the in-situ conditions of the clay is a possible reason for such observation. More likely the introduction of air as well as micro-organisms may have an impact that needs to be estimated. Well-preserved, freshly-drilled Boom Clay samples, as well as samples of different degrees of alteration (air exposed clay from the older galleries) were collected with the aim of determining and quantifying different molecular markers representative for the alteration degree. Additionally, increasing artificial air oxidation experiments were carried out on fresh Praclay samples (sampled during the PRACLAY gallery excavation) in order to obtain an altered reference series. Moreover PRACLAY samples previously extracted with an organic solvent (dichloromethane) were also prepared in order to identify the influence of the kerogen during oxidation. In these experiments, powdered clay was heated at 80 deg. C under air flow during 1, 3, 6, 9, 12, 18 months. Organic matter composition

  5. Applying ARIMA model for annual volume time series of the Magdalena River

    Directory of Open Access Journals (Sweden)

    Gloria Amaris

    2017-04-01

    Conclusions: The simulated results obtained with the ARIMA model compared to the observed data showed a fairly good adjustment of the minimum and maximum magnitudes. This allows concluding that it is a good tool for estimating minimum and maximum volumes, even though this model is not capable of simulating the exact behaviour of an annual volume time series.

  6. The partial duration series method in regional index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    A regional index-flood method based on the partial duration series model is introduced. The model comprises the assumptions of a Poisson-distributed number of threshold exceedances and generalized Pareto (GP) distributed peak magnitudes. The regional T-year event estimator is based on a regional...... estimator is superior to the at-site estimator even in extremely heterogenous regions, the performance of the regional estimator being relatively better in regions with a negative shape parameter. When the record length increases, the relative performance of the regional estimator decreases, but it is still...

  7. Validation of the inverse pulse wave transit time series as surrogate of systolic blood pressure in MVAR modeling.

    Science.gov (United States)

    Giassi, Pedro; Okida, Sergio; Oliveira, Maurício G; Moraes, Raimes

    2013-11-01

    Short-term cardiovascular regulation mediated by the sympathetic and parasympathetic branches of the autonomic nervous system has been investigated by multivariate autoregressive (MVAR) modeling, providing insightful analysis. MVAR models employ, as inputs, heart rate (HR), systolic blood pressure (SBP) and respiratory waveforms. ECG (from which HR series is obtained) and respiratory flow waveform (RFW) can be easily sampled from the patients. Nevertheless, the available methods for acquisition of beat-to-beat SBP measurements during exams hamper the wider use of MVAR models in clinical research. Recent studies show an inverse correlation between pulse wave transit time (PWTT) series and SBP fluctuations. PWTT is the time interval between the ECG R-wave peak and photoplethysmography waveform (PPG) base point within the same cardiac cycle. This study investigates the feasibility of using inverse PWTT (IPWTT) series as an alternative input to SBP for MVAR modeling of the cardiovascular regulation. For that, HR, RFW, and IPWTT series acquired from volunteers during postural changes and autonomic blockade were used as input of MVAR models. Obtained results show that IPWTT series can be used as input of MVAR models, replacing SBP measurements in order to overcome practical difficulties related to the continuous sampling of the SBP during clinical exams.

  8. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    Directory of Open Access Journals (Sweden)

    Yanhui Xi

    2016-01-01

    Full Text Available The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation. With the new representations, a theoretical explanation of leverage effect is provided. Simulated data and daily stock market indices (Shanghai composite index, Shenzhen component index, and Standard and Poor’s 500 Composite index via Bayesian Markov Chain Monte Carlo (MCMC method are used to estimate the leverage market microstructure model. The results verify the effectiveness of the model and its estimation approach proposed in the paper and also indicate that the stock markets have strong leverage effects. Compared with the classical leverage stochastic volatility (SV model in terms of DIC (Deviance Information Criterion, the leverage market microstructure model fits the data better.

  9. A model of Fe speciation and biogeochemistry at the Tropical Eastern North Atlantic Time-Series Observatory site

    Science.gov (United States)

    Ye, Y.; Völker, C.; Wolf-Gladrow, D. A.

    2009-10-01

    A one-dimensional model of Fe speciation and biogeochemistry, coupled with the General Ocean Turbulence Model (GOTM) and a NPZD-type ecosystem model, is applied for the Tropical Eastern North Atlantic Time-Series Observatory (TENATSO) site. Among diverse processes affecting Fe speciation, this study is focusing on investigating the role of dust particles in removing dissolved iron (DFe) by a more complex description of particle aggregation and sinking, and explaining the abundance of organic Fe-binding ligands by modelling their origin and fate. The vertical distribution of different particle classes in the model shows high sensitivity to changing aggregation rates. Using the aggregation rates from the sensitivity study in this work, modelled particle fluxes are close to observations, with dust particles dominating near the surface and aggregates deeper in the water column. POC export at 1000 m is a little higher than regional sediment trap measurements, suggesting further improvement of modelling particle aggregation, sinking or remineralisation. Modelled strong ligands have a high abundance near the surface and decline rapidly below the deep chlorophyll maximum, showing qualitative similarity to observations. Without production of strong ligands, phytoplankton concentration falls to 0 within the first 2 years in the model integration, caused by strong Fe-limitation. A nudging of total weak ligands towards a constant value is required for reproducing the observed nutrient-like profiles, assuming a decay time of 7 years for weak ligands. This indicates that weak ligands have a longer decay time and therefore cannot be modelled adequately in a one-dimensional model. The modelled DFe profile is strongly influenced by particle concentration and vertical distribution, because the most important removal of DFe in deeper waters is colloid formation and aggregation. Redissolution of particulate iron is required to reproduce an observed DFe profile at TENATSO site

  10. A model of Fe speciation and biogeochemistry at the Tropical Eastern North Atlantic Time-Series Observatory site

    Directory of Open Access Journals (Sweden)

    Y. Ye

    2009-10-01

    Full Text Available A one-dimensional model of Fe speciation and biogeochemistry, coupled with the General Ocean Turbulence Model (GOTM and a NPZD-type ecosystem model, is applied for the Tropical Eastern North Atlantic Time-Series Observatory (TENATSO site. Among diverse processes affecting Fe speciation, this study is focusing on investigating the role of dust particles in removing dissolved iron (DFe by a more complex description of particle aggregation and sinking, and explaining the abundance of organic Fe-binding ligands by modelling their origin and fate.

    The vertical distribution of different particle classes in the model shows high sensitivity to changing aggregation rates. Using the aggregation rates from the sensitivity study in this work, modelled particle fluxes are close to observations, with dust particles dominating near the surface and aggregates deeper in the water column. POC export at 1000 m is a little higher than regional sediment trap measurements, suggesting further improvement of modelling particle aggregation, sinking or remineralisation.

    Modelled strong ligands have a high abundance near the surface and decline rapidly below the deep chlorophyll maximum, showing qualitative similarity to observations. Without production of strong ligands, phytoplankton concentration falls to 0 within the first 2 years in the model integration, caused by strong Fe-limitation. A nudging of total weak ligands towards a constant value is required for reproducing the observed nutrient-like profiles, assuming a decay time of 7 years for weak ligands. This indicates that weak ligands have a longer decay time and therefore cannot be modelled adequately in a one-dimensional model.

    The modelled DFe profile is strongly influenced by particle concentration and vertical distribution, because the most important removal of DFe in deeper waters is colloid formation and aggregation. Redissolution of particulate iron is required to reproduce an

  11. [School Organization: Theory and Practice; Selected Readings on Grading, Nongrading, Multigrading, Self-Contained Classrooms, Departmentalization, Team Heterogeneous Grouping. Selected Bibliographies.] Rand McNally Education Series.

    Science.gov (United States)

    Franklin, Marian Pope, Comp.

    Over 400 journal articles, case studies, research reports, dissertations, and position papers are briefly described in a series of eight selected bibliographies related to school organization. The eight specific areas treated in the volume and the number of items listed for each include: nongraded elementary school organization, 96; nongraded…

  12. Complex Systems and Self-organization Modelling

    CERN Document Server

    Bertelle, Cyrille; Kadri-Dahmani, Hakima

    2009-01-01

    The concern of this book is the use of emergent computing and self-organization modelling within various applications of complex systems. The authors focus their attention both on the innovative concepts and implementations in order to model self-organizations, but also on the relevant applicative domains in which they can be used efficiently. This book is the outcome of a workshop meeting within ESM 2006 (Eurosis), held in Toulouse, France in October 2006.

  13. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    Science.gov (United States)

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  14. Modeling Secondary Organic Aerosol Formation From Emissions of Combustion Sources

    Science.gov (United States)

    Jathar, Shantanu Hemant

    Atmospheric aerosols exert a large influence on the Earth's climate and cause adverse public health effects, reduced visibility and material degradation. Secondary organic aerosol (SOA), defined as the aerosol mass arising from the oxidation products of gas-phase organic species, accounts for a significant fraction of the submicron atmospheric aerosol mass. Yet, there are large uncertainties surrounding the sources, atmospheric evolution and properties of SOA. This thesis combines laboratory experiments, extensive data analysis and global modeling to investigate the contribution of semi-volatile and intermediate volatility organic compounds (SVOC and IVOC) from combustion sources to SOA formation. The goals are to quantify the contribution of these emissions to ambient PM and to evaluate and improve models to simulate its formation. To create a database for model development and evaluation, a series of smog chamber experiments were conducted on evaporated fuel, which served as surrogates for real-world combustion emissions. Diesel formed the most SOA followed by conventional jet fuel / jet fuel derived from natural gas, gasoline and jet fuel derived from coal. The variability in SOA formation from actual combustion emissions can be partially explained by the composition of the fuel. Several models were developed and tested along with existing models using SOA data from smog chamber experiments conducted using evaporated fuel (this work, gasoline, fischertropschs, jet fuel, diesels) and published data on dilute combustion emissions (aircraft, on- and off-road gasoline, on- and off-road diesel, wood burning, biomass burning). For all of the SOA data, existing models under-predicted SOA formation if SVOC/IVOC were not included. For the evaporated fuel experiments, when SVOC/IVOC were included predictions using the existing SOA model were brought to within a factor of two of measurements with minor adjustments to model parameterizations. Further, a volatility

  15. Characteristics of the LeRC/Hughes J-series 30-cm engineering model thruster

    Science.gov (United States)

    Collett, C. R.; Poeschel, R. L.; Kami, S.

    1981-01-01

    As a consequence of endurance and structural tests performed on 900-series engineering model thrusters (EMT), several modifications in design were found to be necessary for achieving performance goals. The modified thruster is known as the J-series EMT. The most important of the design modifications affect the accelerator grid, gimbal mount, cathode polepiece, and wiring harness. The paper discusses the design modifications incorporated, the condition(s) they corrected, and the characteristics of the modified thruster.

  16. Simulation of organ patterning on the floral meristem using a polar auxin transport model.

    Directory of Open Access Journals (Sweden)

    Simon van Mourik

    Full Text Available An intriguing phenomenon in plant development is the timing and positioning of lateral organ initiation, which is a fundamental aspect of plant architecture. Although important progress has been made in elucidating the role of auxin transport in the vegetative shoot to explain the phyllotaxis of leaf formation in a spiral fashion, a model study of the role of auxin transport in whorled organ patterning in the expanding floral meristem is not available yet. We present an initial simulation approach to study the mechanisms that are expected to play an important role. Starting point is a confocal imaging study of Arabidopsis floral meristems at consecutive time points during flower development. These images reveal auxin accumulation patterns at the positions of the organs, which strongly suggests that the role of auxin in the floral meristem is similar to the role it plays in the shoot apical meristem. This is the basis for a simulation study of auxin transport through a growing floral meristem, which may answer the question whether auxin transport can in itself be responsible for the typical whorled floral pattern. We combined a cellular growth model for the meristem with a polar auxin transport model. The model predicts that sepals are initiated by auxin maxima arising early during meristem outgrowth. These form a pre-pattern relative to which a series of smaller auxin maxima are positioned, which partially overlap with the anlagen of petals, stamens, and carpels. We adjusted the model parameters corresponding to properties of floral mutants and found that the model predictions agree with the observed mutant patterns. The predicted timing of the primordia outgrowth and the timing and positioning of the sepal primordia show remarkable similarities with a developing flower in nature.

  17. Image reconstruction method for electrical capacitance tomography based on the combined series and parallel normalization model

    International Nuclear Information System (INIS)

    Dong, Xiangyuan; Guo, Shuqing

    2008-01-01

    In this paper, a novel image reconstruction method for electrical capacitance tomography (ECT) based on the combined series and parallel model is presented. A regularization technique is used to obtain a stabilized solution of the inverse problem. Also, the adaptive coefficient of the combined model is deduced by numerical optimization. Simulation results indicate that it can produce higher quality images when compared to the algorithm based on the parallel or series models for the cases tested in this paper. It provides a new algorithm for ECT application

  18. Series expansions without diagrams

    International Nuclear Information System (INIS)

    Bhanot, G.; Creutz, M.; Horvath, I.; Lacki, J.; Weckel, J.

    1994-01-01

    We discuss the use of recursive enumeration schemes to obtain low- and high-temperature series expansions for discrete statistical systems. Using linear combinations of generalized helical lattices, the method is competitive with diagrammatic approaches and is easily generalizable. We illustrate the approach using Ising and Potts models. We present low-temperature series results in up to five dimensions and high-temperature series in three dimensions. The method is general and can be applied to any discrete model

  19. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  20. 75 FR 28480 - Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R...

    Science.gov (United States)

    2010-05-21

    ... Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R Series..., B4-622, B4- 605R, B4-622R, F4-605R, F4-622R, and C4-605R Variant F airplanes; and Model A310-203...

  1. Saccharomyces cerevisiae as a model organism: a comparative study.

    Directory of Open Access Journals (Sweden)

    Hiren Karathia

    Full Text Available BACKGROUND: Model organisms are used for research because they provide a framework on which to develop and optimize methods that facilitate and standardize analysis. Such organisms should be representative of the living beings for which they are to serve as proxy. However, in practice, a model organism is often selected ad hoc, and without considering its representativeness, because a systematic and rational method to include this consideration in the selection process is still lacking. METHODOLOGY/PRINCIPAL FINDINGS: In this work we propose such a method and apply it in a pilot study of strengths and limitations of Saccharomyces cerevisiae as a model organism. The method relies on the functional classification of proteins into different biological pathways and processes and on full proteome comparisons between the putative model organism and other organisms for which we would like to extrapolate results. Here we compare S. cerevisiae to 704 other organisms from various phyla. For each organism, our results identify the pathways and processes for which S. cerevisiae is predicted to be a good model to extrapolate from. We find that animals in general and Homo sapiens in particular are some of the non-fungal organisms for which S. cerevisiae is likely to be a good model in which to study a significant fraction of common biological processes. We validate our approach by correctly predicting which organisms are phenotypically more distant from S. cerevisiae with respect to several different biological processes. CONCLUSIONS/SIGNIFICANCE: The method we propose could be used to choose appropriate substitute model organisms for the study of biological processes in other species that are harder to study. For example, one could identify appropriate models to study either pathologies in humans or specific biological processes in species with a long development time, such as plants.

  2. Constructing the reduced dynamical models of interannual climate variability from spatial-distributed time series

    Science.gov (United States)

    Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander

    2016-04-01

    We suggest a method for empirical forecast of climate dynamics basing on the reconstruction of reduced dynamical models in a form of random dynamical systems [1,2] derived from observational time series. The construction of proper embedding - the set of variables determining the phase space the model works in - is no doubt the most important step in such a modeling, but this task is non-trivial due to huge dimension of time series of typical climatic fields. Actually, an appropriate expansion of observational time series is needed yielding the number of principal components considered as phase variables, which are to be efficient for the construction of low-dimensional evolution operator. We emphasize two main features the reduced models should have for capturing the main dynamical properties of the system: (i) taking into account time-lagged teleconnections in the atmosphere-ocean system and (ii) reflecting the nonlinear nature of these teleconnections. In accordance to these principles, in this report we present the methodology which includes the combination of a new way for the construction of an embedding by the spatio-temporal data expansion and nonlinear model construction on the basis of artificial neural networks. The methodology is aplied to NCEP/NCAR reanalysis data including fields of sea level pressure, geopotential height, and wind speed, covering Northern Hemisphere. Its efficiency for the interannual forecast of various climate phenomena including ENSO, PDO, NAO and strong blocking event condition over the mid latitudes, is demonstrated. Also, we investigate the ability of the models to reproduce and predict the evolution of qualitative features of the dynamics, such as spectral peaks, critical transitions and statistics of extremes. This research was supported by the Government of the Russian Federation (Agreement No. 14.Z50.31.0033 with the Institute of Applied Physics RAS) [1] Y. I. Molkov, E. M. Loskutov, D. N. Mukhin, and A. M. Feigin, "Random

  3. Construction of the exact Fisher information matrix of Gaussian time series models by means of matrix differential rules

    NARCIS (Netherlands)

    Klein, A.A.B.; Melard, G.; Zahaf, T.

    2000-01-01

    The Fisher information matrix is of fundamental importance for the analysis of parameter estimation of time series models. In this paper the exact information matrix of a multivariate Gaussian time series model expressed in state space form is derived. A computationally efficient procedure is used

  4. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  5. Statistical models and time series forecasting of sulfur dioxide: a case study Tehran.

    Science.gov (United States)

    Hassanzadeh, S; Hosseinibalam, F; Alizadeh, R

    2009-08-01

    This study performed a time-series analysis, frequency distribution and prediction of SO(2) levels for five stations (Pardisan, Vila, Azadi, Gholhak and Bahman) in Tehran for the period of 2000-2005. Most sites show a quite similar characteristic with highest pollution in autumn-winter time and least pollution in spring-summer. The frequency distributions show higher peaks at two residential sites. The potential for SO(2) problems is high because of high emissions and the close geographical proximity of the major industrial and urban centers. The ACF and PACF are nonzero for several lags, indicating a mixed (ARMA) model, then at Bahman station an ARMA model was used for forecasting SO(2). The partial autocorrelations become close to 0 after about 5 lags while the autocorrelations remain strong through all the lags shown. The results proved that ARMA (2,2) model can provides reliable, satisfactory predictions for time series.

  6. Extracting Knowledge From Time Series An Introduction to Nonlinear Empirical Modeling

    CERN Document Server

    Bezruchko, Boris P

    2010-01-01

    This book addresses the fundamental question of how to construct mathematical models for the evolution of dynamical systems from experimentally-obtained time series. It places emphasis on chaotic signals and nonlinear modeling and discusses different approaches to the forecast of future system evolution. In particular, it teaches readers how to construct difference and differential model equations depending on the amount of a priori information that is available on the system in addition to the experimental data sets. This book will benefit graduate students and researchers from all natural sciences who seek a self-contained and thorough introduction to this subject.

  7. Self-organizing map models of language acquisition

    Science.gov (United States)

    Li, Ping; Zhao, Xiaowei

    2013-01-01

    Connectionist models have had a profound impact on theories of language. While most early models were inspired by the classic parallel distributed processing architecture, recent models of language have explored various other types of models, including self-organizing models for language acquisition. In this paper, we aim at providing a review of the latter type of models, and highlight a number of simulation experiments that we have conducted based on these models. We show that self-organizing connectionist models can provide significant insights into long-standing debates in both monolingual and bilingual language development. We suggest future directions in which these models can be extended, to better connect with behavioral and neural data, and to make clear predictions in testing relevant psycholinguistic theories. PMID:24312061

  8. Organic production in a dynamic CGE model

    DEFF Research Database (Denmark)

    Jacobsen, Lars Bo

    2004-01-01

    for conventional production into land for organic production, a period of two years must pass before the land being transformed can be used for organic production. During that time, the land is counted as land of the organic industry, but it can only produce the conventional product. To handle this rule, we make......Concerns about the impact of modern agriculture on the environment have in recent years led to an interest in supporting the development of organic farming. In addition to environmental benefits, the aim is to encourage the provision of other “multifunctional” properties of organic farming...... such as rural amenities and rural development that are spillover benefit additional to the supply of food. In this paper we further develop an existing dynamic general equilibrium model of the Danish economy to specifically incorporate organic farming. In the model and input-output data each primary...

  9. Modeling the impact of forecast-based regime switches on macroeconomic time series

    NARCIS (Netherlands)

    K. Bel (Koen); R. Paap (Richard)

    2013-01-01

    textabstractForecasts of key macroeconomic variables may lead to policy changes of governments, central banks and other economic agents. Policy changes in turn lead to structural changes in macroeconomic time series models. To describe this phenomenon we introduce a logistic smooth transition

  10. Virtual Organizations: Trends and Models

    Science.gov (United States)

    Nami, Mohammad Reza; Malekpour, Abbaas

    The Use of ICT in business has changed views about traditional business. With VO, organizations with out physical, geographical, or structural constraint can collaborate with together in order to fulfill customer requests in a networked environment. This idea improves resource utilization, reduces development process and costs, and saves time. Virtual Organization (VO) is always a form of partnership and managing partners and handling partnerships are crucial. Virtual organizations are defined as a temporary collection of enterprises that cooperate and share resources, knowledge, and competencies to better respond to business opportunities. This paper presents an overview of virtual organizations and main issues in collaboration such as security and management. It also presents a number of different model approaches according to their purpose and applications.

  11. Clustering of financial time series

    Science.gov (United States)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  12. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    Science.gov (United States)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  13. Creation and evaluation of a database of renewable production time series and other data for energy system modelling

    International Nuclear Information System (INIS)

    Janker, Karl Albert

    2015-01-01

    This thesis describes a model which generates renewable power generation time series as input data for energy system models. The focus is on photovoltaic systems and wind turbines. The basis is a high resolution global raster data set of weather data for many years. This data is validated, corrected and preprocessed. The composition of the hourly generation data is done via simulation of the respective technology. The generated time series are aggregated for different regions and are validated against historical production time series.

  14. Linear series of stellar models. Pt. 4. Helium-carbon stars of 3.5Msub(o) and 1Msub(o)

    International Nuclear Information System (INIS)

    Kozlowski, M.; Paczynski, B.; Popova, K.

    1973-01-01

    One linear series of models for a star of 3.5Msub(o) and two linear series of models for a star of 1Msub(o) are constructed. Models consist of helium rich envelopes (Y = 0.97, Z = 0.03) and pure carbon cores, and they have a rectangular helium profile, Y(Msub(r)). The linear series for a star of 3.5Msub(o) begins on the normal branch of the helium main sequence and terminates on the normal branch of the carbon main sequence. This series has eight turning points at which the core mass attains a local extremum. One of the two linear series for a star of 1Msub(o) begins on the normal branch of the helium main sequence, terminates on the high density branch of the helium main sequence, and has one turning point. The second linear series for a star of 1Msub(o) begins on the normal branch of the carbon main sequence, terminates on the high density branch of the carbon main sequence, and has three turning points. Two such linear series may have a common bifurcation point for a star of about 1.26Msub(o). (author)

  15. Series of mixed uranyl-lanthanide (Ce, Nd) organic coordination polymers with aromatic polycarboxylates linkers.

    Science.gov (United States)

    Mihalcea, Ionut; Volkringer, Christophe; Henry, Natacha; Loiseau, Thierry

    2012-09-17

    Three series of mixed uranyl-lanthanide (Ce or Nd) carboxylate coordination polymers have been successfully synthesized by means of a hydrothermal route using either conventional or microwave heating methods. These compounds have been prepared from mixtures of uranyl nitrate, lanthanide nitrate together with phthalic acid (1,2), pyromellitic acid (3,4), or mellitic acid (5,6) in aqueous solution. The X-ray diffraction (XRD) single-crystal revealed that the phthalate complex (UO(2))(4)O(2)Ln(H(2)O)(7)(1,2-bdc)(4)·NH(4)·xH(2)O (Ln = Ce(1), Nd(2); x = 1 for 1, x = 0 for 2), is based on the connection of tetranuclear uranyl-centered building blocks linked to discrete monomeric units LnO(2)(H(2)O)(7) via the organic species to generate infinite chains, intercalated by free ammonium cations. The pyromellitate phase (UO(2))(3)Ln(2)(H(2)O)(12)(btec)(3)·5H(2)O (Ce(3), Nd(4)) contains layers of monomeric uranyl-centered hexagonal and pentagonal bipyramids linked via the carboxylate arms of the organic molecules. The three-dimensionality of the structure is ensured by the connection of remaining free carboxylate groups with isolated monomeric units LnO(2)(H(2)O)(7). The network of the third series (UO(2))(2)(OH)Ln(H(2)O)(7)(mel)·5H(2)O (Ce(5), Nd(6)) is built up from dinuclear uranyl units forming layers through connection with the mellitate ligands, which are further linked to each other through discrete monomers LnO(3)(H(2)O)(6). The thermal decomposition of the various coordination complexes led to the formation of mixed uranium-lanthanide oxide, with the fluorite-type structure at 1500 °C (for 1, 2) or 1400 °C for 3-6. Expected U/Ln ratio from the crystal structures were observed for compounds 1-6.

  16. Study of a bio-mechanical model of the movements and deformations of the pelvic organs and integration in the process of radiotherapy treatment for prostate cancer

    International Nuclear Information System (INIS)

    Azad, M.

    2011-01-01

    One of the goals of optimizing treatment planning of prostate cancer radiation therapy is to maintain the margins added to the clinical target volume (CTV) as small as possible to reduce the volumes of normal tissue irradiated. Several methods have been proposed to define these margins: 1) Methods based on the observation of movements obtained by different imaging systems, 2) The predictive methods of the movement of organs, from a model representing the motions of pelvis organs, a calculation of a margin can be made. We have developed and optimized a finite element bio-mechanical model of the prostate, bladder and rectum. This model describes the movement and deformation of the pelvic organs during the filling of certain organs such as the bladder and rectum. An evaluation of this model to predict the movement of the prostate during the various sessions of radiotherapy is shown using a series of CBCT images (Cone Beam Computerized Tomography). (author)

  17. A novel model for Time-Series Data Clustering Based on piecewise SVD and BIRCH for Stock Data Analysis on Hadoop Platform

    Directory of Open Access Journals (Sweden)

    Ibgtc Bowala

    2017-06-01

    Full Text Available With the rapid growth of financial markets, analyzers are paying more attention on predictions. Stock data are time series data, with huge amounts. Feasible solution for handling the increasing amount of data is to use a cluster for parallel processing, and Hadoop parallel computing platform is a typical representative. There are various statistical models for forecasting time series data, but accurate clusters are a pre-requirement. Clustering analysis for time series data is one of the main methods for mining time series data for many other analysis processes. However, general clustering algorithms cannot perform clustering for time series data because series data has a special structure and a high dimensionality has highly co-related values due to high noise level. A novel model for time series clustering is presented using BIRCH, based on piecewise SVD, leading to a novel dimension reduction approach. Highly co-related features are handled using SVD with a novel approach for dimensionality reduction in order to keep co-related behavior optimal and then use BIRCH for clustering. The algorithm is a novel model that can handle massive time series data. Finally, this new model is successfully applied to real stock time series data of Yahoo finance with satisfactory results.

  18. A sequential decision framework for increasing college students' support for organ donation and organ donor registration.

    Science.gov (United States)

    Peltier, James W; D'Alessandro, Anthony M; Dahl, Andrew J; Feeley, Thomas Hugh

    2012-09-01

    Despite the fact that college students support social causes, this age group has underparticipated in organ donor registration. Little research attention has been given to understanding deeper, higher-order relationships between the antecedent attitudes toward and perceptions of organ donation and registration behavior. To test a process model useful for understanding the sequential ordering of information necessary for moving college students along a hierarchical decision-making continuum from awareness to support to organ donor registration. The University of Wisconsin organ procurement organization collaborated with the Collegiate American Marketing Association on a 2-year grant funded by the US Health Resources and Services Administration. A total of 981 association members responded to an online questionnaire. The 5 antecedent measures were awareness of organ donation, need acknowledgment, benefits of organ donation, social support, and concerns about organ donation. The 2 consequence variables were support for organ donation and organ donation registration. Structural equation modeling indicated that 5 of 10 direct antecedent pathways led significantly into organ donation support and registration. The impact of the nonsignificant variables was captured via indirect effects through other decision variables. Model fit statistics were good: the goodness of fit index was .998, the adjusted goodness of fit index was .992, and the root mean square error of approximation was .001. This sequential decision-making model provides insight into the need to enhance the acceptance of organ donation and organ donor registration through a series of communications to move people from awareness to behavior.

  19. Extended causal modeling to assess Partial Directed Coherence in multiple time series with significant instantaneous interactions.

    Science.gov (United States)

    Faes, Luca; Nollo, Giandomenico

    2010-11-01

    The Partial Directed Coherence (PDC) and its generalized formulation (gPDC) are popular tools for investigating, in the frequency domain, the concept of Granger causality among multivariate (MV) time series. PDC and gPDC are formalized in terms of the coefficients of an MV autoregressive (MVAR) model which describes only the lagged effects among the time series and forsakes instantaneous effects. However, instantaneous effects are known to affect linear parametric modeling, and are likely to occur in experimental time series. In this study, we investigate the impact on the assessment of frequency domain causality of excluding instantaneous effects from the model underlying PDC evaluation. Moreover, we propose the utilization of an extended MVAR model including both instantaneous and lagged effects. This model is used to assess PDC either in accordance with the definition of Granger causality when considering only lagged effects (iPDC), or with an extended form of causality, when we consider both instantaneous and lagged effects (ePDC). The approach is first evaluated on three theoretical examples of MVAR processes, which show that the presence of instantaneous correlations may produce misleading profiles of PDC and gPDC, while ePDC and iPDC derived from the extended model provide here a correct interpretation of extended and lagged causality. It is then applied to representative examples of cardiorespiratory and EEG MV time series. They suggest that ePDC and iPDC are better interpretable than PDC and gPDC in terms of the known cardiovascular and neural physiologies.

  20. Normalization of time-series satellite reflectance data to a standard sun-target-sensor geometry using a semi-empirical model

    Science.gov (United States)

    Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang

    2017-10-01

    Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.

  1. Organ dose conversion coefficients for voxel models of the reference male and female from idealized photon exposures

    Science.gov (United States)

    Schlattl, H.; Zankl, M.; Petoussi-Henss, N.

    2007-04-01

    A new series of organ equivalent dose conversion coefficients for whole body external photon exposure is presented for a standardized couple of human voxel models, called Rex and Regina. Irradiations from broad parallel beams in antero-posterior, postero-anterior, left- and right-side lateral directions as well as from a 360° rotational source have been performed numerically by the Monte Carlo transport code EGSnrc. Dose conversion coefficients from an isotropically distributed source were computed, too. The voxel models Rex and Regina originating from real patient CT data comply in body and organ dimensions with the currently valid reference values given by the International Commission on Radiological Protection (ICRP) for the average Caucasian man and woman, respectively. While the equivalent dose conversion coefficients of many organs are in quite good agreement with the reference values of ICRP Publication 74, for some organs and certain geometries the discrepancies amount to 30% or more. Differences between the sexes are of the same order with mostly higher dose conversion coefficients in the smaller female model. However, much smaller deviations from the ICRP values are observed for the resulting effective dose conversion coefficients. With the still valid definition for the effective dose (ICRP Publication 60), the greatest change appears in lateral exposures with a decrease in the new models of at most 9%. However, when the modified definition of the effective dose as suggested by an ICRP draft is applied, the largest deviation from the current reference values is obtained in postero-anterior geometry with a reduction of the effective dose conversion coefficient by at most 12%.

  2. Organ dose conversion coefficients for voxel models of the reference male and female from idealized photon exposures

    International Nuclear Information System (INIS)

    Schlattl, H; Zankl, M; Petoussi-Henss, N

    2007-01-01

    A new series of organ equivalent dose conversion coefficients for whole body external photon exposure is presented for a standardized couple of human voxel models, called Rex and Regina. Irradiations from broad parallel beams in antero-posterior, postero-anterior, left- and right-side lateral directions as well as from a 360 deg. rotational source have been performed numerically by the Monte Carlo transport code EGSnrc. Dose conversion coefficients from an isotropically distributed source were computed, too. The voxel models Rex and Regina originating from real patient CT data comply in body and organ dimensions with the currently valid reference values given by the International Commission on Radiological Protection (ICRP) for the average Caucasian man and woman, respectively. While the equivalent dose conversion coefficients of many organs are in quite good agreement with the reference values of ICRP Publication 74, for some organs and certain geometries the discrepancies amount to 30% or more. Differences between the sexes are of the same order with mostly higher dose conversion coefficients in the smaller female model. However, much smaller deviations from the ICRP values are observed for the resulting effective dose conversion coefficients. With the still valid definition for the effective dose (ICRP Publication 60), the greatest change appears in lateral exposures with a decrease in the new models of at most 9%. However, when the modified definition of the effective dose as suggested by an ICRP draft is applied, the largest deviation from the current reference values is obtained in postero-anterior geometry with a reduction of the effective dose conversion coefficient by at most 12%

  3. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  4. Refining Markov state models for conformational dynamics using ensemble-averaged data and time-series trajectories

    Science.gov (United States)

    Matsunaga, Y.; Sugita, Y.

    2018-06-01

    A data-driven modeling scheme is proposed for conformational dynamics of biomolecules based on molecular dynamics (MD) simulations and experimental measurements. In this scheme, an initial Markov State Model (MSM) is constructed from MD simulation trajectories, and then, the MSM parameters are refined using experimental measurements through machine learning techniques. The second step can reduce the bias of MD simulation results due to inaccurate force-field parameters. Either time-series trajectories or ensemble-averaged data are available as a training data set in the scheme. Using a coarse-grained model of a dye-labeled polyproline-20, we compare the performance of machine learning estimations from the two types of training data sets. Machine learning from time-series data could provide the equilibrium populations of conformational states as well as their transition probabilities. It estimates hidden conformational states in more robust ways compared to that from ensemble-averaged data although there are limitations in estimating the transition probabilities between minor states. We discuss how to use the machine learning scheme for various experimental measurements including single-molecule time-series trajectories.

  5. Developing a local least-squares support vector machines-based neuro-fuzzy model for nonlinear and chaotic time series prediction.

    Science.gov (United States)

    Miranian, A; Abdollahzade, M

    2013-02-01

    Local modeling approaches, owing to their ability to model different operating regimes of nonlinear systems and processes by independent local models, seem appealing for modeling, identification, and prediction applications. In this paper, we propose a local neuro-fuzzy (LNF) approach based on the least-squares support vector machines (LSSVMs). The proposed LNF approach employs LSSVMs, which are powerful in modeling and predicting time series, as local models and uses hierarchical binary tree (HBT) learning algorithm for fast and efficient estimation of its parameters. The HBT algorithm heuristically partitions the input space into smaller subdomains by axis-orthogonal splits. In each partitioning, the validity functions automatically form a unity partition and therefore normalization side effects, e.g., reactivation, are prevented. Integration of LSSVMs into the LNF network as local models, along with the HBT learning algorithm, yield a high-performance approach for modeling and prediction of complex nonlinear time series. The proposed approach is applied to modeling and predictions of different nonlinear and chaotic real-world and hand-designed systems and time series. Analysis of the prediction results and comparisons with recent and old studies demonstrate the promising performance of the proposed LNF approach with the HBT learning algorithm for modeling and prediction of nonlinear and chaotic systems and time series.

  6. A scalable database model for multiparametric time series: a volcano observatory case study

    Science.gov (United States)

    Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea

    2014-05-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  7. Data on copula modeling of mixed discrete and continuous neural time series.

    Science.gov (United States)

    Hu, Meng; Li, Mingyao; Li, Wu; Liang, Hualou

    2016-06-01

    Copula is an important tool for modeling neural dependence. Recent work on copula has been expanded to jointly model mixed time series in neuroscience ("Hu et al., 2016, Joint Analysis of Spikes and Local Field Potentials using Copula" [1]). Here we present further data for joint analysis of spike and local field potential (LFP) with copula modeling. In particular, the details of different model orders and the influence of possible spike contamination in LFP data from the same and different electrode recordings are presented. To further facilitate the use of our copula model for the analysis of mixed data, we provide the Matlab codes, together with example data.

  8. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data

    Science.gov (United States)

    Wilson, Barry T.; Knight, Joseph F.; McRoberts, Ronald E.

    2018-03-01

    Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several methods have previously been developed for use with finer temporal resolution imagery (e.g. AVHRR and MODIS), including image compositing and harmonic regression using Fourier series. The manuscript presents a study, using Minnesota, USA during the years 2009-2013 as the study area and timeframe. The study examined the relative predictive power of land cover models, in particular those related to tree cover, using predictor variables based solely on composite imagery versus those using estimated harmonic regression coefficients. The study used two common non-parametric modeling approaches (i.e. k-nearest neighbors and random forests) for fitting classification and regression models of multiple attributes measured on USFS Forest Inventory and Analysis plots using all available Landsat imagery for the study area and timeframe. The estimated Fourier coefficients developed by harmonic regression of tasseled cap transformation time series data were shown to be correlated with land cover, including tree cover. Regression models using estimated Fourier coefficients as predictor variables showed a two- to threefold increase in explained variance for a small set of continuous response variables, relative to comparable models using monthly image composites. Similarly, the overall accuracies of classification models using the estimated Fourier coefficients were approximately 10-20 percentage points higher than the models using the image composites, with corresponding individual class accuracies between six and 45 percentage points higher.

  9. Performance Evaluation of Linear (ARMA and Threshold Nonlinear (TAR Time Series Models in Daily River Flow Modeling (Case Study: Upstream Basin Rivers of Zarrineh Roud Dam

    Directory of Open Access Journals (Sweden)

    Farshad Fathian

    2017-01-01

    Full Text Available Introduction: Time series models are generally categorized as a data-driven method or mathematically-based method. These models are known as one of the most important tools in modeling and forecasting of hydrological processes, which are used to design and scientific management of water resources projects. On the other hand, a better understanding of the river flow process is vital for appropriate streamflow modeling and forecasting. One of the main concerns of hydrological time series modeling is whether the hydrologic variable is governed by the linear or nonlinear models through time. Although the linear time series models have been widely applied in hydrology research, there has been some recent increasing interest in the application of nonlinear time series approaches. The threshold autoregressive (TAR method is frequently applied in modeling the mean (first order moment of financial and economic time series. Thise type of the model has not received considerable attention yet from the hydrological community. The main purposes of this paper are to analyze and to discuss stochastic modeling of daily river flow time series of the study area using linear (such as ARMA: autoregressive integrated moving average and non-linear (such as two- and three- regime TAR models. Material and Methods: The study area has constituted itself of four sub-basins namely, Saghez Chai, Jighato Chai, Khorkhoreh Chai and Sarogh Chai from west to east, respectively, which discharge water into the Zarrineh Roud dam reservoir. River flow time series of 6 hydro-gauge stations located on upstream basin rivers of Zarrineh Roud dam (located in the southern part of Urmia Lake basin were considered to model purposes. All the data series used here to start from January 1, 1997, and ends until December 31, 2011. In this study, the daily river flow data from January 01 1997 to December 31 2009 (13 years were chosen for calibration and data for January 01 2010 to December 31 2011

  10. Heat recovery system series arrangements

    Science.gov (United States)

    Kauffman, Justin P.; Welch, Andrew M.; Dawson, Gregory R.; Minor, Eric N.

    2017-11-14

    The present disclosure is directed to heat recovery systems that employ two or more organic Rankine cycle (ORC) units disposed in series. According to certain embodiments, each ORC unit includes an evaporator that heats an organic working fluid, a turbine generator set that expands the working fluid to generate electricity, a condenser that cools the working fluid, and a pump that returns the working fluid to the evaporator. The heating fluid is directed through each evaporator to heat the working fluid circulating within each ORC unit, and the cooling fluid is directed through each condenser to cool the working fluid circulating within each ORC unit. The heating fluid and the cooling fluid flow through the ORC units in series in the same or opposite directions.

  11. 75 FR 38017 - Airworthiness Directives; McDonnell Douglas Corporation Model DC-9-10 Series Airplanes, DC-9-30...

    Science.gov (United States)

    2010-07-01

    ... Airworthiness Directives; McDonnell Douglas Corporation Model DC- 9-10 Series Airplanes, DC-9-30 Series... previously to all known U.S. owners and operators of the McDonnell Douglas Corporation airplanes identified... INFORMATION: On July 15, 2009, we issued AD 2009-15-16, which applies to all McDonnell Douglas Model DC-9-10...

  12. Modeling the influence of organic acids on soil weathering

    Science.gov (United States)

    Lawrence, Corey R.; Harden, Jennifer W.; Maher, Kate

    2014-01-01

    Biological inputs and organic matter cycling have long been regarded as important factors in the physical and chemical development of soils. In particular, the extent to which low molecular weight organic acids, such as oxalate, influence geochemical reactions has been widely studied. Although the effects of organic acids are diverse, there is strong evidence that organic acids accelerate the dissolution of some minerals. However, the influence of organic acids at the field-scale and over the timescales of soil development has not been evaluated in detail. In this study, a reactive-transport model of soil chemical weathering and pedogenic development was used to quantify the extent to which organic acid cycling controls mineral dissolution rates and long-term patterns of chemical weathering. Specifically, oxalic acid was added to simulations of soil development to investigate a well-studied chronosequence of soils near Santa Cruz, CA. The model formulation includes organic acid input, transport, decomposition, organic-metal aqueous complexation and mineral surface complexation in various combinations. Results suggest that although organic acid reactions accelerate mineral dissolution rates near the soil surface, the net response is an overall decrease in chemical weathering. Model results demonstrate the importance of organic acid input concentrations, fluid flow, decomposition and secondary mineral precipitation rates on the evolution of mineral weathering fronts. In particular, model soil profile evolution is sensitive to kaolinite precipitation and oxalate decomposition rates. The soil profile-scale modeling presented here provides insights into the influence of organic carbon cycling on soil weathering and pedogenesis and supports the need for further field-scale measurements of the flux and speciation of reactive organic compounds.

  13. Putting "Organizations" into an Organization Theory Course: A Hybrid CAO Model for Teaching Organization Theory

    Science.gov (United States)

    Hannah, David R.; Venkatachary, Ranga

    2010-01-01

    In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…

  14. Assessing and improving the quality of modeling : a series of empirical studies about the UML

    NARCIS (Netherlands)

    Lange, C.F.J.

    2007-01-01

    Assessing and Improving the Quality of Modeling A Series of Empirical Studies about the UML This thesis addresses the assessment and improvement of the quality of modeling in software engineering. In particular, we focus on the Unified Modeling Language (UML), which is the de facto standard in

  15. Healing models for organizations: description, measurement, and outcomes.

    Science.gov (United States)

    Malloch, K

    2000-01-01

    Healthcare leaders are continually searching for ways to improve their ability to provide optimal healthcare services, be financially viable, and retain quality caregivers, often feeling like such goals are impossible to achieve in today's intensely competitive environment. Many healthcare leaders intuitively recognize the need for more humanistic models and the probable connection with positive patient outcomes and financial success but are hesitant to make significant changes in their organizations because of the lack of model descriptions or documented recognition of the clinical and financial advantages of humanistic models. This article describes a study that was developed in response to the increasing work in humanistic or healing environment models and the need for validation of the advantages of such models. The healthy organization model, a framework for healthcare organizations that incorporates humanistic healing values within the traditional structure, is presented as a result of the study. This model addresses the importance of optimal clinical services, financial performance, and staff satisfaction. The five research-based organizational components that form the framework are described, and key indicators of organizational effectiveness over a five-year period are presented. The resulting empirical data are strongly supportive of the healing model and reflect positive outcomes for the organization.

  16. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    OpenAIRE

    Jun-He Yang; Ching-Hsue Cheng; Chia-Pan Chan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting m...

  17. Organization model and formalized description of nuclear enterprise information system

    International Nuclear Information System (INIS)

    Yuan Feng; Song Yafeng; Li Xudong

    2012-01-01

    Organization model is one of the most important models of Nuclear Enterprise Information System (NEIS). Scientific and reasonable organization model is the prerequisite that NEIS has robustness and extendibility, and is also the foundation of the integration of heterogeneous system. Firstly, the paper describes the conceptual model of the NEIS on ontology chart, which provides a consistent semantic framework of organization. Then it discusses the relations between the concepts in detail. Finally, it gives the formalized description of the organization model of NEIS based on six-tuple array. (authors)

  18. Effects of cement organic additives on the adsorption of uranyl ions on calcium silicate hydrate phases: experimental determination and computational molecular modelling

    International Nuclear Information System (INIS)

    Androniuk, Iuliia

    2017-01-01

    Cementitious materials are extensively used in the design and construction of radioactive waste repositories. One of the ways to enhance their performance is to introduce organic admixtures into the cement structure. However, the presence of organics in the pore water may affect the radionuclide mobility: organic molecules can form water-soluble complexes and compete for sorption sites. This work was designed to get detailed understanding of the mechanisms of such interactions on the molecular level. The model system has three components. First, pure C-S-H phases with different Ca/Si ratios were chosen as a cement model. Secondly, gluconate (a simple well-described molecule) is selected as a good starting organic additive model to probe the interaction mechanisms on the molecular scale. A more complex system involving poly-carboxylate super-plasticizer (PCE) was also tested. The third, U(VI), is a representative of the actinide radionuclide series. The development of description of the effects of organics for radioactive waste disposal applications was the primary objective of this work. The study of binary systems provides reference data for the investigation of more complex ternary (C-S-H/organic/U(VI)). The interactions are studied by means of both experimental and computational molecular modelling techniques. Data on sorption and desorption kinetics and isotherms for additives and for U(VI) on C-S-H are acquired in this work. In parallel, atomistic models are developed for the interfaces of interest. Structural, energetic, and dynamic aspects of the sorption processes on surface of cement are quantitatively modeled by molecular dynamics technique. (author)

  19. a Landsat Time-Series Stacks Model for Detection of Cropland Change

    Science.gov (United States)

    Chen, J.; Chen, J.; Zhang, J.

    2017-09-01

    Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.

  20. Nonlinear detection of disordered voice productions from short time series based on a Volterra-Wiener-Korenberg model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Yu, E-mail: yuzhang@xmu.edu.cn [Key Laboratory of Underwater Acoustic Communication and Marine Information Technology of the Ministry of Education, Xiamen University, Xiamen Fujian 361005 (China); Sprecher, Alicia J. [Department of Surgery, Division of Otolaryngology - Head and Neck Surgery, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792-7375 (United States); Zhao Zongxi [Key Laboratory of Underwater Acoustic Communication and Marine Information Technology of the Ministry of Education, Xiamen University, Xiamen Fujian 361005 (China); Jiang, Jack J. [Department of Surgery, Division of Otolaryngology - Head and Neck Surgery, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792-7375 (United States)

    2011-09-15

    Highlights: > The VWK method effectively detects the nonlinearity of a discrete map. > The method describes the chaotic time series of a biomechanical vocal fold model. > Nonlinearity in laryngeal pathology is detected from short and noisy time series. - Abstract: In this paper, we apply the Volterra-Wiener-Korenberg (VWK) model method to detect nonlinearity in disordered voice productions. The VWK method effectively describes the nonlinearity of a third-order nonlinear map. It allows for the analysis of short and noisy data sets. The extracted VWK model parameters show an agreement with the original nonlinear map parameters. Furthermore, the VWK mode method is applied to successfully assess the nonlinearity of a biomechanical voice production model simulating irregular vibratory dynamics of vocal folds with a unilateral vocal polyp. Finally, we show the clinical applicability of this nonlinear detection method to analyze the electroglottographic data generated by 14 patients with vocal nodules or polyps. The VWK model method shows potential in describing the nonlinearity inherent in disordered voice productions from short and noisy time series that are common in the clinical setting.

  1. Nonlinear detection of disordered voice productions from short time series based on a Volterra-Wiener-Korenberg model

    International Nuclear Information System (INIS)

    Zhang Yu; Sprecher, Alicia J.; Zhao Zongxi; Jiang, Jack J.

    2011-01-01

    Highlights: → The VWK method effectively detects the nonlinearity of a discrete map. → The method describes the chaotic time series of a biomechanical vocal fold model. → Nonlinearity in laryngeal pathology is detected from short and noisy time series. - Abstract: In this paper, we apply the Volterra-Wiener-Korenberg (VWK) model method to detect nonlinearity in disordered voice productions. The VWK method effectively describes the nonlinearity of a third-order nonlinear map. It allows for the analysis of short and noisy data sets. The extracted VWK model parameters show an agreement with the original nonlinear map parameters. Furthermore, the VWK mode method is applied to successfully assess the nonlinearity of a biomechanical voice production model simulating irregular vibratory dynamics of vocal folds with a unilateral vocal polyp. Finally, we show the clinical applicability of this nonlinear detection method to analyze the electroglottographic data generated by 14 patients with vocal nodules or polyps. The VWK model method shows potential in describing the nonlinearity inherent in disordered voice productions from short and noisy time series that are common in the clinical setting.

  2. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  3. The System Dynamics Model for Development of Organic Agriculture

    Science.gov (United States)

    Rozman, Črtomir; Škraba, Andrej; Kljajić, Miroljub; Pažek, Karmen; Bavec, Martina; Bavec, Franci

    2008-10-01

    Organic agriculture is the highest environmentally valuable agricultural system, and has strategic importance at national level that goes beyond the interests of agricultural sector. In this paper we address development of organic farming simulation model based on a system dynamics methodology (SD). The system incorporates relevant variables, which affect the development of the organic farming. The group decision support system (GDSS) was used in order to identify most relevant variables for construction of causal loop diagram and further model development. The model seeks answers to strategic questions related to the level of organically utilized area, levels of production and crop selection in a long term dynamic context and will be used for simulation of different policy scenarios for organic farming and their impact on economic and environmental parameters of organic production at an aggregate level.

  4. Isotopic fractionation between organic carbon and carbonate carbon in Precambrian banded ironstone series from Brazil

    International Nuclear Information System (INIS)

    Schidlowski, M.; Eichmann, R.; Fiebiger, W.

    1976-01-01

    37 delta 13 Csub(org) and 9 delta 13 Csub(carb) values furnished by argillaceous and carbonate sediments from the Rio das Velhas and Minas Series (Minas Gerais, Brazil) have yielded means of -24.3 +- 3.9 promille [PDB] and -0.9 +- 1.4 promille [PDB], respectively. These results, obtained from a major sedimentary banded ironstone province with an age between 2 and 3 x 10 9 yr, support previous assumptions that isotopic fractionation between inorganic and organic carbon in Precambrian sediments is about the same as in Phanerozoic rocks. This is consistent with a theoretically expected constancy of the kinetic fractionation factor governing biological carbon fixation and, likewise, with a photosynthetic pedigree of the reduced carbon fraction of Precambrian rocks. (orig.) [de

  5. Time series modelling and forecasting of emergency department overcrowding.

    Science.gov (United States)

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  6. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    Science.gov (United States)

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  7. A dynamic model to explain hydration behaviour along the lanthanide series

    International Nuclear Information System (INIS)

    Duvail, M.; Spezia, R.; Vitorge, P.

    2008-01-01

    An understanding of the hydration structure of heavy atoms, such as transition metals, lanthanides and actinides, in aqueous solution is of fundamental importance in order to address their solvation properties and chemical reactivity. Herein we present a systematic molecular dynamics study of Ln 3+ hydration in bulk water that can be used as reference for experimental and theoretical research in this and related fields. Our study of hydration structure and dynamics along the entire Ln 3+ series provides a dynamic picture of the CN behavioural change from light (CN=9 predominating) to heavy (CN=8 predominating) lanthanides consistent with the exchange mechanism proposed by Helm, Merbach and co-workers. This scenario is summarized in this work. The hydrated light lanthanides are stable TTP structures containing two kinds of water molecules: six molecules forming the trigonal prism and three in the centre triangle. Towards the middle of the series both ionic radii and polarizabilities decrease, such that first-shell water-water repulsion increases and water-cation attraction decreases. This mainly applies for molecules of the centre triangle of the nine-fold structure. Thus, one of these molecules stay in the second hydration sphere of the lanthanide for longer average times, as one progresses along the lanthanide series. The interchange between predominantly CN=9 and CN=8 is found between Tb and Dy. Therefore, we propose a model that determines the properties governing the change in the first-shell coordination number across the series, confirming the basic hypothesis proposed by Helm and Merbach. We show that it is not a sudden change in behaviour, but rather that it results from a statistical predominance of one first hydration shell structure containing nine water molecules over one containing eight. This is observed progressively across the series. (O.M.)

  8. Mining Gene Regulatory Networks by Neural Modeling of Expression Time-Series.

    Science.gov (United States)

    Rubiolo, Mariano; Milone, Diego H; Stegmayer, Georgina

    2015-01-01

    Discovering gene regulatory networks from data is one of the most studied topics in recent years. Neural networks can be successfully used to infer an underlying gene network by modeling expression profiles as times series. This work proposes a novel method based on a pool of neural networks for obtaining a gene regulatory network from a gene expression dataset. They are used for modeling each possible interaction between pairs of genes in the dataset, and a set of mining rules is applied to accurately detect the subjacent relations among genes. The results obtained on artificial and real datasets confirm the method effectiveness for discovering regulatory networks from a proper modeling of the temporal dynamics of gene expression profiles.

  9. Development of New Loan Payment Models with Piecewise Geometric Gradient Series

    Directory of Open Access Journals (Sweden)

    Erdal Aydemir

    2014-12-01

    Full Text Available Engineering economics plays an important role in decision making. Also, the cash flows, time value of money and interest rates are the most important research fields in mathematical finance. Generalized formulae obtained from a variety of models with the time value of money and cash flows are inadequate to solve some problems. In this study, a new generalized formulae is considered for the first time and derived from a loan payment model which is a certain number of payment amount determined by customer at the beginning of payment period and the other repayments with piecewise linear gradient series. As a result, some numerical examples with solutions are given for the developed models

  10. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  11. Model of organ dose combination

    International Nuclear Information System (INIS)

    Valley, J.-F.; Lerch, P.

    1977-01-01

    The ICRP recommendations are based on the limitation of the dose to each organ. In the application and for a unique source the critical organ concept allows to limit the calculation and represents the irradiation status of an individuum. When several sources of radiation are involved the derivation of the dose contribution of each source to each organ is necessary. In order to represent the irradiation status a new parameter is to be defined. Propositions have been made by some authors, in particular by Jacobi introducing at this level biological parameters like the incidence rate of detriment and its severity. The new concept is certainly richer than a simple dose notion. However, in the actual situation of knowledge about radiation effects an intermediate parameter, using only physical concepts and the maximum permissible doses to the organs, seems more appropriate. The model, which is a generalization of the critical organ concept and shall be extended in the future to take the biological effects into account, will be presented [fr

  12. Patient specific dynamic geometric models from sequential volumetric time series image data.

    Science.gov (United States)

    Cameron, B M; Robb, R A

    2004-01-01

    Generating patient specific dynamic models is complicated by the complexity of the motion intrinsic and extrinsic to the anatomic structures being modeled. Using a physics-based sequentially deforming algorithm, an anatomically accurate dynamic four-dimensional model can be created from a sequence of 3-D volumetric time series data sets. While such algorithms may accurately track the cyclic non-linear motion of the heart, they generally fail to accurately track extrinsic structural and non-cyclic motion. To accurately model these motions, we have modified a physics-based deformation algorithm to use a meta-surface defining the temporal and spatial maxima of the anatomic structure as the base reference surface. A mass-spring physics-based deformable model, which can expand or shrink with the local intrinsic motion, is applied to the metasurface, deforming this base reference surface to the volumetric data at each time point. As the meta-surface encompasses the temporal maxima of the structure, any extrinsic motion is inherently encoded into the base reference surface and allows the computation of the time point surfaces to be performed in parallel. The resultant 4-D model can be interactively transformed and viewed from different angles, showing the spatial and temporal motion of the anatomic structure. Using texture maps and per-vertex coloring, additional data such as physiological and/or biomechanical variables (e.g., mapping electrical activation sequences onto contracting myocardial surfaces) can be associated with the dynamic model, producing a 5-D model. For acquisition systems that may capture only limited time series data (e.g., only images at end-diastole/end-systole or inhalation/exhalation), this algorithm can provide useful interpolated surfaces between the time points. Such models help minimize the number of time points required to usefully depict the motion of anatomic structures for quantitative assessment of regional dynamics.

  13. [Survival strategy of photosynthetic organisms. 1. Variability of the extent of light-harvesting pigment aggregation as a structural factor optimizing the function of oligomeric photosynthetic antenna. Model calculations].

    Science.gov (United States)

    Fetisova, Z G

    2004-01-01

    In accordance with our concept of rigorous optimization of photosynthetic machinery by a functional criterion, this series of papers continues purposeful search in natural photosynthetic units (PSU) for the basic principles of their organization that we predicted theoretically for optimal model light-harvesting systems. This approach allowed us to determine the basic principles for the organization of a PSU of any fixed size. This series of papers deals with the problem of structural optimization of light-harvesting antenna of variable size controlled in vivo by the light intensity during the growth of organisms, which accentuates the problem of antenna structure optimization because optimization requirements become more stringent as the PSU increases in size. In this work, using mathematical modeling for the functioning of natural PSUs, we have shown that the aggregation of pigments of model light-harvesting antenna, being one of universal optimizing factors, furthermore allows controlling the antenna efficiency if the extent of pigment aggregation is a variable parameter. In this case, the efficiency of antenna increases with the size of the elementary antenna aggregate, thus ensuring the high efficiency of the PSU irrespective of its size; i.e., variation in the extent of pigment aggregation controlled by the size of light-harvesting antenna is biologically expedient.

  14. using stereochemistry models in teaching organic compounds

    African Journals Online (AJOL)

    Preferred Customer

    The purpose of the study was to find out the effect of stereochemistry models on students' ... consistent with the names given to organic compounds. Some of ... Considering class level, what is the performance of the students in naming organic.

  15. Forecasting Cryptocurrencies Financial Time Series

    OpenAIRE

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely on Dynamic Model Averaging to combine a large set of univariate Dynamic Linear Models and several multivariate Vector Autoregressive models with different forms of time variation. We find statistical si...

  16. An accurate description of Aspergillus niger organic acid batch fermentation through dynamic metabolic modelling.

    Science.gov (United States)

    Upton, Daniel J; McQueen-Mason, Simon J; Wood, A Jamie

    2017-01-01

    Aspergillus niger fermentation has provided the chief source of industrial citric acid for over 50 years. Traditional strain development of this organism was achieved through random mutagenesis, but advances in genomics have enabled the development of genome-scale metabolic modelling that can be used to make predictive improvements in fermentation performance. The parent citric acid-producing strain of A. niger , ATCC 1015, has been described previously by a genome-scale metabolic model that encapsulates its response to ambient pH. Here, we report the development of a novel double optimisation modelling approach that generates time-dependent citric acid fermentation using dynamic flux balance analysis. The output from this model shows a good match with empirical fermentation data. Our studies suggest that citric acid production commences upon a switch to phosphate-limited growth and this is validated by fitting to empirical data, which confirms the diauxic growth behaviour and the role of phosphate storage as polyphosphate. The calibrated time-course model reflects observed metabolic events and generates reliable in silico data for industrially relevant fermentative time series, and for the behaviour of engineered strains suggesting that our approach can be used as a powerful tool for predictive metabolic engineering.

  17. A COMPARATIVE STUDY OF SIMULATION AND TIME SERIES MODEL IN QUANTIFYING BULLWHIP EFFECT IN SUPPLY CHAIN

    Directory of Open Access Journals (Sweden)

    T. V. O. Fabson

    2011-11-01

    Full Text Available Bullwhip (or whiplash effect is an observed phenomenon in forecast driven distribution channeland careful management of these effects is of great importance to managers of supply chain.Bullwhip effect refers to situations where orders to the suppliers tend to have larger variance thansales to the buyer (demand distortion and the distortion increases as we move up the supply chain.Due to the fact that demand of customer for product is unstable, business managers must forecast inorder to properly position inventory and other resources. Forecasts are statistically based and in mostcases, are not very accurate. The existence of forecast errors made it necessary for organizations tooften carry an inventory buffer called “safety stock”. Moving up the supply chain from the end userscustomers to raw materials supplier there is a lot of variation in demand that can be observed, whichcall for greater need for safety stock.This study compares the efficacy of simulation and Time Series model in quantifying the bullwhipeffects in supply chain management.

  18. Detection of perturbation phases and developmental stages in organisms from DNA microarray time series data.

    Directory of Open Access Journals (Sweden)

    Marianne Rooman

    Full Text Available Available DNA microarray time series that record gene expression along the developmental stages of multicellular eukaryotes, or in unicellular organisms subject to external perturbations such as stress and diauxie, are analyzed. By pairwise comparison of the gene expression profiles on the basis of a translation-invariant and scale-invariant distance measure corresponding to least-rectangle regression, it is shown that peaks in the average distance values are noticeable and are localized around specific time points. These points systematically coincide with the transition points between developmental phases or just follow the external perturbations. This approach can thus be used to identify automatically, from microarray time series alone, the presence of external perturbations or the succession of developmental stages in arbitrary cell systems. Moreover, our results show that there is a striking similarity between the gene expression responses to these a priori very different phenomena. In contrast, the cell cycle does not involve a perturbation-like phase, but rather continuous gene expression remodeling. Similar analyses were conducted using three other standard distance measures, showing that the one we introduced was superior. Based on these findings, we set up an adapted clustering method that uses this distance measure and classifies the genes on the basis of their expression profiles within each developmental stage or between perturbation phases.

  19. Decoupling of modeling and measuring interval in groundwater time series analysis based on response characteristics

    NARCIS (Netherlands)

    Berendrecht, W.L.; Heemink, A.W.; Geer, F.C. van; Gehrels, J.C.

    2003-01-01

    A state-space representation of the transfer function-noise (TFN) model allows the choice of a modeling (input) interval that is smaller than the measuring interval of the output variable. Since in geohydrological applications the interval of the available input series (precipitation excess) is

  20. OBJECT ORIENTED MODELLING, A MODELLING METHOD OF AN ECONOMIC ORGANIZATION ACTIVITY

    Directory of Open Access Journals (Sweden)

    TĂNĂSESCU ANA

    2014-05-01

    Full Text Available Now, most economic organizations use different information systems types in order to facilitate their activity. There are different methodologies, methods and techniques that can be used to design information systems. In this paper, I propose to present the advantages of using the object oriented modelling at the information system design of an economic organization. Thus, I have modelled the activity of a photo studio, using Visual Paradigm for UML as a modelling tool. For this purpose, I have identified the use cases for the analyzed system and I have presented the use case diagram. I have, also, realized the system static and dynamic modelling, through the most known UML diagrams.

  1. Modeling the surface tension of complex, reactive organic-inorganic mixtures

    Science.gov (United States)

    Schwier, A. N.; Viglione, G. A.; Li, Z.; McNeill, V. Faye

    2013-11-01

    Atmospheric aerosols can contain thousands of organic compounds which impact aerosol surface tension, affecting aerosol properties such as heterogeneous reactivity, ice nucleation, and cloud droplet formation. We present new experimental data for the surface tension of complex, reactive organic-inorganic aqueous mixtures mimicking tropospheric aerosols. Each solution contained 2-6 organic compounds, including methylglyoxal, glyoxal, formaldehyde, acetaldehyde, oxalic acid, succinic acid, leucine, alanine, glycine, and serine, with and without ammonium sulfate. We test two semi-empirical surface tension models and find that most reactive, complex, aqueous organic mixtures which do not contain salt are well described by a weighted Szyszkowski-Langmuir (S-L) model which was first presented by Henning et al. (2005). Two approaches for modeling the effects of salt were tested: (1) the Tuckermann approach (an extension of the Henning model with an additional explicit salt term), and (2) a new implicit method proposed here which employs experimental surface tension data obtained for each organic species in the presence of salt used with the Henning model. We recommend the use of method (2) for surface tension modeling of aerosol systems because the Henning model (using data obtained from organic-inorganic systems) and Tuckermann approach provide similar modeling results and goodness-of-fit (χ2) values, yet the Henning model is a simpler and more physical approach to modeling the effects of salt, requiring less empirically determined parameters.

  2. Model for the respiratory modulation of the heart beat-to-beat time interval series

    Science.gov (United States)

    Capurro, Alberto; Diambra, Luis; Malta, C. P.

    2005-09-01

    In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.

  3. Model for the heart beat-to-beat time series during meditation

    Science.gov (United States)

    Capurro, A.; Diambra, L.; Malta, C. P.

    2003-09-01

    We present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a pacemaker, that simulates the membrane potential of the sinoatrial node, modulated by a periodic input signal plus correlated noise that simulates the respiratory input. The model was used to assess the waveshape of the respiratory signals needed to reproduce in the phase space the trajectory of experimental heart beat-to-beat interval data. The data sets were recorded during meditation practices of the Chi and Kundalini Yoga techniques. Our study indicates that in the first case the respiratory signal has the shape of a smoothed square wave, and in the second case it has the shape of a smoothed triangular wave.

  4. Estimating and Analyzing Savannah Phenology with a Lagged Time Series Model

    DEFF Research Database (Denmark)

    Boke-Olen, Niklas; Lehsten, Veiko; Ardo, Jonas

    2016-01-01

    cycle due to their areal coverage and can have an effect on the food security in regions that depend on subsistence farming. In this study we investigate how soil moisture, mean annual precipitation, and day length control savannah phenology by developing a lagged time series model. The model uses...... climate data for 15 flux tower sites across four continents, and normalized difference vegetation index from satellite to optimize a statistical phenological model. We show that all three variables can be used to estimate savannah phenology on a global scale. However, it was not possible to create...... a simplified savannah model that works equally well for all sites on the global scale without inclusion of more site specific parameters. The simplified model showed no bias towards tree cover or between continents and resulted in a cross-validated r2 of 0.6 and root mean squared error of 0.1. We therefore...

  5. The string prediction models as an invariants of time series in forex market

    OpenAIRE

    Richard Pincak; Marian Repasan

    2011-01-01

    In this paper we apply a new approach of the string theory to the real financial market. It is direct extension and application of the work [1] into prediction of prices. The models are constructed with an idea of prediction models based on the string invariants (PMBSI). The performance of PMBSI is compared to support vector machines (SVM) and artificial neural networks (ANN) on an artificial and a financial time series. Brief overview of the results and analysis is given. The first model is ...

  6. ShapeSelectForest: a new r package for modeling landsat time series

    Science.gov (United States)

    Mary Meyer; Xiyue Liao; Gretchen Moisen; Elizabeth Freeman

    2015-01-01

    We present a new R package called ShapeSelectForest recently posted to the Comprehensive R Archival Network. The package was developed to fit nonparametric shape-restricted regression splines to time series of Landsat imagery for the purpose of modeling, mapping, and monitoring annual forest disturbance dynamics over nearly three decades. For each pixel and spectral...

  7. Emergent organization in a model market

    Science.gov (United States)

    Yadav, Avinash Chand; Manchanda, Kaustubh; Ramaswamy, Ramakrishna

    2017-09-01

    We study the collective behaviour of interacting agents in a simple model of market economics that was originally introduced by Nørrelykke and Bak. A general theoretical framework for interacting traders on an arbitrary network is presented, with the interaction consisting of buying (namely consumption) and selling (namely production) of commodities. Extremal dynamics is introduced by having the agent with least profit in the market readjust prices, causing the market to self-organize. In addition to examining this model market on regular lattices in two-dimensions, we also study the cases of random complex networks both with and without community structures. Fluctuations in an activity signal exhibit properties that are characteristic of avalanches observed in models of self-organized criticality, and these can be described by power-law distributions when the system is in the critical state.

  8. Modelling the behaviour of uranium-series radionuclides in soils and plants taking into account seasonal variations in soil hydrology.

    Science.gov (United States)

    Pérez-Sánchez, D; Thorne, M C

    2014-05-01

    In a previous paper, a mathematical model for the behaviour of (79)Se in soils and plants was described. Subsequently, a review has been published relating to the behaviour of (238)U-series radionuclides in soils and plants. Here, we bring together those two strands of work to describe a new mathematical model of the behaviour of (238)U-series radionuclides entering soils in solution and their uptake by plants. Initial studies with the model that are reported here demonstrate that it is a powerful tool for exploring the behaviour of this decay chain or subcomponents of it in soil-plant systems under different hydrological regimes. In particular, it permits studies of the degree to which secular equilibrium assumptions are appropriate when modelling this decay chain. Further studies will be undertaken and reported separately examining sensitivities of model results to input parameter values and also applying the model to sites contaminated with (238)U-series radionuclides. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Organic photovoltaic cells utilizing ultrathin sensitizing layer

    Science.gov (United States)

    Rand, Barry P [Princeton, NJ; Forrest, Stephen R [Princeton, NJ

    2011-05-24

    A photosensitive device includes a series of organic photoactive layers disposed between two electrodes. Each layer in the series is in direct contact with a next layer in the series. The series is arranged to form at least one donor-acceptor heterojunction, and includes a first organic photoactive layer comprising a first host material serving as a donor, a thin second organic photoactive layer comprising a second host material disposed between the first and a third organic photoactive layer, and the third organic photoactive layer comprising a third host material serving as an acceptor. The first, second, and third host materials are different. The thin second layer serves as an acceptor relative to the first layer or as a donor relative to the third layer.

  10. {pi}-{pi} Interactions and magnetic properties in a series of hybrid inorganic-organic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, M.; Lemus-Santana, A.A. [Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada, Unidad Legaria, Instituto Politecnico Nacional, Mexico, D. F. (Mexico); Rodriguez-Hernandez, J. [Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada, Unidad Legaria, Instituto Politecnico Nacional, Mexico, D. F. (Mexico); Instituto de Ciencia y Tecnologia de Materiales, Universidad de La Habana (Cuba); Knobel, M. [Instituto de Fisica ' Gleb Wataghin' , Universidade Estadual de Campinas, SP (Brazil); Reguera, E., E-mail: edilso.reguera@gmail.com [Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada, Unidad Legaria, Instituto Politecnico Nacional, Mexico, D. F. (Mexico)

    2013-01-15

    The series of hybrid inorganic-organic solids T(Im){sub 2}[Ni(CN){sub 4}] with T=Fe, Co, Ni and Im=imidazole were prepared by soft chemical routes from aqueous solutions of the involved building units: imidazole, T{sup 2+} metal and the [Ni(CN){sub 4}]{sup 2-} anionic block. The obtained samples were characterized from infrared and UV-vis spectroscopies, and thermogravimetric, X-ray diffraction and magnetic measurements. Anhydrous solids which crystallize with a monoclinic unit cell, in the I2/a space group with four formula units per cell (Z=4) were obtained. Their crystal structure was solved ab initio from the recorded X-ray powder patterns and then refined by the Rietveld method. The metal T is found with octahedral coordination to four N ends of CN groups and two imidazole molecules while the inner Ni atom preserves its planar coordination. The system of layers remains stacked in an ordered 3D structure through dipole-dipole and {pi}-{pi} interactions between imidazole rings from neighboring layers. In this way, a pillared structure is achieved without requiring the coordination of both nitrogen atoms from imidazole ring. The recorded magnetic data indicate the occurrence of a predominant ferromagnetic interaction at low temperature for Co and Ni but not for Fe. Such magnetic ordering is more favorable for Ni with transition temperature of 14.67 K, which was ascribed to the relatively high polarizing power for this metal. Within the considered T metals, to nickel the highest electron-withdrawing ability corresponds and this leads to an increase for the metal-ligand electron clouds overlapping and to a stronger {pi}-{pi} attractive interaction, two factors that result into a higher magnetic ordering temperature. - Graphical Abstract: Magnetic ordering through the {pi}-{pi} interaction between the imidazole rings. Highlights: Black-Right-Pointing-Pointer Hybrid inorganic-organic solids. Black-Right-Pointing-Pointer Hybrid inorganic-organic molecular based

  11. An Ising model for metal-organic frameworks

    Science.gov (United States)

    Höft, Nicolas; Horbach, Jürgen; Martín-Mayor, Victor; Seoane, Beatriz

    2017-08-01

    We present a three-dimensional Ising model where lines of equal spins are frozen such that they form an ordered framework structure. The frame spins impose an external field on the rest of the spins (active spins). We demonstrate that this "porous Ising model" can be seen as a minimal model for condensation transitions of gas molecules in metal-organic frameworks. Using Monte Carlo simulation techniques, we compare the phase behavior of a porous Ising model with that of a particle-based model for the condensation of methane (CH4) in the isoreticular metal-organic framework IRMOF-16. For both models, we find a line of first-order phase transitions that end in a critical point. We show that the critical behavior in both cases belongs to the 3D Ising universality class, in contrast to other phase transitions in confinement such as capillary condensation.

  12. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    Directory of Open Access Journals (Sweden)

    Jun-He Yang

    2017-01-01

    Full Text Available Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir’s water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir’s water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  13. An Instructional Development Model for Global Organizations: The GOaL Model.

    Science.gov (United States)

    Hara, Noriko; Schwen, Thomas M.

    1999-01-01

    Presents an instructional development model, GOaL (Global Organization Localization), for use by global organizations. Topics include gaps in language, culture, and needs; decentralized processes; collaborative efforts; predetermined content; multiple perspectives; needs negotiation; learning within context; just-in-time training; and bilingual…

  14. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    Science.gov (United States)

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  15. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study.

    Science.gov (United States)

    Li, Qiongge; Chan, Maria F

    2017-01-01

    Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.

  16. Sensor response monitoring in pressurized water reactors using time series modeling

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Kerlin, T.W.

    1978-01-01

    Random data analysis in nuclear power reactors for purposes of process surveillance, pattern recognition and monitoring of temperature, pressure, flow and neutron sensors has gained increasing attention in view of their potential for helping to ensure safe plant operation. In this study, application of autoregressive moving-average (ARMA) time series modeling for monitoring temperature sensor response characteristrics is presented. The ARMA model is used to estimate the step and ramp response of the sensors and the related time constant and ramp delay time. The ARMA parameters are estimated by a two-stage algorithm in the spectral domain. Results of sensor testing for an operating pressurized water reactor are presented. 16 refs

  17. Application of semi parametric modelling to times series forecasting: case of the electricity consumption; Modeles semi-parametriques appliques a la prevision des series temporelles. Cas de la consommation d'electricite

    Energy Technology Data Exchange (ETDEWEB)

    Lefieux, V

    2007-10-15

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  18. Forecasting of time series with trend and seasonal cycle using the airline model and artificial neural networks Pronóstico de series de tiempo con tendencia y ciclo estacional usando el modelo airline y redes neuronales artificiales

    Directory of Open Access Journals (Sweden)

    J D Velásquez

    2012-06-01

    Full Text Available Many time series with trend and seasonal pattern are successfully modeled and forecasted by the airline model of Box and Jenkins; however, this model neglects the presence of nonlinearity on data. In this paper, we propose a new nonlinear version of the airline model; for this, we replace the moving average linear component by a multilayer perceptron neural network. The proposedmodel is used for forecasting two benchmark time series; we found that theproposed model is able to forecast the time series with more accuracy that other traditional approaches.Muchas series de tiempo con tendencia y ciclos estacionales son exitosamente modeladas y pronosticadas usando el modelo airline de Box y Jenkins; sin embargo, la presencia de no linealidades en los datos son despreciadas por este modelo. En este artículo, se propone una nueva versión no lineal del modelo airline; para esto, se reemplaza la componente lineal de promedios móviles por un perceptrón multicapa. El modelo propuesto es usado para pronosticar dos series de tiempo benchmark; se encontró que el modelo propuesto es capaz de pronosticar las series de tiempo con mayor precisión que otras aproximaciones tradicionales.

  19. PENDISC: a simple method for constructing a mathematical model from time-series data of metabolite concentrations.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Iwata, Michio; Hirai, Masami Yokota; Shiraishi, Fumihide

    2014-06-01

    The availability of large-scale datasets has led to more effort being made to understand characteristics of metabolic reaction networks. However, because the large-scale data are semi-quantitative, and may contain biological variations and/or analytical errors, it remains a challenge to construct a mathematical model with precise parameters using only these data. The present work proposes a simple method, referred to as PENDISC (Parameter Estimation in a N on- DImensionalized S-system with Constraints), to assist the complex process of parameter estimation in the construction of a mathematical model for a given metabolic reaction system. The PENDISC method was evaluated using two simple mathematical models: a linear metabolic pathway model with inhibition and a branched metabolic pathway model with inhibition and activation. The results indicate that a smaller number of data points and rate constant parameters enhances the agreement between calculated values and time-series data of metabolite concentrations, and leads to faster convergence when the same initial estimates are used for the fitting. This method is also shown to be applicable to noisy time-series data and to unmeasurable metabolite concentrations in a network, and to have a potential to handle metabolome data of a relatively large-scale metabolic reaction system. Furthermore, it was applied to aspartate-derived amino acid biosynthesis in Arabidopsis thaliana plant. The result provides confirmation that the mathematical model constructed satisfactorily agrees with the time-series datasets of seven metabolite concentrations.

  20. Time series segmentation: a new approach based on Genetic Algorithm and Hidden Markov Model

    Science.gov (United States)

    Toreti, A.; Kuglitsch, F. G.; Xoplaki, E.; Luterbacher, J.

    2009-04-01

    The subdivision of a time series into homogeneous segments has been performed using various methods applied to different disciplines. In climatology, for example, it is accompanied by the well-known homogenization problem and the detection of artificial change points. In this context, we present a new method (GAMM) based on Hidden Markov Model (HMM) and Genetic Algorithm (GA), applicable to series of independent observations (and easily adaptable to autoregressive processes). A left-to-right hidden Markov model, estimating the parameters and the best-state sequence, respectively, with the Baum-Welch and Viterbi algorithms, was applied. In order to avoid the well-known dependence of the Baum-Welch algorithm on the initial condition, a Genetic Algorithm was developed. This algorithm is characterized by mutation, elitism and a crossover procedure implemented with some restrictive rules. Moreover the function to be minimized was derived following the approach of Kehagias (2004), i.e. it is the so-called complete log-likelihood. The number of states was determined applying a two-fold cross-validation procedure (Celeux and Durand, 2008). Being aware that the last issue is complex, and it influences all the analysis, a Multi Response Permutation Procedure (MRPP; Mielke et al., 1981) was inserted. It tests the model with K+1 states (where K is the state number of the best model) if its likelihood is close to K-state model. Finally, an evaluation of the GAMM performances, applied as a break detection method in the field of climate time series homogenization, is shown. 1. G. Celeux and J.B. Durand, Comput Stat 2008. 2. A. Kehagias, Stoch Envir Res 2004. 3. P.W. Mielke, K.J. Berry, G.W. Brier, Monthly Wea Rev 1981.

  1. Authentication in Virtual Organizations: A Reputation Based PKI Interconnection Model

    Science.gov (United States)

    Wazan, Ahmad Samer; Laborde, Romain; Barrere, Francois; Benzekri, Abdelmalek

    Authentication mechanism constitutes a central part of the virtual organization work. The PKI technology is used to provide the authentication in each organization involved in the virtual organization. Different trust models are proposed to interconnect the different PKIs in order to propagate the trust between them. While the existing trust models contain many drawbacks, we propose a new trust model based on the reputation of PKIs.

  2. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    Science.gov (United States)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  3. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    International Nuclear Information System (INIS)

    Almog, Assaf; Garlaschelli, Diego

    2014-01-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information. (paper)

  4. Approaches in highly parameterized inversion: TSPROC, a general time-series processor to assist in model calibration and result summarization

    Science.gov (United States)

    Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.

    2012-01-01

    The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.

  5. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  6. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-01-01

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  7. A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.

    Science.gov (United States)

    Houseman, E Andres; Virji, M Abbas

    2017-08-01

    Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates

  8. Time series modelling of global mean temperature for managerial decision-making.

    Science.gov (United States)

    Romilly, Peter

    2005-07-01

    Climate change has important implications for business and economic activity. Effective management of climate change impacts will depend on the availability of accurate and cost-effective forecasts. This paper uses univariate time series techniques to model the properties of a global mean temperature dataset in order to develop a parsimonious forecasting model for managerial decision-making over the short-term horizon. Although the model is estimated on global temperature data, the methodology could also be applied to temperature data at more localised levels. The statistical techniques include seasonal and non-seasonal unit root testing with and without structural breaks, as well as ARIMA and GARCH modelling. A forecasting evaluation shows that the chosen model performs well against rival models. The estimation results confirm the findings of a number of previous studies, namely that global mean temperatures increased significantly throughout the 20th century. The use of GARCH modelling also shows the presence of volatility clustering in the temperature data, and a positive association between volatility and global mean temperature.

  9. Modelling the fate of oxidisable organic contaminants in groundwater

    DEFF Research Database (Denmark)

    Barry, D.A.; Prommer, H.; Miller, C.T.

    2002-01-01

    modelling framework is illustrated by pertinent examples, showing the degradation of dissolved organics by microbial activity limited by the availability of nutrients or electron acceptors (i.e., changing redox states), as well as concomitant secondary reactions. Two field-scale modelling examples......Subsurface contamination by organic chemicals is a pervasive environmental problem, susceptible to remediation by natural or enhanced attenuation approaches or more highly engineered methods such as pump-and-treat, amongst others. Such remediation approaches, along with risk assessment...... are discussed, the Vejen landfill (Denmark) and an example where metal contamination is remediated by redox changes wrought by injection of a dissolved organic compound. A summary is provided of current and likely future challenges to modelling of oxidisable organics in the subsurface. (C) 2002 Elsevier Science...

  10. Asymptotics for the conditional-sum-of-squares estimator in multivariate fractional time series models

    DEFF Research Database (Denmark)

    Ørregård Nielsen, Morten

    This paper proves consistency and asymptotic normality for the conditional-sum-of-squares estimator, which is equivalent to the conditional maximum likelihood estimator, in multivariate fractional time series models. The model is parametric and quite general, and, in particular, encompasses...... the multivariate non-cointegrated fractional ARIMA model. The novelty of the consistency result, in particular, is that it applies to a multivariate model and to an arbitrarily large set of admissible parameter values, for which the objective function does not converge uniformly in probablity, thus making...

  11. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  12. Stochastic modeling of neurobiological time series: Power, coherence, Granger causality, and separation of evoked responses from ongoing activity

    Science.gov (United States)

    Chen, Yonghong; Bressler, Steven L.; Knuth, Kevin H.; Truccolo, Wilson A.; Ding, Mingzhou

    2006-06-01

    In this article we consider the stochastic modeling of neurobiological time series from cognitive experiments. Our starting point is the variable-signal-plus-ongoing-activity model. From this model a differentially variable component analysis strategy is developed from a Bayesian perspective to estimate event-related signals on a single trial basis. After subtracting out the event-related signal from recorded single trial time series, the residual ongoing activity is treated as a piecewise stationary stochastic process and analyzed by an adaptive multivariate autoregressive modeling strategy which yields power, coherence, and Granger causality spectra. Results from applying these methods to local field potential recordings from monkeys performing cognitive tasks are presented.

  13. Time Series Modeling of Human Operator Dynamics in Manual Control Tasks

    Science.gov (United States)

    Biezad, D. J.; Schmidt, D. K.

    1984-01-01

    A time-series technique is presented for identifying the dynamic characteristics of the human operator in manual control tasks from relatively short records of experimental data. Control of system excitation signals used in the identification is not required. The approach is a multi-channel identification technique for modeling multi-input/multi-output situations. The method presented includes statistical tests for validity, is designed for digital computation, and yields estimates for the frequency response of the human operator. A comprehensive relative power analysis may also be performed for validated models. This method is applied to several sets of experimental data; the results are discussed and shown to compare favorably with previous research findings. New results are also presented for a multi-input task that was previously modeled to demonstrate the strengths of the method.

  14. Hydrological time series modeling: A comparison between adaptive neuro-fuzzy, neural network and autoregressive techniques

    Science.gov (United States)

    Lohani, A. K.; Kumar, Rakesh; Singh, R. D.

    2012-06-01

    SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.

  15. THE EFFECT OF DECOMPOSITION METHOD AS DATA PREPROCESSING ON NEURAL NETWORKS MODEL FOR FORECASTING TREND AND SEASONAL TIME SERIES

    Directory of Open Access Journals (Sweden)

    Subanar Subanar

    2006-01-01

    Full Text Available Recently, one of the central topics for the neural networks (NN community is the issue of data preprocessing on the use of NN. In this paper, we will investigate this topic particularly on the effect of Decomposition method as data processing and the use of NN for modeling effectively time series with both trend and seasonal patterns. Limited empirical studies on seasonal time series forecasting with neural networks show that some find neural networks are able to model seasonality directly and prior deseasonalization is not necessary, and others conclude just the opposite. In this research, we study particularly on the effectiveness of data preprocessing, including detrending and deseasonalization by applying Decomposition method on NN modeling and forecasting performance. We use two kinds of data, simulation and real data. Simulation data are examined on multiplicative of trend and seasonality patterns. The results are compared to those obtained from the classical time series model. Our result shows that a combination of detrending and deseasonalization by applying Decomposition method is the effective data preprocessing on the use of NN for forecasting trend and seasonal time series.

  16. Application of semi parametric modelling to times series forecasting: case of the electricity consumption

    International Nuclear Information System (INIS)

    Lefieux, V.

    2007-10-01

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  17. Optimal model-free prediction from multivariate time series

    Science.gov (United States)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  18. The ab initio model potential method. Second series transition metal elements

    International Nuclear Information System (INIS)

    Barandiaran, Z.; Seijo, L.; Huzinaga, S.

    1990-01-01

    The ab initio core method potential model (AIMP) has already been presented in its nonrelativistic version and applied to the main group and first series transition metal elements [J. Chem. Phys. 86, 2132 (1987); 91, 7011 (1989)]. In this paper we extend the AIMP method to include relativistic effects within the Cowan--Griffin approximation and we present relativistic Zn-like core model potentials and valence basis sets, as well as their nonrelativistic Zn-like core and Kr-like core counterparts. The pilot molecular calculations on YO, TcO, AgO, and AgH reveal that the 4p orbital is indeed a core orbital only at the end part of the series, whereas the 4s orbital can be safely frozen from Y to Cd. The all-electron and model potential results agree in 0.01--0.02 A in R e and 25--50 cm -1 in bar ν e if the same type of valence part of the basis set is used. The comparison of the relativistic results on AgH with those of the all-electron Dirac--Fock calculations by Lee and McLean is satisfactory: the absolute value of R e is reproduced within the 0.01 A margin and the relativistic contraction of 0.077 A is also very well reproduced (0.075 A). Finally, the relative magnitude of the effects of the core orbital change, mass--velocity potential, and Darwin potential on the net relativistic effects are analyzed in the four molecules studied

  19. Hidden discriminative features extraction for supervised high-order time series modeling.

    Science.gov (United States)

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Modelling the fate of organic micropollutants in stormwater ponds

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Eriksson, Eva; Ledin, Anna

    2011-01-01

    ). The four simulated organic stormwater MP (iodopropynyl butylcarbamate — IPBC, benzene, glyphosate and pyrene) were selected according to their different urban sources and environmental fate. This ensures that the results can be extended to other relevant stormwater pollutants. All three models use......Urban water managers need to estimate the potential removal of organic micropollutants (MP) in stormwater treatment systems to support MP pollution control strategies. This study documents how the potential removal of organic MP in stormwater treatment systems can be quantified by using multimedia...... models. The fate of four different MP in a stormwater retention pond was simulated by applying two steady-state multimedia fate models (EPI Suite and SimpleBox) commonly applied in chemical risk assessment and a dynamic multimedia fate model (Stormwater Treatment Unit Model for Micro Pollutants — STUMP...

  1. U/Pb dating: brioverian age of the Erquy series (Armorican massif, France)

    International Nuclear Information System (INIS)

    Cocherie, A.; Chantraine, J.; Egal, E.; Fanning, C.M.; Dabard, M.P.; Paris, F.; Le Herisse, A.

    2001-01-01

    New U/Pb analyses obtained with a high-resolution ion microprobe (SHRIMP) fix an age of 608 ± 7 Ma for spilites of the Erquy series, in Cadomian rocks of the Armorican massif, France. This Neo-proterozoic age re-integrates this unit into the Brioverian, the age it was initially assigned to. A Rb/sr whole-rock dating in the 1970's had undermined the regional Cadomian model, by suggesting an Ordovician age for these rocks; this was apparently further supported by the discovery of organic remains, interpreted as Paleozoic micro-fossils. The reassessment of this paleontologic attribution and the new isotope dating are a final confirmation of the age of this series. (authors)

  2. A model-independent view of the mature organization

    Energy Technology Data Exchange (ETDEWEB)

    Hanna, M.; Langston, D.

    1996-12-31

    Over the last 10 years, industry has been dealing with the issues of process and organizational maturity. This focus on process is driven by the success that manufacturing organizations have had implementing the management principles of W. Edwards Deming and Joseph M. Juran. The organizational-maturity focus is driven by organizations striving to be ISO 9000 compliant or to achieve a specific level on one of the maturity models. Unfortunately, each of the models takes a specific view into what is a very broad arena. That is to say, each model addresses only a specific subset of the characteristics of maturity. This paper attempts to extend beyond these specific views to answer the general question, What is a mature organization and its relationship to Quantitative management and statistical process control?

  3. 76 FR 6584 - Airworthiness Directives; Bombardier, Inc. Model DHC-8-400 Series Airplanes

    Science.gov (United States)

    2011-02-07

    .... Model DHC-8-400 Series Airplanes AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of... area on the rib at Yw-42.000 to ensure adequate electrical bonding, installing spiral wrap on certain cable assemblies where existing spiral wrap does not extend 4 inches past the tie mounts, applying a...

  4. Statistical properties of fluctuations of time series representing appearances of words in nationwide blog data and their applications: An example of modeling fluctuation scalings of nonstationary time series.

    Science.gov (United States)

    Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako

    2016-11-01

    To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3×10^{9} Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.

  5. Stochastic modeling for time series InSAR: with emphasis on atmospheric effects

    Science.gov (United States)

    Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai

    2018-02-01

    Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.

  6. Autoregressive models as a tool to discriminate chaos from randomness in geoelectrical time series: an application to earthquake prediction

    Directory of Open Access Journals (Sweden)

    C. Serio

    1997-06-01

    Full Text Available The time dynamics of geoelectrical precursory time series has been investigated and a method to discriminate chaotic behaviour in geoelectrical precursory time series is proposed. It allows us to detect low-dimensional chaos when the only information about the time series comes from the time series themselves. The short-term predictability of these time series is evaluated using two possible forecasting approaches: global autoregressive approximation and local autoregressive approximation. The first views the data as a realization of a linear stochastic process, whereas the second considers the data points as a realization of a deterministic process, supposedly non-linear. The comparison of the predictive skill of the two techniques is a test to discriminate between low-dimensional chaos and random dynamics. The analyzed time series are geoelectrical measurements recorded by an automatic station located in Tito (Southern Italy in one of the most seismic areas of the Mediterranean region. Our findings are that the global (linear approach is superior to the local one and the physical system governing the phenomena of electrical nature is characterized by a large number of degrees of freedom. Power spectra of the filtered time series follow a P(f = F-a scaling law: they exhibit the typical behaviour of a broad class of fractal stochastic processes and they are a signature of the self-organized systems.

  7. A new Markov-chain-related statistical approach for modelling synthetic wind power time series

    International Nuclear Information System (INIS)

    Pesch, T; Hake, J F; Schröders, S; Allelein, H J

    2015-01-01

    The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)

  8. Historical Perspectives and Future Needs in the Development of the Soil Series Concept

    Science.gov (United States)

    Beaudette, Dylan E.; Brevik, Eric C.; Indorante, Samuel J.

    2016-04-01

    The soil series concept is an ever-evolving understanding of soil profile observations, their connection to the landscape, and functional limits on the range in characteristics that affect management. Historically, the soil series has played a pivotal role in the development of soil-landscape theory, modern soil survey methods, and concise delivery of soils information to the end-user-- in other words, soil series is the palette from which soil survey reports are crafted. Over the last 20 years the soil series has received considerable criticism as a means of soil information organization (soil survey development) and delivery (end-user application of soil survey data), with increasing pressure (internal and external) to retire the soil series. We propose that a modern re-examination of soil series information could help address several of the long-standing critiques of soil survey: consistency across survey vintage and political divisions and more robust estimates of soil properties and associated uncertainty. A new library of soil series data would include classic narratives describing morphology and management, quantitative descriptions of soil properties and their ranges, graphical depiction of the relationships between associated soil series, block diagrams illustrating soil-landscape models, maps of series distribution, and a probabilistic representation of a "typical" soil profile. These data would be derived from re-correlation of existing morphologic and characterization data informed by modern statistical methods and regional expertise.

  9. Labour Quality Model for Organic Farming Food Chains

    OpenAIRE

    Gassner, B.; Freyer, B.; Leitner, H.

    2008-01-01

    The debate on labour quality in science is controversial as well as in the organic agriculture community. Therefore, we reviewed literature on different labour quality models and definitions, and had key informant interviews on labour quality issues with stakeholders in a regional oriented organic agriculture bread food chain. We developed a labour quality model with nine quality categories and discussed linkages to labour satisfaction, ethical values and IFOAM principles.

  10. Modeling commodity salam contract between two parties for discrete and continuous time series

    Science.gov (United States)

    Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd

    2017-08-01

    In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.

  11. Monitoring Farmland Loss Caused by Urbanization in Beijing from Modis Time Series Using Hierarchical Hidden Markov Model

    Science.gov (United States)

    Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.

    2018-04-01

    In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.

  12. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  13. Towards Increased Relevance: Context-Adapted Models of the Learning Organization

    Science.gov (United States)

    Örtenblad, Anders

    2015-01-01

    Purpose: The purposes of this paper are to take a closer look at the relevance of the idea of the learning organization for organizations in different generalized organizational contexts; to open up for the existence of multiple, context-adapted models of the learning organization; and to suggest a number of such models.…

  14. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  15. Early diagenesis of recently deposited organic matter: A 9-yr time-series study of a flood deposit

    Science.gov (United States)

    Tesi, T.; Langone, L.; Goñi, M. A.; Wheatcroft, R. A.; Miserocchi, S.; Bertotti, L.

    2012-04-01

    In Fall 2000, the Po River (Italy) experienced a 100-yr return period flood that resulted in a 1-25 cm-thick deposit in the adjacent prodelta (10-25 m water depth). In the following years, numerous post-depositional perturbations occurred including bioturbation, reworking by waves with heights exceeding 5 m, as well as periods of extremely high and low sediment supply. Cores collected in the central prodelta after the Fall 2000 flood and over the following 9 yr, allowed characterization of the event-strata in their initial state and documentation of their subsequent evolution. Sedimentological characteristics were investigated using X-radiographs and sediment texture analyses, whereas the composition of sedimentary organic matter (OM) was studied via bulk and biomarker analyses, including organic carbon (OC), total nitrogen (TN), carbon stable isotope composition (δ13C), lignin phenols, cutin-products, p-hydroxy benzenes, benzoic acids, dicarboxylic acids, and fatty acids. The 9-yr time-series analysis indicated that roughly the lower half of the original event bed was preserved in the sediment record. Conversely, the upper half of the deposit experienced significant alterations including bioturbation, addition of new material, as well as coarsening. Comparison of the recently deposited material with 9-yr old preserved strata represented a unique natural laboratory to investigate the diagenesis of sedimentary OM in a non-steady system. Bulk data indicated that OC and TN were degraded at similar rates (loss ∼17%) whereas biomarkers exhibited a broad spectrum of reactivities (loss from ∼6% to ∼60%) indicating selective preservation during early diagenesis. Given the relevance of episodic sedimentation in several margins, this study has demonstrated the utility of event-response and time-series sampling of the seabed for understanding the early diagenesis in non-steady conditions.

  16. Power flow control for transmission networks with implicit modeling of static synchronous series compensator

    DEFF Research Database (Denmark)

    Kamel, S.; Jurado, F.; Chen, Zhe

    2015-01-01

    This paper presents an implicit modeling of Static Synchronous Series Compensator (SSSC) in Newton–Raphson load flow method. The algorithm of load flow is based on the revised current injection formulation. The developed model of SSSC is depended on the current injection approach. In this model...... will be in the mismatches vector. Finally, this modeling solves the problem that happens when the SSSC is only connected between two areas. Numerical examples on the WSCC 9-bus, IEEE 30-bus system, and IEEE 118-bus system are used to illustrate the feasibility of the developed SSSC model and performance of the Newton–Raphson...

  17. Factors Influencing Implementation of OHSAS 18001 in Indian Construction Organizations: Interpretive Structural Modeling Approach.

    Science.gov (United States)

    Rajaprasad, Sunku Venkata Siva; Chalapathi, Pasupulati Venkata

    2015-09-01

    Construction activity has made considerable breakthroughs in the past two decades on the back of increases in development activities, government policies, and public demand. At the same time, occupational health and safety issues have become a major concern to construction organizations. The unsatisfactory safety performance of the construction industry has always been highlighted since the safety management system is neglected area and not implemented systematically in Indian construction organizations. Due to a lack of enforcement of the applicable legislation, most of the construction organizations are forced to opt for the implementation of Occupational Health Safety Assessment Series (OHSAS) 18001 to improve safety performance. In order to better understand factors influencing the implementation of OHSAS 18001, an interpretive structural modeling approach has been applied and the factors have been classified using matrice d'impacts croises-multiplication appliqué a un classement (MICMAC) analysis. The study proposes the underlying theoretical framework to identify factors and to help management of Indian construction organizations to understand the interaction among factors influencing in implementation of OHSAS 18001. Safety culture, continual improvement, morale of employees, and safety training have been identified as dependent variables. Safety performance, sustainable construction, and conducive working environment have been identified as linkage variables. Management commitment and safety policy have been identified as the driver variables. Management commitment has the maximum driving power and the most influential factor is safety policy, which states clearly the commitment of top management towards occupational safety and health.

  18. Visualizing the Geometric Series.

    Science.gov (United States)

    Bennett, Albert B., Jr.

    1989-01-01

    Mathematical proofs often leave students unconvinced or without understanding of what has been proved, because they provide no visual-geometric representation. Presented are geometric models for the finite geometric series when r is a whole number, and the infinite geometric series when r is the reciprocal of a whole number. (MNS)

  19. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    Science.gov (United States)

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  20. A self-organized criticality model for plasma transport

    International Nuclear Information System (INIS)

    Carreras, B.A.; Newman, D.; Lynch, V.E.

    1996-01-01

    Many models of natural phenomena manifest the basic hypothesis of self-organized criticality (SOC). The SOC concept brings together the self-similarity on space and time scales that is common to many of these phenomena. The application of the SOC modelling concept to the plasma dynamics near marginal stability opens new possibilities of understanding issues such as Bohm scaling, profile consistency, broad band fluctuation spectra with universal characteristics and fast time scales. A model realization of self-organized criticality for plasma transport in a magnetic confinement device is presented. The model is based on subcritical resistive pressure-gradient-driven turbulence. Three-dimensional nonlinear calculations based on this model show the existence of transport under subcritical conditions. This model that includes fluctuation dynamics leads to results very similar to the running sandpile paradigm

  1. Time series modeling for analysis and control advanced autopilot and monitoring systems

    CERN Document Server

    Ohtsu, Kohei; Kitagawa, Genshiro

    2015-01-01

    This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships’ autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state–space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracki...

  2. Modeling Dyadic Processes Using Hidden Markov Models: A Time Series Approach to Mother-Infant Interactions during Infant Immunization

    Science.gov (United States)

    Stifter, Cynthia A.; Rovine, Michael

    2015-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…

  3. Modeling time-dependent toxicity to aquatic organisms from pulsed exposure of PAHs in urban road runoff

    International Nuclear Information System (INIS)

    Zhang Wei; Ye Youbin; Tong Yindong; Ou Langbo; Hu Dan; Wang Xuejun

    2011-01-01

    Understanding of the magnitude of urban runoff toxicity to aquatic organisms is important for effective management of runoff quality. In this paper, the aquatic toxicity of polycyclic aromatic hydrocarbons (PAHs) in urban road runoff was evaluated through a damage assessment model. Mortality probability of the organisms representative in aquatic environment was calculated using the monitored PAHs concentration in road runoff. The result showed that the toxicity of runoff in spring was higher than those in summer. Analysis of the time-dependent toxicity of series of runoff water samples illustrated that the toxicity of runoff water in the final phase of a runoff event may be as high as those in the initial phase. Therefore, the storm runoff treatment systems or strategies designed for capture and treatment of the initial portion of runoff may be inappropriate for control of runoff toxicity. - Research highlights: → Toxicity resulting from realistic exposure patterns of urban runoff is evaluated. → Toxicity of runoff water in the final phase is as high as the initial phase. → Treatment of the initial runoff portion is inappropriate to abate runoff toxicity. - Toxicity to aquatic organisms after sequential pulsed exposure to PAHs in urban road runoff is evaluated.

  4. Predicting Time Series Outputs and Time-to-Failure for an Aircraft Controller Using Bayesian Modeling

    Science.gov (United States)

    He, Yuning

    2015-01-01

    Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.

  5. Measurements and receptor modeling of volatile organic compounds in Southeastern Mexico City, 2000–2007

    Directory of Open Access Journals (Sweden)

    H. Wöhrnschimmel

    2010-09-01

    Full Text Available Ambient samples of volatile organic compounds (VOCs were measured between 2000 and 2007 in Southeastern Mexico City, quantifying 13 species (ethane, propane, propylene, butane, acetylene, pentane, hexane, heptane, benzene, octane, toluene, nonane, o-xylene. These time series were analyzed for long-term trends, using linear regression models. A main finding was that the concentrations for several VOC species were decreasing during this period. A receptor model was applied to identify possible VOC sources, as well as temporal patterns in their respective contributions. Domestic use of liquefied petroleum gas (LPG and vehicle exhaust are suggested to be the principal emission sources, contributing together between 70% and 80% to the total of quantified species. Both diurnal and seasonal patterns, as well as a weekend effect were recognized in the modelled source contributions. Furthermore, decreasing trends over time were found for LPG and hot soak (−7.8% and −12.7% per year, respectively, p < 0.01, whereas for vehicle exhaust no significant trend was found.

  6. Guidance for implementing an environmental, safety, and health-assurance program. Volume 15. A model plan for line organization environmental, safety, and health-assurance programs

    Energy Technology Data Exchange (ETDEWEB)

    Ellingson, A.C.; Trauth, C.A. Jr.

    1982-01-01

    This is 1 of 15 documents designed to illustrate how an Environmental, Safety and Health (ES and H) Assurance Program may be implemented. The generic definition of ES and H Assurance Programs is given in a companion document entitled An Environmental, Safety and Health Assurance Program Standard. This particular document presents a model operational-level ES and H Assurance Program that may be used as a guide by an operational-level organization in developing its own plan. The model presented here reflects the guidance given in the total series of 15 documents.

  7. 78 FR 65155 - Special Conditions: Learjet Model 45 Series Airplanes; Isolation or Security Protection of the...

    Science.gov (United States)

    2013-10-31

    ... an association, business, labor union, etc.). DOT's complete Privacy Act Statement can be found in... supplemental type certificate (STC) change in the digital systems architecture in the Learjet Model 45 series... plus two crew members. The proposed Learjet Model 45 architecture is new and novel for commercial...

  8. The Zebrafish Model Organism Database (ZFIN)

    Data.gov (United States)

    U.S. Department of Health & Human Services — ZFIN serves as the zebrafish model organism database. It aims to: a) be the community database resource for the laboratory use of zebrafish, b) develop and support...

  9. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  10. Effect of continuous addition of an organic substrate to the anoxic phase on biological phosphorus removal

    DEFF Research Database (Denmark)

    Meinhold, Jens; Pedersen, Heinz; Arnold, Eva

    1998-01-01

    The continuous introduction of a biological phosphorus removal (BPR) promoting organic substrate to the denitrifying reactor of a BPR process is examined through a series of batch experiments using acetate as model organic substrate. Several observations are made regarding the influence of substr...

  11. Charge Equalization Controller Algorithm for Series-Connected Lithium-Ion Battery Storage Systems: Modeling and Applications

    Directory of Open Access Journals (Sweden)

    Mahammad A. Hannan

    2017-09-01

    Full Text Available This study aims to develop an accurate model of a charge equalization controller (CEC that manages individual cell monitoring and equalizing by charging and discharging series-connected lithium-ion (Li-ion battery cells. In this concept, an intelligent control algorithm is developed to activate bidirectional cell switches and control direct current (DC–DC converter switches along with pulse width modulation (PWM generation. Individual models of an electric vehicle (EV-sustainable Li-ion battery, optimal power rating, a bidirectional flyback DC–DC converter, and charging and discharging controllers are integrated to develop a small-scale CEC model that can be implemented for 10 series-connected Li-ion battery cells. Results show that the charge equalization controller operates at 91% efficiency and performs well in equalizing both overdischarged and overcharged cells on time. Moreover, the outputs of the CEC model show that the desired balancing level occurs at 2% of state of charge difference and that all cells are operated within a normal range. The configuration, execution, control, power loss, cost, size, and efficiency of the developed CEC model are compared with those of existing controllers. The proposed model is proven suitable for high-tech storage systems toward the advancement of sustainable EV technologies and renewable source of applications.

  12. Investigation of Relationship Between Hydrologic Processes of Precipitation, Evaporation and Stream Flow Using Linear Time Series Models (Case study: Western Basins of Lake Urmia

    Directory of Open Access Journals (Sweden)

    M. Moravej

    2016-02-01

    Full Text Available Introduction: Studying the hydrological cycle, especially in large scales such as water catchments, is difficult and complicated despite the fact that the numbers of hydrological components are limited. This complexity rises from complex interactions between hydrological components and environment. Recognition, determination and modeling of all interactive processes are needed to address this issue, but it's not feasible for dealing with practical engineering problems. So, it is more convenient to consider hydrological components as stochastic phenomenon, and use stochastic models for modeling them. Stochastic simulation of time series models related to water resources, particularly hydrologic time series, have been widely used in recent decades in order to solve issues pertaining planning and management of water resource systems. In this study time series models fitted to the precipitation, evaporation and stream flow series separately and the relationships between stream flow and precipitation processes are investigated. In fact, the three mentioned processes should be modeled in parallel to each other in order to acquire a comprehensive vision of hydrological conditions in the region. Moreover, the relationship between the hydrologic processes has been mostly studied with respect to their trends. It is desirable to investigate the relationship between trends of hydrological processes and climate change, while the relationship of the models has not been taken into consideration. The main objective of this study is to investigate the relationship between hydrological processes and their effects on each other and the selected models. Material and Method: In the current study, the four sub-basins of Lake Urmia Basin namely Zolachay (A, Nazloochay (B, Shahrchay (C and Barandoozchay (D were considered. Precipitation, evaporation and stream flow time series were modeled by linear time series. Fundamental assumptions of time series analysis namely

  13. The initiative on Model Organism Proteomes (iMOP) Session

    DEFF Research Database (Denmark)

    Schrimpf, Sabine P; Mering, Christian von; Bendixen, Emøke

    2012-01-01

    iMOP – the Initiative on Model Organism Proteomes – was accepted as a new HUPO initiative at the Ninth HUPO meeting in Sydney in 2010. A goal of iMOP is to integrate research groups working on a great diversity of species into a model organism community. At the Tenth HUPO meeting in Geneva...

  14. 76 FR 36390 - Airworthiness Directives; The Boeing Company Model 747SP Series Airplanes

    Science.gov (United States)

    2011-06-22

    ... power control modules (PCM). This proposed AD was prompted by a report of a rudder hard-over event on a... rudder PCM manifold, which could result in a hard-over of the rudder surface leading to an increase in... of a Model 747-400 series airplane of a lower rudder hard-over event caused by a lower rudder PCM...

  15. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    Science.gov (United States)

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  16. IT Business Value Model for Information Intensive Organizations

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Gastaud Maçada

    2012-01-01

    Full Text Available Many studies have highlighted the capacity Information Technology (IT has for generating value for organizations. Investments in IT made by organizations have increased each year. Therefore, the purpose of the present study is to analyze the IT Business Value for Information Intensive Organizations (IIO - e.g. banks, insurance companies and securities brokers. The research method consisted of a survey that used and combined the models from Weill and Broadbent (1998 and Gregor, Martin, Fernandez, Stern and Vitale (2006. Data was gathered using an adapted instrument containing 5 dimensions (Strategic, Informational, Transactional, Transformational and Infra-structure with 27 items. The instrument was refined by employing statistical techniques such as Exploratory and Confirmatory Factorial Analysis through Structural Equations (first and second order Model Measurement. The final model is composed of four factors related to IT Business Value: Strategic, Informational, Transactional and Transformational, arranged in 15 items. The dimension Infra-structure was excluded during the model refinement process because it was discovered during interviews that managers were unable to perceive it as a distinct dimension of IT Business Value.

  17. Charge carrier relaxation model in disordered organic semiconductors

    International Nuclear Information System (INIS)

    Lu, Nianduan; Li, Ling; Sun, Pengxiao; Liu, Ming

    2013-01-01

    The relaxation phenomena of charge carrier in disordered organic semiconductors have been demonstrated and investigated theoretically. An analytical model describing the charge carrier relaxation is proposed based on the pure hopping transport theory. The relation between the material disorder, electric field and temperature and the relaxation phenomena has been discussed in detail, respectively. The calculated results reveal that the increase of electric field and temperature can promote the relaxation effect in disordered organic semiconductors, while the increase of material disorder will weaken the relaxation. The proposed model can explain well the stretched-exponential law by adopting the appropriate parameters. The calculation shows a good agreement with the experimental data for organic semiconductors

  18. The influence of noise on nonlinear time series detection based on Volterra-Wiener-Korenberg model

    Energy Technology Data Exchange (ETDEWEB)

    Lei Min [State Key Laboratory of Vibration, Shock and Noise, Shanghai Jiao Tong University, Shanghai 200030 (China)], E-mail: leimin@sjtu.edu.cn; Meng Guang [State Key Laboratory of Vibration, Shock and Noise, Shanghai Jiao Tong University, Shanghai 200030 (China)

    2008-04-15

    This paper studies the influence of noises on Volterra-Wiener-Korenberg (VWK) nonlinear test model. Our numerical results reveal that different types of noises lead to different behavior of VWK model detection. For dynamic noise, it is difficult to distinguish chaos from nonchaotic but nonlinear determinism. For time series, measure noise has no impact on chaos determinism detection. This paper also discusses various behavior of VWK model detection with surrogate data for different noises.

  19. Drosophila melanogaster as a model organism to study nanotoxicity.

    Science.gov (United States)

    Ong, Cynthia; Yung, Lin-Yue Lanry; Cai, Yu; Bay, Boon-Huat; Baeg, Gyeong-Hun

    2015-05-01

    Drosophila melanogaster has been used as an in vivo model organism for the study of genetics and development since 100 years ago. Recently, the fruit fly Drosophila was also developed as an in vivo model organism for toxicology studies, in particular, the field of nanotoxicity. The incorporation of nanomaterials into consumer and biomedical products is a cause for concern as nanomaterials are often associated with toxicity in many in vitro studies. In vivo animal studies of the toxicity of nanomaterials with rodents and other mammals are, however, limited due to high operational cost and ethical objections. Hence, Drosophila, a genetically tractable organism with distinct developmental stages and short life cycle, serves as an ideal organism to study nanomaterial-mediated toxicity. This review discusses the basic biology of Drosophila, the toxicity of nanomaterials, as well as how the Drosophila model can be used to study the toxicity of various types of nanomaterials.

  20. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  1. Wavelet transform approach for fitting financial time series data

    Science.gov (United States)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  2. From Networks to Time Series

    Science.gov (United States)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  3. Correlations in magnitude series to assess nonlinearities: Application to multifractal models and heartbeat fluctuations

    Science.gov (United States)

    Bernaola-Galván, Pedro A.; Gómez-Extremera, Manuel; Romance, A. Ramón; Carpena, Pedro

    2017-09-01

    The correlation properties of the magnitude of a time series are associated with nonlinear and multifractal properties and have been applied in a great variety of fields. Here we have obtained the analytical expression of the autocorrelation of the magnitude series (C|x |) of a linear Gaussian noise as a function of its autocorrelation (Cx). For both, models and natural signals, the deviation of C|x | from its expectation in linear Gaussian noises can be used as an index of nonlinearity that can be applied to relatively short records and does not require the presence of scaling in the time series under study. In a model of artificial Gaussian multifractal signal we use this approach to analyze the relation between nonlinearity and multifractallity and show that the former implies the latter but the reverse is not true. We also apply this approach to analyze experimental data: heart-beat records during rest and moderate exercise. For each individual subject, we observe higher nonlinearities during rest. This behavior is also achieved on average for the analyzed set of 10 semiprofessional soccer players. This result agrees with the fact that other measures of complexity are dramatically reduced during exercise and can shed light on its relationship with the withdrawal of parasympathetic tone and/or the activation of sympathetic activity during physical activity.

  4. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  5. Forecasting Enrollments with Fuzzy Time Series.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  6. Modeling error and stability of endothelial cytoskeletal membrane parameters based on modeling transendothelial impedance as resistor and capacitor in series.

    Science.gov (United States)

    Bodmer, James E; English, Anthony; Brady, Megan; Blackwell, Ken; Haxhinasto, Kari; Fotedar, Sunaina; Borgman, Kurt; Bai, Er-Wei; Moy, Alan B

    2005-09-01

    Transendothelial impedance across an endothelial monolayer grown on a microelectrode has previously been modeled as a repeating pattern of disks in which the electrical circuit consists of a resistor and capacitor in series. Although this numerical model breaks down barrier function into measurements of cell-cell adhesion, cell-matrix adhesion, and membrane capacitance, such solution parameters can be inaccurate without understanding model stability and error. In this study, we have evaluated modeling stability and error by using a chi(2) evaluation and Levenberg-Marquardt nonlinear least-squares (LM-NLS) method of the real and/or imaginary data in which the experimental measurement is compared with the calculated measurement derived by the model. Modeling stability and error were dependent on current frequency and the type of experimental data modeled. Solution parameters of cell-matrix adhesion were most susceptible to modeling instability. Furthermore, the LM-NLS method displayed frequency-dependent instability of the solution parameters, regardless of whether the real or imaginary data were analyzed. However, the LM-NLS method identified stable and reproducible solution parameters between all types of experimental data when a defined frequency spectrum of the entire data set was selected on the basis of a criterion of minimizing error. The frequency bandwidth that produced stable solution parameters varied greatly among different data types. Thus a numerical model based on characterizing transendothelial impedance as a resistor and capacitor in series and as a repeating pattern of disks is not sufficient to characterize the entire frequency spectrum of experimental transendothelial impedance.

  7. Estimating soil hydraulic properties from soil moisture time series by inversion of a dual-permeability model

    Science.gov (United States)

    Dalla Valle, Nicolas; Wutzler, Thomas; Meyer, Stefanie; Potthast, Karin; Michalzik, Beate

    2017-04-01

    Dual-permeability type models are widely used to simulate water fluxes and solute transport in structured soils. These models contain two spatially overlapping flow domains with different parameterizations or even entirely different conceptual descriptions of flow processes. They are usually able to capture preferential flow phenomena, but a large set of parameters is needed, which are very laborious to obtain or cannot be measured at all. Therefore, model inversions are often used to derive the necessary parameters. Although these require sufficient input data themselves, they can use measurements of state variables instead, which are often easier to obtain and can be monitored by automated measurement systems. In this work we show a method to estimate soil hydraulic parameters from high frequency soil moisture time series data gathered at two different measurement depths by inversion of a simple one dimensional dual-permeability model. The model uses an advection equation based on the kinematic wave theory to describe the flow in the fracture domain and a Richards equation for the flow in the matrix domain. The soil moisture time series data were measured in mesocosms during sprinkling experiments. The inversion consists of three consecutive steps: First, the parameters of the water retention function were assessed using vertical soil moisture profiles in hydraulic equilibrium. This was done using two different exponential retention functions and the Campbell function. Second, the soil sorptivity and diffusivity functions were estimated from Boltzmann-transformed soil moisture data, which allowed the calculation of the hydraulic conductivity function. Third, the parameters governing flow in the fracture domain were determined using the whole soil moisture time series. The resulting retention functions were within the range of values predicted by pedotransfer functions apart from very dry conditions, where all retention functions predicted lower matrix potentials

  8. Lotka-Volterra competition models for sessile organisms.

    Science.gov (United States)

    Spencer, Matthew; Tanner, Jason E

    2008-04-01

    Markov models are widely used to describe the dynamics of communities of sessile organisms, because they are easily fitted to field data and provide a rich set of analytical tools. In typical ecological applications, at any point in time, each point in space is in one of a finite set of states (e.g., species, empty space). The models aim to describe the probabilities of transitions between states. In most Markov models for communities, these transition probabilities are assumed to be independent of state abundances. This assumption is often suspected to be false and is rarely justified explicitly. Here, we start with simple assumptions about the interactions among sessile organisms and derive a model in which transition probabilities depend on the abundance of destination states. This model is formulated in continuous time and is equivalent to a Lotka-Volterra competition model. We fit this model and a variety of alternatives in which transition probabilities do not depend on state abundances to a long-term coral reef data set. The Lotka-Volterra model describes the data much better than all models we consider other than a saturated model (a model with a separate parameter for each transition at each time interval, which by definition fits the data perfectly). Our approach provides a basis for further development of stochastic models of sessile communities, and many of the methods we use are relevant to other types of community. We discuss possible extensions to spatially explicit models.

  9. Modeling of the Channel Thickness Influence on Electrical Characteristics and Series Resistance in Gate-Recessed Nanoscale SOI MOSFETs

    Directory of Open Access Journals (Sweden)

    A. Karsenty

    2013-01-01

    Full Text Available Ultrathin body (UTB and nanoscale body (NSB SOI-MOSFET devices, sharing a similar W/L but with a channel thickness of 46 nm and lower than 5 nm, respectively, were fabricated using a selective “gate-recessed” process on the same silicon wafer. Their current-voltage characteristics measured at room temperature were found to be surprisingly different by several orders of magnitude. We analyzed this result by considering the severe mobility degradation and the influence of a huge series resistance and found that the last one seems more coherent. Then the electrical characteristics of the NSB can be analytically derived by integrating a gate voltage-dependent drain source series resistance. In this paper, the influence of the channel thickness on the series resistance is reported for the first time. This influence is integrated to the analytical model in order to describe the trends of the saturation current with the channel thickness. This modeling approach may be useful to interpret anomalous electrical behavior of other nanodevices in which series resistance and/or mobility degradation is of a great concern.

  10. Modelling the behaviour of uranium-series radionuclides in soils and plants taking into account seasonal variations in soil hydrology

    International Nuclear Information System (INIS)

    Pérez-Sánchez, D.; Thorne, M.C.

    2014-01-01

    In a previous paper, a mathematical model for the behaviour of 79 Se in soils and plants was described. Subsequently, a review has been published relating to the behaviour of 238 U-series radionuclides in soils and plants. Here, we bring together those two strands of work to describe a new mathematical model of the behaviour of 238 U-series radionuclides entering soils in solution and their uptake by plants. Initial studies with the model that are reported here demonstrate that it is a powerful tool for exploring the behaviour of this decay chain or subcomponents of it in soil-plant systems under different hydrological regimes. In particular, it permits studies of the degree to which secular equilibrium assumptions are appropriate when modelling this decay chain. Further studies will be undertaken and reported separately examining sensitivities of model results to input parameter values and also applying the model to sites contaminated with 238 U-series radionuclides. - Highlights: • Kinetic model of radionuclide transport in soils and uptake by plants. • Takes soil hydrology and redox conditions into account. • Applicable to the whole U-238 chain, including Rn-222, Pb-210 and Po-210. • Demonstrates intra-season and inter-season variability on timescales up to thousands of years

  11. Forecasting Cryptocurrencies Financial Time Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely...

  12. (Tropical) soil organic matter modelling: problems and prospects

    NARCIS (Netherlands)

    Keulen, van H.

    2001-01-01

    Soil organic matter plays an important role in many physical, chemical and biological processes. However, the quantitative relations between the mineral and organic components of the soil and the relations with the vegetation are poorly understood. In such situations, the use of models is an

  13. Framework for Design of Traceability System on Organic Rice Certification

    Science.gov (United States)

    Purwandoko, P. B.; Seminar, K. B.; Sutrisno; Sugiyanta

    2018-05-01

    Nowadays, the preferences of organic products such as organic rice have been increased. It because of the people awareness of the healthy and eco-friendly food product consumption has grown. Therefore, it is very important to ensure organic quality of the product that will be produced. Certification is a series of process that holds to ensure the quality of products meets all criteria of organic standards. Currently, there is a problem that traceability information system for organic rice certification has been not available. The current system still conducts manually caused the loss of information during storage process. This paper aimed at developing a traceability framework on organic rice certification process. First, the main discussed issues are organic certification process. Second, unified modeling language (UML) is used to build the model of user requirement in order to develop traceability system for all actors in the certification process. Furthermore, the information captured model along certification process will be explained in this paper. The model shows the information flow that has to be recorded for each actor. Finally, the challenges in the implementation system will be discussed in this paper.

  14. SERI Wind Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Noun, R. J.

    1983-06-01

    The SERI Wind Energy Program manages the areas or innovative research, wind systems analysis, and environmental compatibility for the U.S. Department of Energy. Since 1978, SERI wind program staff have conducted in-house aerodynamic and engineering analyses of novel concepts for wind energy conversion and have managed over 20 subcontracts to determine technical feasibility; the most promising of these concepts is the passive blade cyclic pitch control project. In the area of systems analysis, the SERI program has analyzed the impact of intermittent generation on the reliability of electric utility systems using standard utility planning models. SERI has also conducted methodology assessments. Environmental issues related to television interference and acoustic noise from large wind turbines have been addressed. SERI has identified the causes, effects, and potential control of acoustic noise emissions from large wind turbines.

  15. 78 FR 75511 - Special Conditions: Bombardier Inc., Models BD-500-1A10 and BD-500-1A11 Series Airplanes...

    Science.gov (United States)

    2013-12-12

    ... Inc., Models BD-500-1A10 and BD- 500-1A11 Series Airplanes; Electronic Flight Control System: Control... Inc. Models BD-500-1A10 and BD-500-1A11 series airplanes. These airplanes will have a novel or unusual... comments, data, or views. The most helpful comments reference a specific portion of the special conditions...

  16. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...... from data, where clustering is used to propose one single split candidate at each split level. We use the class of ART time series models to serve as illustration, but because of the non-parametric nature of our segmentation approach, it readily generalizes to a wide range of time-series models that go...

  17. Vector Control Using Series Iron Loss Model of Induction, Motors and Power Loss Minimization

    OpenAIRE

    Kheldoun Aissa; Khodja Djalal Eddine

    2009-01-01

    The iron loss is a source of detuning in vector controlled induction motor drives if the classical rotor vector controller is used for decoupling. In fact, the field orientation will not be satisfied and the output torque will not truck the reference torque mostly used by Loss Model Controllers (LMCs). In addition, this component of loss, among others, may be excessive if the vector controlled induction motor is driving light loads. In this paper, the series iron loss model ...

  18. A Model for Mental Health Programming in Schools and Communities: Introduction to the Mini-Series.

    Science.gov (United States)

    Nastasi, Bonnie K.

    1998-01-01

    Describes conceptual framework for mini-series on mental health programming. Model includes five components considered to be critical for comprehensive and effective programming. Components include: action research, ecological perspective of program development, collaborative/participatory approach, prevention to treatment via mental health…

  19. Effect of atmospheric organic complexation on iron-bearing dust solubility

    OpenAIRE

    Paris , R.; Desboeufs , K. V.

    2013-01-01

    International audience; Recent studies reported that the effect of organic complexation may be a potentially important process to be considered by models estimating atmospheric iron flux to the ocean. In this study, we investigated this process effect by a series of dissolution experiments on iron-bearing dust in the presence or the absence of various organic compounds (acetate, formate, oxalate, malonate, succinate, glutarate, glycolate, lactate, tartrate and humic acid as an analogue of hum...

  20. A novel series of isoreticular metal organic frameworks: Realizing metastable structures by liquid phase epitaxy

    KAUST Repository

    Liu, Jinxuan; Lukose, Binit; Shekhah, Osama; Arslan, Hasan Kemal; Weidler, Peter; Gliemann, Hartmut; Brä se, Stefan; Grosjean, Sylvain; Godt, Adelheid; Feng, Xinliang; Mü llen, Klaus; Magdau, Ioan-Bogdan; Heine, Thomas; Wö ll, Christof

    2012-01-01

    A novel class of metal organic frameworks (MOFs) has been synthesized from Cu-acetate and dicarboxylic acids using liquid phase epitaxy. The SURMOF-2 isoreticular series exhibits P4 symmetry, for the longest linker a channel-size of 3 3 nm2 is obtained, one of the largest values reported for any MOF so far. High quality, ab-initio electronic structure calculations confirm the stability of a regular packing of (Cu++) 2-carboxylate paddle-wheel planes with P4 symmetry and reveal, that the SURMOF-2 structures are in fact metastable, with a fairly large activation barrier for the transition to the bulk MOF-2 structures exhibiting a lower, twofold (P2 or C2) symmetry. The theoretical calculations also allow identifying the mechanism for the low-temperature epitaxial growth process and to explain, why a synthesis of this highly interesting, new class of high-symmetry, metastable MOFs is not possible using the conventional solvothermal process.

  1. A novel series of isoreticular metal organic frameworks: Realizing metastable structures by liquid phase epitaxy

    KAUST Repository

    Liu, Jinxuan

    2012-12-04

    A novel class of metal organic frameworks (MOFs) has been synthesized from Cu-acetate and dicarboxylic acids using liquid phase epitaxy. The SURMOF-2 isoreticular series exhibits P4 symmetry, for the longest linker a channel-size of 3 3 nm2 is obtained, one of the largest values reported for any MOF so far. High quality, ab-initio electronic structure calculations confirm the stability of a regular packing of (Cu++) 2-carboxylate paddle-wheel planes with P4 symmetry and reveal, that the SURMOF-2 structures are in fact metastable, with a fairly large activation barrier for the transition to the bulk MOF-2 structures exhibiting a lower, twofold (P2 or C2) symmetry. The theoretical calculations also allow identifying the mechanism for the low-temperature epitaxial growth process and to explain, why a synthesis of this highly interesting, new class of high-symmetry, metastable MOFs is not possible using the conventional solvothermal process.

  2. Physiologically Based Pharmacokinetic Modeling in Lead Optimization. 1. Evaluation and Adaptation of GastroPlus To Predict Bioavailability of Medchem Series.

    Science.gov (United States)

    Daga, Pankaj R; Bolger, Michael B; Haworth, Ian S; Clark, Robert D; Martin, Eric J

    2018-03-05

    When medicinal chemists need to improve bioavailability (%F) within a chemical series during lead optimization, they synthesize new series members with systematically modified properties mainly by following experience and general rules of thumb. More quantitative models that predict %F of proposed compounds from chemical structure alone have proven elusive. Global empirical %F quantitative structure-property (QSPR) models perform poorly, and projects have too little data to train local %F QSPR models. Mechanistic oral absorption and physiologically based pharmacokinetic (PBPK) models simulate the dissolution, absorption, systemic distribution, and clearance of a drug in preclinical species and humans. Attempts to build global PBPK models based purely on calculated inputs have not achieved the optimization. In this work, local GastroPlus PBPK models are instead customized for individual medchem series. The key innovation was building a local QSPR for a numerically fitted effective intrinsic clearance (CL loc ). All inputs are subsequently computed from structure alone, so the models can be applied in advance of synthesis. Training CL loc on the first 15-18 rat %F measurements gave adequate predictions, with clear improvements up to about 30 measurements, and incremental improvements beyond that.

  3. Bayesian model averaging method for evaluating associations between air pollution and respiratory mortality: a time-series study.

    Science.gov (United States)

    Fang, Xin; Li, Runkui; Kan, Haidong; Bottai, Matteo; Fang, Fang; Cao, Yang

    2016-08-16

    To demonstrate an application of Bayesian model averaging (BMA) with generalised additive mixed models (GAMM) and provide a novel modelling technique to assess the association between inhalable coarse particles (PM10) and respiratory mortality in time-series studies. A time-series study using regional death registry between 2009 and 2010. 8 districts in a large metropolitan area in Northern China. 9559 permanent residents of the 8 districts who died of respiratory diseases between 2009 and 2010. Per cent increase in daily respiratory mortality rate (MR) per interquartile range (IQR) increase of PM10 concentration and corresponding 95% confidence interval (CI) in single-pollutant and multipollutant (including NOx, CO) models. The Bayesian model averaged GAMM (GAMM+BMA) and the optimal GAMM of PM10, multipollutants and principal components (PCs) of multipollutants showed comparable results for the effect of PM10 on daily respiratory MR, that is, one IQR increase in PM10 concentration corresponded to 1.38% vs 1.39%, 1.81% vs 1.83% and 0.87% vs 0.88% increase, respectively, in daily respiratory MR. However, GAMM+BMA gave slightly but noticeable wider CIs for the single-pollutant model (-1.09 to 4.28 vs -1.08 to 3.93) and the PCs-based model (-2.23 to 4.07 vs -2.03 vs 3.88). The CIs of the multiple-pollutant model from two methods are similar, that is, -1.12 to 4.85 versus -1.11 versus 4.83. The BMA method may represent a useful tool for modelling uncertainty in time-series studies when evaluating the effect of air pollution on fatal health outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Fractality of profit landscapes and validation of time series models for stock prices

    Science.gov (United States)

    Yi, Il Gu; Oh, Gabjin; Kim, Beom Jun

    2013-08-01

    We apply a simple trading strategy for various time series of real and artificial stock prices to understand the origin of fractality observed in the resulting profit landscapes. The strategy contains only two parameters p and q, and the sell (buy) decision is made when the log return is larger (smaller) than p (-q). We discretize the unit square (p,q) ∈ [0,1] × [0,1] into the N × N square grid and the profit Π(p,q) is calculated at the center of each cell. We confirm the previous finding that local maxima in profit landscapes are scattered in a fractal-like fashion: the number M of local maxima follows the power-law form M ˜ Na, but the scaling exponent a is found to differ for different time series. From comparisons of real and artificial stock prices, we find that the fat-tailed return distribution is closely related to the exponent a ≈ 1.6 observed for real stock markets. We suggest that the fractality of profit landscape characterized by a ≈ 1.6 can be a useful measure to validate time series model for stock prices.

  5. Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting

    Science.gov (United States)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    2017-07-01

    In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.

  6. Development of a statistical shape model of multi-organ and its performance evaluation

    International Nuclear Information System (INIS)

    Nakada, Misaki; Shimizu, Akinobu; Kobatake, Hidefumi; Nawano, Shigeru

    2010-01-01

    Existing statistical shape modeling methods for an organ can not take into account the correlation between neighboring organs. This study focuses on a level set distribution model and proposes two modeling methods for multiple organs that can take into account the correlation between neighboring organs. The first method combines level set functions of multiple organs into a vector. Subsequently it analyses the distribution of the vectors of a training dataset by a principal component analysis and builds a multiple statistical shape model. Second method constructs a statistical shape model for each organ independently and assembles component scores of different organs in a training dataset so as to generate a vector. It analyses the distribution of the vectors of to build a statistical shape model of multiple organs. This paper shows results of applying the proposed methods trained by 15 abdominal CT volumes to unknown 8 CT volumes. (author)

  7. Learning from environmental data: Methods for analysis of forest nutrition time series

    Energy Technology Data Exchange (ETDEWEB)

    Sulkava, M. (Helsinki Univ. of Technology, Espoo (Finland). Computer and Information Science)

    2008-07-01

    Data analysis methods play an important role in increasing our knowledge of the environment as the amount of data measured from the environment increases. This thesis fits under the scope of environmental informatics and environmental statistics. They are fields, in which data analysis methods are developed and applied for the analysis of environmental data. The environmental data studied in this thesis are time series of nutrient concentration measurements of pine and spruce needles. In addition, there are data of laboratory quality and related environmental factors, such as the weather and atmospheric depositions. The most important methods used for the analysis of the data are based on the self-organizing map and linear regression models. First, a new clustering algorithm of the self-organizing map is proposed. It is found to provide better results than two other methods for clustering of the self-organizing map. The algorithm is used to divide the nutrient concentration data into clusters, and the result is evaluated by environmental scientists. Based on the clustering, the temporal development of the forest nutrition is modeled and the effect of nitrogen and sulfur deposition on the foliar mineral composition is assessed. Second, regression models are used for studying how much environmental factors and properties of the needles affect the changes in the nutrient concentrations of the needles between their first and second year of existence. The aim is to build understandable models with good prediction capabilities. Sparse regression models are found to outperform more traditional regression models in this task. Third, fusion of laboratory quality data from different sources is performed to estimate the precisions of the analytical methods. Weighted regression models are used to quantify how much the precision of observations can affect the time needed to detect a trend in environmental time series. The results of power analysis show that improving the

  8. The conceptual model of organization social responsibility

    OpenAIRE

    LUO, Lan; WEI, Jingfu

    2014-01-01

    With the developing of the research of CSR, people more and more deeply noticethat the corporate should take responsibility. Whether other organizations besides corporatesshould not take responsibilities beyond their field? This paper puts forward theconcept of organization social responsibility on the basis of the concept of corporate socialresponsibility and other theories. And the conceptual models are built based on theconception, introducing the OSR from three angles: the types of organi...

  9. Resilient organizations: matrix model and service line management.

    Science.gov (United States)

    Westphal, Judith A

    2005-09-01

    Resilient organizations modify structures to meet the demands of the marketplace. The author describes a structure that enables multihospital organizations to innovate and rapidly adapt to changes. Service line management within a matrix model is an evolving organizational structure for complex systems in which nurses are pivotal members.

  10. Intercomparison of Satellite Derived Gravity Time Series with Inferred Gravity Time Series from TOPEX/POSEIDON Sea Surface Heights and Climatological Model Output

    Science.gov (United States)

    Cox, C.; Au, A.; Klosko, S.; Chao, B.; Smith, David E. (Technical Monitor)

    2001-01-01

    The upcoming GRACE mission promises to open a window on details of the global mass budget that will have remarkable clarity, but it will not directly answer the question of what the state of the Earth's mass budget is over the critical last quarter of the 20th century. To address that problem we must draw upon existing technologies such as SLR, DORIS, and GPS, and climate modeling runs in order to improve our understanding. Analysis of long-period geopotential changes based on SLR and DORIS tracking has shown that addition of post 1996 satellite tracking data has a significant impact on the recovered zonal rates and long-period tides. Interannual effects such as those causing the post 1996 anomalies must be better characterized before refined estimates of the decadal period changes in the geopotential can be derived from the historical database of satellite tracking. A possible cause of this anomaly is variations in ocean mass distribution, perhaps associated with the recent large El Nino/La Nina. In this study, a low-degree spherical harmonic gravity time series derived from satellite tracking is compared with a TOPEX/POSEIDON-derived sea surface height time series. Corrections for atmospheric mass effects, continental hydrology, snowfall accumulation, and ocean steric model predictions will be considered.

  11. A perturbative approach for enhancing the performance of time series forecasting.

    Science.gov (United States)

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Scalability of Sustainable Business Models in Hybrid Organizations

    Directory of Open Access Journals (Sweden)

    Adam Jabłoński

    2016-02-01

    Full Text Available The dynamics of change in modern business create new mechanisms for company management to determine their pursuit and the achievement of their high performance. This performance maintained over a long period of time becomes a source of ensuring business continuity by companies. An ontological being enabling the adoption of such assumptions is such a business model that has the ability to generate results in every possible market situation and, moreover, it has the features of permanent adaptability. A feature that describes the adaptability of the business model is its scalability. Being a factor ensuring more work and more efficient work with an increasing number of components, scalability can be applied to the concept of business models as the company’s ability to maintain similar or higher performance through it. Ensuring the company’s performance in the long term helps to build the so-called sustainable business model that often balances the objectives of stakeholders and shareholders, and that is created by the implemented principles of value-based management and corporate social responsibility. This perception of business paves the way for building hybrid organizations that integrate business activities with pro-social ones. The combination of an approach typical of hybrid organizations in designing and implementing sustainable business models pursuant to the scalability criterion seems interesting from the cognitive point of view. Today, hybrid organizations are great spaces for building effective and efficient mechanisms for dialogue between business and society. This requires the appropriate business model. The purpose of the paper is to present the conceptualization and operationalization of scalability of sustainable business models that determine the performance of a hybrid organization in the network environment. The paper presents the original concept of applying scalability in sustainable business models with detailed

  13. Modeling the Explicit Chemistry of Anthropogenic and Biogenic Organic Aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Madronich, Sasha [Univ. Corporation for Atmospheric Research, Boulder, CO (United States)

    2015-12-09

    The atmospheric burden of Secondary Organic Aerosols (SOA) remains one of the most important yet uncertain aspects of the radiative forcing of climate. This grant focused on improving our quantitative understanding of SOA formation and evolution, by developing, applying, and improving a highly detailed model of atmospheric organic chemistry, the Generation of Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A) model. Eleven (11) publications have resulted from this grant.

  14. Triplet exciton diffusion in organic semiconductors

    Energy Technology Data Exchange (ETDEWEB)

    Koehler, Anna [Department of Physics, University of Bayreuth (Germany)

    2010-07-01

    Efficient triplet exciton emission has allowed improved operation of organic light-emitting diodes (LEDs). To enhance the device performance, it is necessary to understand what governs the motion of triplet excitons through the organic semiconductor. We use a series of poly(p-phenylene)-type conjugated polymers and oligomers of variable degree of molecular distortion (i.e. polaron formation) and energetic disorder as model systems to study the Dexter-type triplet exciton diffusion in thin films. We show that triplet diffusion can be quantitatively described in the framework of a Holstein small polaron model (Marcus theory) that is extended to include contributions from energetic disorder. The model predicts a tunnelling process at low temperatures followed by a thermally activated hopping process above a transition temperature. In contrast to charge transfer, the activation energy required for triplet exciton transfer can be deduced from the optical spectra. We discuss the implications for device architecture.

  15. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  16. An SVM model with hybrid kernels for hydrological time series

    Science.gov (United States)

    Wang, C.; Wang, H.; Zhao, X.; Xie, Q.

    2017-12-01

    Support Vector Machine (SVM) models have been widely applied to the forecast of climate/weather and its impact on other environmental variables such as hydrologic response to climate/weather. When using SVM, the choice of the kernel function plays the key role. Conventional SVM models mostly use one single type of kernel function, e.g., radial basis kernel function. Provided that there are several featured kernel functions available, each having its own advantages and drawbacks, a combination of these kernel functions may give more flexibility and robustness to SVM approach, making it suitable for a wide range of application scenarios. This paper presents such a linear combination of radial basis kernel and polynomial kernel for the forecast of monthly flowrate in two gaging stations using SVM approach. The results indicate significant improvement in the accuracy of predicted series compared to the approach with either individual kernel function, thus demonstrating the feasibility and advantages of such hybrid kernel approach for SVM applications.

  17. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).

    Science.gov (United States)

    Mayaud, C; Wagner, T; Benischke, R; Birk, S

    2014-04-16

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  18. Development of a whole-organism model to screen new compounds for sun protection.

    Science.gov (United States)

    Wang, Yun-Hsin; Wen, Chi-Chung; Yang, Zhi-Shiang; Cheng, Chien-Chung; Tsai, Jen-Ning; Ku, Chia-Chen; Wu, Hsin-Ju; Chen, Yau-Hung

    2009-01-01

    We used zebrafish as a whole-organism model to screen new compounds for sun protection activity. First of all, we designed a series of UVB exposure experiments and recorded the phenotypic changes of zebrafish embryos. Results showed that 100 mJ/cm(2) of UVB given six times separated by 30 min intervals is the best condition. Fin malformation (reduced and/or absent fin) phenotypes are the most evident consequences after exposure to UVB. Each fin was affected by UVB, including pelvic, ventral, caudal, and dorsal fin, but pelvic fin seemed to be the most sensitive target after UVB exposure. We furthermore carried out "prevention" and "treatment" experiments using green tea extract and/or (-)-epigallocatechin (EGCG) to test this whole-organism model by observing the morphological changes of all fins (especially pelvic fin) after UVB exposure. Effects of UVB, green tea extract and EGCG on fin development were assessed using the Kaplan-Meier analysis, log-rank test and Cox proportional hazards regression. Results showed that a zebrafish pelvic fin in the UVB + green tea (treatment) group is 5.51 (range from 2.39 to 14.90) times, one in the UVB + green tea (prevention) group is 7.04 (range from 3.11 to 18.92) times, and one in the 25 ppm of EGCG (prevention) group is 22.19 (range from 9.40 to 61.50) times more likely to return to normal fin than one in the UVB only group. On the basis of these observations, we believe this model is effective for screening the higher stability and lower toxicity of new compounds, such as small chemicals which are derivative from EGCG or other dietary agents for sun protection.

  19. Forecasting of particulate matter time series using wavelet analysis and wavelet-ARMA/ARIMA model in Taiyuan, China.

    Science.gov (United States)

    Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng

    2017-07-01

    Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.

  20. Clinical time series prediction: towards a hierarchical dynamical system framework

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive

  1. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  2. 78 FR 76254 - Special Conditions: Airbus, Model A350-900 Series Airplane; Control Surface Awareness and Mode...

    Science.gov (United States)

    2013-12-17

    ...-0899; Notice No. 25-13-15-SC] Special Conditions: Airbus, Model A350-900 Series Airplane; Control... of proposed special conditions. SUMMARY: This action proposes special conditions for the Airbus Model..., data, or views. The most helpful comments reference a specific portion of the proposed special...

  3. Xanthusbase: adapting wikipedia principles to a model organism database

    OpenAIRE

    Arshinoff, Bradley I.; Suen, Garret; Just, Eric M.; Merchant, Sohel M.; Kibbe, Warren A.; Chisholm, Rex L.; Welch, Roy D.

    2006-01-01

    xanthusBase () is the official model organism database (MOD) for the social bacterium Myxococcus xanthus. In many respects, M.xanthus represents the pioneer model organism (MO) for studying the genetic, biochemical, and mechanistic basis of prokaryotic multicellularity, a topic that has garnered considerable attention due to the significance of biofilms in both basic and applied microbiology research. To facilitate its utility, the design of xanthusBase incorporates open-source software, leve...

  4. Fuzzy time series forecasting model with natural partitioning length approach for predicting the unemployment rate under different degree of confidence

    Science.gov (United States)

    Ramli, Nazirah; Mutalib, Siti Musleha Ab; Mohamad, Daud

    2017-08-01

    Fuzzy time series forecasting model has been proposed since 1993 to cater for data in linguistic values. Many improvement and modification have been made to the model such as enhancement on the length of interval and types of fuzzy logical relation. However, most of the improvement models represent the linguistic term in the form of discrete fuzzy sets. In this paper, fuzzy time series model with data in the form of trapezoidal fuzzy numbers and natural partitioning length approach is introduced for predicting the unemployment rate. Two types of fuzzy relations are used in this study which are first order and second order fuzzy relation. This proposed model can produce the forecasted values under different degree of confidence.

  5. Mechanical characterization of porcine abdominal organs.

    Science.gov (United States)

    Tamura, Atsutaka; Omori, Kiyoshi; Miki, Kazuo; Lee, Jong B; Yang, King H; King, Albert I

    2002-11-01

    Typical automotive related abdominal injuries occur due to contact with the rim of the steering wheel, seatbelt and armrest, however, the rate is less than in other body regions. When solid abdominal organs, such as the liver, kidneys and spleen are involved, the injury severity tends to be higher. Although sled and pendulum impact tests have been conducted using cadavers and animals, the mechanical properties and the tissue level injury tolerance of abdominal solid organs are not well characterized. These data are needed in the development of computer models, the improvement of current anthropometric test devices and the enhancement of our understanding of abdominal injury mechanisms. In this study, a series of experimental tests on solid abdominal organs was conducted using porcine liver, kidney and spleen specimens. Additionally, the injury tolerance of the solid organs was deduced from the experimental data.

  6. A multi-tiered time-series modelling approach to forecasting respiratory syncytial virus incidence at the local level.

    Science.gov (United States)

    Spaeder, M C; Fackler, J C

    2012-04-01

    Respiratory syncytial virus (RSV) is the most common cause of documented viral respiratory infections, and the leading cause of hospitalization, in young children. We performed a retrospective time-series analysis of all patients aged Forecasting models of weekly RSV incidence for the local community, inpatient paediatric hospital and paediatric intensive-care unit (PICU) were created. Ninety-five percent confidence intervals calculated around our models' 2-week forecasts were accurate to ±9·3, ±7·5 and ±1·5 cases/week for the local community, inpatient hospital and PICU, respectively. Our results suggest that time-series models may be useful tools in forecasting the burden of RSV infection at the local and institutional levels, helping communities and institutions to optimize distribution of resources based on the changing burden and severity of illness in their respective communities.

  7. Fuzzy time-series based on Fibonacci sequence for stock price forecasting

    Science.gov (United States)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia

    2007-07-01

    Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.

  8. Pesticide nonextractable residue formation in soil: insights from inverse modeling of degradation time series.

    Science.gov (United States)

    Loos, Martin; Krauss, Martin; Fenner, Kathrin

    2012-09-18

    Formation of soil nonextractable residues (NER) is central to the fate and persistence of pesticides. To investigate pools and extent of NER formation, an established inverse modeling approach for pesticide soil degradation time series was evaluated with a Monte Carlo Markov Chain (MCMC) sampling procedure. It was found that only half of 73 pesticide degradation time series from a homogeneous soil source allowed for well-behaved identification of kinetic parameters with a four-pool model containing a parent compound, a metabolite, a volatile, and a NER pool. A subsequent simulation indeed confirmed distinct parameter combinations of low identifiability. Taking the resulting uncertainties into account, several conclusions regarding NER formation and its impact on persistence assessment could nonetheless be drawn. First, rate constants for transformation of parent compounds to metabolites were correlated to those for transformation of parent compounds to NER, leading to degradation half-lives (DegT50) typically not being larger than disappearance half-lives (DT50) by more than a factor of 2. Second, estimated rate constants were used to evaluate NER formation over time. This showed that NER formation, particularly through the metabolite pool, may be grossly underestimated when using standard incubation periods. It further showed that amounts and uncertainties in (i) total NER, (ii) NER formed from the parent pool, and (iii) NER formed from the metabolite pool vary considerably among data sets at t→∞, with no clear dominance between (ii) and (iii). However, compounds containing aromatic amine moieties were found to form significantly more total NER when extrapolating to t→∞ than the other compounds studied. Overall, our study stresses the general need for assessing uncertainties, identifiability issues, and resulting biases when using inverse modeling of degradation time series for evaluating persistence and NER formation.

  9. Connected to TV series: Quantifying series watching engagement.

    Science.gov (United States)

    Tóth-Király, István; Bőthe, Beáta; Tóth-Fáber, Eszter; Hága, Győző; Orosz, Gábor

    2017-12-01

    Background and aims Television series watching stepped into a new golden age with the appearance of online series. Being highly involved in series could potentially lead to negative outcomes, but the distinction between highly engaged and problematic viewers should be distinguished. As no appropriate measure is available for identifying such differences, a short and valid measure was constructed in a multistudy investigation: the Series Watching Engagement Scale (SWES). Methods In Study 1 (N Sample1  = 740 and N Sample2  = 740), exploratory structural equation modeling and confirmatory factor analysis were used to identify the most important facets of series watching engagement. In Study 2 (N = 944), measurement invariance of the SWES was investigated between males and females. In Study 3 (N = 1,520), latent profile analysis (LPA) was conducted to identify subgroups of viewers. Results Five factors of engagement were identified in Study 1 that are of major relevance: persistence, identification, social interaction, overuse, and self-development. Study 2 supported the high levels of equivalence between males and females. In Study 3, three groups of viewers (low-, medium-, and high-engagement viewers) were identified. The highly engaged at-risk group can be differentiated from the other two along key variables of watching time and personality. Discussion The present findings support the overall validity, reliability, and usefulness of the SWES and the results of the LPA showed that it might be useful to identify at-risk viewers before the development of problematic use.

  10. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  11. A Virtual Machine Migration Strategy Based on Time Series Workload Prediction Using Cloud Model

    Directory of Open Access Journals (Sweden)

    Yanbing Liu

    2014-01-01

    Full Text Available Aimed at resolving the issues of the imbalance of resources and workloads at data centers and the overhead together with the high cost of virtual machine (VM migrations, this paper proposes a new VM migration strategy which is based on the cloud model time series workload prediction algorithm. By setting the upper and lower workload bounds for host machines, forecasting the tendency of their subsequent workloads by creating a workload time series using the cloud model, and stipulating a general VM migration criterion workload-aware migration (WAM, the proposed strategy selects a source host machine, a destination host machine, and a VM on the source host machine carrying out the task of the VM migration. Experimental results and analyses show, through comparison with other peer research works, that the proposed method can effectively avoid VM migrations caused by momentary peak workload values, significantly lower the number of VM migrations, and dynamically reach and maintain a resource and workload balance for virtual machines promoting an improved utilization of resources in the entire data center.

  12. MATRIX-VBS Condensing Organic Aerosols in an Aerosol Microphysics Model

    Science.gov (United States)

    Gao, Chloe Y.; Tsigaridis, Konstas; Bauer, Susanne E.

    2015-01-01

    The condensation of organic aerosols is represented in a newly developed box-model scheme, where its effect on the growth and composition of particles are examined. We implemented the volatility-basis set (VBS) framework into the aerosol mixing state resolving microphysical scheme Multiconfiguration Aerosol TRacker of mIXing state (MATRIX). This new scheme is unique and advances the representation of organic aerosols in models in that, contrary to the traditional treatment of organic aerosols as non-volatile in most climate models and in the original version of MATRIX, this new scheme treats them as semi-volatile. Such treatment is important because low-volatility organics contribute significantly to the growth of particles. The new scheme includes several classes of semi-volatile organic compounds from the VBS framework that can partition among aerosol populations in MATRIX, thus representing the growth of particles via condensation of low volatility organic vapors. Results from test cases representing Mexico City and a Finish forrest condistions show good representation of the time evolutions of concentration for VBS species in the gas phase and in the condensed particulate phase. Emitted semi-volatile primary organic aerosols evaporate almost completely in the high volatile range, and they condense more efficiently in the low volatility range.

  13. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Democracy versus dictatorship in self-organized models of financial markets

    Science.gov (United States)

    D'Hulst, R.; Rodgers, G. J.

    2000-06-01

    Models to mimic the transmission of information in financial markets are introduced. As an attempt to generate the demand process, we distinguish between dictatorship associations, where groups of agents rely on one of them to make decision, and democratic associations, where each agent takes part in the group decision. In the dictatorship model, agents segregate into two distinct populations, while the democratic model is driven towards a critical state where groups of agents of all sizes exist. Hence, both models display a level of organization, but only the democratic model is self-organized. We show that the dictatorship model generates less-volatile markets than the democratic model.

  15. Table of 3D organ model IDs and organ names (PART-OF Tree) - BodyParts3D | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us BodyParts3D Table of 3D organ model IDs and organ names (PART-OF Tree) Data detail Data name Table of 3D org...an model IDs and organ names (PART-OF Tree) DOI 10.18908/lsdba.nbdc00837-002 Description of ...data contents List of downloadable 3D organ models in a tab-delimited text file format, describing the correspondence between 3D org...an model IDs and organ names available in PART-OF Tree. D...atabase Site Policy | Contact Us Table of 3D organ model IDs and organ names (PART-OF Tree) - BodyParts3D | LSDB Archive ...

  16. Finite-element time-domain modeling of electromagnetic data in general dispersive medium using adaptive Padé series

    Science.gov (United States)

    Cai, Hongzhu; Hu, Xiangyun; Xiong, Bin; Zhdanov, Michael S.

    2017-12-01

    The induced polarization (IP) method has been widely used in geophysical exploration to identify the chargeable targets such as mineral deposits. The inversion of the IP data requires modeling the IP response of 3D dispersive conductive structures. We have developed an edge-based finite-element time-domain (FETD) modeling method to simulate the electromagnetic (EM) fields in 3D dispersive medium. We solve the vector Helmholtz equation for total electric field using the edge-based finite-element method with an unstructured tetrahedral mesh. We adopt the backward propagation Euler method, which is unconditionally stable, with semi-adaptive time stepping for the time domain discretization. We use the direct solver based on a sparse LU decomposition to solve the system of equations. We consider the Cole-Cole model in order to take into account the frequency-dependent conductivity dispersion. The Cole-Cole conductivity model in frequency domain is expanded using a truncated Padé series with adaptive selection of the center frequency of the series for early and late time. This approach can significantly increase the accuracy of FETD modeling.

  17. Cooling load calculation by the radiant time series method - effect of solar radiation models

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Alexandre M.S. [Universidade Estadual de Maringa (UEM), PR (Brazil)], E-mail: amscosta@uem.br

    2010-07-01

    In this work was analyzed numerically the effect of three different models for solar radiation on the cooling load calculated by the radiant time series' method. The solar radiation models implemented were clear sky, isotropic sky and anisotropic sky. The radiant time series' method (RTS) was proposed by ASHRAE (2001) for replacing the classical methods of cooling load calculation, such as TETD/TA. The method is based on computing the effect of space thermal energy storage on the instantaneous cooling load. The computing is carried out by splitting the heat gain components in convective and radiant parts. Following the radiant part is transformed using time series, which coefficients are a function of the construction type and heat gain (solar or non-solar). The transformed result is added to the convective part, giving the instantaneous cooling load. The method was applied for investigate the influence for an example room. The location used was - 23 degree S and 51 degree W and the day was 21 of January, a typical summer day in the southern hemisphere. The room was composed of two vertical walls with windows exposed to outdoors with azimuth angles equals to west and east directions. The output of the different models of solar radiation for the two walls in terms of direct and diffuse components as well heat gains were investigated. It was verified that the clear sky exhibited the less conservative (higher values) for the direct component of solar radiation, with the opposite trend for the diffuse component. For the heat gain, the clear sky gives the higher values, three times higher for the peek hours than the other models. Both isotropic and anisotropic models predicted similar magnitude for the heat gain. The same behavior was also verified for the cooling load. The effect of room thermal inertia was decreasing the cooling load during the peak hours. On the other hand the higher thermal inertia values are the greater for the non peak hours. The effect

  18. Stochastic simulation of time-series models combined with geostatistics to predict water-table scenarios in a Guarani Aquifer System outcrop area, Brazil

    Science.gov (United States)

    Manzione, Rodrigo L.; Wendland, Edson; Tanikawa, Diego H.

    2012-11-01

    Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.

  19. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  20. Stochastic Models in the DORIS Position Time Series: Estimates from the IDS Contribution to the ITRF2014

    Science.gov (United States)

    Klos, A.; Bogusz, J.; Moreaux, G.

    2017-12-01

    This research focuses on the investigation of the deterministic and stochastic parts of the DORIS (Doppler Orbitography and Radiopositioning Integrated by Satellite) weekly coordinate time series from the IDS contribution to the ITRF2014A set of 90 stations was divided into three groups depending on when the data was collected at an individual station. To reliably describe the DORIS time series, we employed a mathematical model that included the long-term nonlinear signal, linear trend, seasonal oscillations (these three sum up to produce the Polynomial Trend Model) and a stochastic part, all being resolved with Maximum Likelihood Estimation (MLE). We proved that the values of the parameters delivered for DORIS data are strictly correlated with the time span of the observations, meaning that the most recent data are the most reliable ones. Not only did the seasonal amplitudes decrease over the years, but also, and most importantly, the noise level and its type changed significantly. We examined five different noise models to be applied to the stochastic part of the DORIS time series: a pure white noise (WN), a pure power-law noise (PL), a combination of white and power-law noise (WNPL), an autoregressive process of first order (AR(1)) and a Generalized Gauss Markov model (GGM). From our study it arises that the PL process may be chosen as the preferred one for most of the DORIS data. Moreover, the preferred noise model has changed through the years from AR(1) to pure PL with few stations characterized by a positive spectral index.

  1. Stage-structured matrix models for organisms with non-geometric development times

    Science.gov (United States)

    Andrew Birt; Richard M. Feldman; David M. Cairns; Robert N. Coulson; Maria Tchakerian; Weimin Xi; James M. Guldin

    2009-01-01

    Matrix models have been used to model population growth of organisms for many decades. They are popular because of both their conceptual simplicity and their computational efficiency. For some types of organisms they are relatively accurate in predicting population growth; however, for others the matrix approach does not adequately model...

  2. Two-parameter double-oscillator model of Mathews-Lakshmanan type: Series solutions and supersymmetric partners

    International Nuclear Information System (INIS)

    Schulze-Halberg, Axel; Wang, Jie

    2015-01-01

    We obtain series solutions, the discrete spectrum, and supersymmetric partners for a quantum double-oscillator system. Its potential features a superposition of the one-parameter Mathews-Lakshmanan interaction and a one-parameter harmonic or inverse harmonic oscillator contribution. Furthermore, our results are transferred to a generalized Pöschl-Teller model that is isospectral to the double-oscillator system

  3. Two-parameter double-oscillator model of Mathews-Lakshmanan type: Series solutions and supersymmetric partners

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Halberg, Axel, E-mail: axgeschu@iun.edu, E-mail: xbataxel@gmail.com [Department of Mathematics and Actuarial Science and Department of Physics, Indiana University Northwest, 3400 Broadway, Gary, Indiana 46408 (United States); Wang, Jie, E-mail: wangjie@iun.edu [Department of Computer Information Systems, Indiana University Northwest, 3400 Broadway, Gary, Indiana 46408 (United States)

    2015-07-15

    We obtain series solutions, the discrete spectrum, and supersymmetric partners for a quantum double-oscillator system. Its potential features a superposition of the one-parameter Mathews-Lakshmanan interaction and a one-parameter harmonic or inverse harmonic oscillator contribution. Furthermore, our results are transferred to a generalized Pöschl-Teller model that is isospectral to the double-oscillator system.

  4. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis

    Science.gov (United States)

    2013-06-01

    Lehmann, D. R. (1984). How advertising affects sales: Meta- analysis of econometric results. Journal of Marketing Research , 21, 65-74. Barabási, A. L...317-357. Leone, R. P. (1983). Modeling sales-advertising relationships: An integrated time series- econometric approach. Journal of Marketing ... Research , 20, 291-295. McGrath, J. E., & Kravitz, D. A. (1982). Group research. Annual Review of Psychology, 33, 195- 230. Monge, P. R., & Contractor

  5. Modeling time-series count data: the unique challenges facing political communication studies.

    Science.gov (United States)

    Fogarty, Brian J; Monogan, James E

    2014-05-01

    This paper demonstrates the importance of proper model specification when analyzing time-series count data in political communication studies. It is common for scholars of media and politics to investigate counts of coverage of an issue as it evolves over time. Many scholars rightly consider the issues of time dependence and dynamic causality to be the most important when crafting a model. However, to ignore the count features of the outcome variable overlooks an important feature of the data. This is particularly the case when modeling data with a low number of counts. In this paper, we argue that the Poisson autoregressive model (Brandt and Williams, 2001) accurately meets the needs of many media studies. We replicate the analyses of Flemming et al. (1997), Peake and Eshbaugh-Soha (2008), and Ura (2009) and demonstrate that models missing some of the assumptions of the Poisson autoregressive model often yield invalid inferences. We also demonstrate that the effect of any of these models can be illustrated dynamically with estimates of uncertainty through a simulation procedure. The paper concludes with implications of these findings for the practical researcher. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Comparison of extended mean-reversion and time series models for electricity spot price simulation considering negative prices

    International Nuclear Information System (INIS)

    Keles, Dogan; Genoese, Massimo; Möst, Dominik; Fichtner, Wolf

    2012-01-01

    This paper evaluates different financial price and time series models, such as mean reversion, autoregressive moving average (ARMA), integrated ARMA (ARIMA) and general autoregressive conditional heteroscedasticity (GARCH) process, usually applied for electricity price simulations. However, as these models are developed to describe the stochastic behaviour of electricity prices, they are extended by a separate data treatment for the deterministic components (trend, daily, weekly and annual cycles) of electricity spot prices. Furthermore price jumps are considered and implemented within a regime-switching model. Since 2008 market design allows for negative prices at the European Energy Exchange, which also occurred for several hours in the last years. Up to now, only a few financial and time series approaches exist, which are able to capture negative prices. This paper presents a new approach incorporating negative prices. The evaluation of the different approaches presented points out that the mean reversion and the ARMA models deliver the lowest mean root square error between simulated and historical electricity spot prices gained from the European Energy Exchange. These models posses also lower mean average errors than GARCH models. Hence, they are more suitable to simulate well-fitting price paths. Furthermore it is shown that the daily structure of historical price curves is better captured applying ARMA or ARIMA processes instead of mean-reversion or GARCH models. Another important outcome of the paper is that the regime-switching approach and the consideration of negative prices via the new proposed approach lead to a significant improvement of the electricity price simulation. - Highlights: ► Considering negative prices improves the results of time-series and financial models for electricity prices. ► Regime-switching approach captures the jumps and base prices quite well. ► Removing and separate modelling of deterministic annual, weekly and daily

  7. Modeling organic aerosols during MILAGRO: importance of biogenic secondary organic aerosols

    Directory of Open Access Journals (Sweden)

    A. Hodzic

    2009-09-01

    Full Text Available The meso-scale chemistry-transport model CHIMERE is used to assess our understanding of major sources and formation processes leading to a fairly large amount of organic aerosols – OA, including primary OA (POA and secondary OA (SOA – observed in Mexico City during the MILAGRO field project (March 2006. Chemical analyses of submicron aerosols from aerosol mass spectrometers (AMS indicate that organic particles found in the Mexico City basin contain a large fraction of oxygenated organic species (OOA which have strong correspondence with SOA, and that their production actively continues downwind of the city. The SOA formation is modeled here by the one-step oxidation of anthropogenic (i.e. aromatics, alkanes, biogenic (i.e. monoterpenes and isoprene, and biomass-burning SOA precursors and their partitioning into both organic and aqueous phases. Conservative assumptions are made for uncertain parameters to maximize the amount of SOA produced by the model. The near-surface model evaluation shows that predicted OA correlates reasonably well with measurements during the campaign, however it remains a factor of 2 lower than the measured total OA. Fairly good agreement is found between predicted and observed POA within the city suggesting that anthropogenic and biomass burning emissions are reasonably captured. Consistent with previous studies in Mexico City, large discrepancies are encountered for SOA, with a factor of 2–10 model underestimate. When only anthropogenic SOA precursors were considered, the model was able to reproduce within a factor of two the sharp increase in OOA concentrations during the late morning at both urban and near-urban locations but the discrepancy increases rapidly later in the day, consistent with previous results, and is especially obvious when the column-integrated SOA mass is considered instead of the surface concentration. The increase in the missing SOA mass in the afternoon coincides with the sharp drop in POA

  8. Predicting long-term organic carbon dynamics in organically amended soils using the CQESTR model

    Energy Technology Data Exchange (ETDEWEB)

    Plaza, Cesar; Polo, Alfredo [Consejo Superior de Investigaciones Cientificas, Madrid (Spain). Inst. de Ciencias Agrarias; Gollany, Hero T. [Columbia Plateau Conservation Research Center, Pendleton, OR (United States). USDA-ARS; Baldoni, Guido; Ciavatta, Claudio [Bologna Univ. (Italy). Dept. of Agroenvironmental Sciences and Technologies

    2012-04-15

    Purpose: The CQESTR model is a process-based C model recently developed to simulate soil organic matter (SOM) dynamics and uses readily available or easily measurable input parameters. The current version of CQESTR (v. 2.0) has been validated successfully with a number of datasets from agricultural sites in North America but still needs to be tested in other geographic areas and soil types under diverse organic management systems. Materials and methods: We evaluated the predictive performance of CQESTR to simulate long-term (34 years) soil organic C (SOC) changes in a SOM-depleted European soil either unamended or amended with solid manure, liquid manure, or crop residue. Results and discussion: Measured SOC levels declined over the study period in the unamended soil, remained constant in the soil amended with crop residues, and tended to increase in the soils amended with manure, especially with solid manure. Linear regression analysis of measured SOC contents and CQESTR predictions resulted in a correlation coefficient of 0.626 (P < 0.001) and a slope and an intercept not significantly different from 1 and 0, respectively (95% confidence level). The mean squared deviation and root mean square error were relatively small. Simulated values fell within the 95% confidence interval of the measured SOC, and predicted errors were mainly associated with data scattering. Conclusions: The CQESTR model was shown to predict, with a reasonable degree of accuracy, the organic C dynamics in the soils examined. The CQESTR performance, however, could be improved by adding an additional parameter to differentiate between pre-decomposed organic amendments with varying degrees of stability. (orig.)

  9. A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoping Yang

    2016-01-01

    Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.

  10. Biomass and nutrient distribution in an age series of eucalyptus hybrid plantation in Tamil Nadu. I. Distribution of organic matter

    Energy Technology Data Exchange (ETDEWEB)

    Negi, J D.S.; Sharma, S C

    1985-12-01

    The distribution of organic matter in an age series of Eucalyptus hybrid plantation in Tamil Nadu has been discussed. It was observed that (i) the rotation age for E. hybrid can be fixed at 7 years where the Mean Annual Production (MAP) is at the maximum, (ii) Pollachi seems to be comparatively better site for E. hybrid planting, presumably due to higher leaf efficiency (III) to increase the productivity in a coppiced crop thinning is essential as the lower stand density give a better chance for high leaf production and consequently higher biomass. 7 references, 1 figure, 5 tables.

  11. Fourier Series, the DFT and Shape Modelling

    DEFF Research Database (Denmark)

    Skoglund, Karl

    2004-01-01

    This report provides an introduction to Fourier series, the discrete Fourier transform, complex geometry and Fourier descriptors for shape analysis. The content is aimed at undergraduate and graduate students who wish to learn about Fourier analysis in general, as well as its application to shape...

  12. Predicting The Exit Time Of Employees In An Organization Using Statistical Model

    Directory of Open Access Journals (Sweden)

    Ahmed Al Kuwaiti

    2015-08-01

    Full Text Available Employees are considered as an asset to any organization and each organization provide a better and flexible working environment to retain its best and resourceful workforce. As such continuous efforts are being taken to avoid or extend the exitwithdrawal of employees from the organization. Human resource managers are facing a challenge to predict the exit time of employees and there is no precise model existing at present in the literature. This study has been conducted to predict the probability of exit of an employee in an organization using appropriate statistical model. Accordingly authors designed a model using Additive Weibull distribution to predict the expected exit time of employee in an organization. In addition a Shock model approach is also executed to check how well the Additive Weibull distribution suits in an organization. The analytical results showed that when the inter-arrival time increases the expected time for the employees to exit also increases. This study concluded that Additive Weibull distribution can be considered as an alternative in the place of Shock model approach to predict the exit time of employee in an organization.

  13. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  14. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    Science.gov (United States)

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  15. Online Self-Organizing Network Control with Time Averaged Weighted Throughput Objective

    Directory of Open Access Journals (Sweden)

    Zhicong Zhang

    2018-01-01

    Full Text Available We study an online multisource multisink queueing network control problem characterized with self-organizing network structure and self-organizing job routing. We decompose the self-organizing queueing network control problem into a series of interrelated Markov Decision Processes and construct a control decision model for them based on the coupled reinforcement learning (RL architecture. To maximize the mean time averaged weighted throughput of the jobs through the network, we propose a reinforcement learning algorithm with time averaged reward to deal with the control decision model and obtain a control policy integrating the jobs routing selection strategy and the jobs sequencing strategy. Computational experiments verify the learning ability and the effectiveness of the proposed reinforcement learning algorithm applied in the investigated self-organizing network control problem.

  16. Relevance of the ICRP biokinetic model for dietary organically bound tritium

    International Nuclear Information System (INIS)

    Trivedi, A.

    1999-10-01

    Ingested dietary tritium can participate in metabolic processes, and become synthesized into organically bound tritium in the tissues and organs. The distribution and retention of the organically bound tritium throughout the body are much different than tritium in the body water. The International Commission on Radiological Protection (ICRP) Publication 56 (1989) has a biokinetic model to calculate dose from the ingestion of organically bound dietary tritium. The model predicts that the dose from the ingestion of organically bound dietary tritium is about 2.3 times higher than from the ingestion of the same activity of tritiated water. Under steady-state conditions, the calculated dose rate (using the first principle approach) from the ingestion of dietary organically bound tritium can be twice that from the ingestion of tritiated water. For an adult, the upper-bound dose estimate for the ingestion of dietary organically bound tritium is estimated to be close to 2.3 times higher than that of tritiated water. Therefore, given the uncertainty in the dose calculation with respect to the actual relevant dose, the ICRP biokinetic model for organically bound tritium is sufficient for dosimetry for adults. (author)

  17. Modeling the acid-base chemistry of organic solutes in Adirondack, New York, lakes

    Science.gov (United States)

    Driscoll, Charles T.; Lehtinen, Michael D.; Sullivan, Timothy J.

    1994-02-01

    Data from the large and diverse Adirondack Lake Survey were used to calibrate four simple organic acid analog models in an effort to quantify the influence of naturally occurring organic acids on lake water pH and acid-neutralizing capacity (ANC). The organic acid analog models were calibrated to observations of pH, dissolved organic carbon (DOC), and organic anion (An-) concentrations from a reduced data set representing 1128 individual lake samples, expressed as 41 observations of mean pH, in intervals of 0.1 pH units from pH 3.9 to 7.0. Of the four organic analog approaches examined, including the Oliver et al. (1983) model, as well as monoprotic, diprotic, and triprotic representations, the triprotic analog model yielded the best fit (r2 = 0.92) to the observed data. Moreover, the triprotic model was qualitatively consistent with observed patterns of change in organic solute charge density as a function of pH. A low calibrated value for the first H+ dissociation constant (pKal = 2.62) and the observation that organic anion concentrations were significant even at very low pH (acidic functional groups. Inclusion of organic acidity in model calculations resulted in good agreement between measured and predicted values of lake water pH and ANC. Assessments to project the response of surface waters to future changes in atmospheric deposition, through the use of acidification models, will need to include representations of organic acids in model structure to make accurate predictions of pH and ANC.

  18. Similarity-based search of model organism, disease and drug effect phenotypes

    KAUST Repository

    Hoehndorf, Robert

    2015-02-19

    Background: Semantic similarity measures over phenotype ontologies have been demonstrated to provide a powerful approach for the analysis of model organism phenotypes, the discovery of animal models of human disease, novel pathways, gene functions, druggable therapeutic targets, and determination of pathogenicity. Results: We have developed PhenomeNET 2, a system that enables similarity-based searches over a large repository of phenotypes in real-time. It can be used to identify strains of model organisms that are phenotypically similar to human patients, diseases that are phenotypically similar to model organism phenotypes, or drug effect profiles that are similar to the phenotypes observed in a patient or model organism. PhenomeNET 2 is available at http://aber-owl.net/phenomenet. Conclusions: Phenotype-similarity searches can provide a powerful tool for the discovery and investigation of molecular mechanisms underlying an observed phenotypic manifestation. PhenomeNET 2 facilitates user-defined similarity searches and allows researchers to analyze their data within a large repository of human, mouse and rat phenotypes.

  19. Electrochemical model of the polyaniline based organic memristive device

    International Nuclear Information System (INIS)

    Demin, V. A.; Erokhin, V. V.; Kashkarov, P. K.; Kovalchuk, M. V.

    2014-01-01

    The electrochemical organic memristive device with polyaniline active layer is a stand-alone device designed and realized for reproduction of some synapse properties in the innovative electronic circuits, including the neuromorphic networks capable for learning. In this work, a new theoretical model of the polyaniline memristive is presented. The developed model of organic memristive functioning was based on the detailed consideration of possible electrochemical processes occuring in the active zone of this device. Results of the calculation have demonstrated not only the qualitative explanation of the characteristics observed in the experiment but also the quantitative similarities of the resultant current values. It is shown how the memristive could behave at zero potential difference relative to the reference electrode. This improved model can establish a basis for the design and prediction of properties of more complicated circuits and systems (including stochastic ones) based on the organic memristive devices

  20. Accurate Measurement of the Optical Constants n and k for a Series of 57 Inorganic and Organic Liquids for Optical Modeling and Detection.

    Science.gov (United States)

    Myers, Tanya L; Tonkyn, Russell G; Danby, Tyler O; Taubman, Matthew S; Bernacki, Bruce E; Birnbaum, Jerome C; Sharpe, Steven W; Johnson, Timothy J

    2018-04-01

    For optical modeling and other purposes, we have created a library of 57 liquids for which we have measured the complex optical constants n and k. These liquids vary in their nature, ranging in properties that include chemical structure, optical band strength, volatility, and viscosity. By obtaining the optical constants, one can model most optical phenomena in media and at interfaces including reflection, refraction, and dispersion. Based on the works of others, we have developed improved protocols using multiple path lengths to determine the optical constants n/k for dozens of liquids, including inorganic, organic, and organophosphorus compounds. Detailed descriptions of the measurement and data reduction protocols are discussed; agreement of the derived optical constant n and k values with literature values are presented. We also present results using the n/k values as applied to an optical modeling scenario whereby the derived data are presented and tested for models of 1 µm and 100 µm layers for dimethyl methylphosphonate (DMMP) on both metal (aluminum) and dielectric (soda lime glass) substrates to show substantial differences between the reflected signal from highly reflective substrates and less-reflective substrates.

  1. Quantum trigonometric Calogero-Sutherland model, irreducible characters and Clebsch-Gordan series for the exceptional algebra E7

    International Nuclear Information System (INIS)

    Fernandez Nunez, J.; Garcia Fuertes, W.; Perelomov, A.M.

    2005-01-01

    We reexpress the quantum Calogero-Sutherland model for the Lie algebra E 7 and the particular value of the coupling constant κ=1 by using the fundamental irreducible characters of the algebra as dynamical variables. For that, we need to develop a systematic procedure to obtain all the Clebsch-Gordan series required to perform the change of variables. We describe how the resulting quantum Hamiltonian operator can be used to compute more characters and Clebsch-Gordan series for this exceptional algebra

  2. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    Science.gov (United States)

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Time Series Neural Network Model for Part-of-Speech Tagging Indonesian Language

    Science.gov (United States)

    Tanadi, Theo

    2018-03-01

    Part-of-speech tagging (POS tagging) is an important part in natural language processing. Many methods have been used to do this task, including neural network. This paper models a neural network that attempts to do POS tagging. A time series neural network is modelled to solve the problems that a basic neural network faces when attempting to do POS tagging. In order to enable the neural network to have text data input, the text data will get clustered first using Brown Clustering, resulting a binary dictionary that the neural network can use. To further the accuracy of the neural network, other features such as the POS tag, suffix, and affix of previous words would also be fed to the neural network.

  4. A model to accumulate fractionated dose in a deforming organ

    International Nuclear Information System (INIS)

    Yan Di; Jaffray, D.A.; Wong, J.W.

    1999-01-01

    Purpose: Measurements of internal organ motion have demonstrated that daily organ deformation exists throughout the course of radiation treatment. However, a method of constructing the resultant dose delivered to the organ volume remains a difficult challenge. In this study, a model to quantify internal organ motion and a method to construct a cumulative dose in a deforming organ are introduced. Methods and Materials: A biomechanical model of an elastic body is used to quantify patient organ motion in the process of radiation therapy. Intertreatment displacements of volume elements in an organ of interest is calculated by applying an finite element method with boundary conditions, obtained from multiple daily computed tomography (CT) measurements. Therefore, by incorporating also the measurements of daily setup error, daily dose delivered to a deforming organ can be accumulated by tracking the position of volume elements in the organ. Furthermore, distribution of patient-specific organ motion is also predicted during the early phase of treatment delivery using the daily measurements, and the cumulative dose distribution in the organ can then be estimated. This dose distribution will be updated whenever a new measurement becomes available, and used to reoptimize the ongoing treatment. Results: An integrated process to accumulate dosage in a daily deforming organ was implemented. In this process, intertreatment organ motion and setup error were systematically quantified, and incorporated in the calculation of the cumulative dose. An example of the rectal wall motion in a prostate treatment was applied to test the model. The displacements of volume elements in the rectal wall, as well as the resultant doses, were calculated. Conclusion: This study is intended to provide a systematic framework to incorporate daily patient-specific organ motion and setup error in the reconstruction of the cumulative dose distribution in an organ of interest. The realistic dose

  5. A likelihood-based time series modeling approach for application in dendrochronology to examine the growth-climate relations and forest disturbance history

    Science.gov (United States)

    A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for ...

  6. Shifting physician prescribing to a preferred histamine-2-receptor antagonist. Effects of a multifactorial intervention in a mixed-model health maintenance organization.

    Science.gov (United States)

    Brufsky, J W; Ross-Degnan, D; Calabrese, D; Gao, X; Soumerai, S B

    1998-03-01

    This study was undertaken to determine whether a program of education, therapeutic reevaluation of eligible patients, and performance feedback could shift prescribing to cimetidine from other histamine-2 receptor antagonists, which commonly are used in the management of ulcers and reflux, and reduce costs without increasing rates of ulcer-related hospital admissions. This study used an interrupted monthly time series with comparison series in a large mixed-model health maintenance organization. Physicians employed in health centers (staff model) and physicians in independent medical groups contracting to provide health maintenance organization services (group model) participated. The comparative percentage prescribed of specific histamine-2 receptor antagonists (market share), total histamine-2 receptor antagonist prescribing, cost per histamine-2 receptor antagonist prescription, and the rate of hospitalization for gastrointestinal illness were assessed. In the staff model, therapeutic reevaluation resulted in a sudden increase in market share of the preferred histamine-2 receptor antagonist cimetidine (+53.8%) and a sudden decrease in ranitidine (-44.7%) and famotidine (-4.8%); subsequently, cimetidine market share grew by 1.1% per month. In the group model, therapeutic reevaluation resulted in increased cimetidine market share (+9.7%) and decreased prescribing of other histamine-2 receptor antagonists (ranitidine -11.6%; famotidine -1.2%). Performance feedback did not result in further changes in prescribing in either setting. Use of omeprazole, an expensive alternative, essentially was unchanged by the interventions, as were overall histamine-2 receptor antagonist prescribing and hospital admissions for gastrointestinal illnesses. This intervention, which cost approximately $60,000 to implement, resulted in estimated annual savings in histamine-2 receptor antagonist expenditures of $1.06 million. Annual savings in histamine-2 receptor antagonist expenditures

  7. Self-organized topology of recurrence-based complex networks

    International Nuclear Information System (INIS)

    Yang, Hui; Liu, Gang

    2013-01-01

    With the rapid technological advancement, network is almost everywhere in our daily life. Network theory leads to a new way to investigate the dynamics of complex systems. As a result, many methods are proposed to construct a network from nonlinear time series, including the partition of state space, visibility graph, nearest neighbors, and recurrence approaches. However, most previous works focus on deriving the adjacency matrix to represent the complex network and extract new network-theoretic measures. Although the adjacency matrix provides connectivity information of nodes and edges, the network geometry can take variable forms. The research objective of this article is to develop a self-organizing approach to derive the steady geometric structure of a network from the adjacency matrix. We simulate the recurrence network as a physical system by treating the edges as springs and the nodes as electrically charged particles. Then, force-directed algorithms are developed to automatically organize the network geometry by minimizing the system energy. Further, a set of experiments were designed to investigate important factors (i.e., dynamical systems, network construction methods, force-model parameter, nonhomogeneous distribution) affecting this self-organizing process. Interestingly, experimental results show that the self-organized geometry recovers the attractor of a dynamical system that produced the adjacency matrix. This research addresses a question, i.e., “what is the self-organizing geometry of a recurrence network?” and provides a new way to reproduce the attractor or time series from the recurrence plot. As a result, novel network-theoretic measures (e.g., average path length and proximity ratio) can be achieved based on actual node-to-node distances in the self-organized network topology. The paper brings the physical models into the recurrence analysis and discloses the spatial geometry of recurrence networks

  8. PENGEMBANGAN FOIL NACA SERI 2412 SEBAGAI SISTEM PENYELAMAN MODEL KAPAL SELAM

    Directory of Open Access Journals (Sweden)

    Ali Munazid

    2015-06-01

    Full Text Available Bentuk  foil menghasilkan gaya angkat (lift force ketika foil dilewati oleh aliran fluida  karena adanya pengaruh interaksi antara aliran fluida dengan permukaan foil yang mengakibatkan tekanan permukaan atas lebih kecil dari permukaan bawah. Bagaimana mengaplikasikan teori foil pada hydroplane kapal selam sebagai  system penyelaman, dengan membalik foil maka lift force tersebut menjadi gaya ke bawah, dengan demikian memungkinkan kapal selam dapat menyelam, melayang dan bermanouver di bawah air, seperti halnya gerak pesawat terbang yang terbang dan melayang dengan menggunakan sayap. Dilakukan penelitian dan pengamatan terhadap kemampuan penyelaman (diving plan dari foil NACA seri 2412 pada model kapal selam, dengan mencari nilai Cl (coefisien lift di Laboratorium, serta mendesain bentuk badan kapal selam dan analisa gaya-gaya yang bekerja pada model kapal selam, jumlah gaya-gaya yang bekerja keatas lebih rendah dari gaya-gaya ke bawah maka kapal selam mampu menyelam. Penerapan Hydroplane sebagai diving plane dapat diterapkan, kemampuan penyelaman dipengaruhi oleh sudut flip  Hydroplane dan kecepatan model, semakin besar kecepatan dan sudut flip maka semakin besar kedalaman penyelaman yang dapat dilakukan.

  9. Public invited to Appalachian Studies Film Series

    OpenAIRE

    Elliott, Jean

    2004-01-01

    The Appalachian Studies Program at Virginia Tech is hosting a series of notable artistic and documentary films. The films deal with themes or issues covered in Appalachian Studies courses and are organized historically, touching upon issues common to all Appalachians.

  10. Safety Cultural Competency Modeling in Nuclear Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Oh, Yeon Ju; Luo, Meiling; Lee, Yong Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    The nuclear safety cultural competency model should be supplemented through a bottom-up approach such as behavioral event interview. The developed model, however, is meaningful for determining what should be dealt for enhancing safety cultural competency of nuclear organizations. The more details of the developing process, results, and applications will be introduced later. Organizational culture include safety culture in terms of its organizational characteristics.

  11. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data

    Science.gov (United States)

    Barry T. Wilson; Joseph F. Knight; Ronald E. McRoberts

    2018-01-01

    Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several...

  12. Integrated stratigraphy of an organic matter enriched pelagic series (''black shales''). The Aptian-Albian of the Marches - Umbria basin (central Italy); Stratigraphie integree d'une serie pelagique a horizons enrichis en matiere organique (''black shales''). L'Aptien-Albien du bassin de Marches - Ombrie (Italie centrale)

    Energy Technology Data Exchange (ETDEWEB)

    Fiet, N

    1998-10-23

    The Aptian-Albian series of the Marches-Umbria basin is considered as a field analogue of most basin deposits of the same age located in the Atlantic domain. It corresponds to a pelagic sedimentation with alternations of marls, black shales, and limestones. The study of the black shales series has been carried out using a combination of petrological, geochemical and palynological data. The integration of these data allows to propose a detailed typology of these beds, to define a deposition mode with respect to the organic matter content and to precise the location of sources and transfer ways. A close relationship between the deposition of the black shales and the development of delta zones in the North-Gondwana margin is shown. A comparison with sub-actual analogues allows to explain their rhythmical organization within the sedimentation. A cyclo-stratigraphical approach of the overall series has been performed using the analysis of the sedimentary rhythms. A detailed time calibration (< 100 ka) of the Aptian and Albian epochs is proposed according to the planktonic foraminifera, the calcareous nano-fossils and the dyno-cysts populations. The M-0 magnetic chron has ben dated to 116.7 {+-} 0.7 Ma. The combination of all stratigraphical approaches has permitted to elaborate a subdivision of the series into deposition sequences. The forcing phenomena that led to the genesis of these sedimentary bodies are probably of astronomical-climatical origin. Then a relative sea-level curve has been constructed and compared with the existing reference curves published for the worldwide ocean and the Russian platform. The strong similarities between these curves and the amplitude of the relative variations (up to 80 m) suggest a control of the sedimentation of glacial-eustatic origin. Thus, several glaciation phases are proposed according to the low sea level deposits identified in the series (upper Gargasian, Clansayesian, upper Albian, middle Vraconian). (J.S.)

  13. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  14. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  15. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  16. iVAR: a program for imputing missing data in multivariate time series using vector autoregressive models.

    Science.gov (United States)

    Liu, Siwei; Molenaar, Peter C M

    2014-12-01

    This article introduces iVAR, an R program for imputing missing data in multivariate time series on the basis of vector autoregressive (VAR) models. We conducted a simulation study to compare iVAR with three methods for handling missing data: listwise deletion, imputation with sample means and variances, and multiple imputation ignoring time dependency. The results showed that iVAR produces better estimates for the cross-lagged coefficients than do the other three methods. We demonstrate the use of iVAR with an empirical example of time series electrodermal activity data and discuss the advantages and limitations of the program.

  17. Stages in the development of a model organism as a platform for mechanistic models in developmental biology: Zebrafish, 1970-2000.

    Science.gov (United States)

    Meunier, Robert

    2012-06-01

    Model organisms became an indispensable part of experimental systems in molecular developmental and cell biology, constructed to investigate physiological and pathological processes. They are thought to play a crucial role for the elucidation of gene function, complementing the sequencing of the genomes of humans and other organisms. Accordingly, historians and philosophers paid considerable attention to various issues concerning this aspect of experimental biology. With respect to the representational features of model organisms, that is, their status as models, the main focus was on generalization of phenomena investigated in such experimental systems. Model organisms have been said to be models for other organisms or a higher taxon. This, however, presupposes a representation of the phenomenon in question. I will argue that prior to generalization, model organisms allow researchers to built generative material models of phenomena - structures, processes or the mechanisms that explain them - through their integration in experimental set-ups that carve out the phenomena from the whole organism and thus represent them. I will use the history of zebrafish biology to show how model organism systems, from around 1970 on, were developed to construct material models of molecular mechanisms explaining developmental or physiological processes. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. A Behavior-Based Circuit Model of How Outcome Expectations Organize Learned Behavior in Larval "Drosophila"

    Science.gov (United States)

    Schleyer, Michael; Saumweber, Timo; Nahrendorf, Wiebke; Fischer, Benjamin; von Alpen, Desiree; Pauls, Dennis; Thum, Andreas; Gerber, Bertram

    2011-01-01

    Drosophila larvae combine a numerically simple brain, a correspondingly moderate behavioral complexity, and the availability of a rich toolbox for transgenic manipulation. This makes them attractive as a study case when trying to achieve a circuit-level understanding of behavior organization. From a series of behavioral experiments, we suggest a…

  19. Modelling the behaviour of organic degradation products

    International Nuclear Information System (INIS)

    Cross, J.E.; Ewart, F.T.; Greenfield, B.F.

    1989-03-01

    Results are presented from recent studies at Harwell which show that the degradation products which are formed when certain organic waste materials are exposed to the alkaline conditions typical of a cementitious environment, can enhance the solubility of plutonium, even at pH values as high as 12, by significant factors. Characterisation of the degradation products has been undertaken but the solubility enhancement does not appear to be related to the concentration of any of the major organic species that have been identified in the solutions. While it has not been possible to identify by analysis the organic ligand responsible for the increased solubility of plutonium, the behaviour of D-Saccharic acid does approach the behaviour of the degradation products. The PHREEQE code has been used to simulate the solubility of plutonium in the presence of D-Saccharic acid and other model degradation products, in order to explain the solubility enhancement. The extrapolation of the experimental conditions to the repository is the major objective, but in this work the ability of a model to predict the behaviour of plutonium over a range of experimental conditions has been tested. (author)

  20. Modification of SWAT model for simulation of organic matter in Korean watersheds.

    Science.gov (United States)

    Jang, Jae-Ho; Jung, Kwang-Wook; Gyeong Yoon, Chun

    2012-01-01

    The focus of water quality modeling of Korean streams needs to be shifted from dissolved oxygen to algae or organic matter. In particular, the structure of water quality models should be modified to simulate the biochemical oxygen demand (BOD), which is a key factor in calculating total maximum daily loads (TMDLs) in Korea, using 5-day BOD determined in the laboratory (Bottle BOD(5)). Considering the limitations in simulating organic matter under domestic conditions, we attempted to model total organic carbon (TOC) as well as BOD by using a watershed model. For this purpose, the Soil and Water Assessment Tool (SWAT) model was modified and extended to achieve better correspondence between the measured and simulated BOD and TOC concentrations. For simulated BOD in the period 2004-2008, the Nash-Sutcliffe model efficiency coefficient increased from a value of -2.54 to 0.61. Another indicator of organic matter, namely, the simulated TOC concentration showed that the modified SWAT adequately reflected the observed values. The improved model can be used to predict organic matter and hence, may be a potential decision-making tool for TMDLs. However, it needs further testing for longer simulation periods and other catchments.