WorldWideScience

Sample records for model technical series

  1. DIY Solar Market Analysis Webinar Series: Solar Resource and Technical

    Science.gov (United States)

    Series: Solar Resource and Technical Potential DIY Solar Market Analysis Webinar Series: Solar Resource and Technical Potential Wednesday, June 11, 2014 As part of a Do-It-Yourself Solar Market Analysis Potential | State, Local, and Tribal Governments | NREL DIY Solar Market Analysis Webinar

  2. Technical Manual: 2002 Series GED Tests

    Science.gov (United States)

    Ezzelle, Carol; Setzer, J. Carl

    2009-01-01

    This manual was written to provide technical information regarding the 2002 Series GED (General Educational Development) Tests. Throughout this manual, documentation is provided regarding the development of the GED Tests, data collection activities, as well as reliability and validity evidence. The purpose of this manual is to provide evidence…

  3. Reviving Graduate Seminar Series through Non-Technical Presentations

    Science.gov (United States)

    Madihally, Sundararajan V.

    2011-01-01

    Most chemical engineering programs that offer M.S. and Ph.D. degrees have a common seminar series for all the graduate students. Many would agree that seminars lack student interest, leading to ineffectiveness. We questioned the possibility of adding value to the seminar series by incorporating non-technical topics that may be more important to…

  4. Technical discussions on Emissions and Atmospheric Modeling (TEAM)

    Science.gov (United States)

    Frost, G. J.; Henderson, B.; Lefer, B. L.

    2017-12-01

    A new informal activity, Technical discussions on Emissions and Atmospheric Modeling (TEAM), aims to improve the scientific understanding of emissions and atmospheric processes by leveraging resources through coordination, communication and collaboration between scientists in the Nation's environmental agencies. TEAM seeks to close information gaps that may be limiting emission inventory development and atmospheric modeling and to help identify related research areas that could benefit from additional coordinated efforts. TEAM is designed around webinars and in-person meetings on particular topics that are intended to facilitate active and sustained informal communications between technical staff at different agencies. The first series of TEAM webinars focuses on emissions of nitrogen oxides, a criteria pollutant impacting human and ecosystem health and a key precursor of ozone and particulate matter. Technical staff at Federal agencies with specific interests in emissions and atmospheric modeling are welcome to participate in TEAM.

  5. Manhattan Project Technical Series: The Chemistry of Uranium (I)

    International Nuclear Information System (INIS)

    Rabinowitch, E. I.; Katz, J. J.

    1947-01-01

    This constitutes Chapters 11 through 16, inclusive, of the Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Uranium Oxides, Sulfides, Selenides, and Tellurides; The Non-Volatile Fluorides of Uranium; Uranium Hexafluoride; Uranium-Chlorine Compounds; Bromides, Iodides, and Pseudo-Halides of Uranium; and Oxyhalides of Uranium.

  6. Manhattan Project Technical Series: The Chemistry of Uranium (I)

    Energy Technology Data Exchange (ETDEWEB)

    Rabinowitch, E. I. [Argonne National Lab. (ANL), Argonne, IL (United States); Katz, J. J. [Argonne National Lab. (ANL), Argonne, IL (United States)

    1947-03-10

    This constitutes Chapters 11 through 16, inclusive, of the Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Uranium Oxides, Sulfides, Selenides, and Tellurides; The Non-Volatile Fluorides of Uranium; Uranium Hexafluoride; Uranium-Chlorine Compounds; Bromides, Iodides, and Pseudo-Halides of Uranium; and Oxyhalides of Uranium.

  7. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  8. Series-produced Helium II Cryostats for the LHC Magnets Technical Choices, Industrialisation, Costs

    CERN Document Server

    Poncet, A

    2008-01-01

    Assembled in 8 continuous segments of approximately 2.7 km length each, the He II cryostats for the 1232 cryodipoles and 474 Short Straight Sections (SSS housing the quadrupoles) must fulfil tight technical requirements. They have been produced by industry in large series according to cost-effective industrial production methods to keep expenditure within the financial constraints of the project and assembled under contract at CERN. The specific technical requirements of the generic systems of the cryostat (vacuum, cryogenic, electrical distribution, magnet alignment) are briefly recalled, as well as the basic design choices leading to the definition of their components (vacuum vessels, thermal shielding, supporting systems). Early in the design process emphasis was placed on the feasibility of manufacturing techniques adequate for large series production of components, optimal tooling for time-effective assembly methods, and reliable quality assurance systems. An analytical review of the costs of the cryosta...

  9. Automotive Mechanics Technical Terms. English-Thai Lexicon. Introduction to Automotive Mechanics. Thai Version. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Shin, Masako T.

    This English-Thai lexicon and program introduction for automotive mechanics is one of eight documents in the Multicultural Competency-Based Vocational/Technical Curricula Series. It is intended for use in postsecondary, adult, and preservice teacher and administrator education. The first two sections provide Thai equivalencies of English…

  10. Combination Welding Technical Terms. English-Thai Lexicon. Introduction to Combination Welding. Thai Version. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Shin, Masako T.

    This English-Thai lexicon and program introduction for combination welding is one of eight documents in the Multicultural Competency-Based Vocational/Technical Curricula Series. It is intended for use in postsecondary, adult, and preservice teacher and administrator education. The first two sections provide Thai equivalencies of English…

  11. Predicting Jakarta composite index using hybrid of fuzzy time series and support vector regression models

    Science.gov (United States)

    Febrian Umbara, Rian; Tarwidi, Dede; Budi Setiawan, Erwin

    2018-03-01

    The paper discusses the prediction of Jakarta Composite Index (JCI) in Indonesia Stock Exchange. The study is based on JCI historical data for 1286 days to predict the value of JCI one day ahead. This paper proposes predictions done in two stages., The first stage using Fuzzy Time Series (FTS) to predict values of ten technical indicators, and the second stage using Support Vector Regression (SVR) to predict the value of JCI one day ahead, resulting in a hybrid prediction model FTS-SVR. The performance of this combined prediction model is compared with the performance of the single stage prediction model using SVR only. Ten technical indicators are used as input for each model.

  12. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  13. Multiple Indicator Stationary Time Series Models.

    Science.gov (United States)

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  14. Manhattan Project Technical Series The Chemistry of Uranium (I) Chapters 1-10

    International Nuclear Information System (INIS)

    Rabinowitch, E. I.; Katz, J. J.

    1946-01-01

    This constitutes Chapters 1 through 10. inclusive, of The Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Nuclear Properties of Uranium; Properties of the Uranium Atom; Uranium in Nature; Extraction of Uranium from Ores and Preparation of Uranium Metal; Physical Properties of Uranium Metal; Chemical Properties of Uranium Metal; Intermetallic Compounds and Alloy systems of Uranium; the Uranium-Hydrogen System; Uranium Borides, Carbides, and Silicides; Uranium Nitrides, Phosphides, Arsenides, and Antimonides.

  15. Engineering report : technical review of the GPSI model G3000 series multi-sensor controller system

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, S.; Campbell, I. [Gas Protection Systems Inc., Maple Ridge, BC (Canada)

    2001-12-01

    The Enviro Sentry 24/7 model G3002 remote computer was developed by Gas Production Services Inc. (GPSI). This low cost Universal Detector/Controller consists of a stand-alone, scalable network connected to a daisy-chain topology. M. Collyer reviewed the capabilities of the GPSI model G3000 series multi-sensor controller system from an engineering perspective and presented an independent opinion on its performance and operation. Its evaluation was based on a randomly selected production unit supplied by GPSI. M. Collyer used widely used industry principles, electronic laboratory testing methods, prepared schematic diagrams, and presented operator ratings. The strength of the system's viability in the global market is that its use is not constrained by regulations or standards of any country, province, state or region. It can be used to provide solutions for air quality, protection and energy management. In particular, the G3000 series provides continuous protection and intelligent management in combustible gas detection; seismic risk mitigation; toxic gas early detection; air quality management; and, energy conservation. 1 tab.

  16. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    Science.gov (United States)

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  17. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    Science.gov (United States)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  18. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  19. Multi-Cultural Competency-Based Vocational Curricula. Food Service. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on food service. This program is designed to run 24 weeks and cover 15 instructional areas: orientation, sanitation, management/planning, preparing food for cooking, preparing beverages, cooking eggs, cooking meat, cooking vegetables,…

  20. Multi-Cultural Competency-Based Vocational Curricula. Automotive Mechanics. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on automotive mechanics. This program is designed to run 36 weeks and cover 10 instructional areas: the engine; drive trains--rear ends/drive shafts/manual transmission; carburetor; emission; ignition/tune-up; charging and starting;…

  1. Modeling of Volatility with Non-linear Time Series Model

    OpenAIRE

    Kim Song Yon; Kim Mun Chol

    2013-01-01

    In this paper, non-linear time series models are used to describe volatility in financial time series data. To describe volatility, two of the non-linear time series are combined into form TAR (Threshold Auto-Regressive Model) with AARCH (Asymmetric Auto-Regressive Conditional Heteroskedasticity) error term and its parameter estimation is studied.

  2. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  3. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  4. Improved technical efficiency and exogenous factors in transportation demand for energy: An application of structural time series analysis to South Korean data

    International Nuclear Information System (INIS)

    Sa'ad, Suleiman

    2010-01-01

    This paper stresses the importance of incorporating the effects of improved technical efficiency and exogenous factors when estimating energy demand functions. Using annual time series data for the period 1973-2007 in the STSM (structural time series model) developed by Harvey et al. the paper estimates price and income elasticities of demand for energy as well as the annual growth of the stochastic trend at the end of the estimation period. The results of the study reveal a long-run income elasticity of 1.37 and a price elasticity of -0.19. In addition, the underlying trend is generally stochastic and negatively sloping during the greater part of the estimation period. Finally, the estimated result from the structural time series is compared with the results from the Johansen Cointegration. These results suggest that income is the dominant factor in energy consumption. In addition, the coefficient of linear trend is negative, supporting the results from the STSM.

  5. Improved technical efficiency and exogenous factors in transportation demand for energy: An application of structural time series analysis to South Korean data

    Energy Technology Data Exchange (ETDEWEB)

    Sa' ad, Suleiman [Surrey Centre for Energy Economics (SEEC), University of Surrey, Guildford (United Kingdom)

    2010-07-15

    This paper stresses the importance of incorporating the effects of improved technical efficiency and exogenous factors when estimating energy demand functions. Using annual time series data for the period 1973-2007 in the STSM (structural time series model) developed by Harvey et al. the paper estimates price and income elasticities of demand for energy as well as the annual growth of the stochastic trend at the end of the estimation period. The results of the study reveal a long-run income elasticity of 1.37 and a price elasticity of -0.19. In addition, the underlying trend is generally stochastic and negatively sloping during the greater part of the estimation period. Finally, the estimated result from the structural time series is compared with the results from the Johansen Cointegration. These results suggest that income is the dominant factor in energy consumption. In addition, the coefficient of linear trend is negative, supporting the results from the STSM. (author)

  6. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  7. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  8. Self-organising mixture autoregressive model for non-stationary time series modelling.

    Science.gov (United States)

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  9. SERI Wind Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Noun, R. J.

    1983-06-01

    The SERI Wind Energy Program manages the areas or innovative research, wind systems analysis, and environmental compatibility for the U.S. Department of Energy. Since 1978, SERI wind program staff have conducted in-house aerodynamic and engineering analyses of novel concepts for wind energy conversion and have managed over 20 subcontracts to determine technical feasibility; the most promising of these concepts is the passive blade cyclic pitch control project. In the area of systems analysis, the SERI program has analyzed the impact of intermittent generation on the reliability of electric utility systems using standard utility planning models. SERI has also conducted methodology assessments. Environmental issues related to television interference and acoustic noise from large wind turbines have been addressed. SERI has identified the causes, effects, and potential control of acoustic noise emissions from large wind turbines.

  10. Performance of technical indicators in forecasting high-frequency foreign exchange rates

    Directory of Open Access Journals (Sweden)

    Václav Mastný

    2004-01-01

    Full Text Available This paper deals with technical analysis and its forecasting ability in the intradaily foreign exchange market. The objective of this study is to investigate whether technical indicators are able to provide prediction superior to „buy and hold“ strategy. Each indicator is tested with series of parameters in time series of different frequency (5, 15, 30, 60 min. The profitability of each indicator is examined in simple trading modell.

  11. Time series modeling in traffic safety research.

    Science.gov (United States)

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Modelling Social-Technical Attacks with Timed Automata

    DEFF Research Database (Denmark)

    David, Nicolas; David, Alexandre; Hansen, Rene Rydhof

    2015-01-01

    . In this paper we develop an approach towards modelling socio-technical systems in general and socio-technical attacks in particular, using timed automata and illustrate its application by a complex case study. Thanks to automated model checking and automata theory, we can automatically generate possible attacks...... in our model and perform analysis and simulation of both model and attack, revealing details about the specific interaction between attacker and victim. Using timed automata also allows for intuitive modelling of systems, in which quantities like time and cost can be easily added and analysed....

  13. Empirical investigation on modeling solar radiation series with ARMA–GARCH models

    International Nuclear Information System (INIS)

    Sun, Huaiwei; Yan, Dong; Zhao, Na; Zhou, Jianzhong

    2015-01-01

    Highlights: • Apply 6 ARMA–GARCH(-M) models to model and forecast solar radiation. • The ARMA–GARCH(-M) models produce more accurate radiation forecasting than conventional methods. • Show that ARMA–GARCH-M models are more effective for forecasting solar radiation mean and volatility. • The ARMA–EGARCH-M is robust and the ARMA–sGARCH-M is very competitive. - Abstract: Simulation of radiation is one of the most important issues in solar utilization. Time series models are useful tools in the estimation and forecasting of solar radiation series and their changes. In this paper, the effectiveness of autoregressive moving average (ARMA) models with various generalized autoregressive conditional heteroskedasticity (GARCH) processes, namely ARMA–GARCH models are evaluated for their effectiveness in radiation series. Six different GARCH approaches, which contain three different ARMA–GARCH models and corresponded GARCH in mean (ARMA–GARCH-M) models, are applied in radiation data sets from two representative climate stations in China. Multiple evaluation metrics of modeling sufficiency are used for evaluating the performances of models. The results show that the ARMA–GARCH(-M) models are effective in radiation series estimation. Both in fitting and prediction of radiation series, the ARMA–GARCH(-M) models show better modeling sufficiency than traditional models, while ARMA–EGARCH-M models are robustness in two sites and the ARMA–sGARCH-M models appear very competitive. Comparisons of statistical diagnostics and model performance clearly show that the ARMA–GARCH-M models make the mean radiation equations become more sufficient. It is recommended the ARMA–GARCH(-M) models to be the preferred method to use in the modeling of solar radiation series

  14. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  15. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...... applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic...... and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...

  16. Gulf Coast Section SPE Production Operations Study Group-technical highlights from a series of frac pack treatment symposiums

    Energy Technology Data Exchange (ETDEWEB)

    McLarty, J.M.; DeBonis, V.

    1995-12-31

    One of the main functions of the SPE is to provide a means for collection, dissemination, and exchange of technical information and to provide technical forums that afford opportunities for members to maintain and upgrade their technical competence. The large chapters (such as Houston SPE) located near many oil company headquarters have the advantage of being able to bring together a cross section of service company and operator personnel representing operations and research from major and independent operators. This paper describes a series of 1-day symposiums on frac pack technology that were organized by the Houston-based Gulf Coast Section SPE Production Operations Study Group. These study sessions provided a means for the local members of the industry to further develop a new technology as a team. Publishing the major focus and contributions of the seminars will allow sharing of the technology with chapters outside of Houston.

  17. Modelling conditional heteroscedasticity in nonstationary series

    NARCIS (Netherlands)

    Cizek, P.; Cizek, P.; Härdle, W.K.; Weron, R.

    2011-01-01

    A vast amount of econometrical and statistical research deals with modeling financial time series and their volatility, which measures the dispersion of a series at a point in time (i.e., conditional variance). Although financial markets have been experiencing many shorter and longer periods of

  18. Long Memory Models to Generate Synthetic Hydrological Series

    Directory of Open Access Journals (Sweden)

    Guilherme Armando de Almeida Pereira

    2014-01-01

    Full Text Available In Brazil, much of the energy production comes from hydroelectric plants whose planning is not trivial due to the strong dependence on rainfall regimes. This planning is accomplished through optimization models that use inputs such as synthetic hydrologic series generated from the statistical model PAR(p (periodic autoregressive. Recently, Brazil began the search for alternative models able to capture the effects that the traditional model PAR(p does not incorporate, such as long memory effects. Long memory in a time series can be defined as a significant dependence between lags separated by a long period of time. Thus, this research develops a study of the effects of long dependence in the series of streamflow natural energy in the South subsystem, in order to estimate a long memory model capable of generating synthetic hydrologic series.

  19. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  1. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  2. Modeling and Forecasting of Water Demand in Isfahan Using Underlying Trend Concept and Time Series

    Directory of Open Access Journals (Sweden)

    H. Sadeghi

    2016-02-01

    Full Text Available Introduction: Accurate water demand modeling for the city is very important for forecasting and policies adoption related to water resources management. Thus, for future requirements of water estimation, forecasting and modeling, it is important to utilize models with little errors. Water has a special place among the basic human needs, because it not hampers human life. The importance of the issue of water management in the extraction and consumption, it is necessary as a basic need. Municipal water applications is include a variety of water demand for domestic, public, industrial and commercial. Predicting the impact of urban water demand in better planning of water resources in arid and semiarid regions are faced with water restrictions. Materials and Methods: One of the most important factors affecting the changing technological advances in production and demand functions, we must pay special attention to the layout pattern. Technology development is concerned not only technically, but also other aspects such as personal, non-economic factors (population, geographical and social factors can be analyzed. Model examined in this study, a regression model is composed of a series of structural components over time allows changed invisible accidentally. Explanatory variables technology (both crystalline and amorphous in a model according to which the material is said to be better, but because of the lack of measured variables over time can not be entered in the template. Model examined in this study, a regression model is composed of a series of structural component invisible accidentally changed over time allows. In this study, structural time series (STSM and ARMA time series models have been used to model and estimate the water demand in Isfahan. Moreover, in order to find the efficient procedure, both models have been compared to each other. The desired data in this research include water consumption in Isfahan, water price and the monthly pay

  3. Foundations of Sequence-to-Sequence Modeling for Time Series

    OpenAIRE

    Kuznetsov, Vitaly; Mariet, Zelda

    2018-01-01

    The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting. We provide the first theoretical analysis of this time series forecasting framework. We include a comparison of sequence-to-sequence modeling to classical time series models, and as such our theory can serve as a quantitative guide for practiti...

  4. Symptomatic thoracic spinal cord herniation: case series and technical report.

    Science.gov (United States)

    Hawasli, Ammar H; Ray, Wilson Z; Wright, Neill M

    2014-09-01

    Idiopathic spinal cord herniation (ISCH) is an uncommon condition located predominantly in the thoracic spine and often associated with a remote history of a major traumatic injury. ISCH has an incompletely described presentation and unknown etiology. There is no consensus on the treatment algorithm and surgical technique, and there are few data on clinical outcomes. In this case series and technical report, we describe the atypical myelopathy presentation, remote history of traumatic injury, radiographic progression, treatment, and outcomes of 5 patients treated at Washington University for symptomatic ISCH. A video showing surgical repair is presented. In contrast to classic compressive myelopathy symptomatology, ISCH patients presented with an atypical myelopathy, characterized by asymmetric motor and sensory deficits and early-onset urinary incontinence. Clinical deterioration correlated with progressive spinal cord displacement and herniation observed on yearly spinal imaging in a patient imaged serially because of multiple sclerosis. Finally, compared with compressive myelopathy in the thoracic spine, surgical treatment of ISCH led to rapid improvement despite a long duration of symptoms. Symptomatic ISCH presents with atypical myelopathy and slow temporal progression and can be successfully managed with surgical repair.

  5. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  6. Modeling seasonality in bimonthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1992-01-01

    textabstractA recurring issue in modeling seasonal time series variables is the choice of the most adequate model for the seasonal movements. One selection method for quarterly data is proposed in Hylleberg et al. (1990). Market response models are often constructed for bimonthly variables, and

  7. Adaptive time-variant models for fuzzy-time-series forecasting.

    Science.gov (United States)

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  8. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data

    Directory of Open Access Journals (Sweden)

    Kansuporn eSriyudthsak

    2016-05-01

    Full Text Available The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  9. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Shiraishi, Fumihide; Hirai, Masami Yokota

    2016-01-01

    The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although, hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  10. Promoting Creative Thinking Ability Using Contextual Learning Model in Technical Drawing Achievement

    Science.gov (United States)

    Mursid, R.

    2018-02-01

    The purpose of this study is to determine whether there is influence; the differences in the results between students that learn drawing techniques taught by the Contextual Innovative Model (CIM) and taught by Direct Instructional Model (DIM), the differences in achievement among students of technical drawing that have High Creative Thinking Ability (HCTA) with Low Creative Thinking Ability (LCTA), and the interaction between the learning model with the ability to think creatively to the achievement technical drawing. Quasi-experimental research method. Results of research appoint that: the achievement of students that learned technical drawing by using CIM is higher than the students that learned technical drawing by using DIM, the achievement of students of technical drawings HCTA is higher than the achievement of students who have technical drawing LCTA, and there are interactions between the use of learning models and creative thinking abilities in influencing student achievement technical drawing.

  11. Annotated bibliography of structural equation modelling: technical work.

    Science.gov (United States)

    Austin, J T; Wolfle, L M

    1991-05-01

    Researchers must be familiar with a variety of source literature to facilitate the informed use of structural equation modelling. Knowledge can be acquired through the study of an expanding literature found in a diverse set of publishing forums. We propose that structural equation modelling publications can be roughly classified into two groups: (a) technical and (b) substantive applications. Technical materials focus on the procedures rather than substantive conclusions derived from applications. The focus of this article is the former category; included are foundational/major contributions, minor contributions, critical and evaluative reviews, integrations, simulations and computer applications, precursor and historical material, and pedagogical textbooks. After a brief introduction, we annotate 294 articles in the technical category dating back to Sewall Wright (1921).

  12. Developing and Validating the Socio-Technical Model in Ontology Engineering

    Science.gov (United States)

    Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin

    2018-03-01

    This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.

  13. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  14. Trend time-series modeling and forecasting with neural networks.

    Science.gov (United States)

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  15. Issues in Biological Shape Modelling

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen

    This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape or appear......This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape...

  16. Modeling technical change in climate analysis: evidence from agricultural crop damages.

    Science.gov (United States)

    Ahmed, Adeel; Devadason, Evelyn S; Al-Amin, Abul Quasem

    2017-05-01

    This study accounts for the Hicks neutral technical change in a calibrated model of climate analysis, to identify the optimum level of technical change for addressing climate changes. It demonstrates the reduction to crop damages, the costs to technical change, and the net gains for the adoption of technical change for a climate-sensitive Pakistan economy. The calibrated model assesses the net gains of technical change for the overall economy and at the agriculture-specific level. The study finds that the gains of technical change are overwhelmingly higher than the costs across the agriculture subsectors. The gains and costs following technical change differ substantially for different crops. More importantly, the study finds a cost-effective optimal level of technical change that potentially reduces crop damages to a minimum possible level. The study therefore contends that the climate policy for Pakistan should consider the role of technical change in addressing climate impacts on the agriculture sector.

  17. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  18. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  19. Competence Model and Modern Trends of Development of the Russian Institute of Technical Customer

    Directory of Open Access Journals (Sweden)

    Mishlanova Marina

    2017-01-01

    Full Text Available Article considers modern maintenance and development of the management actor by the investment-construction projects of the technical customer. Urgent problems of the formation of Institute of the technical customer establishment are allocated. Elementary competence model is presented: based competences of technical customer, model of the primary competence, example of the operational level of the model. Analysis of the development of the Institute of the technical customer was performed: compliance with current realities of investment-construction activities, improvement of contractual relations, compliance with international standards, state participation, creation of the single technical customer. Necessity of development of competence models for the urgent justification of professional standards is assessed. The possibility of modeling of the competencies and functions of technical customer in approach to the FIDIC-model was revealed. Possibility of usage of the competence model of the technical customer on the stage of building in terms of public-private partnership. Results show the direction for further researches.

  20. Modeling interdependent socio-technical networks: The smart grid—an agent-based modeling approach

    NARCIS (Netherlands)

    Worm, D.; Langley, D.J.; Becker, J.

    2014-01-01

    The aim of this paper is to improve scientific modeling of interdependent socio-technical networks. In these networks the interplay between technical or infrastructural elements on the one hand and social and behavioral aspects on the other hand, plays an important role. Examples include electricity

  1. FOURIER SERIES MODELS THROUGH TRANSFORMATION

    African Journals Online (AJOL)

    DEPT

    monthly temperature data (1996 – 2005) collected from the National Root ... KEY WORDS: Fourier series, square transformation, multiplicative model, ... fluctuations or movements are often periodic(Ekpeyong,2005). .... significant trend or not, if the trend is not significant, the grand mean may be used as an estimate of trend.

  2. Estimation of pure autoregressive vector models for revenue series ...

    African Journals Online (AJOL)

    This paper aims at applying multivariate approach to Box and Jenkins univariate time series modeling to three vector series. General Autoregressive Vector Models with time varying coefficients are estimated. The first vector is a response vector, while others are predictor vectors. By matrix expansion each vector, whether ...

  3. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  4. Technical change in forest sector models: the global forest products model approach

    Science.gov (United States)

    Joseph Buongiorno; Sushuai Zhu

    2015-01-01

    Technical change is developing rapidly in some parts of the forest sector, especially in the pulp and paper industry where wood fiber is being substituted by waste paper. In forest sector models, the processing of wood and other input into products is frequently represented by activity analysis (input–output). In this context, technical change translates in changes...

  5. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...

  6. Technical know-how of site descriptive modeling for site characterization - 59089

    International Nuclear Information System (INIS)

    Saegusa, Hiromitsu; Onoe, Hironori; Doke, Ryosuke; Niizato, Tadafumi; Yasue, Ken-ichi

    2012-01-01

    The site descriptive model covering the current status of characteristics of geological environment and the site evolution model for estimation of the long-term evolution of site conditions are used to integrate multi-disciplinary investigation results. It is important to evaluate uncertainties in the models, to specify issues regarding the uncertainties and to prioritize the resolution of specified issues, for the planning of site characterization. There is a large quantity of technical know-how in the modeling process. It is important to record the technical know-how with transparency and traceability, since site characterization projects generally need long duration. The transfer of the technical know-how accumulated in the research and development (R and D) phase to the implementation phase is equally important. The aim of this study is to support the planning of initial surface-based site characterizations based on the technical know-how accumulated from the underground research laboratory projects. These projects are broad scientific studies of the deep geological environment and provide a technical basis for the geological disposal of high-level radioactive wastes. In this study, a comprehensive task flow from acquisition of existing data to planning of field investigations through the modeling has been specified. Specific task flow and decision-making process to perform the tasks have been specified. (authors)

  7. Technical reference book for the Energy Economic Data Base Program: EEDB Phase 9 (1987)

    International Nuclear Information System (INIS)

    1988-07-01

    This document provides the current technical design bases for each technical data model (of and electric generating plant) in the eighth update. It contains a set of detailed system design descriptions (supplemented with engineering drawings) for the technical data models. This distribution is the latest in a series published since 1978. The overall program purpose is to provide periodically updated, detailed base construction cost estimates for large nuclear electric operating plants. These data, which are representative of current US powerplant construction cost experience, are a useful contribution to program planning by the Office of the Assistant Secretary for Nuclear Energy

  8. A four-stage hybrid model for hydrological time series forecasting.

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  9. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  10. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  11. Comparison of annual maximum series and partial duration series methods for modeling extreme hydrologic events

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rasmussen, Peter F.; Rosbjerg, Dan

    1997-01-01

    Two different models for analyzing extreme hydrologic events, based on, respectively, partial duration series (PDS) and annual maximum series (AMS), are compared. The PDS model assumes a generalized Pareto distribution for modeling threshold exceedances corresponding to a generalized extreme value......). In the case of ML estimation, the PDS model provides the most efficient T-year event estimator. In the cases of MOM and PWM estimation, the PDS model is generally preferable for negative shape parameters, whereas the AMS model yields the most efficient estimator for positive shape parameters. A comparison...... of the considered methods reveals that in general, one should use the PDS model with MOM estimation for negative shape parameters, the PDS model with exponentially distributed exceedances if the shape parameter is close to zero, the AMS model with MOM estimation for moderately positive shape parameters, and the PDS...

  12. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  13. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.

  14. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    Science.gov (United States)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  15. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  16. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    Science.gov (United States)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  17. Modeling vector nonlinear time series using POLYMARS

    NARCIS (Netherlands)

    de Gooijer, J.G.; Ray, B.K.

    2003-01-01

    A modified multivariate adaptive regression splines method for modeling vector nonlinear time series is investigated. The method results in models that can capture certain types of vector self-exciting threshold autoregressive behavior, as well as provide good predictions for more general vector

  18. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  19. Summary of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    1977-05-01

    The technical capabilities of Sandia Laboratories are detailed in a series of companion reports. In this summary the use of the capabilities in technical programs is outlined and the capabilities are summarized. 25 figures, 3 tables

  20. Outlier Detection in Structural Time Series Models

    DEFF Research Database (Denmark)

    Marczak, Martyna; Proietti, Tommaso

    investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality......Structural change affects the estimation of economic signals, like the underlying growth rate or the seasonally adjusted series. An important issue, which has attracted a great deal of attention also in the seasonal adjustment literature, is its detection by an expert procedure. The general......–to–specific approach to the detection of structural change, currently implemented in Autometrics via indicator saturation, has proven to be both practical and effective in the context of stationary dynamic regression models and unit–root autoregressions. By focusing on impulse– and step–indicator saturation, we...

  1. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.

  2. Modelling of Water Cooled Fuel Including Design Basis and Severe Accidents. Proceedings of a Technical Meeting

    International Nuclear Information System (INIS)

    2015-11-01

    The demands on nuclear fuel have recently been increasing, and include transient regimes, higher discharge burnup and longer fuel cycles. This has resulted in an increase of loads on fuel and core internals. In order to satisfy these demands while ensuring compliance with safety criteria, new national and international programmes have been launched and advanced modelling codes are being developed. The Fukushima Daiichi accident has particularly demonstrated the need for adequate analysis of all aspects of fuel performance to prevent a failure and also to predict fuel behaviour were an accident to occur.This publication presents the Proceedings of the Technical Meeting on Modelling of Water Cooled Fuel Including Design Basis and Severe Accidents, which was hosted by the Nuclear Power Institute of China (NPIC) in Chengdu, China, following the recommendation made in 2013 at the IAEA Technical Working Group on Fuel Performance and Technology. This recommendation was in agreement with IAEA mid-term initiatives, linked to the post-Fukushima IAEA Nuclear Safety Action Plan, as well as the forthcoming Coordinated Research Project (CRP) on Fuel Modelling in Accident Conditions. At the technical meeting in Chengdu, major areas and physical phenomena, as well as types of code and experiment to be studied and used in the CRP, were discussed. The technical meeting provided a forum for international experts to review the state of the art of code development for modelling fuel performance of nuclear fuel for water cooled reactors with regard to steady state and transient conditions, and for design basis and early phases of severe accidents, including experimental support for code validation. A round table discussion focused on the needs and perspectives on fuel modelling in accident conditions. This meeting was the ninth in a series of IAEA meetings, which reflects Member States’ continuing interest in nuclear fuel issues. The previous meetings were held in 1980 (jointly with

  3. Modelling bursty time series

    International Nuclear Information System (INIS)

    Vajna, Szabolcs; Kertész, János; Tóth, Bálint

    2013-01-01

    Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)

  4. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  5. Alternative Approaches to Technical Efficiency Estimation in the Stochastic Frontier Model

    OpenAIRE

    Acquah, H. de-Graft; Onumah, E. E.

    2014-01-01

    Estimating the stochastic frontier model and calculating technical efficiency of decision making units are of great importance in applied production economic works. This paper estimates technical efficiency from the stochastic frontier model using Jondrow, and Battese and Coelli approaches. In order to compare alternative methods, simulated data with sample sizes of 60 and 200 are generated from stochastic frontier model commonly applied to agricultural firms. Simulated data is employed to co...

  6. Modeling Interdependent Socio-technical Networks via ABM Smart Grid Case

    NARCIS (Netherlands)

    Worm, D.T.H.; Langley, D.J.; Becker, J.M.

    2013-01-01

    The objective of this paper is to improve scientific modeling of interdependent socio-technical networks. In these networks the interplay between technical or infrastructural elements on the one hand and social and behavioral aspects on the other hand, is of importance. Examples include electricity

  7. time series modeling of daily abandoned calls in a call centre

    African Journals Online (AJOL)

    DJFLEX

    Models for evaluating and predicting the short periodic time series in daily ... Ugwuowo (2006) proposed asymmetric angular- linear multivariate regression models, ..... Using the parameter estimates in Table 3, the fitted Fourier series model is ..... For the SARIMA model with the stochastic component also being white noise, ...

  8. Technical solutions to enable embedded generation growth

    Energy Technology Data Exchange (ETDEWEB)

    Lynch, C.A.; Todd, S.; Millar, W.; Wood, H.S.

    2003-07-01

    This report describes the results of one of a series of studies commissioned by the UK Department of Trade and Industry into various aspects of embedded generation with the aim of supporting the development and deployment of electrical sources (particularly their ease of connection to the network) to deliver power to consumers. The first phase of the project involved a literature review and meetings with embedded generation developers and planning engineers from distribution network operators (DNOs). The second phase investigated embedded generation at different levels of the distribution network and included modelling a representative network. Technologies that could facilitate a significant increase in embedded generation were identified and estimates made of when and where significant development would be needed. Technical problems identified by DNOs were concerned with thermal loading, voltage regulation, fault levels, protection and network operation. A number of non-technical (commercial and regulatory) problems were also identified. The report describes the UK regulatory framework, the present situation, the British power system, the accommodation of embedded generation by established means, the representative model and technical innovations.

  9. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  10. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  11. Technical Communicator: A New Model for the Electronic Resources Librarian?

    Science.gov (United States)

    Hulseberg, Anna

    2016-01-01

    This article explores whether technical communicator is a useful model for electronic resources (ER) librarians. The fields of ER librarianship and technical communication (TC) originated and continue to develop in relation to evolving technologies. A review of the literature reveals four common themes for ER librarianship and TC. While the…

  12. Short-Term Bus Passenger Demand Prediction Based on Time Series Model and Interactive Multiple Model Approach

    Directory of Open Access Journals (Sweden)

    Rui Xue

    2015-01-01

    Full Text Available Although bus passenger demand prediction has attracted increased attention during recent years, limited research has been conducted in the context of short-term passenger demand forecasting. This paper proposes an interactive multiple model (IMM filter algorithm-based model to predict short-term passenger demand. After aggregated in 15 min interval, passenger demand data collected from a busy bus route over four months were used to generate time series. Considering that passenger demand exhibits various characteristics in different time scales, three time series were developed, named weekly, daily, and 15 min time series. After the correlation, periodicity, and stationarity analyses, time series models were constructed. Particularly, the heteroscedasticity of time series was explored to achieve better prediction performance. Finally, IMM filter algorithm was applied to combine individual forecasting models with dynamically predicted passenger demand for next interval. Different error indices were adopted for the analyses of individual and hybrid models. The performance comparison indicates that hybrid model forecasts are superior to individual ones in accuracy. Findings of this study are of theoretical and practical significance in bus scheduling.

  13. Technical training: AXEL-2012 - Introduction to Particle Accelerators

    CERN Multimedia

    HR Department

    2011-01-01

    CERN Technical Training 2012: Learning for the LHC! AXEL-2012 is a course series on particle accelerators, given at CERN within the framework of the Technical Training Program. Being part of BE Department’s Operation Group Shutdown Lecture series, the general accelerator physics module is organized since 2003 as a joint venture between the BE Department and Technical Training, and is open to a wider CERN community. The AXEL-2012 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) from the 16th – 20th of January 2012, and given in English with ...

  14. Technical training: AXEL-2011 - Introduction to Particle Accelerators

    CERN Multimedia

    HR Department

    2010-01-01

    CERN Technical Training 2011: Learning for the LHC! AXEL-2011 is a course series on particle accelerators, given at CERN within the framework of the 2011 Technical Training Program. As part of the BE Department’s Operation Group Shutdown Lecture series, the general accelerator physics module has been organized since 2003 as a joint venture between the BE Department and Technical Training, and is open to a wider CERN community. The AXEL-2011 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is also open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge of accelerators. However, some basic knowledge of trigonometry, matrices and differential equations, and some basic knowledge of magnetism would be an advantage. The series will be composed of 10 one-hour courses (Monday 10.01.2011 – Fri 14.01.2011, from 09:00 to 10:30 and from 14:00 to 15:...

  15. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  16. Estimating High-Dimensional Time Series Models

    DEFF Research Database (Denmark)

    Medeiros, Marcelo C.; Mendes, Eduardo F.

    We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly......, larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency), and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic. A simulation study shows...

  17. Prospective application of advanced series compensation to improve transmission system performance

    Energy Technology Data Exchange (ETDEWEB)

    Gama, C A; Scavassa, J L; Silva, W.M. da; Silva, J M.M. da; Ponte, J R [ELETRONORTE, Brasilia, DF (Brazil)

    1994-12-31

    This paper describes the main aspects and results of the planning studies undertaken to evaluate the technical benefits of using Thyristor Controlled Series Compensation (TCSC), in a large 500 kV transmission system (Brazilian North-Northeast system). TCSC controllers design and simplified models for digital non-linear time domain simulations are discussed. The proposed controllers are tuned and used to support the comparative analysis between controlled and fixed series compensation. The relevant conclusions concerning this comparison are highlighted. (author) 2 refs., 14 figs., 3 tabs.

  18. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    Science.gov (United States)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the

  19. Development of knowledge models by linguistic analysis of lexical relationships in technical documents

    International Nuclear Information System (INIS)

    Seguela, Patrick

    2001-01-01

    This research thesis addresses the problem of knowledge acquisition and structuring from technical texts, and the use of this knowledge in the development of models. The author presents the Cameleon method which aims at extracting binary lexical relationships from technical texts by identifying linguistic markers. The relevance of this method is assessed in the case of four different corpuses: a written technical corpus, an oral technical corpus, a corpus of texts of instructions, and a corpus of academic texts. The author reports the development of a model of representation of knowledge of a specific field by using lexical relationships. The method is then applied to develop a model used in document search within a knowledge management system [fr

  20. The use of synthetic input sequences in time series modeling

    International Nuclear Information System (INIS)

    Oliveira, Dair Jose de; Letellier, Christophe; Gomes, Murilo E.D.; Aguirre, Luis A.

    2008-01-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure

  1. Time Series Modelling using Proc Varmax

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2007-01-01

    In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box...

  2. Time-series modeling: applications to long-term finfish monitoring data

    International Nuclear Information System (INIS)

    Bireley, L.E.

    1985-01-01

    The growing concern and awareness that developed during the 1970's over the effects that industry had on the environment caused the electric utility industry in particular to develop monitoring programs. These programs generate long-term series of data that are not very amenable to classical normal-theory statistical analysis. The monitoring data collected from three finfish programs (impingement, trawl and seine) at the Millstone Nuclear Power Station were typical of such series and thus were used to develop methodology that used the full extent of the information in the series. The basis of the methodology was classic Box-Jenkins time-series modeling; however, the models also included deterministic components that involved flow, season and time as predictor variables. Time entered into the models as harmonic regression terms. Of the 32 models fitted to finfish catch data, 19 were found to account for more than 70% of the historical variation. The models were than used to forecast finfish catches a year in advance and comparisons were made to actual data. Usually the confidence intervals associated with the forecasts encompassed most of the observed data. The technique can provide the basis for intervention analysis in future impact assessments

  3. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    Science.gov (United States)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  4. Simulated lumbar minimally invasive surgery educational model with didactic and technical components.

    Science.gov (United States)

    Chitale, Rohan; Ghobrial, George M; Lobel, Darlene; Harrop, James

    2013-10-01

    The learning and development of technical skills are paramount for neurosurgical trainees. External influences and a need for maximizing efficiency and proficiency have encouraged advancements in simulator-based learning models. To confirm the importance of establishing an educational curriculum for teaching minimally invasive techniques of pedicle screw placement using a computer-enhanced physical model of percutaneous pedicle screw placement with simultaneous didactic and technical components. A 2-hour educational curriculum was created to educate neurosurgical residents on anatomy, pathophysiology, and technical aspects associated with image-guided pedicle screw placement. Predidactic and postdidactic practical and written scores were analyzed and compared. Scores were calculated for each participant on the basis of the optimal pedicle screw starting point and trajectory for both fluoroscopy and computed tomographic navigation. Eight trainees participated in this module. Average mean scores on the written didactic test improved from 78% to 100%. The technical component scores for fluoroscopic guidance improved from 58.8 to 52.9. Technical score for computed tomography-navigated guidance also improved from 28.3 to 26.6. Didactic and technical quantitative scores with a simulator-based educational curriculum improved objectively measured resident performance. A minimally invasive spine simulation model and curriculum may serve a valuable function in the education of neurosurgical residents and outcomes for patients.

  5. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    Science.gov (United States)

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  6. The DART general equilibrium model: A technical description

    OpenAIRE

    Springer, Katrin

    1998-01-01

    This paper provides a technical description of the Dynamic Applied Regional Trade (DART) General Equilibrium Model. The DART model is a recursive dynamic, multi-region, multi-sector computable general equilibrium model. All regions are fully specified and linked by bilateral trade flows. The DART model can be used to project economic activities, energy use and trade flows for each of the specified regions to simulate various trade policy as well as environmental policy scenarios, and to analy...

  7. On modeling panels of time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    2002-01-01

    textabstractThis paper reviews research issues in modeling panels of time series. Examples of this type of data are annually observed macroeconomic indicators for all countries in the world, daily returns on the individual stocks listed in the S&P500, and the sales records of all items in a

  8. Applicability of Socio-Technical Model (STM in Working System of Modern Organizations

    Directory of Open Access Journals (Sweden)

    Rosmaini Tasmin

    2011-10-01

    Full Text Available Knowledge has been identified as one of the most important resources in organization that contributes to competitive advantages. Organizations around the world realize and put into practice an approach that bases on technological and sociological aspects to fill-up the gaps in their workplaces. The Socio-Technical Model (STM is an established organizational model introduced by Trist since 1960s at Tavistock Institute, London. It relates two most common components exist in all organizations, namely social systems (human and technological systems (information technology, machinery and equipment in organizations over many decades. This paper reviews the socio-technical model from various perspectives of its developmental stages and ideas written by researchers. Therefore, several literature reviews on socio-technical model have been compiled and discussed to justify whether its basic argument matches with required practices in Techno-Social environments. Through a socio-technical perspective on Knowledge Management, this paper highlights the interplay between social systems and technological system. It also suggests that management and leadership play critical roles in establishing the techno-social perspective for the effective assimilation of Knowledge Management practices.

  9. New insights into soil temperature time series modeling: linear or nonlinear?

    Science.gov (United States)

    Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram

    2018-03-01

    Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and

  10. High-temperature series expansions for random Potts models

    Directory of Open Access Journals (Sweden)

    M.Hellmund

    2005-01-01

    Full Text Available We discuss recently generated high-temperature series expansions for the free energy and the susceptibility of random-bond q-state Potts models on hypercubic lattices. Using the star-graph expansion technique, quenched disorder averages can be calculated exactly for arbitrary uncorrelated coupling distributions while keeping the disorder strength p as well as the dimension d as symbolic parameters. We present analyses of the new series for the susceptibility of the Ising (q=2 and 4-state Potts model in three dimensions up to the order 19 and 18, respectively, and compare our findings with results from field-theoretical renormalization group studies and Monte Carlo simulations.

  11. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Technical resource documents and technical handbooks for hazardous-wastes management

    Energy Technology Data Exchange (ETDEWEB)

    Schomaker, N.B.; Bliss, T.M.

    1986-07-01

    The Environmental Protection Agency is preparing a series of Technical Resource Documents (TRD's) and Technical Handbooks to provide best engineering control technology to meet the needs of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA) respectively. These documents and handbooks are basically compilation of research efforts of the Land Pollution Control Division (LPCD) to date. The specific areas of research being conducted under the RCRA land disposal program relate to laboratory, pilot and field validation studies in cover systems, waste leaching and solidification, liner systems and disposal facility evaluation. The technical handbooks provide the EPA Program Offices and Regions, as well as the states and other interested parties, with the latest information relevant to remedial actions.

  13. FRAM Modelling Complex Socio-technical Systems

    CERN Document Server

    Hollnagel, Erik

    2012-01-01

    There has not yet been a comprehensive method that goes behind 'human error' and beyond the failure concept, and various complicated accidents have accentuated the need for it. The Functional Resonance Analysis Method (FRAM) fulfils that need. This book presents a detailed and tested method that can be used to model how complex and dynamic socio-technical systems work, and understand both why things sometimes go wrong but also why they normally succeed.

  14. Rotation in the dynamic factor modeling of multivariate stationary time series.

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    2001-01-01

    A special rotation procedure is proposed for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white

  15. Exploring the Benefits of Teacher-Modeling Strategies Integrated into Career and Technical Education

    Science.gov (United States)

    Cathers, Thomas J., Sr.

    2013-01-01

    This case study examined how career and technical education classes function using multiple instructional modeling strategies integrated into vocational and technical training environments. Seven New Jersey public school technical teachers received an introductory overview of the investigation and participated by responding to 10 open-end…

  16. CERN Technical Training 2007: IT3T - IT Technical Training Tutorials (Autumn 2007)

    CERN Multimedia

    2007-01-01

    CERN Technical Training and the Internet Services group of the IT department (IT/IS) are jointly organizing a series of free tutorials, addressing some topics of common interest: the IT Technical Training Tutorials (IT3T). The first IT3T series will be offered in October 2007, in French, with the following schedule: IT3T/2007/1 "Introduction to Collaboration Workspaces using SharePoint", October 23rd , 14:30-16:00, Alexandre Lossent IT3T/2007/2 "What is new in Office 2007", October 25th, 14:30-15:30, Emmanuel Ormancey IT3T/2007/3 "Working with Windows Vista at CERN", October 30th, 14:30-15:30, Michal Kwiatek IT3T/2007/4 "Read your mail and more with Outlook 2007", November 1st, 14:30-15:30, Sebastien Dellabella All IT Technical Training Tutorials will take place in the Training Centre Auditorium (building 593, room 11), at 14h30. The tutorials are free of charge, but separate registration to each is required. Participation to any of the tutorials is open: attendance to any ...

  17. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    Science.gov (United States)

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9

  18. Forecasting daily meteorological time series using ARIMA and regression models

    Science.gov (United States)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  19. Hierarchical Hidden Markov Models for Multivariate Integer-Valued Time-Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Di Mari, Roberto

    2018-01-01

    We propose a new flexible dynamic model for multivariate nonnegative integer-valued time-series. Observations are assumed to depend on the realization of two additional unobserved integer-valued stochastic variables which control for the time-and cross-dependence of the data. An Expectation......-Maximization algorithm for maximum likelihood estimation of the model's parameters is derived. We provide conditional and unconditional (cross)-moments implied by the model, as well as the limiting distribution of the series. A Monte Carlo experiment investigates the finite sample properties of our estimation...

  20. Designing water demand management schemes using a socio-technical modelling approach.

    Science.gov (United States)

    Baki, Sotiria; Rozos, Evangelos; Makropoulos, Christos

    2018-05-01

    Although it is now widely acknowledged that urban water systems (UWSs) are complex socio-technical systems and that a shift towards a socio-technical approach is critical in achieving sustainable urban water management, still, more often than not, UWSs are designed using a segmented modelling approach. As such, either the analysis focuses on the description of the purely technical sub-system, without explicitly taking into account the system's dynamic socio-economic processes, or a more interdisciplinary approach is followed, but delivered through relatively coarse models, which often fail to provide a thorough representation of the urban water cycle and hence cannot deliver accurate estimations of the hydrosystem's responses. In this work we propose an integrated modelling approach for the study of the complete socio-technical UWS that also takes into account socio-economic and climatic variability. We have developed an integrated model, which is used to investigate the diffusion of household water conservation technologies and its effects on the UWS, under different socio-economic and climatic scenarios. The integrated model is formed by coupling a System Dynamics model that simulates the water technology adoption process, and the Urban Water Optioneering Tool (UWOT) for the detailed simulation of the urban water cycle. The model and approach are tested and demonstrated in an urban redevelopment area in Athens, Greece under different socio-economic scenarios and policy interventions. It is suggested that the proposed approach can establish quantifiable links between socio-economic change and UWS responses and therefore assist decision makers in designing more effective and resilient long-term strategies for water conservation. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Modelling Changes in the Unconditional Variance of Long Stock Return Series

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long return series. For the purpose, we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta (2011...... show that the long-memory property in volatility may be explained by ignored changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecast accuracy of the new model over the GJR-GARCH model at all...... horizons for a subset of the long return series....

  2. Modelling changes in the unconditional variance of long stock return series

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    2014-01-01

    In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long daily return series. For this purpose we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta...... that the apparent long memory property in volatility may be interpreted as changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecasting accuracy of the new model over the GJR-GARCH model at all horizons for eight...... subsets of the long return series....

  3. Small-signal model for the series resonant converter

    Science.gov (United States)

    King, R. J.; Stuart, T. A.

    1985-01-01

    The results of a previous discrete-time model of the series resonant dc-dc converter are reviewed and from these a small signal dynamic model is derived. This model is valid for low frequencies and is based on the modulation of the diode conduction angle for control. The basic converter is modeled separately from its output filter to facilitate the use of these results for design purposes. Experimental results are presented.

  4. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  5. Uncertainty and endogenous technical change in climate policy models

    International Nuclear Information System (INIS)

    Baker, Erin; Shittu, Ekundayo

    2008-01-01

    Until recently endogenous technical change and uncertainty have been modeled separately in climate policy models. In this paper, we review the emerging literature that considers both these elements together. Taken as a whole the literature indicates that explicitly including uncertainty has important quantitative and qualitative impacts on optimal climate change technology policy. (author)

  6. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  7. Model-based Clustering of Categorical Time Series with Multinomial Logit Classification

    Science.gov (United States)

    Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea

    2010-09-01

    A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.

  8. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  9. Electronic resource management practical perspectives in a new technical services model

    CERN Document Server

    Elguindi, Anne

    2012-01-01

    A significant shift is taking place in libraries, with the purchase of e-resources accounting for the bulk of materials spending. Electronic Resource Management makes the case that technical services workflows need to make a corresponding shift toward e-centric models and highlights the increasing variety of e-formats that are forcing new developments in the field.Six chapters cover key topics, including: technical services models, both past and emerging; staffing and workflow in electronic resource management; implementation and transformation of electronic resource management systems; the ro

  10. Investigation on Insar Time Series Deformation Model Considering Rheological Parameters for Soft Clay Subgrade Monitoring

    Science.gov (United States)

    Xing, X.; Yuan, Z.; Chen, L. F.; Yu, X. Y.; Xiao, L.

    2018-04-01

    The stability control is one of the major technical difficulties in the field of highway subgrade construction engineering. Building deformation model is a crucial step for InSAR time series deformation monitoring. Most of the InSAR deformation models for deformation monitoring are pure empirical mathematical models, without considering the physical mechanism of the monitored object. In this study, we take rheology into consideration, inducing rheological parameters into traditional InSAR deformation models. To assess the feasibility and accuracy for our new model, both simulation and real deformation data over Lungui highway (a typical highway built on soft clay subgrade in Guangdong province, China) are investigated with TerraSAR-X satellite imagery. In order to solve the unknows of the non-linear rheological model, three algorithms: Gauss-Newton (GN), Levenberg-Marquarat (LM), and Genetic Algorithm (GA), are utilized and compared to estimate the unknown parameters. Considering both the calculation efficiency and accuracy, GA is chosen as the final choice for the new model in our case study. Preliminary real data experiment is conducted with use of 17 TerraSAR-X Stripmap images (with a 3-m resolution). With the new deformation model and GA aforementioned, the unknown rheological parameters over all the high coherence points are obtained and the LOS deformation (the low-pass component) sequences are generated.

  11. Capturing socio-technical systems with agent-based modelling

    NARCIS (Netherlands)

    Van Dam, K.H.

    2009-01-01

    What is a suitable modelling approach for socio-technical systems? The answer to this question is of great importance to decision makers in large scale interconnected network systems. The behaviour of these systems is determined by many actors, situated in a dynamic, multi-actor, multi-objective and

  12. Degeneracy of time series models: The best model is not always the correct model

    International Nuclear Information System (INIS)

    Judd, Kevin; Nakamura, Tomomichi

    2006-01-01

    There are a number of good techniques for finding, in some sense, the best model of a deterministic system given a time series of observations. We examine a problem called model degeneracy, which has the consequence that even when a perfect model of a system exists, one does not find it using the best techniques currently available. The problem is illustrated using global polynomial models and the theory of Groebner bases

  13. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  14. Technical training: AXEL-2009 - Introduction to Particle Accelerators

    CERN Multimedia

    HR Department

    2009-01-01

    CERN Technical Training 2009: Learning for the LHC! AXEL-2009 is a course series on particle accelerators, given at CERN within the framework of the 2009 Technical Training Program. Known in the past as the PS Complex Operation Course (or the ‘PS Shutdown Course’), the general accelerator physics module as been organized since 2003 as a joint venture between the AB Department and Technical Training Service, and is open to a wider CERN community. The AXEL-2009 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is also open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge of accelerators. However, some basic knowledge of trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) from 19 to 23 January 2009, and will be given i...

  15. Technical training: AXEL-2006 - Introduction to Particle Accelerators

    CERN Multimedia

    Davide Vitè

    2006-01-01

    CERN Technical Training 2006: Learning for the LHC! AXEL-2006 is a course series on particle accelerators, given at CERN within the framework of the 2006 Technical Training Programme. Known in the past as the PS Complex Operation Course (or the 'PS Shutdown Course'), the general accelerator physics module is organised since 2003 as a joint venture between the AB department and Technical Training, and is open to a wider CERN community. The AXEL-2006 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) during the week 6-10 February March 2006, and given in English...

  16. Technical Training: AXEL-2005: Introduction to Particle Accelerators

    CERN Multimedia

    Monique Duval

    2005-01-01

    CERN Technical Training 2005: Learning for the LHC! AXEL-2005 is a course series on particle accelerators, given at CERN within the framework of the 2005 Technical Training Programme. Known in the past as the PS Complex Operation Course (or the 'PS Shutdown Course', now AB/OP), the general accelerator physics section is organised since 2003 as a joint venture between the AB department and Technical Training, and is open to a wider CERN community. The AXEL-2005 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to all people (technicians, engineers, physicists) interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) during the week 14-18 March 2005, given in Fr...

  17. Technical training: AXEL-2009 - Introduction to Particle Accelerators

    CERN Multimedia

    HR Department

    2008-01-01

    CERN Technical Training 2009: Learning for the LHC! AXEL-2009 is a course series on particle accelerators, given at CERN within the framework of the 2009 Technical Training Program. Known in the past as the PS Complex Operation Course (or the ‘PS Shutdown Course’), the general accelerator physics module is organized since 2003 as a joint venture between the AB department and Technical Training, and is open to a wider CERN community. The AXEL-2009 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) from the 19th – 23rd of January 2009, and given in English with...

  18. Technical training: AXEL-2010 - Introduction to particle accelerators

    CERN Multimedia

    HR Department

    2010-01-01

    CERN Technical Training 2010: Learning for the LHC! AXEL-2010 is a course series on particle accelerators, given at CERN within the framework of the 2010 Technical Training Program. Known in the past as the PS Complex Operation Course (or the ‘PS Shutdown Course’), the general accelerator physics module is organized since 2003 as a joint venture between the BE department and Technical Training, and is open to a wider CERN community. The AXEL-2010 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) from the 1st – 5th of February 201...

  19. Energy flow models for the estimation of technical losses in distribution network

    International Nuclear Information System (INIS)

    Au, Mau Teng; Tan, Chin Hooi

    2013-01-01

    This paper presents energy flow models developed to estimate technical losses in distribution network. Energy flow models applied in this paper is based on input energy and peak demand of distribution network, feeder length and peak demand, transformer loading capacity, and load factor. Two case studies, an urban distribution network and a rural distribution network are used to illustrate application of the energy flow models. Results on technical losses obtained for the two distribution networks are consistent and comparable to network of similar types and characteristics. Hence, the energy flow models are suitable for practical application.

  20. Silverleaf: An Experimental Series in Support of Nightshade

    Energy Technology Data Exchange (ETDEWEB)

    Danielson, Jeremy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bauer, Amy L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-23

    In this series, a prototype package was designed and fielded to address a major technical risk of Nightshade: A comprehensive suite of drives and diagnostics must fit within a small space. In addition, this series satisfies a milestone for the project.

  1. Neural network versus classical time series forecasting models

    Science.gov (United States)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  2. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  3. Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition

    Science.gov (United States)

    Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.

    2005-12-01

    Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.

  4. Technical Manual for the SAM Biomass Power Generation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jorgenson, J.; Gilman, P.; Dobos, A.

    2011-09-01

    This technical manual provides context for the implementation of the biomass electric power generation performance model in the National Renewable Energy Laboratory's (NREL's) System Advisor Model (SAM). Additionally, the report details the engineering and scientific principles behind the underlying calculations in the model. The framework established in this manual is designed to give users a complete understanding of behind-the-scenes calculations and the results generated.

  5. Multiband Prediction Model for Financial Time Series with Multivariate Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Md. Rabiul Islam

    2012-01-01

    Full Text Available This paper presents a subband approach to financial time series prediction. Multivariate empirical mode decomposition (MEMD is employed here for multiband representation of multichannel financial time series together. Autoregressive moving average (ARMA model is used in prediction of individual subband of any time series data. Then all the predicted subband signals are summed up to obtain the overall prediction. The ARMA model works better for stationary signal. With multiband representation, each subband becomes a band-limited (narrow band signal and hence better prediction is achieved. The performance of the proposed MEMD-ARMA model is compared with classical EMD, discrete wavelet transform (DWT, and with full band ARMA model in terms of signal-to-noise ratio (SNR and mean square error (MSE between the original and predicted time series. The simulation results show that the MEMD-ARMA-based method performs better than the other methods.

  6. Evaluating Technical Efficiency of Nursing Care Using Data Envelopment Analysis and Multilevel Modeling.

    Science.gov (United States)

    Min, Ari; Park, Chang Gi; Scott, Linda D

    2016-05-23

    Data envelopment analysis (DEA) is an advantageous non-parametric technique for evaluating relative efficiency of performance. This article describes use of DEA to estimate technical efficiency of nursing care and demonstrates the benefits of using multilevel modeling to identify characteristics of efficient facilities in the second stage of analysis. Data were drawn from LTCFocUS.org, a secondary database including nursing home data from the Online Survey Certification and Reporting System and Minimum Data Set. In this example, 2,267 non-hospital-based nursing homes were evaluated. Use of DEA with nurse staffing levels as inputs and quality of care as outputs allowed estimation of the relative technical efficiency of nursing care in these facilities. In the second stage, multilevel modeling was applied to identify organizational factors contributing to technical efficiency. Use of multilevel modeling avoided biased estimation of findings for nested data and provided comprehensive information on differences in technical efficiency among counties and states. © The Author(s) 2016.

  7. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  8. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Science.gov (United States)

    Fu, Peihua; Zhu, Anding; Fang, Qiwen; Wang, Xi

    Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR)-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data. We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution. We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation. To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount. The buzz in public social communities

  9. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Directory of Open Access Journals (Sweden)

    Peihua Fu

    Full Text Available Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data.We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution.We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation.To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount. The buzz in public

  10. Modeling Periodic Impulsive Effects on Online TV Series Diffusion

    Science.gov (United States)

    Fang, Qiwen; Wang, Xi

    2016-01-01

    Background Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR)-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data. Methods We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution. Results We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation. Conclusion To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount

  11. Technical performance of percutaneous and laminectomy leads analyzed by modeling

    NARCIS (Netherlands)

    Manola, L.; Holsheimer, J.

    2004-01-01

    The objective of this study was to compare the technical performance of laminectomy and percutaneous spinal cord stimulation leads with similar contact spacing by computer modeling. Monopolar and tripolar (guarded cathode) stimulation with both lead types in a low-thoracic spine model was simulated

  12. MODELING OF TECHNICAL CHANNELS OF INFORMATION LEAKAGE AT DISTRIBUTED CONTROL OBJECTS

    Directory of Open Access Journals (Sweden)

    Aleksander Vladimirovich Karpov

    2018-05-01

    Full Text Available The significant increase in requirements for distributed control objects’ functioning can’t be realized only at the expense of the widening and strengthening of security control measures. The first step in ensuring the information security at such objects is the analysis of the conditions of their functioning and modeling of technical channels of information leakage. The development of models of such channels is essentially the only method of complete study of their opportunities and it is pointed toward receiving quantitative assessments of the safe operation of compound objects. The evaluation data are necessary to make a decision on the degree of the information security from a leak according to the current criterion. The existing models are developed for the standard concentrated objects and allow to evaluate the level of information security from a leak on each of channels separately, what involves the significant increase in the required protective resource and time of assessment of information security on an object in general. The article deals with a logical-and-probabilistic method of a security assessment of structurally-compound objects. The model of a security leak on the distributed control objects is cited as an example. It is recommended to use a software package of an automated structurally-logistical modeling of compound systems, which allows to evaluate risk of information leakage in the loudspeaker. A possibility of information leakage by technical channels is evaluated and such differential characteristics of the safe operation of the distributed control objects as positive and negative contributions of the initiating events and conditions, which cause a leak are calculated. Purpose. The aim is a quantitative assessment of data risk, which is necessary for justifying the rational composition of organizational and technical protection measures, as well as a variant of the structure of the information security system from a

  13. Biotrans functional and technical description. Report of VIEWLS WP5, modelling studies

    International Nuclear Information System (INIS)

    Van Tilburg, X.; Egging, R.; Londo, H.M.

    2006-01-01

    The overall objectives of this project are to provide structured and clear data on the availability and performance of biofuels and to identify the possibilities and strategies towards large scale sustainable production, use and trading of biofuels for the transport sector in Europe, including Central and Eastern European Countries (CEEC). The report supplements the two other reports in the work package: 'Biofuel and Bio-energy implementation scenarios - final report of VIEWLS WP5' (2005) and 'VIEWLS modelling and analysis, technical data for biofuel production chains' (2005). This document contains a functional and technical description of the BioTrans model, accompanied by a description of the system. Section 2 contains a conceptual and functional description of the biofuel model. Section 3 describes the optimisation method in technical terms, discussing aspects like the target function and constraints used. Finally, section 4 discusses the input and output requirements for the BioTrans system

  14. MATHEMATICAL MODELS OF PROCESSES AND SYSTEMS OF TECHNICAL OPERATION FOR ONBOARD COMPLEXES AND FUNCTIONAL SYSTEMS OF AVIONICS

    Directory of Open Access Journals (Sweden)

    Sergey Viktorovich Kuznetsov

    2017-01-01

    Full Text Available Modern aircraft are equipped with complicated systems and complexes of avionics. Aircraft and its avionics tech- nical operation process is observed as a process with changing of operation states. Mathematical models of avionics pro- cesses and systems of technical operation are represented as Markov chains, Markov and semi-Markov processes. The pur- pose is to develop the graph-models of avionics technical operation processes, describing their work in flight, as well as during maintenance on the ground in the various systems of technical operation. The graph-models of processes and sys- tems of on-board complexes and functional avionics systems in flight are proposed. They are based on the state tables. The models are specified for the various technical operation systems: the system with control of the reliability level, the system with parameters control and the system with resource control. The events, which cause the avionics complexes and func- tional systems change their technical state, are failures and faults of built-in test equipment. Avionics system of technical operation with reliability level control is applicable for objects with constant or slowly varying in time failure rate. Avion- ics system of technical operation with resource control is mainly used for objects with increasing over time failure rate. Avionics system of technical operation with parameters control is used for objects with increasing over time failure rate and with generalized parameters, which can provide forecasting and assign the borders of before-fail technical states. The pro- posed formal graphical approach avionics complexes and systems models designing is the basis for models and complex systems and facilities construction, both for a single aircraft and for an airline aircraft fleet, or even for the entire aircraft fleet of some specific type. The ultimate graph-models for avionics in various systems of technical operation permit the beginning of

  15. Technical evaluation of bids for nuclear power plants

    International Nuclear Information System (INIS)

    1981-01-01

    In continuation of its efforts to provide comprehensive and impartial guidance to Member States facing the need to introduce nuclear power, the International Atomic Energy Agency is issuing this guidebook as part of a series of guidebooks and codes of practice and, in particular, as a necessary supplement to 'Economic Evaluation of Bids for Nuclear Power Plants: A Guidebook', published by the IAEA in 1976 as Technical Reports Series No.175. The present publication is intended for project managers and senior engineers of electric utilities who are concerned with the evaluation of bids for a nuclear power project. It assumes that the reader has a good knowledge of the technical characteristics of nuclear power plants and of nuclear power project implementation. Its purpose is to provide the information necessary to organize, guide and supervise the technical evaluation of bids for a nuclear power project. It goes without saying that the technical staff carrying out the evaluation must have prior technical experience which cannot be provided by a guidebook

  16. Identification of neutral biochemical network models from time series data.

    Science.gov (United States)

    Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S

    2009-05-05

    The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  17. Using a neural network approach and time series data from an international monitoring station in the Yellow Sea for modeling marine ecosystems.

    Science.gov (United States)

    Zhang, Yingying; Wang, Juncheng; Vorontsov, A M; Hou, Guangli; Nikanorova, M N; Wang, Hongliang

    2014-01-01

    The international marine ecological safety monitoring demonstration station in the Yellow Sea was developed as a collaborative project between China and Russia. It is a nonprofit technical workstation designed as a facility for marine scientific research for public welfare. By undertaking long-term monitoring of the marine environment and automatic data collection, this station will provide valuable information for marine ecological protection and disaster prevention and reduction. The results of some initial research by scientists at the research station into predictive modeling of marine ecological environments and early warning are described in this paper. Marine ecological processes are influenced by many factors including hydrological and meteorological conditions, biological factors, and human activities. Consequently, it is very difficult to incorporate all these influences and their interactions in a deterministic or analysis model. A prediction model integrating a time series prediction approach with neural network nonlinear modeling is proposed for marine ecological parameters. The model explores the natural fluctuations in marine ecological parameters by learning from the latest observed data automatically, and then predicting future values of the parameter. The model is updated in a "rolling" fashion with new observed data from the monitoring station. Prediction experiments results showed that the neural network prediction model based on time series data is effective for marine ecological prediction and can be used for the development of early warning systems.

  18. Bayesian dynamic modeling of time series of dengue disease case counts.

    Science.gov (United States)

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful

  19. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  20. A 25-kW Series-Resonant Power Converter

    Science.gov (United States)

    Frye, R. J.; Robson, R. R.

    1986-01-01

    Prototype exhibited efficiency of 93.9 percent. 25-kW resonant dc/dc power converter designed, developed, fabricated, and tested, using Westinghouse D7ST transistors as high-power switches. D7ST transistor characterized for use as switch in series-resonant converters, and refined base-drive circuit developed. Technical base includes advanced switching magnetic, and filter components, mathematical circuit models, control philosophies, and switch-drive strategies. Power-system benefits such as lower losses when used for high-voltage distribution, and reduced magnetics and filter mass realized.

  1. TIME SERIES MODELS OF THREE SETS OF RXTE OBSERVATIONS OF 4U 1543–47

    International Nuclear Information System (INIS)

    Koen, C.

    2013-01-01

    The X-ray nova 4U 1543–47 was in a different physical state (low/hard, high/soft, and very high) during the acquisition of each of the three time series analyzed in this paper. Standard time series models of the autoregressive moving average (ARMA) family are fitted to these series. The low/hard data can be adequately modeled by a simple low-order model with fixed coefficients, once the slowly varying mean count rate has been accounted for. The high/soft series requires a higher order model, or an ARMA model with variable coefficients. The very high state is characterized by a succession of 'dips', with roughly equal depths. These seem to appear independently of one another. The underlying stochastic series can again be modeled by an ARMA form, or roughly as the sum of an ARMA series and white noise. The structuring of each model in terms of short-lived aperiodic and 'quasi-periodic' components is discussed.

  2. Agent-Based Modeling and Analysis of Socio-Technical Systems

    NARCIS (Netherlands)

    Sharpanskykh, O.

    2011-01-01

    Socio-technical systems are characterized by high structural and behavioral complexities, which impede understanding and modeling of such systems. In particular, reciprocal relations between diverse local system processes that determine global system dynamics are not well understood. In this article

  3. Road safety forecasts in five European countries using structural time series models.

    Science.gov (United States)

    Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George

    2014-01-01

    Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.

  4. New series of 3 D lattice integrable models

    International Nuclear Information System (INIS)

    Mangazeev, V.V.; Sergeev, S.M.; Stroganov, Yu.G.

    1993-01-01

    In this paper we present a new series of 3-dimensional integrable lattice models with N colors. The weight functions of the models satisfy modified tetrahedron equations with N states and give a commuting family of two-layer transfer-matrices. The dependence on the spectral parameters corresponds to the static limit of the modified tetrahedron equations and weights are parameterized in terms of elliptic functions. The models contain two free parameters: elliptic modulus and additional parameter η. 12 refs

  5. PVUSA model technical specification for a turnkey photovoltaic power system

    Energy Technology Data Exchange (ETDEWEB)

    Dows, R.N.; Gough, E.J.

    1995-11-01

    One of the five objectives of PVUSA is to offer U.S. utilities hands-on experience in designing, procuring, and operating PV systems. The procurement process included the development of a detailed set of technical requirements for a PV system. PVUSA embodied its requirements in a technical specification used as an attachment to its contracts for four utility-scale PV systems in the 200 kW to 500 kW range. The technical specification has also been adapted and used by several utilities. The PVUSA Technical Specification has now been updated and is presented here as a Model Technical Specification (MTS) for utility use. The MTS text is also furnished on a computer disk in Microsoft Word 6.0 so that it may be conveniently adapted by each user. The text includes guidance in the form of comments and by the use of parentheses to indicate where technical information must be developed and inserted. Commercial terms and conditions will reflect the procurement practice of the buyer. The reader is referred to PG&E Report Number 95-3090000. 1, PVUSA Procurement, Acceptance and Rating Practices for Photovoltaic Power Plants (1995) for PVUSA experience and practice. The MTS is regarded by PVUSA as a use-proven document, but needs to be adapted with care and attention to detail.

  6. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    Science.gov (United States)

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  7. Nonlinearity, Breaks, and Long-Range Dependence in Time-Series Models

    DEFF Research Database (Denmark)

    Hillebrand, Eric Tobias; Medeiros, Marcelo C.

    We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building...

  8. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA; GENTON, MARC G.

    2009-01-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided

  9. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  10. Identification of neutral biochemical network models from time series data

    Directory of Open Access Journals (Sweden)

    Maia Marco

    2009-05-01

    Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  11. Time-series modeling of long-term weight self-monitoring data.

    Science.gov (United States)

    Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka

    2015-08-01

    Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.

  12. Technical Manual for the SAM Physical Trough Model

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field, power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.

  13. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  14. Automated Bayesian model development for frequency detection in biological time series.

    Science.gov (United States)

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time

  15. A Time Series Regime Classification Approach for Short-Term Forecasting; Identificacion de Mecanismos en Series Temporales para la Prediccion a Corto Plazo

    Energy Technology Data Exchange (ETDEWEB)

    Gallego, C. J.

    2010-03-08

    Abstract: This technical report is focused on the analysis of stochastic processes that switch between different dynamics (also called regimes or mechanisms) over time. The so-called Switching-regime models consider several underlying functions instead of one. In this case, a classification problem arises as the current regime has to be assessed at each time-step. The identification of the regimes allows the performance of regime-switching models for short-term forecasting purposes. Within this framework, identifying different regimes showed by time-series is the aim of this work. The proposed approach is based on a statistical tool called Gamma-test. One of the main advantages of this methodology is the absence of a mathematical definition for the different underlying functions. Applications with both simulated and real wind power data have been considered. Results on simulated time series show that regimes can be successfully identified under certain hypothesis. Nevertheless, this work highlights that further research has to be done when considering real wind power time-series, which usually show different behaviours (e.g. fluctuations or ramps, followed by low variance periods). A better understanding of these events eventually will improve wind power forecasting. (Author) 15 refs.

  16. Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns

    Science.gov (United States)

    Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto

    2017-09-01

    Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.

  17. Tempered fractional time series model for turbulence in geophysical flows

    Science.gov (United States)

    Meerschaert, Mark M.; Sabzikar, Farzad; Phanikumar, Mantha S.; Zeleke, Aklilu

    2014-09-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model.

  18. Extending the Applicability of the Generalized Likelihood Function for Zero-Inflated Data Series

    Science.gov (United States)

    Oliveira, Debora Y.; Chaffe, Pedro L. B.; Sá, João. H. M.

    2018-03-01

    Proper uncertainty estimation for data series with a high proportion of zero and near zero observations has been a challenge in hydrologic studies. This technical note proposes a modification to the Generalized Likelihood function that accounts for zero inflation of the error distribution (ZI-GL). We compare the performance of the proposed ZI-GL with the original Generalized Likelihood function using the entire data series (GL) and by simply suppressing zero observations (GLy>0). These approaches were applied to two interception modeling examples characterized by data series with a significant number of zeros. The ZI-GL produced better uncertainty ranges than the GL as measured by the precision, reliability and volumetric bias metrics. The comparison between ZI-GL and GLy>0 highlights the need for further improvement in the treatment of residuals from near zero simulations when a linear heteroscedastic error model is considered. Aside from the interception modeling examples illustrated herein, the proposed ZI-GL may be useful for other hydrologic studies, such as for the modeling of the runoff generation in hillslopes and ephemeral catchments.

  19. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  20. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  1. Volterra-series-based nonlinear system modeling and its engineering applications: A state-of-the-art review

    Science.gov (United States)

    Cheng, C. M.; Peng, Z. K.; Zhang, W. M.; Meng, G.

    2017-03-01

    Nonlinear problems have drawn great interest and extensive attention from engineers, physicists and mathematicians and many other scientists because most real systems are inherently nonlinear in nature. To model and analyze nonlinear systems, many mathematical theories and methods have been developed, including Volterra series. In this paper, the basic definition of the Volterra series is recapitulated, together with some frequency domain concepts which are derived from the Volterra series, including the general frequency response function (GFRF), the nonlinear output frequency response function (NOFRF), output frequency response function (OFRF) and associated frequency response function (AFRF). The relationship between the Volterra series and other nonlinear system models and nonlinear problem solving methods are discussed, including the Taylor series, Wiener series, NARMAX model, Hammerstein model, Wiener model, Wiener-Hammerstein model, harmonic balance method, perturbation method and Adomian decomposition. The challenging problems and their state of arts in the series convergence study and the kernel identification study are comprehensively introduced. In addition, a detailed review is then given on the applications of Volterra series in mechanical engineering, aeroelasticity problem, control engineering, electronic and electrical engineering.

  2. Tempered fractional time series model for turbulence in geophysical flows

    International Nuclear Information System (INIS)

    Meerschaert, Mark M; Sabzikar, Farzad; Phanikumar, Mantha S; Zeleke, Aklilu

    2014-01-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model. (paper)

  3. A prediction method based on wavelet transform and multiple models fusion for chaotic time series

    International Nuclear Information System (INIS)

    Zhongda, Tian; Shujiang, Li; Yanhong, Wang; Yi, Sha

    2017-01-01

    In order to improve the prediction accuracy of chaotic time series, a prediction method based on wavelet transform and multiple models fusion is proposed. The chaotic time series is decomposed and reconstructed by wavelet transform, and approximate components and detail components are obtained. According to different characteristics of each component, least squares support vector machine (LSSVM) is used as predictive model for approximation components. At the same time, an improved free search algorithm is utilized for predictive model parameters optimization. Auto regressive integrated moving average model (ARIMA) is used as predictive model for detail components. The multiple prediction model predictive values are fusion by Gauss–Markov algorithm, the error variance of predicted results after fusion is less than the single model, the prediction accuracy is improved. The simulation results are compared through two typical chaotic time series include Lorenz time series and Mackey–Glass time series. The simulation results show that the prediction method in this paper has a better prediction.

  4. PETRA - Technical implementation. PETRA working no. 9

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-09-01

    This report describes the technical implementation of PETRA. The report is intended for specialist users and refers to the series of reports describing the project. The PETRA system consists of a number of econometric models, representing the national travel demand in Denmark. Application of these models requires the definition of a scenario, a number of runs with the individual models and extensive data transfer between the models and a database containing base data and results. The system contains three basic scenarios to which changes in various assumptions can be applied. It is possible to construct more basic scenarios but this is outside the scope of using the model as it presently stands. The focus of this report is thus on the specification of changes to basic scenario, on running the model - including description of the data flows, and on the possibilities for analysis of the results. (au) EFP-94. 11 refs.

  5. Wave basin model tests of technical-biological bank protection

    Science.gov (United States)

    Eisenmann, J.

    2012-04-01

    Sloped embankments of inland waterways are usually protected from erosion and other negative im-pacts of ship-induced hydraulic loads by technical revetments consisting of riprap. Concerning the dimensioning of such bank protection there are several design rules available, e.g. the "Principles for the Design of Bank and Bottom Protection for Inland Waterways" or the Code of Practice "Use of Standard Construction Methods for Bank and Bottom Protection on Waterways" issued by the BAW (Federal Waterways Engineering and Research Institute). Since the European Water Framework Directive has been put into action special emphasis was put on natural banks. Therefore the application of technical-biological bank protection is favoured. Currently design principles for technical-biological bank protection on inland waterways are missing. The existing experiences mainly refer to flowing waters with no or low ship-induced hydraulic loads on the banks. Since 2004 the Federal Waterways Engineering and Research Institute has been tracking the re-search and development project "Alternative Technical-Biological Bank Protection on Inland Water-ways" in company with the Federal Institute of Hydrology. The investigation to date includes the ex-amination of waterway sections where technical- biological bank protection is applied locally. For the development of design rules for technical-biological bank protection investigations shall be carried out in a next step, considering the mechanics and resilience of technical-biological bank protection with special attention to ship-induced hydraulic loads. The presentation gives a short introduction into hydraulic loads at inland waterways and their bank protection. More in detail model tests of a willow brush mattress as a technical-biological bank protec-tion in a wave basin are explained. Within the scope of these tests the brush mattresses were ex-posed to wave impacts to determine their resilience towards hydraulic loads. Since the

  6. SSM: Inference for time series analysis with State Space Models

    OpenAIRE

    Dureau, Joseph; Ballesteros, Sébastien; Bogich, Tiffany

    2013-01-01

    The main motivation behind the open source library SSM is to reduce the technical friction that prevents modellers from sharing their work, quickly iterating in crisis situations, and making their work directly usable by public authorities to serve decision-making.

  7. Technical know-how for modeling of geological environment. (1) Overview and groundwater flow modeling

    International Nuclear Information System (INIS)

    Saegusa, Hiromitsu; Takeuchi, Shinji; Maekawa, Keisuke; Osawa, Hideaki; Semba, Takeshi

    2011-01-01

    It is important for site characterization projects to manage the decision-making process with transparency and traceability and to transfer the technical know-how accumulated during the research and development to the implementing phase and to future generations. The modeling for a geological environment is to be used to synthesize investigation results. Evaluation of the impact of uncertainties in the model is important to identify and prioritize key issues for further investigations. Therefore, a plan for site characterization should be made based on the results of the modeling. The aim of this study is to support for the planning of initial surface-based site characterization based on the technical know-how accumulated from the Mizunami Underground Research Laboratory Project and the Horonobe Underground Research Laboratory Project. These projects are broad scientific studies of the deep geological environment that are a basis for research and development for the geological disposal of high-level radioactive wastes. In this study, the work-flow of the groundwater flow modeling, which is one of the geological environment models, and is to be used for setting the area for the geological environment modeling and for groundwater flow characterization, and the related decision-making process using literature data have been summarized. (author)

  8. A time series model: First-order integer-valued autoregressive (INAR(1))

    Science.gov (United States)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  9. FY 1985 scientific and technical reports, articles, papers and presentations

    Science.gov (United States)

    Turner, Joyce E. (Compiler)

    1985-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by Marshal Space Flight Center (MSFC) personnel in FY 85. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the National Technical Information Service (NTIS), 5285 Port Royal Road, Springfield, Va. 22161.

  10. Women and Technical Professions. Leonardo da Vinci Series: Good Practices.

    Science.gov (United States)

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles programs for women in technical professions that are offered through the European Commission's Leonardo da Vinci program. The following programs are profiled: (1) Artemis and Diana (vocational guidance programs to help direct girls toward technology-related careers); (2) CEEWIT (an Internet-based information and…

  11. On the maximum-entropy/autoregressive modeling of time series

    Science.gov (United States)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  12. Assimilation of LAI time-series in crop production models

    Science.gov (United States)

    Kooistra, Lammert; Rijk, Bert; Nannes, Louis

    2014-05-01

    Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor

  13. Markov Chain Modelling for Short-Term NDVI Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Stepčenko Artūrs

    2016-12-01

    Full Text Available In this paper, the NDVI time series forecasting model has been developed based on the use of discrete time, continuous state Markov chain of suitable order. The normalised difference vegetation index (NDVI is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation; therefore, it is an important variable for vegetation forecasting. A Markov chain is a stochastic process that consists of a state space. This stochastic process undergoes transitions from one state to another in the state space with some probabilities. A Markov chain forecast model is flexible in accommodating various forecast assumptions and structures. The present paper discusses the considerations and techniques in building a Markov chain forecast model at each step. Continuous state Markov chain model is analytically described. Finally, the application of the proposed Markov chain model is illustrated with reference to a set of NDVI time series data.

  14. Scientific and Technical Reports, Articles, Papers, and Presentations

    Science.gov (United States)

    Waits, J. E. Turner (Compiler)

    2001-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY 2000. It also includes papers of MSFC contractors. After being announced in STAR, all the NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  15. Technical review of the dispersion and dose models used in the MILDOS computer program

    International Nuclear Information System (INIS)

    Horst, T.W.; Soldat, J.K.; Bander, T.J.

    1982-05-01

    The MILDOS computer code is used to estimate impacts of radioactive emissions from uranium milling facilities. This report reviews the technical basis of the models used in the MILDOS computer code. The models were compared with state-of-the-art predictions, taking into account the intended uses of the MILDOS code. Several suggested modifications are presented and the technical basis for those changes are given

  16. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  17. STRUCTURAL AND FUNCTIONAL MODEL OF FORMING INFORMATIONAL COMPETENCE OF TECHNICAL UNIVERSITY STUDENTS

    Directory of Open Access Journals (Sweden)

    Taras Ostapchuk

    2016-11-01

    Full Text Available The article elaborates and analyses the structural and functional model of formation of information competence of technical university students. The system and mutual relationships between its elements are revealed. It is found out that the presence of the target structure of the proposed model, process and result-evaluative blocks ensure its functioning and the opportunity to optimize the learning process for technical school students’ information training. It is established that the formation of technical university students’ information competence based on components such as motivational value, as well as operational activity, cognitive, and reflexive one. These criteria (motivation, operational and activity, cognitive, reflective, indexes and levels (reproductive, technologized, constructive forming technical university students’ information competence are disclosed. Expediency of complex organizational and educational conditions in the stages of information competence is justified. The complex organizational and pedagogical conditions include: orientation in the organization and implementation of class work for technical university students’ positive value treatment; the issue of forming professionalism; informatization of educational and socio-cultural environment of higher technical educational institutions; orientation of technical university students’ training to the demands of European and international standards on information competence as a factor in the formation of competitiveness at the labor market; introducing a special course curriculum that will provide competence formation due to the use of information technology in professional activities. Forms (lecture, visualization, problem lecture, combined lecture, scientific online conference, recitals, excursions, etc., tools (computer lab, multimedia projector, interactive whiteboard, multimedia technology (audio, video, the Internet technologies; social networks, etc

  18. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    Carlo method of forecasting using a special nonlinear time series model, called logistic smooth transition ... We illustrate this new method using some simulation ..... in MATLAB 7.5.0. ... process (DGP) using the logistic smooth transi-.

  19. The application of time series models to cloud field morphology analysis

    Science.gov (United States)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  20. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  1. Groundwater technical procedures of the U.S. Geological Survey

    Science.gov (United States)

    Cunningham, William L.; Schalk, Charles W.

    2011-01-01

    A series of groundwater technical procedures documents (GWPDs) has been released by the U.S. Geological Survey, Water-Resources Discipline, for general use by the public. These technical procedures were written in response to the need for standardized technical procedures of many aspects of groundwater science, including site and measuring-point establishment, measurement of water levels, and measurement of well discharge. The techniques are described in the GWPDs in concise language and are accompanied by necessary figures and tables derived from cited manuals, reports, and other documents. Because a goal of this series of procedures is to remain current with the state of the science, and because procedures change over time, this report is released in an online format only. As new procedures are developed and released, they will be linked to this document.

  2. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    OpenAIRE

    Yanhui Xi; Hui Peng; Yemei Qin

    2016-01-01

    The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation....

  3. Analysis and evaluation of the ASTEC model basis. Relevant experiments. Technical report

    International Nuclear Information System (INIS)

    Koppers, V.; Koch, M.K.

    2015-12-01

    The present report is a Technical Report within the research project ''ASMO'', funded by the German Federal Ministry of Economics and Technology (BMWi 1501433) and projected at the Reactor Simulation and Safety Group, Chair of Energy Systems and Energy Economics (LEE) at the Ruhr-Universitaet Bochum (RUB). The project deals with the analysis of the model basis of the Accident Source Term Evaluation Code (ASTEC). This report focuses on the containment part of ASTEC (CPA) and presents the simulation results of the experiment TH20.7. The experimental series TH20 was performed in the test vessel THAI (Thermal-hydraulics, Aerosols, Iodine) to investigate the erosion of a helium layer by a blower generated air jet. Helium is used as a substitute for hydrogen. In the experiment TH20.7 a light-gas layer is established and eroded by a momentum driven jet. The simulation of momentum driven jets is challenging for CPA because there is no model to simulate the kinetic momentum transfer. Subject of this report is the analysis of the capability of the code with the current model basis to model momentum driven phenomena. The jet is modelled using virtual ventilation systems, so called FAN-Systems. The FAN-Systems are adapted to the erosion velocity. The simulation results are compared to the experimental results and a basic calculation using FAN-Systems without any adjustments. For further improvement, different variation calculations are performed. At first, the vertical nodalization is refined. Subsequently, the resistance coefficients are adjusted to support the jet flow pattern and the number of the FAN-Systems is reduced. The analysis shows that the simulation of a momentum driven light-gas layer erosion is possible using adjusted FAN-Systems. A fine selected vertical nodalization and adaption of the resistance coefficients improves the simulation results.

  4. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...

  5. Solar Technical Assistance Team 2013 Webinars | State, Local, and Tribal

    Science.gov (United States)

    Governments | NREL 3 Webinars Solar Technical Assistance Team 2013 Webinars The Solar Technical Assistance Team (STAT) 2013 webinar series provides an overview of solar technologies, resources, and the following sessions are available: Solar Finance for Residential and Commercial Customers and Potential Roles

  6. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    Science.gov (United States)

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  7. Tracing the History of Technical Communication from 1850-2000: Plus a Series of Survey Studies.

    Science.gov (United States)

    McDowell, Earl E.

    This research focuses on the history of technical communication since 1850, with a specific focus on the technological changes that occurred between 1900 and 1950. This paper also discusses the development of professional technical communication organizations and the development of technical communication programs at the bachelor, masters, and…

  8. Modelling Socio-Technical Aspects of Organisational Security

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva

    Identification of threats to organisations and risk assessment often take into consideration the pure technical aspects, overlooking the vulnerabilities originating from attacks on a social level, for example social engineering, and abstracting away the physical infrastructure. However, attacks...... would close this gap, however, it would also result in complicating the formal treatment and automatic identification of attacks. This dissertation shows that applying a system modelling approach to sociotechnical systems can be used for identifying attacks on organisations, which exploit various levels...... process calculus, we develop a formal analytical approach that generates attack trees from the model. The overall goal of the framework is to predict, prioritise and minimise the vulnerabilities in organisations by prohibiting the overall attack or at least increasing the difficulty and cost of fulfilling...

  9. Modeling sports highlights using a time-series clustering framework and model interpretation

    Science.gov (United States)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  10. Improved time series prediction with a new method for selection of model parameters

    International Nuclear Information System (INIS)

    Jade, A M; Jayaraman, V K; Kulkarni, B D

    2006-01-01

    A new method for model selection in prediction of time series is proposed. Apart from the conventional criterion of minimizing RMS error, the method also minimizes the error on the distribution of singularities, evaluated through the local Hoelder estimates and its probability density spectrum. Predictions of two simulated and one real time series have been done using kernel principal component regression (KPCR) and model parameters of KPCR have been selected employing the proposed as well as the conventional method. Results obtained demonstrate that the proposed method takes into account the sharp changes in a time series and improves the generalization capability of the KPCR model for better prediction of the unseen test data. (letter to the editor)

  11. Pin failure modeling of the A series CABRI tests

    International Nuclear Information System (INIS)

    Young, M.F.; Portugal, J.L.

    1978-01-01

    The EXPAND pin fialure model, a research tool designed to model pin failure under prompt burst conditions, has been used to predict failure conditions for several of the A series CABRI tests as part of the United States participation in the CABRI Joint Project. The Project is an international program involving France, Germany, England, Japan, and the United States and has the goal of obtaining experimental data relating to the safety of LMFBR's. The A series, designed to simulate high ramp rate TOP conditions, initially utilizes single, fresh UO 2 pins of the PHENIX type in a flowing sodium loop. The pins are preheated at constant power in the CABRI reactor to establish steady state conditions (480 w/cm at the axial peak) and then subjected to a power pulse of 14 ms to 24 ms duration

  12. Time series modeling by a regression approach based on a latent process.

    Science.gov (United States)

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.

  13. Modelling the International Climate Change Negotiations: A Non-Technical Outline of Model Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Underdal, Arild

    1997-12-31

    This report discusses in non-technical terms the overall architecture of a model that will be designed to enable the user to (1) explore systematically the political feasibility of alternative policy options and (2) to determine the set of politically feasible solutions in the global climate change negotiations. 25 refs., 2 figs., 1 tab.

  14. Simulated spinal cerebrospinal fluid leak repair: an educational model with didactic and technical components.

    Science.gov (United States)

    Ghobrial, George M; Anderson, Paul A; Chitale, Rohan; Campbell, Peter G; Lobel, Darlene A; Harrop, James

    2013-10-01

    In the era of surgical resident work hour restrictions, the traditional apprenticeship model may provide fewer hours for neurosurgical residents to hone technical skills. Spinal dura mater closure or repair is 1 skill that is infrequently encountered, and persistent cerebrospinal fluid leaks are a potential morbidity. To establish an educational curriculum to train residents in spinal dura mater closure with a novel durotomy repair model. The Congress of Neurological Surgeons has developed a simulation-based model for durotomy closure with the ongoing efforts of their simulation educational committee. The core curriculum consists of didactic training materials and a technical simulation model of dural repair for the lumbar spine. Didactic pretest scores ranged from 4/11 (36%) to 10/11 (91%). Posttest scores ranged from 8/11 (73%) to 11/11 (100%). Overall, didactic improvements were demonstrated by all participants, with a mean improvement between pre- and posttest scores of 1.17 (18.5%; P = .02). The technical component consisted of 11 durotomy closures by 6 participants, where 4 participants performed multiple durotomies. Mean time to closure of the durotomy ranged from 490 to 546 seconds in the first and second closures, respectively (P = .66), whereby the median leak rate improved from 14 to 7 (P = .34). There were also demonstrative technical improvements by all. Simulated spinal dura mater repair appears to be a potentially valuable tool in the education of neurosurgery residents. The combination of a didactic and technical assessment appears to be synergistic in terms of educational development.

  15. A COMPARATIVE STUDY OF FORECASTING MODELS FOR TREND AND SEASONAL TIME SERIES DOES COMPLEX MODEL ALWAYS YIELD BETTER FORECAST THAN SIMPLE MODELS

    Directory of Open Access Journals (Sweden)

    Suhartono Suhartono

    2005-01-01

    Full Text Available Many business and economic time series are non-stationary time series that contain trend and seasonal variations. Seasonality is a periodic and recurrent pattern caused by factors such as weather, holidays, or repeating promotions. A stochastic trend is often accompanied with the seasonal variations and can have a significant impact on various forecasting methods. In this paper, we will investigate and compare some forecasting methods for modeling time series with both trend and seasonal patterns. These methods are Winter's, Decomposition, Time Series Regression, ARIMA and Neural Networks models. In this empirical research, we study on the effectiveness of the forecasting performance, particularly to answer whether a complex method always give a better forecast than a simpler method. We use a real data, that is airline passenger data. The result shows that the more complex model does not always yield a better result than a simpler one. Additionally, we also find the possibility to do further research especially the use of hybrid model by combining some forecasting method to get better forecast, for example combination between decomposition (as data preprocessing and neural network model.

  16. FY 2004 Scientific and Technical Reports, Articles, Papers, and Presentations

    Science.gov (United States)

    Fowler, B. A. (Compiler)

    2006-01-01

    This Technical Memorandum (TM) presents formal NASA technical reports, papers published in technical journals, and presentations by Marshall Space Flight Center (MSFC) personnel FY 2004. It also includes papers of MSFC contractors. After being announced in STAR, all NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this TM maybe of value to the scientific and engineering community in determining what information has been published and what is available.

  17. FY 2001 Scientific and Technical Reports, Articles, Papers, and Presentations

    Science.gov (United States)

    Waits, J. E. Turner (Compiler)

    2002-01-01

    This Technical Memorandum (TM) presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY 2001. It also includes papers of MSFC contractors. After being announced in STAR, all NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this TM may be of value to the scientific and engineering community in determining what information has been published and what is available.

  18. Disease management with ARIMA model in time series.

    Science.gov (United States)

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  19. DETERMINING THE NEED FOR ZERO SERIES EXECUTION IN MANUFACTURING PROCESSES IN THE TEXTILE GARMENT INDUSTRY

    Directory of Open Access Journals (Sweden)

    OANA Ioan Pave

    2017-05-01

    Full Text Available Because the industrial production requires the application of some transformation procedures on the material resources, so that a clothing product comes out with optimal use value in terms of maximum economic efficiency, one of the main influencial factors is the quality of the products. To make manufacturing processes more efficient, it is necessary to carry out the zero series in order to ensure the quality of the technological processes, as well as to prevent some design deficiencies. Among the main operations undertaken to ensure the quality of the zero series, we mention: creating the conditions for launch, tracking and finalizing the accompanying production documents under similar series production conditions; zero-series producers are usually the same workers who make up the series production line; equipping with the appropriate equipment and providing with necessary devices in order to create the technical conditions for the execution of the zero series; providing technical assistance in relation to manufacturing and control documentation for eliminating the design deficiencies. This paper presents the architecture of the zero series execution in manufacturing processes in the textile garment industry. The information obtained from the zero-series analysis is directed to the technical support, for possible corrections of the patterns according to which the products were manufactured.

  20. Modeling the full-bridge series-resonant power converter

    Science.gov (United States)

    King, R. J.; Stuart, T. A.

    1982-01-01

    A steady state model is derived for the full-bridge series-resonant power converter. Normalized parametric curves for various currents and voltages are then plotted versus the triggering angle of the switching devices. The calculations are compared with experimental measurements made on a 50 kHz converter and a discussion of certain operating problems is presented.

  1. Development of Simulink-Based SiC MOSFET Modeling Platform for Series Connected Devices

    DEFF Research Database (Denmark)

    Tsolaridis, Georgios; Ilves, Kalle; Reigosa, Paula Diaz

    2016-01-01

    A new MATLAB/Simulink-based modeling platform has been developed for SiC MOSFET power modules. The modeling platform describes the electrical behavior f a single 1.2 kV/ 350 A SiC MOSFET power module, as well as the series connection of two of them. A fast parameter initialization is followed...... by an optimization process to facilitate the extraction of the model’s parameters in a more automated way relying on a small number of experimental waveforms. Through extensive experimental work, it is shown that the model accurately predicts both static and dynamic performances. The series connection of two Si......C power modules has been investigated through the validation of the static and dynamic conditions. Thanks to the developed model, a better understanding of the challenges introduced by uneven voltage balance sharing among series connected devices is possible....

  2. Technical Training: AXEL-2005 - Introduction to Particle Accelerators

    CERN Multimedia

    Monique Duval

    2005-01-01

    CERN Technical Training 2005: Learning for the LHC! AXEL-2005 is a course series on particle accelerators, given at CERN within the framework of the 2005 Technical Training Programme. Known in the past as the PS Complex Operation Course (or the 'PS Shutdown Course', now AB/OP), the general accelerator physics section is organised since 2003 as a joint venture between the AB department and Technical Training, and is open to a wider CERN community. The AXEL-2005 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to all people (technicians, engineers, physicists) interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course will be given in French on course supports in English; questions and answers possible in both languages. AXEL-2005 - I...

  3. Modelos de gestión de conflictos en serie de ficción televisiva (Conflict management models in television fiction series

    Directory of Open Access Journals (Sweden)

    Yolanda Navarro-Abal

    2012-12-01

    Full Text Available Television fiction series sometimes generate an unreal vision of life, especially among young people, becoming a mirror in which they can see themselves reflected. The series become models of values, attitudes, skills and behaviours that tend to be imitated by some viewers. The aim of this study was to analyze the conflict management behavioural styles presented by the main characters of television fiction series. Thus, we evaluated the association between these styles and the age and sex of the main characters, as well as the nationality and genre of the fiction series. 16 fiction series were assessed by selecting two characters of both sexes from each series. We adapted the Rahim Organizational Conflict Inventory-II for observing and recording the data. The results show that there is no direct association between the conflict management behavioural styles presented in the drama series and the sex of the main characters. However, associations were found between these styles and the age of the characters and the genre of the fiction series.

  4. From Taylor series to Taylor models

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    An overview of the background of Taylor series methods and the utilization of the differential algebraic structure is given, and various associated techniques are reviewed. The conventional Taylor methods are extended to allow for a rigorous treatment of bounds for the remainder of the expansion in a similarly universal way. Utilizing differential algebraic and functional analytic arguments on the set of Taylor models, arbitrary order integrators with rigorous remainder treatment are developed. The integrators can meet pre-specified accuracy requirements in a mathematically strict way, and are a stepping stone towards fully rigorous estimates of stability of repetitive systems

  5. Time-series models on somatic cell score improve detection of matistis

    DEFF Research Database (Denmark)

    Norberg, E; Korsgaard, I R; Sloth, K H M N

    2008-01-01

    In-line detection of mastitis using frequent milk sampling was studied in 241 cows in a Danish research herd. Somatic cell scores obtained at a daily basis were analyzed using a mixture of four time-series models. Probabilities were assigned to each model for the observations to belong to a normal...... "steady-state" development, change in "level", change of "slope" or "outlier". Mastitis was indicated from the sum of probabilities for the "level" and "slope" models. Time-series models were based on the Kalman filter. Reference data was obtained from veterinary assessment of health status combined...... with bacteriological findings. At a sensitivity of 90% the corresponding specificity was 68%, which increased to 83% using a one-step back smoothing. It is concluded that mixture models based on Kalman filters are efficient in handling in-line sensor data for detection of mastitis and may be useful for similar...

  6. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  7. THE MODEL OF LIFELONG EDUCATION IN A TECHNICAL UNIVERSITY AS A MULTILEVEL EDUCATIONAL COMPLEX

    Directory of Open Access Journals (Sweden)

    Svetlana V. Sergeyeva

    2016-06-01

    Full Text Available Introduction: the current leading trend of the educational development is characterised by its continuity. Institutions of higher education as multi-level educational complexes nurture favourable conditions for realisation of the strategy of lifelong education. Today a technical university offering training of future engineers is facing a topic issue of creating a multilevel educational complex. Materials and Methods: this paper is put together on the basis of modern Russian and foreign scientific literature about lifelong education. The authors used theoretical methods of scientific research: systemstructural analysis, synthesis, modeling, analysis and generalisations of concepts. Results: the paper presents a model of lifelong education developed by authors for a technical university as a multilevel educational complex. It is realised through a set of principles: multi-level and continuity, integration, conformity and quality, mobility, anticipation, openness, social partnership and feedback. In accordance with the purpose, objectives and principles, the content part of the model is formed. The syllabi following the described model are run in accordance with the training levels undertaken by a technical university as a multilevel educational complex. All syllabi are based on the gradual nature of their implementation. In this regard, the authors highlight three phases: diagnostic, constructive and transformative, assessing. Discussion and Conclusions: the expected result of the created model of lifelong education development in a technical university as a multilevel educational complex is presented by a graduate trained for effective professional activity, competitive, prepared and sought-after at the regional labour market.

  8. Evaluating an Automated Number Series Item Generator Using Linear Logistic Test Models

    Directory of Open Access Journals (Sweden)

    Bao Sheng Loe

    2018-04-01

    Full Text Available This study investigates the item properties of a newly developed Automatic Number Series Item Generator (ANSIG. The foundation of the ANSIG is based on five hypothesised cognitive operators. Thirteen item models were developed using the numGen R package and eleven were evaluated in this study. The 16-item ICAR (International Cognitive Ability Resource1 short form ability test was used to evaluate construct validity. The Rasch Model and two Linear Logistic Test Model(s (LLTM were employed to estimate and predict the item parameters. Results indicate that a single factor determines the performance on tests composed of items generated by the ANSIG. Under the LLTM approach, all the cognitive operators were significant predictors of item difficulty. Moderate to high correlations were evident between the number series items and the ICAR test scores, with high correlation found for the ICAR Letter-Numeric-Series type items, suggesting adequate nomothetic span. Extended cognitive research is, nevertheless, essential for the automatic generation of an item pool with predictable psychometric properties.

  9. Time Series Modeling of Nano-Gold Immunochromatographic Assay via Expectation Maximization Algorithm.

    Science.gov (United States)

    Zeng, Nianyin; Wang, Zidong; Li, Yurong; Du, Min; Cao, Jie; Liu, Xiaohui

    2013-12-01

    In this paper, the expectation maximization (EM) algorithm is applied to the modeling of the nano-gold immunochromatographic assay (nano-GICA) via available time series of the measured signal intensities of the test and control lines. The model for the nano-GICA is developed as the stochastic dynamic model that consists of a first-order autoregressive stochastic dynamic process and a noisy measurement. By using the EM algorithm, the model parameters, the actual signal intensities of the test and control lines, as well as the noise intensity can be identified simultaneously. Three different time series data sets concerning the target concentrations are employed to demonstrate the effectiveness of the introduced algorithm. Several indices are also proposed to evaluate the inferred models. It is shown that the model fits the data very well.

  10. 75 FR 47199 - Airworthiness Directives; McDonnell Douglas Corporation Model DC-9-10 Series Airplanes, DC-9-30...

    Science.gov (United States)

    2010-08-05

    ... Airworthiness Directives; McDonnell Douglas Corporation Model DC- 9-10 Series Airplanes, DC-9-30 Series... existing airworthiness directive (AD), which applies to all McDonnell Douglas Model DC-9-10 series..., 2010). That AD applies to all McDonnell Douglas Corporation Model DC-9-10 series airplanes, DC-9-30...

  11. Some Issues of Biological Shape Modelling with Applications

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Hilger, Klaus Baggesen; Skoglund, Karl

    2003-01-01

    This paper illustrates current research at Informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations to, modifications to, and applications of the elements of constructing models of shape or appearance...

  12. On determining the prediction limits of mathematical models for time series

    International Nuclear Information System (INIS)

    Peluso, E.; Gelfusa, M.; Lungaroni, M.; Talebzadeh, S.; Gaudio, P.; Murari, A.; Contributors, JET

    2016-01-01

    Prediction is one of the main objectives of scientific analysis and it refers to both modelling and forecasting. The determination of the limits of predictability is an important issue of both theoretical and practical relevance. In the case of modelling time series, reached a certain level in performance in either modelling or prediction, it is often important to assess whether all the information available in the data has been exploited or whether there are still margins for improvement of the tools being developed. In this paper, an information theoretic approach is proposed to address this issue and quantify the quality of the models and/or predictions. The excellent properties of the proposed indicator have been proved with the help of a systematic series of numerical tests and a concrete example of extreme relevance for nuclear fusion.

  13. Investigating Some Technical Issues on Cohesive Zone Modeling of Fracture

    Science.gov (United States)

    Wang, John T.

    2011-01-01

    This study investigates some technical issues related to the use of cohesive zone models (CZMs) in modeling fracture processes. These issues include: why cohesive laws of different shapes can produce similar fracture predictions; under what conditions CZM predictions have a high degree of agreement with linear elastic fracture mechanics (LEFM) analysis results; when the shape of cohesive laws becomes important in the fracture predictions; and why the opening profile along the cohesive zone length needs to be accurately predicted. Two cohesive models were used in this study to address these technical issues. They are the linear softening cohesive model and the Dugdale perfectly plastic cohesive model. Each cohesive model constitutes five cohesive laws of different maximum tractions. All cohesive laws have the same cohesive work rate (CWR) which is defined by the area under the traction-separation curve. The effects of the maximum traction on the cohesive zone length and the critical remote applied stress are investigated for both models. For a CZM to predict a fracture load similar to that obtained by an LEFM analysis, the cohesive zone length needs to be much smaller than the crack length, which reflects the small scale yielding condition requirement for LEFM analysis to be valid. For large-scale cohesive zone cases, the predicted critical remote applied stresses depend on the shape of cohesive models used and can significantly deviate from LEFM results. Furthermore, this study also reveals the importance of accurately predicting the cohesive zone profile in determining the critical remote applied load.

  14. FY 2003 Scientific and Technical Reports, Articles, Papers, and Presentations

    Science.gov (United States)

    Fowler, B. A. (Compiler)

    2004-01-01

    This Technical Memorandum (TM) presents formal NASA technical reports, papers published in technical journals, and presentations by Marshall Space Flight Center (MSFC) personnel in FY 2003. It also includes papers of MSFC contractors. After being announced in STAR, all NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Spring.eld, VA 22161. The information in this TM may be of value to the scientific and engineering community in determining what information has been published and what is available.

  15. Big Data impacts on stochastic Forecast Models: Evidence from FX time series

    Directory of Open Access Journals (Sweden)

    Sebastian Dietz

    2013-12-01

    Full Text Available With the rise of the Big Data paradigm new tasks for prediction models appeared. In addition to the volume problem of such data sets nonlinearity becomes important, as the more detailed data sets contain also more comprehensive information, e.g. about non regular seasonal or cyclical movements as well as jumps in time series. This essay compares two nonlinear methods for predicting a high frequency time series, the USD/Euro exchange rate. The first method investigated is Autoregressive Neural Network Processes (ARNN, a neural network based nonlinear extension of classical autoregressive process models from time series analysis (see Dietz 2011. Its advantage is its simple but scalable time series process model architecture, which is able to include all kinds of nonlinearities based on the universal approximation theorem of Hornik, Stinchcombe and White 1989 and the extensions of Hornik 1993. However, restrictions related to the numeric estimation procedures limit the flexibility of the model. The alternative is a Support Vector Machine Model (SVM, Vapnik 1995. The two methods compared have different approaches of error minimization (Empirical error minimization at the ARNN vs. structural error minimization at the SVM. Our new finding is, that time series data classified as “Big Data” need new methods for prediction. Estimation and prediction was performed using the statistical programming language R. Besides prediction results we will also discuss the impact of Big Data on data preparation and model validation steps. Normal 0 21 false false false DE X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

  16. Modeling multivariate time series on manifolds with skew radial basis functions.

    Science.gov (United States)

    Jamshidi, Arta A; Kirby, Michael J

    2011-01-01

    We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters--in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.

  17. Technical training: AXEL-2008 - Introduction to Particle Accelerators

    CERN Multimedia

    2008-01-01

    CERN Technical Training 2008: Learning for the LHC! AXEL-2008 is a course series on particle accelerators, given at CERN within the framework of the AB Operation Group Shut-down Lectures. Since 2003, this course has been organized as a joint venture between the AB Department and Technical Training and is open to a wider CERN community. The AXEL-2008 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is also open to technicians, engineers and physicists interested in this field. The course does not require any prior knowledge of accelerators. However, some basic knowledge of trigonometry, matrices and differential equations and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) from 29th January to 1st February 2008, and given in English with questions and answers als...

  18. Technical training: AXEL-2008 - Introduction to Particle Accelerators

    CERN Multimedia

    2008-01-01

    CERN Technical Training 2008: Learning for the LHC! AXEL-2008 is a course series on particle accelerators, given at CERN within the framework of the AB Operation Group Shut-down Lectures. Since 2003, this course is organized as a joint venture between the AB department and Technical Training, and is open to a wider CERN community. The AXEL-2008 course series is designed for technicians who are operating an accelerator, or whose work is closely linked to accelerators, but it is open to technicians, engineers, and physicists interested in this field. The course does not require any prior knowledge on accelerators. However, some basic knowledge on trigonometry, matrices and differential equations, and some basic notions of magnetism would be an advantage. The course series will be composed of 10 one-hour lectures (mornings and afternoons) from the 29th of January to the 1st February 2008, and given in English with questions and answers also possible in French. The lecturer is Rende Steerenberg, engineer and sup...

  19. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  20. Incorporating Satellite Time-Series Data into Modeling

    Science.gov (United States)

    Gregg, Watson

    2008-01-01

    In situ time series observations have provided a multi-decadal view of long-term changes in ocean biology. These observations are sufficiently reliable to enable discernment of even relatively small changes, and provide continuous information on a host of variables. Their key drawback is their limited domain. Satellite observations from ocean color sensors do not suffer the drawback of domain, and simultaneously view the global oceans. This attribute lends credence to their use in global and regional model validation and data assimilation. We focus on these applications using the NASA Ocean Biogeochemical Model. The enhancement of the satellite data using data assimilation is featured and the limitation of tongterm satellite data sets is also discussed.

  1. A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series

    Directory of Open Access Journals (Sweden)

    Fernando Luiz Cyrino Oliveira

    2014-01-01

    Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.

  2. An Illustration of Generalised Arma (garma) Time Series Modeling of Forest Area in Malaysia

    Science.gov (United States)

    Pillai, Thulasyammal Ramiah; Shitan, Mahendran

    Forestry is the art and science of managing forests, tree plantations, and related natural resources. The main goal of forestry is to create and implement systems that allow forests to continue a sustainable provision of environmental supplies and services. Forest area is land under natural or planted stands of trees, whether productive or not. Forest area of Malaysia has been observed over the years and it can be modeled using time series models. A new class of GARMA models have been introduced in the time series literature to reveal some hidden features in time series data. For these models to be used widely in practice, we illustrate the fitting of GARMA (1, 1; 1, δ) model to the Annual Forest Area data of Malaysia which has been observed from 1987 to 2008. The estimation of the model was done using Hannan-Rissanen Algorithm, Whittle's Estimation and Maximum Likelihood Estimation.

  3. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  4. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    Science.gov (United States)

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  5. Series-NonUniform Rational B-Spline (S-NURBS) model: a geometrical interpolation framework for chaotic data.

    Science.gov (United States)

    Shao, Chenxi; Liu, Qingqing; Wang, Tingting; Yin, Peifeng; Wang, Binghong

    2013-09-01

    Time series is widely exploited to study the innate character of the complex chaotic system. Existing chaotic models are weak in modeling accuracy because of adopting either error minimization strategy or an acceptable error to end the modeling process. Instead, interpolation can be very useful for solving differential equations with a small modeling error, but it is also very difficult to deal with arbitrary-dimensional series. In this paper, geometric theory is considered to reduce the modeling error, and a high-precision framework called Series-NonUniform Rational B-Spline (S-NURBS) model is developed to deal with arbitrary-dimensional series. The capability of the interpolation framework is proved in the validation part. Besides, we verify its reliability by interpolating Musa dataset. The main improvement of the proposed framework is that we are able to reduce the interpolation error by properly adjusting weights series step by step if more information is given. Meanwhile, these experiments also demonstrate that studying the physical system from a geometric perspective is feasible.

  6. Travel Cost Inference from Sparse, Spatio-Temporally Correlated Time Series Using Markov Models

    DEFF Research Database (Denmark)

    Yang, Bin; Guo, Chenjuan; Jensen, Christian S.

    2013-01-01

    of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...... with the sparsity, spatio-temporal correlation, and heterogeneity of the time series. Using the resulting STHMM, near future travel costs in the transportation network, e.g., travel time or greenhouse gas emissions, can be inferred, enabling a variety of routing services, e.g., eco-routing. Empirical studies...

  7. Forecasting electricity spot-prices using linear univariate time-series models

    International Nuclear Information System (INIS)

    Cuaresma, Jesus Crespo; Hlouskova, Jaroslava; Kossmeier, Stephan; Obersteiner, Michael

    2004-01-01

    This paper studies the forecasting abilities of a battery of univariate models on hourly electricity spot prices, using data from the Leipzig Power Exchange. The specifications studied include autoregressive models, autoregressive-moving average models and unobserved component models. The results show that specifications, where each hour of the day is modelled separately present uniformly better forecasting properties than specifications for the whole time-series, and that the inclusion of simple probabilistic processes for the arrival of extreme price events can lead to improvements in the forecasting abilities of univariate models for electricity spot prices. (Author)

  8. Modelling technical snow production for skiing areas in the Austrian Alps with the physically based snow model AMUNDSEN

    Science.gov (United States)

    Hanzer, F.; Marke, T.; Steiger, R.; Strasser, U.

    2012-04-01

    Tourism and particularly winter tourism is a key factor for the Austrian economy. Judging from currently available climate simulations, the Austrian Alps show a particularly high vulnerability to climatic changes. To reduce the exposure of ski areas towards changes in natural snow conditions as well as to generally enhance snow conditions at skiing sites, technical snowmaking is widely utilized across Austrian ski areas. While such measures result in better snow conditions at the skiing sites and are important for the local skiing industry, its economic efficiency has also to be taken into account. The current work emerges from the project CC-Snow II, where improved future climate scenario simulations are used to determine future natural and artificial snow conditions and their effects on tourism and economy in the Austrian Alps. In a first step, a simple technical snowmaking approach is incorporated into the process based snow model AMUNDSEN, which operates at a spatial resolution of 10-50 m and a temporal resolution of 1-3 hours. Locations of skiing slopes within a ski area in Styria, Austria, were digitized and imported into the model environment. During a predefined time frame in the beginning of the ski season, the model produces a maximum possible amount of technical snow and distributes the associated snow on the slopes, whereas afterwards, until to the end of the ski season, the model tries to maintain a certain snow depth threshold value on the slopes. Due to only few required input parameters, this approach is easily transferable to other ski areas. In our poster contribution, we present first results of this snowmaking approach and give an overview of the data and methodology applied. In a further step in CC-Snow, this simple bulk approach will be extended to consider actual snow cannon locations and technical specifications, which will allow a more detailed description of technical snow production as well as cannon-based recordings of water and energy

  9. Effect of calibration data series length on performance and optimal parameters of hydrological model

    Directory of Open Access Journals (Sweden)

    Chuan-zhe Li

    2010-12-01

    Full Text Available In order to assess the effects of calibration data series length on the performance and optimal parameter values of a hydrological model in ungauged or data-limited catchments (data are non-continuous and fragmental in some catchments, we used non-continuous calibration periods for more independent streamflow data for SIMHYD (simple hydrology model calibration. Nash-Sutcliffe efficiency and percentage water balance error were used as performance measures. The particle swarm optimization (PSO method was used to calibrate the rainfall-runoff models. Different lengths of data series ranging from one year to ten years, randomly sampled, were used to study the impact of calibration data series length. Fifty-five relatively unimpaired catchments located all over Australia with daily precipitation, potential evapotranspiration, and streamflow data were tested to obtain more general conclusions. The results show that longer calibration data series do not necessarily result in better model performance. In general, eight years of data are sufficient to obtain steady estimates of model performance and parameters for the SIMHYD model. It is also shown that most humid catchments require fewer calibration data to obtain a good performance and stable parameter values. The model performs better in humid and semi-humid catchments than in arid catchments. Our results may have useful and interesting implications for the efficiency of using limited observation data for hydrological model calibration in different climates.

  10. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    Science.gov (United States)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  11. Intuitionistic Fuzzy Time Series Forecasting Model Based on Intuitionistic Fuzzy Reasoning

    Directory of Open Access Journals (Sweden)

    Ya’nan Wang

    2016-01-01

    Full Text Available Fuzzy sets theory cannot describe the data comprehensively, which has greatly limited the objectivity of fuzzy time series in uncertain data forecasting. In this regard, an intuitionistic fuzzy time series forecasting model is built. In the new model, a fuzzy clustering algorithm is used to divide the universe of discourse into unequal intervals, and a more objective technique for ascertaining the membership function and nonmembership function of the intuitionistic fuzzy set is proposed. On these bases, forecast rules based on intuitionistic fuzzy approximate reasoning are established. At last, contrast experiments on the enrollments of the University of Alabama and the Taiwan Stock Exchange Capitalization Weighted Stock Index are carried out. The results show that the new model has a clear advantage of improving the forecast accuracy.

  12. Transfer function modeling of the monthly accumulated rainfall series over the Iberian Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Mateos, Vidal L.; Garcia, Jose A.; Serrano, Antonio; De la Cruz Gallego, Maria [Departamento de Fisica, Universidad de Extremadura, Badajoz (Spain)

    2002-10-01

    In order to improve the results given by Autoregressive Moving-Average (ARMA) modeling for the monthly accumulated rainfall series taken at 19 observatories of the Iberian Peninsula, a Discrete Linear Transfer Function Noise (DLTFN) model was applied taking the local pressure series (LP), North Atlantic sea level pressure series (SLP) and North Atlantic sea surface temperature (SST) as input variables, and the rainfall series as the output series. In all cases, the performance of the DLTFN models, measured by the explained variance of the rainfall series, is better than the performance given by the ARMA modeling. The best performance is given by the models that take the local pressure as the input variable, followed by the sea level pressure models and the sea surface temperature models. Geographically speaking, the models fitted to those observatories located in the west of the Iberian Peninsula work better than those on the north and east of the Peninsula. Also, it was found that there is a region located between 0 N and 20 N, which shows the highest cross-correlation between SST and the peninsula rainfalls. This region moves to the west and northwest off the Peninsula when the SLP series are used. [Spanish] Con el objeto de mejorar los resultados porporcionados por los modelos Autorregresivo Media Movil (ARMA) ajustados a las precipitaciones mensuales acumuladas registradas en 19 observatorios de la Peninsula Iberica se han usado modelos de funcion de transferencia (DLTFN) en los que se han empleado como variable independiente la presion local (LP), la presion a nivel del mar (SLP) o la temperatura de agua del mar (SST) en el Atlantico Norte. En todos los casos analizados, los resultados obtenidos con los modelos DLTFN, medidos mediante la varianza explicada por el modelo, han sido mejores que los resultados proporcionados por los modelos ARMA. Los mejores resultados han sido dados por aquellos modelos que usan la presion local como variable de entrada, seguidos

  13. SITE-94. The CRYSTAL Geosphere Transport Model: Technical documentation version 2.1

    International Nuclear Information System (INIS)

    Worgan, K.; Robinson, P.

    1995-12-01

    CRYSTAL, a one-dimensional contaminant transport model of a densely fissured geosphere, was originally developed for the SKI Project-90 performance assessment program. It has since been extended to include matrix blocks of alternative basic geometries. CRYSTAL predicts the transport of arbitrary-length decay chains by advection, diffusion and surface sorption in the fissures and diffusion into the rock matrix blocks. The model equations are solved in Laplace transform space, and inverted numerically to the time domain. This approach avoids time-stepping and consequently is numerically very efficient. The source term for crystal may be supplied internally using either simple leaching or band release submodels or by input of a general time-series output from a near-field model. The time series input is interfaced with the geosphere model using the method of convolution. The response of the geosphere to delta-function inputs from each nuclide is combined with the time series outputs from the near-field, to obtain the nuclide flux emerging from the far-field. 14 refs

  14. From Taylor series to Taylor models

    International Nuclear Information System (INIS)

    Berz, M.

    1997-01-01

    An overview of the background of Taylor series methods and the utilization of the differential algebraic structure is given, and various associated techniques are reviewed. The conventional Taylor methods are extended to allow for a rigorous treatment of bounds for the remainder of the expansion in a similarly universal way. Utilizing differential algebraic and functional analytic arguments on the set of Taylor models, arbitrary order integrators with rigorous remainder treatment are developed. The integrators can meet pre-specified accuracy requirements in a mathematically strict way, and are a stepping stone towards fully rigorous estimates of stability of repetitive systems. copyright 1997 American Institute of Physics

  15. State-space prediction model for chaotic time series

    Science.gov (United States)

    Alparslan, A. K.; Sayar, M.; Atilgan, A. R.

    1998-08-01

    A simple method for predicting the continuation of scalar chaotic time series ahead in time is proposed. The false nearest neighbors technique in connection with the time-delayed embedding is employed so as to reconstruct the state space. A local forecasting model based upon the time evolution of the topological neighboring in the reconstructed phase space is suggested. A moving root-mean-square error is utilized in order to monitor the error along the prediction horizon. The model is tested for the convection amplitude of the Lorenz model. The results indicate that for approximately 100 cycles of the training data, the prediction follows the actual continuation very closely about six cycles. The proposed model, like other state-space forecasting models, captures the long-term behavior of the system due to the use of spatial neighbors in the state space.

  16. Modeling Technical Change in Energy System Analysis: Analyzing the Introduction of Learning-by-Doing in Bottom-up Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Berglund, Christer; Soederholm, Patrik [Luleaa Univ. of Technology (Sweden). Div. of Economics

    2005-02-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aiming at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options - which is absent in many top-down models - they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they fail in capturing strategic technology diffusion behavior in the energy sector, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). For these reasons bottom-up and top-down models with induced technical change should not be viewed as substitutes but rather as complements.

  17. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...

  18. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress.

    Science.gov (United States)

    Cheng, Ching-Hsue; Chan, Chia-Pang; Yang, Jun-He

    2018-01-01

    The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.

  19. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  20. Cointegration and Error Correction Modelling in Time-Series Analysis: A Brief Introduction

    Directory of Open Access Journals (Sweden)

    Helmut Thome

    2015-07-01

    Full Text Available Criminological research is often based on time-series data showing some type of trend movement. Trending time-series may correlate strongly even in cases where no causal relationship exists (spurious causality. To avoid this problem researchers often apply some technique of detrending their data, such as by differencing the series. This approach, however, may bring up another problem: that of spurious non-causality. Both problems can, in principle, be avoided if the series under investigation are “difference-stationary” (if the trend movements are stochastic and “cointegrated” (if the stochastically changing trendmovements in different variables correspond to each other. The article gives a brief introduction to key instruments and interpretative tools applied in cointegration modelling.

  1. Wavelet entropy of BOLD time series: An application to Rolandic epilepsy.

    Science.gov (United States)

    Gupta, Lalit; Jansen, Jacobus F A; Hofman, Paul A M; Besseling, René M H; de Louw, Anton J A; Aldenkamp, Albert P; Backes, Walter H

    2017-12-01

    To assess the wavelet entropy for the characterization of intrinsic aberrant temporal irregularities in the time series of resting-state blood-oxygen-level-dependent (BOLD) signal fluctuations. Further, to evaluate the temporal irregularities (disorder/order) on a voxel-by-voxel basis in the brains of children with Rolandic epilepsy. The BOLD time series was decomposed using the discrete wavelet transform and the wavelet entropy was calculated. Using a model time series consisting of multiple harmonics and nonstationary components, the wavelet entropy was compared with Shannon and spectral (Fourier-based) entropy. As an application, the wavelet entropy in 22 children with Rolandic epilepsy was compared to 22 age-matched healthy controls. The images were obtained by performing resting-state functional magnetic resonance imaging (fMRI) using a 3T system, an 8-element receive-only head coil, and an echo planar imaging pulse sequence ( T2*-weighted). The wavelet entropy was also compared to spectral entropy, regional homogeneity, and Shannon entropy. Wavelet entropy was found to identify the nonstationary components of the model time series. In Rolandic epilepsy patients, a significantly elevated wavelet entropy was observed relative to controls for the whole cerebrum (P = 0.03). Spectral entropy (P = 0.41), regional homogeneity (P = 0.52), and Shannon entropy (P = 0.32) did not reveal significant differences. The wavelet entropy measure appeared more sensitive to detect abnormalities in cerebral fluctuations represented by nonstationary effects in the BOLD time series than more conventional measures. This effect was observed in the model time series as well as in Rolandic epilepsy. These observations suggest that the brains of children with Rolandic epilepsy exhibit stronger nonstationary temporal signal fluctuations than controls. 2 Technical Efficacy: Stage 3 J. Magn. Reson. Imaging 2017;46:1728-1737. © 2017 International Society for Magnetic

  2. SERI biomass program annual technical report: 1982

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, P.W.; Corder, R.E.; Hill, A.M.; Lindsey, H.; Lowenstein, M.Z.

    1983-02-01

    The biomass with which this report is concerned includes aquatic plants, which can be converted into liquid fuels and chemicals; organic wastes (crop residues as well as animal and municipal wastes), from which biogas can be produced via anerobic digestion; and organic or inorganic waste streams, from which hydrogen can be produced by photobiological processes. The Biomass Program Office supports research in three areas which, although distinct, all use living organisms to create the desired products. The Aquatic Species Program (ASP) supports research on organisms that are themselves processed into the final products, while the Anaerobic Digestion (ADP) and Photo/Biological Hydrogen Program (P/BHP) deals with organisms that transform waste streams into energy products. The P/BHP is also investigating systems using water as a feedstock and cell-free systems which do not utilize living organisms. This report summarizes the progress and research accomplishments of the SERI Biomass Program during FY 1982.

  3. Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox

    DEFF Research Database (Denmark)

    Nonejad, Nima

    This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...... and efficient framework for estimation. These advantages are used to for instance estimate stochastic volatility models with leverage effect or with Student-t distributed errors. We also model changing time series characteristics of the US inflation rate by considering a heteroskedastic ARFIMA model where...

  4. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  5. Application of the Laplace transform method for computational modelling of radioactive decay series

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Deise L.; Damasceno, Ralf M.; Barros, Ricardo C. [Univ. do Estado do Rio de Janeiro (IME/UERJ) (Brazil). Programa de Pos-graduacao em Ciencias Computacionais

    2012-03-15

    It is well known that when spent fuel is removed from the core, it is still composed of considerable amount of radioactive elements with significant half-lives. Most actinides, in particular plutonium, fall into this category, and have to be safely disposed of. One solution is to store the long-lived spent fuel as it is, by encasing and burying it deep underground in a stable geological formation. This implies estimating the transmutation of these radioactive elements with time. Therefore, we describe in this paper the application of the Laplace transform technique in matrix formulation to analytically solve initial value problems that mathematically model radioactive decay series. Given the initial amount of each type of radioactive isotopes in the decay series, the computer code generates the amount at a given time of interest, or may plot a graph of the evolution in time of the amount of each type of isotopes in the series. This computer code, that we refer to as the LTRad{sub L} code, where L is the number of types of isotopes belonging to the series, was developed using the Scilab free platform for numerical computation and can model one segment or the entire chain of any of the three radioactive series existing on Earth today. Numerical results are given to typical model problems to illustrate the computer code efficiency and accuracy. (orig.)

  6. Application of the Laplace transform method for computational modelling of radioactive decay series

    International Nuclear Information System (INIS)

    Oliveira, Deise L.; Damasceno, Ralf M.; Barros, Ricardo C.

    2012-01-01

    It is well known that when spent fuel is removed from the core, it is still composed of considerable amount of radioactive elements with significant half-lives. Most actinides, in particular plutonium, fall into this category, and have to be safely disposed of. One solution is to store the long-lived spent fuel as it is, by encasing and burying it deep underground in a stable geological formation. This implies estimating the transmutation of these radioactive elements with time. Therefore, we describe in this paper the application of the Laplace transform technique in matrix formulation to analytically solve initial value problems that mathematically model radioactive decay series. Given the initial amount of each type of radioactive isotopes in the decay series, the computer code generates the amount at a given time of interest, or may plot a graph of the evolution in time of the amount of each type of isotopes in the series. This computer code, that we refer to as the LTRad L code, where L is the number of types of isotopes belonging to the series, was developed using the Scilab free platform for numerical computation and can model one segment or the entire chain of any of the three radioactive series existing on Earth today. Numerical results are given to typical model problems to illustrate the computer code efficiency and accuracy. (orig.)

  7. Technical Support Essentials Advice to Succeed in Technical Support

    CERN Document Server

    Sanchez, Andrew

    2010-01-01

    Technical Support Essentials is a book about the many facets of technical support. It attempts to provide a wide array of topics to serve as points of improvement, discussion, or simply topics that you might want to learn. The topics range from good work habits to the way technical supportgroups establish their own style of work. This book applies theories, models, and concepts synthesized from existing research in other fields-such as management, economics, leadership, and psychology-and connects them to technical support. The goal is to build on the work of others and allow their success to

  8. FY 1998 Scientific and Technical Reports, Articles, Papers, and Presentations

    Science.gov (United States)

    Waits, J. E. Turner (Compiler)

    1999-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC (Marshall Space Flight Center) personnel in FY98. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the National Technical Information Service. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  9. TECHNICAL COORDINATION

    CERN Document Server

    A. Ball and W. Zeuner

    2013-01-01

    For the reporting period, the CMS common systems and infrastructure worked well, without failures that caused significant data losses. One more disconnection of the magnet cold box occurred in the shadow of interruptions in data taking, caused by a series of technical faults. The recognition during 2012 that re-connection can only safely be done at around 2 T implies a minimum magnet recovery time of 12 hours and raises serious concerns about the number of ramping cycles of the magnet these incidents cause. This has triggered studies of how to make the cryo-system of the magnet more robust against failures. The proton-proton run ended just before the end-of-year CERN closure, during which CASTOR was installed on the negative end of CMS and both ZDC calorimeters were installed in TAN absorbers the LHC tunnel, in preparation for the heavy-ion run. The installation of CASTOR was an excellent “engineering test” of procedures for working in an activated environment. Despite some technical pr...

  10. Time series modeling and forecasting using memetic algorithms for regime-switching models.

    Science.gov (United States)

    Bergmeir, Christoph; Triguero, Isaac; Molina, Daniel; Aznarte, José Luis; Benitez, José Manuel

    2012-11-01

    In this brief, we present a novel model fitting procedure for the neuro-coefficient smooth transition autoregressive model (NCSTAR), as presented by Medeiros and Veiga. The model is endowed with a statistically founded iterative building procedure and can be interpreted in terms of fuzzy rule-based systems. The interpretability of the generated models and a mathematically sound building procedure are two very important properties of forecasting models. The model fitting procedure employed by the original NCSTAR is a combination of initial parameter estimation by a grid search procedure with a traditional local search algorithm. We propose a different fitting procedure, using a memetic algorithm, in order to obtain more accurate models. An empirical evaluation of the method is performed, applying it to various real-world time series originating from three forecasting competitions. The results indicate that we can significantly enhance the accuracy of the models, making them competitive to models commonly used in the field.

  11. Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System

    Directory of Open Access Journals (Sweden)

    Wuyang Cheng

    2014-01-01

    Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.

  12. A new model for reliability optimization of series-parallel systems with non-homogeneous components

    International Nuclear Information System (INIS)

    Feizabadi, Mohammad; Jahromi, Abdolhamid Eshraghniaye

    2017-01-01

    In discussions related to reliability optimization using redundancy allocation, one of the structures that has attracted the attention of many researchers, is series-parallel structure. In models previously presented for reliability optimization of series-parallel systems, there is a restricting assumption based on which all components of a subsystem must be homogeneous. This constraint limits system designers in selecting components and prevents achieving higher levels of reliability. In this paper, a new model is proposed for reliability optimization of series-parallel systems, which makes possible the use of non-homogeneous components in each subsystem. As a result of this flexibility, the process of supplying system components will be easier. To solve the proposed model, since the redundancy allocation problem (RAP) belongs to the NP-hard class of optimization problems, a genetic algorithm (GA) is developed. The computational results of the designed GA are indicative of high performance of the proposed model in increasing system reliability and decreasing costs. - Highlights: • In this paper, a new model is proposed for reliability optimization of series-parallel systems. • In the previous models, there is a restricting assumption based on which all components of a subsystem must be homogeneous. • The presented model provides a possibility for the subsystems’ components to be non- homogeneous in the required conditions. • The computational results demonstrate the high performance of the proposed model in improving reliability and reducing costs.

  13. Neural network modeling of nonlinear systems based on Volterra series extension of a linear model

    Science.gov (United States)

    Soloway, Donald I.; Bialasiewicz, Jan T.

    1992-01-01

    A Volterra series approach was applied to the identification of nonlinear systems which are described by a neural network model. A procedure is outlined by which a mathematical model can be developed from experimental data obtained from the network structure. Applications of the results to the control of robotic systems are discussed.

  14. Technical Training: ELEC-2005: Electronics in High Energy Physics

    CERN Multimedia

    Monique Duval

    2005-01-01

    CERN Technical Training 2005: Learning for the LHC! ELEC-2005: Electronics in High Energy Physics - Spring Term ELEC-2005 is a new course series on modern electronics, given by CERN physicists and engineers within the framework of the 2005 Technical Training Programme, in an extended format of the successful ELEC-2002 course series. This comprehensive course series is designed for people who are not electronics specialists, for example physicists, engineers and technicians working at or visiting the laboratory, who use or will use electronics in their present or future activities, in particular in the context of the LHC accelerator and experiments. ELEC-2005 is composed of four Terms: the Winter Term, Introduction to electronics in HEP, already took place; the next three Terms will run throughout the year: Spring Term: Integrated circuits and VLSI technology for physics (March, 6 lectures) - now open for registration Summer Term: System electronics for physics: Issues (May, 7 lectures) Autumn Term: Ele...

  15. Adaptive Anchoring Model: How Static and Dynamic Presentations of Time Series Influence Judgments and Predictions.

    Science.gov (United States)

    Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick

    2018-01-01

    When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley

  16. Physics constrained nonlinear regression models for time series

    International Nuclear Information System (INIS)

    Majda, Andrew J; Harlim, John

    2013-01-01

    A central issue in contemporary science is the development of data driven statistical nonlinear dynamical models for time series of partial observations of nature or a complex physical model. It has been established recently that ad hoc quadratic multi-level regression (MLR) models can have finite-time blow up of statistical solutions and/or pathological behaviour of their invariant measure. Here a new class of physics constrained multi-level quadratic regression models are introduced, analysed and applied to build reduced stochastic models from data of nonlinear systems. These models have the advantages of incorporating memory effects in time as well as the nonlinear noise from energy conserving nonlinear interactions. The mathematical guidelines for the performance and behaviour of these physics constrained MLR models as well as filtering algorithms for their implementation are developed here. Data driven applications of these new multi-level nonlinear regression models are developed for test models involving a nonlinear oscillator with memory effects and the difficult test case of the truncated Burgers–Hopf model. These new physics constrained quadratic MLR models are proposed here as process models for Bayesian estimation through Markov chain Monte Carlo algorithms of low frequency behaviour in complex physical data. (paper)

  17. FY87 scientific and technical reports, articles, papers, and presentations

    Science.gov (United States)

    Turner, Joyce E. (Compiler)

    1987-01-01

    The document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY87. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, Va. 22161. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  18. FY 1988 scientific and technical reports, articles, papers and presentations

    Science.gov (United States)

    Turner, Joyce E. (Compiler)

    1988-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY 88. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the NationaL Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  19. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress

    Science.gov (United States)

    2018-01-01

    The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies. PMID:29765399

  20. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress

    Directory of Open Access Journals (Sweden)

    Ching-Hsue Cheng

    2018-01-01

    Full Text Available The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i the proposed model is different from the previous models lacking the concept of time series; (ii the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.

  1. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    Science.gov (United States)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  2. Time series models of environmental exposures: Good predictions or good understanding.

    Science.gov (United States)

    Barnett, Adrian G; Stephen, Dimity; Huang, Cunrui; Wolkewitz, Martin

    2017-04-01

    Time series data are popular in environmental epidemiology as they make use of the natural experiment of how changes in exposure over time might impact on disease. Many published time series papers have used parameter-heavy models that fully explained the second order patterns in disease to give residuals that have no short-term autocorrelation or seasonality. This is often achieved by including predictors of past disease counts (autoregression) or seasonal splines with many degrees of freedom. These approaches give great residuals, but add little to our understanding of cause and effect. We argue that modelling approaches should rely more on good epidemiology and less on statistical tests. This includes thinking about causal pathways, making potential confounders explicit, fitting a limited number of models, and not over-fitting at the cost of under-estimating the true association between exposure and disease. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Technical Training: EMAG-2005 - Electromagnetic Design and Mathematical Optimization Methods in Magnet Technology

    CERN Multimedia

    Monique Duval

    2005-01-01

    CERN Technical Training 2005: Learning for the LHC! CERN Technical Training, in collaboration with the AT-MEL-EM section, is organising a new course series in the framework of the 2005 CERN Technical Training programme: EMAG-2005 - Electromagnetic Design and Mathematical Optimization Methods in Magnet Technology, composed of three-hour lectures in the morning and topical seminars in the afternoon. The EMAG-2005 course series will run at CERN from Monday April 4 until Thursday April 14 (no lectures on Friday 8). The course series, in English, will focus on the foundations of electromagnetism and the design of accelerator magnets, both normal conducting and superconducting, employing analytical and numerical field computations. Examples of the LHC magnet design using the CERN field computation program ROXIE will be presented. However, EMAG-2005 is not a ROXIE user course: it is rather a course for users or potential users of numerical field computation software, and for magnet designers. The course will be o...

  4. A regional and nonstationary model for partial duration series of extreme rainfall

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Madsen, Henrik; Rosbjerg, Dan

    2017-01-01

    as the explanatory variables in the regional and temporal domain, respectively. Further analysis of partial duration series with nonstationary and regional thresholds shows that the mean exceedances also exhibit a significant variation in space and time for some rainfall durations, while the shape parameter is found...... of extreme rainfall. The framework is built on a partial duration series approach with a nonstationary, regional threshold value. The model is based on generalized linear regression solved by generalized estimation equations. It allows a spatial correlation between the stations in the network and accounts...... furthermore for variable observation periods at each station and in each year. Marginal regional and temporal regression models solved by generalized least squares are used to validate and discuss the results of the full spatiotemporal model. The model is applied on data from a large Danish rain gauge network...

  5. Book Review: "Hidden Markov Models for Time Series: An ...

    African Journals Online (AJOL)

    Hidden Markov Models for Time Series: An Introduction using R. by Walter Zucchini and Iain L. MacDonald. Chapman & Hall (CRC Press), 2009. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · http://dx.doi.org/10.4314/saaj.v10i1.61717 · AJOL African Journals Online.

  6. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  7. Comparison of ARIMA and Random Forest time series models for prediction of avian influenza H5N1 outbreaks.

    Science.gov (United States)

    Kane, Michael J; Price, Natalie; Scotch, Matthew; Rabinowitz, Peter

    2014-08-13

    Time series models can play an important role in disease prediction. Incidence data can be used to predict the future occurrence of disease events. Developments in modeling approaches provide an opportunity to compare different time series models for predictive power. We applied ARIMA and Random Forest time series models to incidence data of outbreaks of highly pathogenic avian influenza (H5N1) in Egypt, available through the online EMPRES-I system. We found that the Random Forest model outperformed the ARIMA model in predictive ability. Furthermore, we found that the Random Forest model is effective for predicting outbreaks of H5N1 in Egypt. Random Forest time series modeling provides enhanced predictive ability over existing time series models for the prediction of infectious disease outbreaks. This result, along with those showing the concordance between bird and human outbreaks (Rabinowitz et al. 2012), provides a new approach to predicting these dangerous outbreaks in bird populations based on existing, freely available data. Our analysis uncovers the time-series structure of outbreak severity for highly pathogenic avain influenza (H5N1) in Egypt.

  8. Advanced Grid Control Technologies Workshop Series | Energy Systems

    Science.gov (United States)

    : Smart Grid and Beyond John McDonald, Director, Technical Strategy and Policy Development, General Control Technologies Workshop Series In July 2015, NREL's energy systems integration team hosted workshops the Energy Systems Integration Facility (ESIF) and included a technology showcase featuring projects

  9. KBS Technical report 1-120 (1977-1978). Summaries

    International Nuclear Information System (INIS)

    1979-05-01

    The Swedish nuclear utilities started early in 1977 the KBS (nuclear fuel safety) project to study the high level waste problem and report on how and where a safe final storage could be arranged in Sweden. The docummentation produced by the project during 1977 and 1978 has been collected in a series of technical reports numbered from 1 to 120. The English summaries of the technical reports have been collected in this separate volume, No. 121. (G.B.)

  10. Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.

    Science.gov (United States)

    Ouyang, Yicun; Yin, Hujun

    2018-05-01

    Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.

  11. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  12. The evaluator as technical assistant: A model for systemic reform support

    Science.gov (United States)

    Century, Jeanne Rose

    This study explored evaluation of systemic reform. Specifically, it focused on the evaluation of a systemic effort to improve K-8 science, mathematics and technology education. The evaluation was of particular interest because it used both technical assistance and evaluation strategies. Through studying the combination of these roles, this investigation set out to increase understanding of potentially new evaluator roles, distinguish important characteristics of the evaluator/project participant relationship, and identify how these roles and characteristics contribute to effective evaluation of systemic science education reform. This qualitative study used interview, document analysis, and participant observation as methods of data collection. Interviews were conducted with project leaders, project participants, and evaluators and focused on the evaluation strategies and process, the use of the evaluation, and technical assistance. Documents analyzed included transcripts of evaluation team meetings and reports, memoranda and other print materials generated by the project leaders and the evaluators. Data analysis consisted of analytic and interpretive procedures consistent with the qualitative data collected and entailed a combined process of coding transcripts of interviews and meetings, field notes, and other documents; analyzing and organizing findings; writing of reflective and analytic memos; and designing and diagramming conceptual relationships. The data analysis resulted in the development of the Multi-Function Model for Systemic Reform Support. This model organizes systemic reform support into three functions: evaluation, technical assistance, and a third, named here as "systemic perspective." These functions work together to support the project's educational goals as well as a larger goal--building capacity in project participants. This model can now serve as an informed starting point or "blueprint" for strategically supporting systemic reform.

  13. Technical Training: ELEC-2005

    CERN Multimedia

    Davide Vitè

    2005-01-01

    ELEC-2005 - Electronics in High Energy Physics: Autumn Term (November-December 2005) ELEC-2005 is a new course series on modern electronics, given by CERN physicists and engineers within the framework of the 2005 Technical Training Programme, in an extended format of the ELEC-2002 course series. This new, comprehensive course series is designed for people who are not electronics specialists, for example physicists, engineers and technicians working at or visiting the laboratory, who use or will use electronics in their present or future activities, in particular in the context of the LHC accelerator and experiments. ELEC-2005 is composed of four Terms. The Winter (Introduction to electronics in HEP), Spring (Integrated circuits and VLSI technology for physics), and Summer (System electronics for physics: Issues) Terms already took place. The Autumn Term: Electronics applications in HEP experiments (November-December, 10 lectures) is now open for online registration, and will start on November 8th with the...

  14. Career and Technical Education (CTE) Student Success in Community Colleges: A Conceptual Model

    Science.gov (United States)

    Hirschy, Amy S.; Bremer, Christine D.; Castellano, Marisa

    2011-01-01

    Career and technical education (CTE) students pursuing occupational associate's degrees or certificates differ from students seeking academic majors at 2-year institutions in several ways. This article examines several theoretical models of student persistence and offers a conceptual model of student success focused on CTE students in community…

  15. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  16. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.

    Science.gov (United States)

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  17. A stochastic HMM-based forecasting model for fuzzy time series.

    Science.gov (United States)

    Li, Sheng-Tun; Cheng, Yi-Chung

    2010-10-01

    Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.

  18. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  19. 75 FR 6865 - Airworthiness Directives; The Boeing Company Model 737-700 (IGW) Series Airplanes Equipped With...

    Science.gov (United States)

    2010-02-12

    ... replacing aging float level switch conduit assemblies, periodically inspecting the external dry bay system... Model 737-700 (IGW) Series Airplanes Equipped With Auxiliary Fuel Tanks Installed in Accordance With... airworthiness directive (AD) for certain Model 737-700 (IGW) series airplanes. This proposed AD would require...

  20. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    Science.gov (United States)

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. INMM Physical Protection Technical Working Group Workshops

    International Nuclear Information System (INIS)

    Williams, J.D.

    1982-01-01

    The Institute of Nuclear Materials Management (INMM) established the Physical Protection Technical Working Group to be a focal point for INMM activities related to the physical protection of nuclear materials and facilities. The Technical Working Group has sponsored workshops with major emphasis on intrusion detection systems, entry control systems, and security personnel training. The format for these workshops has consisted of a series of small informal group discussions on specific subject matter which allows direct participation by the attendees and the exchange of ideas, experiences, and insights. This paper will introduce the reader to the activities of the Physical Protection Technical Working Group, to identify the workshops which have been held, and to serve as an introduction to the following three papers of this session

  2. Univaried models in the series of temperature of the air

    International Nuclear Information System (INIS)

    Leon Aristizabal Gloria esperanza

    2000-01-01

    The theoretical framework for the study of the air's temperature time series is the theory of stochastic processes, particularly those known as ARIMA, that make it possible to carry out a univaried analysis. ARIMA models are built in order to explain the structure of the monthly temperatures corresponding to the mean, the absolute maximum, absolute minimum, maximum mean and minimum mean temperatures, for four stations in Colombia. By means of those models, the possible evolution of the latter variables is estimated with predictive aims in mind. The application and utility of the models is discussed

  3. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  4. FY 1990 scientific and technical reports, articles, papers, and presentations

    Science.gov (United States)

    Turner, Joyce E. (Compiler)

    1990-01-01

    Formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY 90 are presented. Also included are papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from NTIS. The information may be of value to the scientific and engineering community in determining what information has been published and what is available.

  5. Advanced methods for modeling water-levels and estimating drawdowns with SeriesSEE, an Excel add-in

    Science.gov (United States)

    Halford, Keith; Garcia, C. Amanda; Fenelon, Joe; Mirus, Benjamin B.

    2012-12-21

    Water-level modeling is used for multiple-well aquifer tests to reliably differentiate pumping responses from natural water-level changes in wells, or “environmental fluctuations.” Synthetic water levels are created during water-level modeling and represent the summation of multiple component fluctuations, including those caused by environmental forcing and pumping. Pumping signals are modeled by transforming step-wise pumping records into water-level changes by using superimposed Theis functions. Water-levels can be modeled robustly with this Theis-transform approach because environmental fluctuations and pumping signals are simulated simultaneously. Water-level modeling with Theis transforms has been implemented in the program SeriesSEE, which is a Microsoft® Excel add-in. Moving average, Theis, pneumatic-lag, and gamma functions transform time series of measured values into water-level model components in SeriesSEE. Earth tides and step transforms are additional computed water-level model components. Water-level models are calibrated by minimizing a sum-of-squares objective function where singular value decomposition and Tikhonov regularization stabilize results. Drawdown estimates from a water-level model are the summation of all Theis transforms minus residual differences between synthetic and measured water levels. The accuracy of drawdown estimates is limited primarily by noise in the data sets, not the Theis-transform approach. Drawdowns much smaller than environmental fluctuations have been detected across major fault structures, at distances of more than 1 mile from the pumping well, and with limited pre-pumping and recovery data at sites across the United States. In addition to water-level modeling, utilities exist in SeriesSEE for viewing, cleaning, manipulating, and analyzing time-series data.

  6. Applying ARIMA model for annual volume time series of the Magdalena River

    OpenAIRE

    Gloria Amaris; Humberto Ávila; Thomas Guerrero

    2017-01-01

    Context: Climate change effects, human interventions, and river characteristics are factors that increase the risk on the population and the water resources. However, negative impacts such as flooding, and river droughts may be previously identified using appropriate numerical tools. Objectives: The annual volume (Millions of m3/year) time series of the Magdalena River was analyzed by an ARIMA model, using the historical time series of the Calamar station (Instituto de Hidrología, Meteoro...

  7. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  8. Fiscal year 1993 scientific and technical reports, articles, papers, and presentations

    Science.gov (United States)

    Turner, Joyce E. (Compiler)

    1993-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY93. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  9. The FY 1992 scientific and technical reports, articles, papers, and presentations

    Science.gov (United States)

    Turner, Joyce E. (Compiler)

    1992-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY92. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  10. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  11. PNNL Technical Support to The Implementation of EMTA and EMTA-NLA Models in Autodesk® Moldflow® Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Wang, Jin

    2012-12-01

    Under the Predictive Engineering effort, PNNL developed linear and nonlinear property prediction models for long-fiber thermoplastics (LFTs). These models were implemented in PNNL’s EMTA and EMTA-NLA codes. While EMTA is a standalone software for the computation of the composites thermoelastic properties, EMTA-NLA presents a series of nonlinear models implemented in ABAQUS® via user subroutines for structural analyses. In all these models, it is assumed that the fibers are linear elastic while the matrix material can exhibit a linear or typical nonlinear behavior depending on the loading prescribed to the composite. The key idea is to model the constitutive behavior of the matrix material and then to use an Eshelby-Mori-Tanaka approach (EMTA) combined with numerical techniques for fiber length and orientation distributions to determine the behavior of the as-formed composite. The basic property prediction models of EMTA and EMTA-NLA have been subject for implementation in the Autodesk® Moldflow® software packages. These models are the elastic stiffness model accounting for fiber length and orientation distributions, the fiber/matrix interface debonding model, and the elastic-plastic models. The PNNL elastic-plastic models for LFTs describes the composite nonlinear stress-strain response up to failure by an elastic-plastic formulation associated with either a micromechanical criterion to predict failure or a continuum damage mechanics formulation coupling damage to plasticity. All the models account for fiber length and orientation distributions as well as fiber/matrix debonding that can occur at any stage of loading. In an effort to transfer the technologies developed under the Predictive Engineering project to the American automotive and plastics industries, PNNL has obtained the approval of the DOE Office of Vehicle Technologies to provide Autodesk, Inc. with the technical support for the implementation of the basic property prediction models of EMTA and

  12. A parameter network and model pyramid for managing technical information flow

    International Nuclear Information System (INIS)

    Sinnock, S.; Hartman, H.A.

    1994-01-01

    Prototypes of information management tools have been developed that can help communicate the technical basis for nuclear waste disposal to a broad audience of program scientists and engineers, project managers, and informed observers from stakeholder organizations. These tools include system engineering concepts, parameter networks expressed as influence diagrams, associated model hierarchies, and a relational database. These tools are used to express relationships among data-collection parameters, model input parameters, model output parameters, systems requirements, physical elements of a system description, and functional analysis of the contribution of physical elements and their associated parameters in satisfying the system requirements. By organizing parameters, models, physical elements, functions, and requirements in a visually reviewable network and a relational database the severe communication challenges facing participants in the nuclear waste dialog can be addressed. The network identifies the influences that data collected in the field have on measures of repository suitability, providing a visual, traceable map that clarifies the role of data and models in supporting conclusions about repository suitability. The map allows conclusions to be traced directly to the underlying parameters and models. Uncertainty in these underlying elements can be exposed to open review in the context of the effects uncertainty has on judgements about suitability. A parameter network provides a stage upon which an informed social dialog about the technical merits of a nuclear waste repository can be conducted. The basis for such dialog must be that stage, if decisions about repository suitability are to be based on a repository's ability to meet requirements embodied in laws and regulations governing disposal of nuclear wastes

  13. Formal modelling and analysis of socio-technical systems

    DEFF Research Database (Denmark)

    Probst, Christian W.; Kammüller, Florian; Hansen, Rene Rydhof

    2016-01-01

    systems are still mostly identified through brainstorming of experts. In this work we discuss several approaches to formalising socio-technical systems and their analysis. Starting from a flow logic-based analysis of the insider threat, we discuss how to include the socio aspects explicitly, and show......Attacks on systems and organisations increasingly exploit human actors, for example through social engineering. This non-technical aspect of attacks complicates their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio-technical...... a formalisation that proves properties of this formalisation. On the formal side, our work closes the gap between formal and informal approaches to socio-technical systems. On the informal side, we show how to steal a birthday cake from a bakery by social engineering....

  14. Time series ARIMA models for daily price of palm oil

    Science.gov (United States)

    Ariff, Noratiqah Mohd; Zamhawari, Nor Hashimah; Bakar, Mohd Aftar Abu

    2015-02-01

    Palm oil is deemed as one of the most important commodity that forms the economic backbone of Malaysia. Modeling and forecasting the daily price of palm oil is of great interest for Malaysia's economic growth. In this study, time series ARIMA models are used to fit the daily price of palm oil. The Akaike Infromation Criterion (AIC), Akaike Infromation Criterion with a correction for finite sample sizes (AICc) and Bayesian Information Criterion (BIC) are used to compare between different ARIMA models being considered. It is found that ARIMA(1,2,1) model is suitable for daily price of crude palm oil in Malaysia for the year 2010 to 2012.

  15. Stochastic series expansion simulation of the t -V model

    Science.gov (United States)

    Wang, Lei; Liu, Ye-Hua; Troyer, Matthias

    2016-04-01

    We present an algorithm for the efficient simulation of the half-filled spinless t -V model on bipartite lattices, which combines the stochastic series expansion method with determinantal quantum Monte Carlo techniques widely used in fermionic simulations. The algorithm scales linearly in the inverse temperature, cubically with the system size, and is free from the time-discretization error. We use it to map out the finite-temperature phase diagram of the spinless t -V model on the honeycomb lattice and observe a suppression of the critical temperature of the charge-density-wave phase in the vicinity of a fermionic quantum critical point.

  16. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  17. A Technical Review on Biomass Processing: Densification, Preprocessing, Modeling and Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright

    2010-06-01

    It is now a well-acclaimed fact that burning fossil fuels and deforestation are major contributors to climate change. Biomass from plants can serve as an alternative renewable and carbon-neutral raw material for the production of bioenergy. Low densities of 40–60 kg/m3 for lignocellulosic and 200–400 kg/m3 for woody biomass limits their application for energy purposes. Prior to use in energy applications these materials need to be densified. The densified biomass can have bulk densities over 10 times the raw material helping to significantly reduce technical limitations associated with storage, loading and transportation. Pelleting, briquetting, or extrusion processing are commonly used methods for densification. The aim of the present research is to develop a comprehensive review of biomass processing that includes densification, preprocessing, modeling and optimization. The specific objective include carrying out a technical review on (a) mechanisms of particle bonding during densification; (b) methods of densification including extrusion, briquetting, pelleting, and agglomeration; (c) effects of process and feedstock variables and biomass biochemical composition on the densification (d) effects of preprocessing such as grinding, preheating, steam explosion, and torrefaction on biomass quality and binding characteristics; (e) models for understanding the compression characteristics; and (f) procedures for response surface modeling and optimization.

  18. Inactive Tanks Remediation Program Batch I, Series I tanks 3001-B, 3004-B, 3013, and T-30 technical memorandum. Environmental Restoration Program

    International Nuclear Information System (INIS)

    1995-05-01

    This technical memorandum provides information that can be used to make decisions concerning the disposition of four inactive tank systems that have been designated Batch 1, Series 1, by the Inactive Tanks Remediation Program team. The Batch I, Series 1, tanks are 3001-B, 3004-B, 3013, and T-30. The report offers viable alternatives for tank system disposition. The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) requires a Federal Facility Agreement (FFA) for federal facilities placed on the National Priorities List. The Oak Ridge Reservation was placed on that list on December 21, 1989, and the agreement was signed in November 1991 by DOE's Oak Ridge Operations Office, the US Environmental Protection Agency-Region IV, and the Tennessee Department of Environment and Conservation. The effective date of the FFA is January 1, 1992. One objective of the FFA is to ensure that inactive liquid low-level radioactive waste tank systems are evaluated and, if appropriate, remediated through the CERCLA process. The Inactive Tanks Remediation Program and the Gunite and Associated Tanks Project (GAAT) are the two efforts that will meet this FFA objective. This memorandum addresses tank systems within the Inactive Tanks Remediation Program. Separate CERCLA documentation addresses the tank systems within the GAAT Project

  19. Moving Mini-Max - a new indicator for technical analysis

    OpenAIRE

    Z. K. Silagadze

    2008-01-01

    We propose a new indicator for technical analysis. The indicator emphasizes maximums and minimums in price series with inherent smoothing and has a potential to be useful in both mechanical trading rules and chart pattern analysis.

  20. Technical Training Seminar

    CERN Multimedia

    2003-01-01

    Wednesday 19 November From 9:30 to 17:30 - IT Amphitheatre bldg.31 3-004 2003 Data Transmission Design Seminar - TEXAS INSTRUMENTS The one-day 2003 Data Transmission Design Seminar is part of a series of technical seminars produced by Texas Instruments. This series of seminars will introduce system solutions based on different standards according to their respective requirements. It will include topics containing a blend of basic principles and hands-on application examples, along with layout-guidelines. • Basics and Practical Examples of Data-Transmission • Low Voltage Differential Signaling (LVDS) • Connectivity and PC-based Links • Industrial Interfaces • Parallel Bus Systems and Clock-Distribution-Circuits (CDC) This Seminar will be of interest to hardware designers and system engineers dealing with the realization of system- or board-level interfaces based on various transmission standards. Language: English Free seminar, no registration Organiser: John ...

  1. French N4 Series Design and Manufacturing

    International Nuclear Information System (INIS)

    Lebreton, Gerard

    1989-01-01

    The construction contract for the first two N4 units. on the Chooz site close to the French-Belgian border (Chooz B1 and B2). was awarded by Electricite de France (EDF) to Fumarate in May 1984. At present, project construction is approximately 50% complete for unit B1. The main civil works are practically finished and the reactor vessel and the steam generator were delivered on site by mid-1988. The connection to the grid by 1991 appears quite feasible. Continuation of the French nuclear power programme in the 1990s, specifically on the CIVEX and Penly sites, is based on this model. The N4 thus follows the four-loop, 1,300 MW class series designated P.a. whose first unit is Cattenom 1, which will serve in the following as a reference to appreciate the design evolution of the N4 NSSS. The design evolutions selected to reach these objectives are in technical continuity with the previous series, integrate the vast experience gained within the French programme and abroad, and were supported by a large R and D programme and further by industrial qualification tests

  2. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    Science.gov (United States)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  3. Decision support for life extension of technical systems through virtual age modelling

    International Nuclear Information System (INIS)

    Pérez Ramírez, Pedro A.; Utne, Ingrid Bouwer

    2013-01-01

    This article presents a virtual age model for decision support regarding life extension of ageing repairable systems. The aim of the model is to evaluate different life extension decision alternatives and their impact on the future performance of the system. The model can be applied to systems operated continuously (e.g., process systems) and systems operated on demand (e.g., safety systems). Deterioration and efficiency of imperfect maintenance is assessed when there is limited or no degradation data, and only failure and maintenance data is available. Systems that are in operation can be studied, meaning that the systems may be degraded. The current degradation is represented by a “current virtual age”, which is calculated from recorded maintenance data. The model parameters are estimated with the maximum likelihood method. A case study illustrates the application of the model for life extension of two fire water pumps in an oil and gas facility. The performance of the pump system is assessed with respect to number of failures, safety unavailability and costs during the life extension period. -- Highlights: ► Life extension assessment of technical systems using virtual age model is proposed. ► A virtual age model is generalised for systems in stand-by and continuous operation. ► The concept of current virtual age describes technical condition of the system. ► Different decision alternatives for life extension can be easily analysed. ► The decision process is improved even when only scarce failure data is available

  4. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.

    1977-01-01

    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  5. Scientific-technical reports by the GKSS in 1975

    International Nuclear Information System (INIS)

    1976-01-01

    The scientific-technical reports of the GKSS which are for the use within the GKSS (internal reports), are summarized in the I-series, whereas the E-series contains the publications suitable for external circulation and for exchange of information with other research facilities and industrial undertakings. The survey contains data on titles and authors of all GKSS reports published in the year 1975. Of the external reports, abstracts and/or bibliographical data are reproduced in the appendix. (orig/LH) [de

  6. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong,T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. In I. A. Sánchez & P. Isaías (Eds.), Proceedings of the IADIS Mobile Learning Conference 2008 (pp. 206-210). April, 11-13, 2008, Carvoeiro, Portugal.

  7. Formal Modelling and Analysis of Socio-Technical Systems

    NARCIS (Netherlands)

    Probst, Christian W.; Kammüller, Florian; Rydhof Hansen, René; Probst, Christian W.; Hankin, Chris; Rydhof Hansen, René

    2015-01-01

    Attacks on systems and organisations increasingly exploit human actors, for example through social engineering. This non-technical aspect of attacks complicates their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio-technical

  8. Neural Network Models for Time Series Forecasts

    OpenAIRE

    Tim Hill; Marcus O'Connor; William Remus

    1996-01-01

    Neural networks have been advocated as an alternative to traditional statistical forecasting methods. In the present experiment, time series forecasts produced by neural networks are compared with forecasts from six statistical time series methods generated in a major forecasting competition (Makridakis et al. [Makridakis, S., A. Anderson, R. Carbone, R. Fildes, M. Hibon, R. Lewandowski, J. Newton, E. Parzen, R. Winkler. 1982. The accuracy of extrapolation (time series) methods: Results of a ...

  9. MARKOWITZ' MODEL WITH FUNDAMENTAL AND TECHNICAL ANALYSIS – COMPLEMENTARY METHODS OR NOT

    Directory of Open Access Journals (Sweden)

    Branka Marasović

    2011-02-01

    Full Text Available As it is well known there are few “starting points” in portfolio optimization process, i.e. in the stock selection process. Famous Markowitz’ optimization model is unavoidable in this job. On the other side, someone may say that the indicators of the fundamental analysis must be the starting point. Beside that, the suggestions of the technical analysis must be taken into consideration. There are really numerous studies of the each approach separately, but it is almost impossible to find researches combining these approaches in logic and efficient unity. The main task of the paper is to find out if these approaches are complementary and if they are, how to apply them as efficient unit process. The empirical part of the study uses share sample from the Croatian stock market. Beside Markowitz’ MV model, fundamental and technical analysis, big role in the paper has an original multi-criterion approach.

  10. Technical basis for the ITER-FEAT outline design

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-11-01

    This ITER EDA Documentation Series issue summarizes the results of the ITER Engineering Design Activities on the technical basis for the ITER-FEAT outline design. This issue also comprises some physical analysis activities as well as structure and goals of the Physics Expert Group activities.

  11. Technical basis for the ITER-FEAT outline design

    International Nuclear Information System (INIS)

    2000-01-01

    This ITER EDA Documentation Series issue summarizes the results of the ITER Engineering Design Activities on the technical basis for the ITER-FEAT outline design. This issue also comprises some physical analysis activities as well as structure and goals of the Physics Expert Group activities

  12. Modeling of plasma chemistry in a corona streamer pulse series in air

    International Nuclear Information System (INIS)

    Nowakowska, H.; Stanco, J.; Dors, M.; Mizeraczyk, J.

    2002-01-01

    The aim of this study is to analyse the chemistry in air treated by a series of corona discharge streamers. Attention is focused on the conversion of ozone and nitrogen oxides. In the model it is assumed that the streamer head of relatively small geometrical dimensions propagates from the anode to the cathode, leaving the streamer channel behind. Any elemental gas volume in the streamer path is subjected first to the conditions of the streamer head, and next to those of the streamer channel. The kinetics of plasma-chemical processes occurring in the gas is modeled numerically for a single streamer and a series of streamers. The temporal evolution of 25 chemical compounds initially present or produced in air is calculated. (author)

  13. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  14. Advancements in the behavioral modeling of fuel elements and related structures

    International Nuclear Information System (INIS)

    Billone, M.C.; Montgomery, R.O.; Rashid, Y.R.; Head, J.L.

    1989-01-01

    An important aspect of the design and analysis of nuclear reactors is the ability to predict the behavior of fuel elements in the adverse environment of a reactor system. By understanding the thermomechanical behavior of the different materials which constitute a nuclear fuel element, analysis and predictions can be made regarding the integrity and reliability of fuel element designs. The SMiRT conference series, through the division on fuel elements and the post-conference seminars on fuel element modeling, provided technical forums for the international participation in the exchange of knowledge concerning the thermomechanical modeling of fuel elements. This paper discusses the technical advances in the behavioral modeling of fuel elements presented at the SMiRT conference series since its inception in 1971. Progress in the areas of material properties and constitutive relationships, modeling methodologies, and integral modeling approaches was reviewed and is summarized in light of their impact on the thermomechanical modeling of nuclear fuel elements. 34 refs., 5 tabs

  15. National Profiles in Technical and Vocational Education in Asia and the Pacific: Australia.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This technical and vocational education (TVE) profile on Australia is one in a series of profiles of UNESCO member countries. It is intended to be a handy reference on TVE systems, staff development, technical cooperation, and information networking. Chapter 1 describes the demography, government, and economy of Australia. Chapter 2 provides…

  16. Time series modelling to forecast prehospital EMS demand for diabetic emergencies.

    Science.gov (United States)

    Villani, Melanie; Earnest, Arul; Nanayakkara, Natalie; Smith, Karen; de Courten, Barbora; Zoungas, Sophia

    2017-05-05

    Acute diabetic emergencies are often managed by prehospital Emergency Medical Services (EMS). The projected growth in prevalence of diabetes is likely to result in rising demand for prehospital EMS that are already under pressure. The aims of this study were to model the temporal trends and provide forecasts of prehospital attendances for diabetic emergencies. A time series analysis on monthly cases of hypoglycemia and hyperglycemia was conducted using data from the Ambulance Victoria (AV) electronic database between 2009 and 2015. Using the seasonal autoregressive integrated moving average (SARIMA) modelling process, different models were evaluated. The most parsimonious model with the highest accuracy was selected. Forty-one thousand four hundred fifty-four prehospital diabetic emergencies were attended over a seven-year period with an increase in the annual median monthly caseload between 2009 (484.5) and 2015 (549.5). Hypoglycemia (70%) and people with type 1 diabetes (48%) accounted for most attendances. The SARIMA (0,1,0,12) model provided the best fit, with a MAPE of 4.2% and predicts a monthly caseload of approximately 740 by the end of 2017. Prehospital EMS demand for diabetic emergencies is increasing. SARIMA time series models are a valuable tool to allow forecasting of future caseload with high accuracy and predict increasing cases of prehospital diabetic emergencies into the future. The model generated by this study may be used by service providers to allow appropriate planning and resource allocation of EMS for diabetic emergencies.

  17. Technical report on comparative analysis of ASME QA requirements and ISO series

    International Nuclear Information System (INIS)

    Kim, Kwan Hyun

    2000-06-01

    This technical report provides the differences on the QA requirement ASME and ISO in nuclear fields. This report applies to the quality assurance(QA) programmes of the design of two requirement. The organization having overall responsibility for the nuclear design, preservation, fabrication shall be described in this report in each stage of design project

  18. Modelling of series of types of automated trenchless works tunneling

    Science.gov (United States)

    Gendarz, P.; Rzasinski, R.

    2016-08-01

    Microtunneling is the newest method for making underground installations. Show method is the result of experience and methods applied in other, previous methods of trenchless underground works. It is considered reasonable to elaborate a series of types of construction of tunneling machines, to develop this particular earthworks method. There are many design solutions of machines, but the current goal is to develop non - excavation robotized machine. Erosion machines with main dimensions of the tunnels which are: 1600, 2000, 2500, 3150 are design with use of the computer aided methods. Series of types of construction of tunneling machines creating process was preceded by analysis of current state. The verification of practical methodology of creating the systematic part series was based on the designed erosion machines series of types. There were developed: method of construction similarity of the erosion machines, algorithmic methods of quantitative construction attributes variant analyzes in the I-DEAS advanced graphical program, relational and program parameterization. There manufacturing process of the parts will be created, which allows to verify the technological process on the CNC machines. The models of designed will be modified and the construction will be consulted with erosion machine users and manufacturers like: Tauber Rohrbau GmbH & Co.KG from Minster, OHL ZS a.s. from Brna,. The companies’ acceptance will result in practical verification by JUMARPOL company.

  19. Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.

    Science.gov (United States)

    Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark

    2016-01-01

    Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.

  20. A Feature Fusion Based Forecasting Model for Financial Time Series

    Science.gov (United States)

    Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie

    2014-01-01

    Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455

  1. A time series modeling approach in risk appraisal of violent and sexual recidivism.

    Science.gov (United States)

    Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E

    2010-10-01

    For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.

  2. Forecast models for suicide: Time-series analysis with data from Italy.

    Science.gov (United States)

    Preti, Antonio; Lentini, Gianluca

    2016-01-01

    The prediction of suicidal behavior is a complex task. To fine-tune targeted preventative interventions, predictive analytics (i.e. forecasting future risk of suicide) is more important than exploratory data analysis (pattern recognition, e.g. detection of seasonality in suicide time series). This study sets out to investigate the accuracy of forecasting models of suicide for men and women. A total of 101 499 male suicides and of 39 681 female suicides - occurred in Italy from 1969 to 2003 - were investigated. In order to apply the forecasting model and test its accuracy, the time series were split into a training set (1969 to 1996; 336 months) and a test set (1997 to 2003; 84 months). The main outcome was the accuracy of forecasting models on the monthly number of suicides. These measures of accuracy were used: mean absolute error; root mean squared error; mean absolute percentage error; mean absolute scaled error. In both male and female suicides a change in the trend pattern was observed, with an increase from 1969 onwards to reach a maximum around 1990 and decrease thereafter. The variances attributable to the seasonal and trend components were, respectively, 24% and 64% in male suicides, and 28% and 41% in female ones. Both annual and seasonal historical trends of monthly data contributed to forecast future trends of suicide with a margin of error around 10%. The finding is clearer in male than in female time series of suicide. The main conclusion of the study is that models taking seasonality into account seem to be able to derive information on deviation from the mean when this occurs as a zenith, but they fail to reproduce it when it occurs as a nadir. Preventative efforts should concentrate on the factors that influence the occurrence of increases above the main trend in both seasonal and cyclic patterns of suicides.

  3. Time Series with Long Memory

    OpenAIRE

    西埜, 晴久

    2004-01-01

    The paper investigates an application of long-memory processes to economic time series. We show properties of long-memory processes, which are motivated to model a long-memory phenomenon in economic time series. An FARIMA model is described as an example of long-memory model in statistical terms. The paper explains basic limit theorems and estimation methods for long-memory processes in order to apply long-memory models to economic time series.

  4. FY 1996 Scientific and Technical Reports, Articles, Papers, and Presentations. Volume 1

    Science.gov (United States)

    Turner-Waits, Joyce E. (Compiler)

    1996-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY96. It also includes papers of MSFC contractors. After being announced in STAR, all of the NASA series reports may be obtained from the National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  5. Generation of Natural Runoff Monthly Series at Ungauged Sites Using a Regional Regressive Model

    Directory of Open Access Journals (Sweden)

    Dario Pumo

    2016-05-01

    Full Text Available Many hydrologic applications require reliable estimates of runoff in river basins to face the widespread lack of data, both in time and in space. A regional method for the reconstruction of monthly runoff series is here developed and applied to Sicily (Italy. A simple modeling structure is adopted, consisting of a regression-based rainfall–runoff model with four model parameters, calibrated through a two-step procedure. Monthly runoff estimates are based on precipitation, temperature, and exploiting the autocorrelation with runoff at the previous month. Model parameters are assessed by specific regional equations as a function of easily measurable physical and climate basin descriptors. The first calibration step is aimed at the identification of a set of parameters optimizing model performances at the level of single basin. Such “optimal” sets are used at the second step, part of a regional regression analysis, to establish the regional equations for model parameters assessment as a function of basin attributes. All the gauged watersheds across the region have been analyzed, selecting 53 basins for model calibration and using the other six basins exclusively for validation. Performances, quantitatively evaluated by different statistical indexes, demonstrate relevant model ability in reproducing the observed hydrological time-series at both the monthly and coarser time resolutions. The methodology, which is easily transferable to other arid and semi-arid areas, provides a reliable tool for filling/reconstructing runoff time series at any gauged or ungauged basin of a region.

  6. National Profiles in Technical and Vocational Education in Asia and the Pacific: Fiji.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This technical and vocational education (TVE) profile on Fiji is one in a series of profiles of UNESCO member countries. It is intended to be a handy reference on TVE systems, staff development, technical cooperation, and information networking. Part I, General Information, covers the following: location, area, and physical features; economic and…

  7. National Profiles in Technical and Vocational Education in Asia and the Pacific: Bangladesh.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This technical and vocational education (TVE) profile on Bangladesh is one in a series of profiles of UNESCO member countries. It is intended to be a handy reference on TVE systems, staff development, technical cooperation, and information networking. An overview of the report appears first. Part I provides general information on the physical…

  8. National Profiles in Technical and Vocational Education in Asia and the Pacific: Bhutan.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This technical and vocational education (TVE) profile on Bhutan is one in a series of profiles of UNESCO member countries. It is intended to be a handy reference on TVE systems, staff development, technical cooperation, and information networking. Part I, Policy Concern, provides general information on the education delivery system, the role of…

  9. Exploring Electrochromics: A Series of Eye-Catching Experiments to Introduce Students to Multidisciplinary Research

    Science.gov (United States)

    Small, Leo J.; Wolf, Steven; Spoerke, Erik D.

    2014-01-01

    Introducing students to a multidisciplinary research laboratory presents challenges in terms of learning specific technical skills and concepts but also with respect to integrating different technical elements to form a coherent picture of the research. Here we present a multidisciplinary series of experiments we have developed in the Electronic,…

  10. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    Science.gov (United States)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  11. Applying ARIMA model for annual volume time series of the Magdalena River

    Directory of Open Access Journals (Sweden)

    Gloria Amaris

    2017-04-01

    Conclusions: The simulated results obtained with the ARIMA model compared to the observed data showed a fairly good adjustment of the minimum and maximum magnitudes. This allows concluding that it is a good tool for estimating minimum and maximum volumes, even though this model is not capable of simulating the exact behaviour of an annual volume time series.

  12. Risk-based technical specifications: Development and application of an approach to the generation of a plant specific real-time risk model

    International Nuclear Information System (INIS)

    Puglia, B.; Gallagher, D.; Amico, P.; Atefi, B.

    1992-10-01

    This report describes a process developed to convert an existing PRA into a model amenable to real time, risk-based technical specification calculations. In earlier studies (culminating in NUREG/CR-5742), several risk-based approaches to technical specification were evaluated. A real-time approach using a plant specific PRA capable of modeling plant configurations as they change was identified as the most comprehensive approach to control plant risk. A master fault tree logic model representative of-all of the core damage sequences was developed. Portions of the system fault trees were modularized and supercomponents comprised of component failures with similar effects were developed to reduce the size of the model and, quantification times. Modifications to the master fault tree logic were made to properly model the effect of maintenance and recovery actions. Fault trees representing several actuation systems not modeled in detail in the existing PRA were added to the master fault tree logic. This process was applied to the Surry NUREG-1150 Level 1 PRA. The master logic mode was confirmed. The model was then used to evaluate frequency associated with several plant configurations using the IRRAS code. For all cases analyzed computational time was less than three minutes. This document Volume 2, contains appendices A, B, and C. These provide, respectively: Surry Technical Specifications Model Database, Surry Technical Specifications Model, and a list of supercomponents used in the Surry Technical Specifications Model

  13. On the Statistical Validation of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Rosane Riera Freire

    2007-06-01

    Full Text Available Technical analysis, or charting, aims on visually identifying geometrical patterns in price charts in order to antecipate price "trends". In this paper we revisit the issue of thecnical analysis validation which has been tackled in the literature without taking care for (i the presence of heterogeneity and (ii statistical dependence in the analyzed data - various agglutinated return time series from distinct financial securities. The main purpose here is to address the first cited problem by suggesting a validation methodology that also "homogenizes" the securities according to the finite dimensional probability distribution of their return series. The general steps go through the identification of the stochastic processes for the securities returns, the clustering of similar securities and, finally, the identification of presence, or absence, of informatinal content obtained from those price patterns. We illustrate the proposed methodology with a real data exercise including several securities of the global market. Our investigation shows that there is a statistically significant informational content in two out of three common patterns usually found through technical analysis, namely: triangle, rectangle and head and shoulders.

  14. The partial duration series method in regional index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    A regional index-flood method based on the partial duration series model is introduced. The model comprises the assumptions of a Poisson-distributed number of threshold exceedances and generalized Pareto (GP) distributed peak magnitudes. The regional T-year event estimator is based on a regional...... estimator is superior to the at-site estimator even in extremely heterogenous regions, the performance of the regional estimator being relatively better in regions with a negative shape parameter. When the record length increases, the relative performance of the regional estimator decreases, but it is still...

  15. Investigation of prospects for forecasting non-linear time series by example of drilling oil and gas wells

    Science.gov (United States)

    Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.

    2018-05-01

    Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.

  16. Comparison of Uncertainty of Two Precipitation Prediction Models at Los Alamos National Lab Technical Area 54

    Energy Technology Data Exchange (ETDEWEB)

    Shield, Stephen Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dai, Zhenxue [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-18

    Meteorological inputs are an important part of subsurface flow and transport modeling. The choice of source for meteorological data used as inputs has significant impacts on the results of subsurface flow and transport studies. One method to obtain the meteorological data required for flow and transport studies is the use of weather generating models. This paper compares the difference in performance of two weather generating models at Technical Area 54 of Los Alamos National Lab. Technical Area 54 is contains several waste pits for low-level radioactive waste and is the site for subsurface flow and transport studies. This makes the comparison of the performance of the two weather generators at this site particularly valuable.

  17. National Profiles in Technical and Vocational Education in Asia and the Pacific: Nepal.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    One of a series of studies on the development of technical and vocational education in the member states of UNESCO, this report profiles the educational system in Nepal. The four parts of the document provide general information about the following: the country; the history, goals, and structure of the educational system; vocational technical and…

  18. National Profiles in Technical and Vocational Education in Asia and the Pacific: Indonesia.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This technical and vocational education (TVE) profile on Indonesia is one in a series of profiles of UNESCO member countries. It is intended to be a handy reference on TVE systems, staff development, technical cooperation, and information networking. A two-page overview lists TVE goals, outlines the structure and governance, and lists the six…

  19. Validation of the inverse pulse wave transit time series as surrogate of systolic blood pressure in MVAR modeling.

    Science.gov (United States)

    Giassi, Pedro; Okida, Sergio; Oliveira, Maurício G; Moraes, Raimes

    2013-11-01

    Short-term cardiovascular regulation mediated by the sympathetic and parasympathetic branches of the autonomic nervous system has been investigated by multivariate autoregressive (MVAR) modeling, providing insightful analysis. MVAR models employ, as inputs, heart rate (HR), systolic blood pressure (SBP) and respiratory waveforms. ECG (from which HR series is obtained) and respiratory flow waveform (RFW) can be easily sampled from the patients. Nevertheless, the available methods for acquisition of beat-to-beat SBP measurements during exams hamper the wider use of MVAR models in clinical research. Recent studies show an inverse correlation between pulse wave transit time (PWTT) series and SBP fluctuations. PWTT is the time interval between the ECG R-wave peak and photoplethysmography waveform (PPG) base point within the same cardiac cycle. This study investigates the feasibility of using inverse PWTT (IPWTT) series as an alternative input to SBP for MVAR modeling of the cardiovascular regulation. For that, HR, RFW, and IPWTT series acquired from volunteers during postural changes and autonomic blockade were used as input of MVAR models. Obtained results show that IPWTT series can be used as input of MVAR models, replacing SBP measurements in order to overcome practical difficulties related to the continuous sampling of the SBP during clinical exams.

  20. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    Directory of Open Access Journals (Sweden)

    Yanhui Xi

    2016-01-01

    Full Text Available The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation. With the new representations, a theoretical explanation of leverage effect is provided. Simulated data and daily stock market indices (Shanghai composite index, Shenzhen component index, and Standard and Poor’s 500 Composite index via Bayesian Markov Chain Monte Carlo (MCMC method are used to estimate the leverage market microstructure model. The results verify the effectiveness of the model and its estimation approach proposed in the paper and also indicate that the stock markets have strong leverage effects. Compared with the classical leverage stochastic volatility (SV model in terms of DIC (Deviance Information Criterion, the leverage market microstructure model fits the data better.

  1. Electrical Power and Illumination Systems. Energy Technology Series.

    Science.gov (United States)

    Center for Occupational Research and Development, Inc., Waco, TX.

    This course in electrical power and illumination systems is one of 16 courses in the Energy Technology Series developed for an Energy Conservation-and-Use Technology curriculum. Intended for use in two-year postsecondary technical institutions to prepare technicians for employment, the courses are also useful in industry for updating employees in…

  2. Mathematical model for estimating of technical and technological indicators of railway stations operation

    Directory of Open Access Journals (Sweden)

    D.M. Kozachenko

    2013-06-01

    Full Text Available Purpose. The article aims to create a mathematical model of the railway station functioning for the solving of problems of station technology development on the plan-schedule basis. Methodology. The methods of graph theory and object-oriented analysis are used as research methods. The model of the station activity plan-schedule includes a model of technical equipment of the station (plan-schedule net and a model of the station functioning , which are formalized on the basis of parametric graphs. Findings. The presented model is implemented as an application to the graphics package AutoCAD. The software is developed in Visual LISP and Visual Basic. Taking into account that the construction of the plan-schedule is mostly a traditional process of adding, deleting, and modifying of icons, the developed interface is intuitively understandable for a technologist and practically does not require additional training. Originality. A mathematical model was created on the basis of the theory of graphs and object-oriented analysis in order to evaluate the technical and process of railway stations indicators; it is focused on solving problems of technology development of their work. Practical value. The proposed mathematical model is implemented as an application to the graphics package of AutoCAD. The presence of a mathematical model allows carrying out an automatic analysis of the plan-schedule and, thereby, reducing the period of its creation more than twice.

  3. FY 1999 Scientific and Technical Reports, Articles, Papers, and Presentations

    Science.gov (United States)

    Waits, J.oyce E.Turner

    2000-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY99. It also includes papers of MSFC contractors. All of the NASA series reports may be obtained from the NASA Center for AeroSpace Information (CASI), 7121 Standard Drive, Hanover, MD 21076-1320 The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  4. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    Science.gov (United States)

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  5. Characteristics of the LeRC/Hughes J-series 30-cm engineering model thruster

    Science.gov (United States)

    Collett, C. R.; Poeschel, R. L.; Kami, S.

    1981-01-01

    As a consequence of endurance and structural tests performed on 900-series engineering model thrusters (EMT), several modifications in design were found to be necessary for achieving performance goals. The modified thruster is known as the J-series EMT. The most important of the design modifications affect the accelerator grid, gimbal mount, cathode polepiece, and wiring harness. The paper discusses the design modifications incorporated, the condition(s) they corrected, and the characteristics of the modified thruster.

  6. Image reconstruction method for electrical capacitance tomography based on the combined series and parallel normalization model

    International Nuclear Information System (INIS)

    Dong, Xiangyuan; Guo, Shuqing

    2008-01-01

    In this paper, a novel image reconstruction method for electrical capacitance tomography (ECT) based on the combined series and parallel model is presented. A regularization technique is used to obtain a stabilized solution of the inverse problem. Also, the adaptive coefficient of the combined model is deduced by numerical optimization. Simulation results indicate that it can produce higher quality images when compared to the algorithm based on the parallel or series models for the cases tested in this paper. It provides a new algorithm for ECT application

  7. Series expansions without diagrams

    International Nuclear Information System (INIS)

    Bhanot, G.; Creutz, M.; Horvath, I.; Lacki, J.; Weckel, J.

    1994-01-01

    We discuss the use of recursive enumeration schemes to obtain low- and high-temperature series expansions for discrete statistical systems. Using linear combinations of generalized helical lattices, the method is competitive with diagrammatic approaches and is easily generalizable. We illustrate the approach using Ising and Potts models. We present low-temperature series results in up to five dimensions and high-temperature series in three dimensions. The method is general and can be applied to any discrete model

  8. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  9. Microsurgical Bypass Training Rat Model, Part 1: Technical Nuances of Exposure of the Aorta and Iliac Arteries.

    Science.gov (United States)

    Tayebi Meybodi, Ali; Lawton, Michael T; Mokhtari, Pooneh; Yousef, Sonia; Gandhi, Sirin; Benet, Arnau

    2017-11-01

    Animal models using rodents are frequently used for practicing microvascular anastomosis-an essential technique in cerebrovascular surgery. However, safely and efficiently exposing rat's target vessels is technically difficult. Such difficulty may lead to excessive hemorrhage and shorten animal survival. This limits the ability to perform multiple anastomoses on a single animal and may increase the overall training time and costs. We report our model for microsurgical bypass training in rodents in 2 consecutive articles. In part 1, we describe the technical nuances for a safe and efficient exposure of the rat abdominal aorta and common iliac arteries (CIAs) for bypass. Over a 2-year period, 50 Sprague-Dawley rats underwent inhalant anesthesia for practicing microvascular anastomosis on the abdominal aorta and CIAs. Lessons learned regarding the technical nuances of vessel exposure were recorded. Several technical nuances were important for avoiding intraoperative bleeding and preventing animal demise while preparing an adequate length of vessels for bypass. The most relevant technical nuances include (1) generous subcutaneous dissection; (2) use of cotton swabs for the blunt dissection of the retroperitoneal fat; (3) combination of sharp and blunt dissection to isolate the aorta and iliac arteries from the accompanying veins; (4) proper control of the posterior branches of the aorta; and (5) efficient division and mobilization of the left renal pedicle. Applying the aforementioned technical nuances enables safe and efficient preparation of the rat abdominal aorta and CIAs for microvascular anastomosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. 75 FR 28480 - Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R...

    Science.gov (United States)

    2010-05-21

    ... Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R Series..., B4-622, B4- 605R, B4-622R, F4-605R, F4-622R, and C4-605R Variant F airplanes; and Model A310-203...

  11. Constructing the reduced dynamical models of interannual climate variability from spatial-distributed time series

    Science.gov (United States)

    Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander

    2016-04-01

    We suggest a method for empirical forecast of climate dynamics basing on the reconstruction of reduced dynamical models in a form of random dynamical systems [1,2] derived from observational time series. The construction of proper embedding - the set of variables determining the phase space the model works in - is no doubt the most important step in such a modeling, but this task is non-trivial due to huge dimension of time series of typical climatic fields. Actually, an appropriate expansion of observational time series is needed yielding the number of principal components considered as phase variables, which are to be efficient for the construction of low-dimensional evolution operator. We emphasize two main features the reduced models should have for capturing the main dynamical properties of the system: (i) taking into account time-lagged teleconnections in the atmosphere-ocean system and (ii) reflecting the nonlinear nature of these teleconnections. In accordance to these principles, in this report we present the methodology which includes the combination of a new way for the construction of an embedding by the spatio-temporal data expansion and nonlinear model construction on the basis of artificial neural networks. The methodology is aplied to NCEP/NCAR reanalysis data including fields of sea level pressure, geopotential height, and wind speed, covering Northern Hemisphere. Its efficiency for the interannual forecast of various climate phenomena including ENSO, PDO, NAO and strong blocking event condition over the mid latitudes, is demonstrated. Also, we investigate the ability of the models to reproduce and predict the evolution of qualitative features of the dynamics, such as spectral peaks, critical transitions and statistics of extremes. This research was supported by the Government of the Russian Federation (Agreement No. 14.Z50.31.0033 with the Institute of Applied Physics RAS) [1] Y. I. Molkov, E. M. Loskutov, D. N. Mukhin, and A. M. Feigin, "Random

  12. Construction of the exact Fisher information matrix of Gaussian time series models by means of matrix differential rules

    NARCIS (Netherlands)

    Klein, A.A.B.; Melard, G.; Zahaf, T.

    2000-01-01

    The Fisher information matrix is of fundamental importance for the analysis of parameter estimation of time series models. In this paper the exact information matrix of a multivariate Gaussian time series model expressed in state space form is derived. A computationally efficient procedure is used

  13. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  14. Analysis Method of Combine Harvesters Technical Level by Functional and Structural Parameters

    Directory of Open Access Journals (Sweden)

    E. V. Zhalnin

    2018-01-01

    Full Text Available The analysis of modern methods of evaluation of the grain harvesters technical level revealed a discrepancy in various criteria: comparative parameters, dimensionless series, the names of firms, the power of the motor, the width of the capture of the harvester, the capacity at the location of the manufacturer plant, advertising brands. (Purpose of research This led to a variety in the name of harvester models, which significantly complicates the assessment of their technical level, complicates the choice of agricultural necessary to him fashion, does not give the perception of the continuity of the change of generations of combines, makes it impossible to analyze trends in their development, does not disclose the technological essence of a model, but - most importantly - combines can not be compared with each other. The figures in the name of the harvester model are not related functionally to the main parameters and performance capabilities. (Materials and methods The close correlation in the form of a linear equation between their design parameters and the capacity of combines was revealed. Verification of this equation in the process of operation of the combine showed that it statistically stable and the estimates are always within the confidence interval with an error of 5-8 percent. It was found that four parameters of the variety of factors, that affect the performance of the harvester per hour net time, having most close correlation with it are: the motor power and the square of the separation concave, straw walkers and sieves for cleaning. (Results and discussion On the basis of the revealed correlation dependence we proposed a new method of assessment of the technical level of combines, which is based on the throughput (kg/s of the wetted material and the size series, indicating the nominal productivity of the combine in centners of grain harvested in 1 hour of basic time. The methodological background and mathematical apparatus

  15. Statistical models and time series forecasting of sulfur dioxide: a case study Tehran.

    Science.gov (United States)

    Hassanzadeh, S; Hosseinibalam, F; Alizadeh, R

    2009-08-01

    This study performed a time-series analysis, frequency distribution and prediction of SO(2) levels for five stations (Pardisan, Vila, Azadi, Gholhak and Bahman) in Tehran for the period of 2000-2005. Most sites show a quite similar characteristic with highest pollution in autumn-winter time and least pollution in spring-summer. The frequency distributions show higher peaks at two residential sites. The potential for SO(2) problems is high because of high emissions and the close geographical proximity of the major industrial and urban centers. The ACF and PACF are nonzero for several lags, indicating a mixed (ARMA) model, then at Bahman station an ARMA model was used for forecasting SO(2). The partial autocorrelations become close to 0 after about 5 lags while the autocorrelations remain strong through all the lags shown. The results proved that ARMA (2,2) model can provides reliable, satisfactory predictions for time series.

  16. Extracting Knowledge From Time Series An Introduction to Nonlinear Empirical Modeling

    CERN Document Server

    Bezruchko, Boris P

    2010-01-01

    This book addresses the fundamental question of how to construct mathematical models for the evolution of dynamical systems from experimentally-obtained time series. It places emphasis on chaotic signals and nonlinear modeling and discusses different approaches to the forecast of future system evolution. In particular, it teaches readers how to construct difference and differential model equations depending on the amount of a priori information that is available on the system in addition to the experimental data sets. This book will benefit graduate students and researchers from all natural sciences who seek a self-contained and thorough introduction to this subject.

  17. Fiscal Year 1986 Technical Objective Document (TOD).

    Science.gov (United States)

    1986-03-01

    abilties superior to other IR and manual turrets. - START DATE: FY 88 END DATE: FY 90" PROJECT TITLE: COMPOSITE METAL FIRES EE 62:06 JON: 2673XXXX...TECHNOLOGY: FIRE ELEMENT: INTERACTION DESCRIPTION (TECHNICAL OBJECTIVE) Evaluate a new series of agents "BORALONS" capable of extinguishing metal fires and...PROJECT TITLE: COMPOSITE METAL FIRES PE: 63723 JON: 2104XXXX

  18. Technical support non-SLB(GCDF)

    International Nuclear Information System (INIS)

    DePoorter, G.L.; Burton, B.W.

    1982-01-01

    The Los Alamos National Laboratory is providing technical support for the Greater Confinement Demonstration Facility (GCDF) at the Nevada Test Site. This technical support consists of computer modeling of the GCDF, design and emplacement of a Shallow Test Plot at NTS, and instrument testing at Los Alamos. Results to date on the computer modeling and the Shallow Test Plot are described

  19. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    Science.gov (United States)

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  20. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2017-01-01

    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  1. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  2. The MIDAS Touch: Mixed Data Sampling Regression Models

    OpenAIRE

    Ghysels, Eric; Santa-Clara, Pedro; Valkanov, Rossen

    2004-01-01

    We introduce Mixed Data Sampling (henceforth MIDAS) regression models. The regressions involve time series data sampled at different frequencies. Technically speaking MIDAS models specify conditional expectations as a distributed lag of regressors recorded at some higher sampling frequencies. We examine the asymptotic properties of MIDAS regression estimation and compare it with traditional distributed lag models. MIDAS regressions have wide applicability in macroeconomics and �nance.

  3. Socio-technical Betwixtness

    DEFF Research Database (Denmark)

    Bossen, Claus

    2017-01-01

    the intrinsically social and technical interwovenness of design, and the necessity of including affected people and stakeholders in the design process. This betwixtness of socio-technical design is demonstrated by the analysis of two IT systems for healthcare: a foundational model for electronic healthcare records......This chapter focusses on two challenges for socio-technical design: Having to choose between different rationales for design, and the adequate understanding and depiction of the work to be redesigned. These two challenges betwixt the otherwise strong tenets of socio-technical design of pointing out......, and an IT system organizing hospital porters’ work. The conceptual background for the analysis of the cases is provided by a short introduction to different rationales for organizational design, and by pointing to the differences between a linear, rationalistic versus an interactional depiction of work....

  4. Modeling the impact of forecast-based regime switches on macroeconomic time series

    NARCIS (Netherlands)

    K. Bel (Koen); R. Paap (Richard)

    2013-01-01

    textabstractForecasts of key macroeconomic variables may lead to policy changes of governments, central banks and other economic agents. Policy changes in turn lead to structural changes in macroeconomic time series models. To describe this phenomenon we introduce a logistic smooth transition

  5. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in-creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack...

  6. Modeling and Analysing Socio-Technical Systems

    NARCIS (Netherlands)

    Aslanyan, Zaruhi; Ivanova, Marieta G.; Nielson, Flemming; Probst, Christian W.

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in- creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack with

  7. Clustering of financial time series

    Science.gov (United States)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  8. Prediction of traffic-related nitrogen oxides concentrations using Structural Time-Series models

    Science.gov (United States)

    Lawson, Anneka Ruth; Ghosh, Bidisha; Broderick, Brian

    2011-09-01

    Ambient air quality monitoring, modeling and compliance to the standards set by European Union (EU) directives and World Health Organization (WHO) guidelines are required to ensure the protection of human and environmental health. Congested urban areas are most susceptible to traffic-related air pollution which is the most problematic source of air pollution in Ireland. Long-term continuous real-time monitoring of ambient air quality at such urban centers is essential but often not realistic due to financial and operational constraints. Hence, the development of a resource-conservative ambient air quality monitoring technique is essential to ensure compliance with the threshold values set by the standards. As an intelligent and advanced statistical methodology, a Structural Time Series (STS) based approach has been introduced in this paper to develop a parsimonious and computationally simple air quality model. In STS methodology, the different components of a time-series dataset such as the trend, seasonal, cyclical and calendar variations can be modeled separately. To test the effectiveness of the proposed modeling strategy, average hourly concentrations of nitrogen dioxide and nitrogen oxides from a congested urban arterial in Dublin city center were modeled using STS methodology. The prediction error estimates from the developed air quality model indicate that the STS model can be a useful tool in predicting nitrogen dioxide and nitrogen oxides concentrations in urban areas and will be particularly useful in situations where the information on external variables such as meteorology or traffic volume is not available.

  9. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    Science.gov (United States)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  10. Creation and evaluation of a database of renewable production time series and other data for energy system modelling

    International Nuclear Information System (INIS)

    Janker, Karl Albert

    2015-01-01

    This thesis describes a model which generates renewable power generation time series as input data for energy system models. The focus is on photovoltaic systems and wind turbines. The basis is a high resolution global raster data set of weather data for many years. This data is validated, corrected and preprocessed. The composition of the hourly generation data is done via simulation of the respective technology. The generated time series are aggregated for different regions and are validated against historical production time series.

  11. Linear series of stellar models. Pt. 4. Helium-carbon stars of 3.5Msub(o) and 1Msub(o)

    International Nuclear Information System (INIS)

    Kozlowski, M.; Paczynski, B.; Popova, K.

    1973-01-01

    One linear series of models for a star of 3.5Msub(o) and two linear series of models for a star of 1Msub(o) are constructed. Models consist of helium rich envelopes (Y = 0.97, Z = 0.03) and pure carbon cores, and they have a rectangular helium profile, Y(Msub(r)). The linear series for a star of 3.5Msub(o) begins on the normal branch of the helium main sequence and terminates on the normal branch of the carbon main sequence. This series has eight turning points at which the core mass attains a local extremum. One of the two linear series for a star of 1Msub(o) begins on the normal branch of the helium main sequence, terminates on the high density branch of the helium main sequence, and has one turning point. The second linear series for a star of 1Msub(o) begins on the normal branch of the carbon main sequence, terminates on the high density branch of the carbon main sequence, and has three turning points. Two such linear series may have a common bifurcation point for a star of about 1.26Msub(o). (author)

  12. Technical and tactical action modeling of highly trained athletes specializing in breaststroke swimming at various length distances

    Directory of Open Access Journals (Sweden)

    Olga Pilipko

    2017-08-01

    Full Text Available Purpose: definition of model parameters of technical and tactical actions of highly trained athletes specializing in breaststroke swimming at various length distances. Material & Methods: analysis of literary sources, video shooting, timing, methods of mathematical data processing. The contingent of the surveyed was made up of athletes who specialized in distances of 50, 100 and 200 meters in breaststroke swimming and had the level of sports qualification of master of sports of Ukraine, Master of Sports of International grade. Result: authors found that the technical and tactical actions of highly trained athletes during the swim of distances of 50, 100 and 200 meters by the breaststroke have their own characteristics; degree of influence of speed, pace and "step" of the strokes cycle on the result of swim distances of 50, 100 and 200 meters is determined; developed their model characteristics. Conclusion: the definition of distance specialization in breaststroke swimming should be carried out taking into account the compliance of individual indicators of technical and tactical actions of athletes to model parameters.

  13. Modeling technical change in energy system analysis: analyzing the introduction of learning-by-doing in bottom-up energy models

    International Nuclear Information System (INIS)

    Berglund, Christer; Soederholm, Patrik

    2006-01-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aimed at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options-which is absent in many top-down models-they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they often fail in capturing strategic technology diffusion behavior in the energy sector as well as the energy sector's endogenous responses to policy, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). Some suggestions on how innovation and diffusion modeling in bottom-up analysis can be improved are put forward

  14. Technical writing versus technical writing

    Science.gov (United States)

    Dillingham, J. W.

    1981-01-01

    Two terms, two job categories, 'technical writer' and 'technical author' are discussed in terms of industrial and business requirements and standards. A distinction between 'technical writing' and technical 'writing' is made. The term 'technical editor' is also considered. Problems inherent in the design of programs to prepare and train students for these jobs are discussed. A closer alliance between industry and academia is suggested as a means of preparing students with competent technical communication skills (especially writing and editing skills) and good technical skills.

  15. A novel model for Time-Series Data Clustering Based on piecewise SVD and BIRCH for Stock Data Analysis on Hadoop Platform

    Directory of Open Access Journals (Sweden)

    Ibgtc Bowala

    2017-06-01

    Full Text Available With the rapid growth of financial markets, analyzers are paying more attention on predictions. Stock data are time series data, with huge amounts. Feasible solution for handling the increasing amount of data is to use a cluster for parallel processing, and Hadoop parallel computing platform is a typical representative. There are various statistical models for forecasting time series data, but accurate clusters are a pre-requirement. Clustering analysis for time series data is one of the main methods for mining time series data for many other analysis processes. However, general clustering algorithms cannot perform clustering for time series data because series data has a special structure and a high dimensionality has highly co-related values due to high noise level. A novel model for time series clustering is presented using BIRCH, based on piecewise SVD, leading to a novel dimension reduction approach. Highly co-related features are handled using SVD with a novel approach for dimensionality reduction in order to keep co-related behavior optimal and then use BIRCH for clustering. The algorithm is a novel model that can handle massive time series data. Finally, this new model is successfully applied to real stock time series data of Yahoo finance with satisfactory results.

  16. Analysis of "The Wonderful Desert." Technical Report No. 170.

    Science.gov (United States)

    Green, G. M.; And Others

    This report presents a text analysis of "The Wonderful Desert," a brief selection from the "Reader's Digest Skill Builder" series. (The techniques used in arriving at the analysis are presented in a Reading Center Technical Report, Number 168, "Problems and Techniques of Text Analysis.") Tables are given for a…

  17. Extended causal modeling to assess Partial Directed Coherence in multiple time series with significant instantaneous interactions.

    Science.gov (United States)

    Faes, Luca; Nollo, Giandomenico

    2010-11-01

    The Partial Directed Coherence (PDC) and its generalized formulation (gPDC) are popular tools for investigating, in the frequency domain, the concept of Granger causality among multivariate (MV) time series. PDC and gPDC are formalized in terms of the coefficients of an MV autoregressive (MVAR) model which describes only the lagged effects among the time series and forsakes instantaneous effects. However, instantaneous effects are known to affect linear parametric modeling, and are likely to occur in experimental time series. In this study, we investigate the impact on the assessment of frequency domain causality of excluding instantaneous effects from the model underlying PDC evaluation. Moreover, we propose the utilization of an extended MVAR model including both instantaneous and lagged effects. This model is used to assess PDC either in accordance with the definition of Granger causality when considering only lagged effects (iPDC), or with an extended form of causality, when we consider both instantaneous and lagged effects (ePDC). The approach is first evaluated on three theoretical examples of MVAR processes, which show that the presence of instantaneous correlations may produce misleading profiles of PDC and gPDC, while ePDC and iPDC derived from the extended model provide here a correct interpretation of extended and lagged causality. It is then applied to representative examples of cardiorespiratory and EEG MV time series. They suggest that ePDC and iPDC are better interpretable than PDC and gPDC in terms of the known cardiovascular and neural physiologies.

  18. Normalization of time-series satellite reflectance data to a standard sun-target-sensor geometry using a semi-empirical model

    Science.gov (United States)

    Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang

    2017-10-01

    Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.

  19. Nagra technical report 14-02, geological basics - Dossier III - Long-term geological developments

    International Nuclear Information System (INIS)

    Schnellmann, M.; Madritsch, H.

    2014-01-01

    This dossier is the third of a series of eight reports concerning the safety and technical aspects of locations for the disposal of radioactive wastes in Switzerland. Dossier III takes a look at long-term geological developments. Developments in the topography and river networks of northern Switzerland over the past five million years are looked at. Data and information derived from high-resolution models and compilations of gravel deposition, glacier developments and moraines are reviewed. Tectonic developments, seismological aspects and erosion are discussed. Their consequences for the long-term geological developments in the proposed depository areas are looked at

  20. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  1. Refining Markov state models for conformational dynamics using ensemble-averaged data and time-series trajectories

    Science.gov (United States)

    Matsunaga, Y.; Sugita, Y.

    2018-06-01

    A data-driven modeling scheme is proposed for conformational dynamics of biomolecules based on molecular dynamics (MD) simulations and experimental measurements. In this scheme, an initial Markov State Model (MSM) is constructed from MD simulation trajectories, and then, the MSM parameters are refined using experimental measurements through machine learning techniques. The second step can reduce the bias of MD simulation results due to inaccurate force-field parameters. Either time-series trajectories or ensemble-averaged data are available as a training data set in the scheme. Using a coarse-grained model of a dye-labeled polyproline-20, we compare the performance of machine learning estimations from the two types of training data sets. Machine learning from time-series data could provide the equilibrium populations of conformational states as well as their transition probabilities. It estimates hidden conformational states in more robust ways compared to that from ensemble-averaged data although there are limitations in estimating the transition probabilities between minor states. We discuss how to use the machine learning scheme for various experimental measurements including single-molecule time-series trajectories.

  2. Bridging the Gap Between the Social and the Technical: The Enrolment of Socio-Technical Information Architects to Cope with the Two-Level Model of EPR Systems.

    Science.gov (United States)

    Pedersen, Rune

    2017-01-01

    This is a project proposal derived from an urge to re-define the governance of ICT in healthcare towards regional and national standardization of the patient pathways. The focus is on a two-levelled approach for governing EPR systems where the clinicians' model structured variables and patient pathways. The overall goal is a patient centric EPR portfolio. This paper define and enlighten the need for establishing the socio- technical architect role necessary to obtain the capabilities of a modern structured EPR system. Clinicians are not capable to moderate between the technical and the clinical.

  3. Developing a local least-squares support vector machines-based neuro-fuzzy model for nonlinear and chaotic time series prediction.

    Science.gov (United States)

    Miranian, A; Abdollahzade, M

    2013-02-01

    Local modeling approaches, owing to their ability to model different operating regimes of nonlinear systems and processes by independent local models, seem appealing for modeling, identification, and prediction applications. In this paper, we propose a local neuro-fuzzy (LNF) approach based on the least-squares support vector machines (LSSVMs). The proposed LNF approach employs LSSVMs, which are powerful in modeling and predicting time series, as local models and uses hierarchical binary tree (HBT) learning algorithm for fast and efficient estimation of its parameters. The HBT algorithm heuristically partitions the input space into smaller subdomains by axis-orthogonal splits. In each partitioning, the validity functions automatically form a unity partition and therefore normalization side effects, e.g., reactivation, are prevented. Integration of LSSVMs into the LNF network as local models, along with the HBT learning algorithm, yield a high-performance approach for modeling and prediction of complex nonlinear time series. The proposed approach is applied to modeling and predictions of different nonlinear and chaotic real-world and hand-designed systems and time series. Analysis of the prediction results and comparisons with recent and old studies demonstrate the promising performance of the proposed LNF approach with the HBT learning algorithm for modeling and prediction of nonlinear and chaotic systems and time series.

  4. Data on copula modeling of mixed discrete and continuous neural time series.

    Science.gov (United States)

    Hu, Meng; Li, Mingyao; Li, Wu; Liang, Hualou

    2016-06-01

    Copula is an important tool for modeling neural dependence. Recent work on copula has been expanded to jointly model mixed time series in neuroscience ("Hu et al., 2016, Joint Analysis of Spikes and Local Field Potentials using Copula" [1]). Here we present further data for joint analysis of spike and local field potential (LFP) with copula modeling. In particular, the details of different model orders and the influence of possible spike contamination in LFP data from the same and different electrode recordings are presented. To further facilitate the use of our copula model for the analysis of mixed data, we provide the Matlab codes, together with example data.

  5. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data

    Science.gov (United States)

    Wilson, Barry T.; Knight, Joseph F.; McRoberts, Ronald E.

    2018-03-01

    Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several methods have previously been developed for use with finer temporal resolution imagery (e.g. AVHRR and MODIS), including image compositing and harmonic regression using Fourier series. The manuscript presents a study, using Minnesota, USA during the years 2009-2013 as the study area and timeframe. The study examined the relative predictive power of land cover models, in particular those related to tree cover, using predictor variables based solely on composite imagery versus those using estimated harmonic regression coefficients. The study used two common non-parametric modeling approaches (i.e. k-nearest neighbors and random forests) for fitting classification and regression models of multiple attributes measured on USFS Forest Inventory and Analysis plots using all available Landsat imagery for the study area and timeframe. The estimated Fourier coefficients developed by harmonic regression of tasseled cap transformation time series data were shown to be correlated with land cover, including tree cover. Regression models using estimated Fourier coefficients as predictor variables showed a two- to threefold increase in explained variance for a small set of continuous response variables, relative to comparable models using monthly image composites. Similarly, the overall accuracies of classification models using the estimated Fourier coefficients were approximately 10-20 percentage points higher than the models using the image composites, with corresponding individual class accuracies between six and 45 percentage points higher.

  6. Performance Evaluation of Linear (ARMA and Threshold Nonlinear (TAR Time Series Models in Daily River Flow Modeling (Case Study: Upstream Basin Rivers of Zarrineh Roud Dam

    Directory of Open Access Journals (Sweden)

    Farshad Fathian

    2017-01-01

    Full Text Available Introduction: Time series models are generally categorized as a data-driven method or mathematically-based method. These models are known as one of the most important tools in modeling and forecasting of hydrological processes, which are used to design and scientific management of water resources projects. On the other hand, a better understanding of the river flow process is vital for appropriate streamflow modeling and forecasting. One of the main concerns of hydrological time series modeling is whether the hydrologic variable is governed by the linear or nonlinear models through time. Although the linear time series models have been widely applied in hydrology research, there has been some recent increasing interest in the application of nonlinear time series approaches. The threshold autoregressive (TAR method is frequently applied in modeling the mean (first order moment of financial and economic time series. Thise type of the model has not received considerable attention yet from the hydrological community. The main purposes of this paper are to analyze and to discuss stochastic modeling of daily river flow time series of the study area using linear (such as ARMA: autoregressive integrated moving average and non-linear (such as two- and three- regime TAR models. Material and Methods: The study area has constituted itself of four sub-basins namely, Saghez Chai, Jighato Chai, Khorkhoreh Chai and Sarogh Chai from west to east, respectively, which discharge water into the Zarrineh Roud dam reservoir. River flow time series of 6 hydro-gauge stations located on upstream basin rivers of Zarrineh Roud dam (located in the southern part of Urmia Lake basin were considered to model purposes. All the data series used here to start from January 1, 1997, and ends until December 31, 2011. In this study, the daily river flow data from January 01 1997 to December 31 2009 (13 years were chosen for calibration and data for January 01 2010 to December 31 2011

  7. 75 FR 38017 - Airworthiness Directives; McDonnell Douglas Corporation Model DC-9-10 Series Airplanes, DC-9-30...

    Science.gov (United States)

    2010-07-01

    ... Airworthiness Directives; McDonnell Douglas Corporation Model DC- 9-10 Series Airplanes, DC-9-30 Series... previously to all known U.S. owners and operators of the McDonnell Douglas Corporation airplanes identified... INFORMATION: On July 15, 2009, we issued AD 2009-15-16, which applies to all McDonnell Douglas Model DC-9-10...

  8. A meta-analysis of motivational interviewing process: Technical, relational, and conditional process models of change.

    Science.gov (United States)

    Magill, Molly; Apodaca, Timothy R; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E F; Tonigan, J Scott; Moyers, Theresa

    2018-02-01

    In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3,025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Therapist MI-consistent skills were correlated with more client change talk (r = .55, p technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = -.16, p technical hypothesis path effect sizes was partially explained by inter- and intrapersonal moderators. This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Radioactivity and Man Minicourse, Career Oriented Pre-Technical Physics.

    Science.gov (United States)

    Dallas Independent School District, TX.

    This instructional guide, intended for student use, develops the subject of radioactivity and man through a series of sequential activities. A technical development of the subject is pursued with examples stressing practical aspects of the concepts. Included in the minicourse are: (1) the rationale, (2) terminal behavioral objectives, (3) enabling…

  10. Assessing and improving the quality of modeling : a series of empirical studies about the UML

    NARCIS (Netherlands)

    Lange, C.F.J.

    2007-01-01

    Assessing and Improving the Quality of Modeling A Series of Empirical Studies about the UML This thesis addresses the assessment and improvement of the quality of modeling in software engineering. In particular, we focus on the Unified Modeling Language (UML), which is the de facto standard in

  11. Application of stochastic frontier approach model to assess technical ...

    African Journals Online (AJOL)

    Since almost all the arable land is under cultivation, future increase in maize production will heavily depend on technical efficiency and yield improvement rather than expansion in area under production. The main objective of this study was to determine the technical efficiency of smallholder maize production in Kenya.

  12. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    OpenAIRE

    Jun-He Yang; Ching-Hsue Cheng; Chia-Pan Chan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting m...

  13. a Landsat Time-Series Stacks Model for Detection of Cropland Change

    Science.gov (United States)

    Chen, J.; Chen, J.; Zhang, J.

    2017-09-01

    Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.

  14. Nonlinear detection of disordered voice productions from short time series based on a Volterra-Wiener-Korenberg model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Yu, E-mail: yuzhang@xmu.edu.cn [Key Laboratory of Underwater Acoustic Communication and Marine Information Technology of the Ministry of Education, Xiamen University, Xiamen Fujian 361005 (China); Sprecher, Alicia J. [Department of Surgery, Division of Otolaryngology - Head and Neck Surgery, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792-7375 (United States); Zhao Zongxi [Key Laboratory of Underwater Acoustic Communication and Marine Information Technology of the Ministry of Education, Xiamen University, Xiamen Fujian 361005 (China); Jiang, Jack J. [Department of Surgery, Division of Otolaryngology - Head and Neck Surgery, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792-7375 (United States)

    2011-09-15

    Highlights: > The VWK method effectively detects the nonlinearity of a discrete map. > The method describes the chaotic time series of a biomechanical vocal fold model. > Nonlinearity in laryngeal pathology is detected from short and noisy time series. - Abstract: In this paper, we apply the Volterra-Wiener-Korenberg (VWK) model method to detect nonlinearity in disordered voice productions. The VWK method effectively describes the nonlinearity of a third-order nonlinear map. It allows for the analysis of short and noisy data sets. The extracted VWK model parameters show an agreement with the original nonlinear map parameters. Furthermore, the VWK mode method is applied to successfully assess the nonlinearity of a biomechanical voice production model simulating irregular vibratory dynamics of vocal folds with a unilateral vocal polyp. Finally, we show the clinical applicability of this nonlinear detection method to analyze the electroglottographic data generated by 14 patients with vocal nodules or polyps. The VWK model method shows potential in describing the nonlinearity inherent in disordered voice productions from short and noisy time series that are common in the clinical setting.

  15. Nonlinear detection of disordered voice productions from short time series based on a Volterra-Wiener-Korenberg model

    International Nuclear Information System (INIS)

    Zhang Yu; Sprecher, Alicia J.; Zhao Zongxi; Jiang, Jack J.

    2011-01-01

    Highlights: → The VWK method effectively detects the nonlinearity of a discrete map. → The method describes the chaotic time series of a biomechanical vocal fold model. → Nonlinearity in laryngeal pathology is detected from short and noisy time series. - Abstract: In this paper, we apply the Volterra-Wiener-Korenberg (VWK) model method to detect nonlinearity in disordered voice productions. The VWK method effectively describes the nonlinearity of a third-order nonlinear map. It allows for the analysis of short and noisy data sets. The extracted VWK model parameters show an agreement with the original nonlinear map parameters. Furthermore, the VWK mode method is applied to successfully assess the nonlinearity of a biomechanical voice production model simulating irregular vibratory dynamics of vocal folds with a unilateral vocal polyp. Finally, we show the clinical applicability of this nonlinear detection method to analyze the electroglottographic data generated by 14 patients with vocal nodules or polyps. The VWK model method shows potential in describing the nonlinearity inherent in disordered voice productions from short and noisy time series that are common in the clinical setting.

  16. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  17. Time series modelling and forecasting of emergency department overcrowding.

    Science.gov (United States)

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  18. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    Science.gov (United States)

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  19. Study on the Technical Efficiency of Creative Human Capital in China by Three-Stage Data Envelopment Analysis Model

    Directory of Open Access Journals (Sweden)

    Jian Ma

    2014-01-01

    Full Text Available Previous researches have proved the positive effect of creative human capital and its development on the development of economy. Yet, the technical efficiency of creative human capital and its effects are still under research. The authors are trying to estimate the technical efficiency value in Chinese context, which is adjusted by the environmental variables and statistical noises, by establishing a three-stage data envelopment analysis model, using data from 2003 to 2010. The research results indicate that, in this period, the entirety of creative human capital in China and the technical efficiency value in different regions and different provinces is still in the low level and could be promoted. Otherwise, technical non-efficiency is mostly derived from the scale nonefficiency and rarely affected by pure technical efficiency. The research also examines environmental variables’ marked effects on the technical efficiency, and it shows that different environmental variables differ in the aspect of their own effects. The expansion of the scale of education, development of healthy environment, growth of GDP, development of skill training, and population migration could reduce the input of creative human capital and promote the technical efficiency, while development of trade and institutional change, on the contrary, would block the input of creative human capital and the promotion the technical efficiency.

  20. A dynamic model to explain hydration behaviour along the lanthanide series

    International Nuclear Information System (INIS)

    Duvail, M.; Spezia, R.; Vitorge, P.

    2008-01-01

    An understanding of the hydration structure of heavy atoms, such as transition metals, lanthanides and actinides, in aqueous solution is of fundamental importance in order to address their solvation properties and chemical reactivity. Herein we present a systematic molecular dynamics study of Ln 3+ hydration in bulk water that can be used as reference for experimental and theoretical research in this and related fields. Our study of hydration structure and dynamics along the entire Ln 3+ series provides a dynamic picture of the CN behavioural change from light (CN=9 predominating) to heavy (CN=8 predominating) lanthanides consistent with the exchange mechanism proposed by Helm, Merbach and co-workers. This scenario is summarized in this work. The hydrated light lanthanides are stable TTP structures containing two kinds of water molecules: six molecules forming the trigonal prism and three in the centre triangle. Towards the middle of the series both ionic radii and polarizabilities decrease, such that first-shell water-water repulsion increases and water-cation attraction decreases. This mainly applies for molecules of the centre triangle of the nine-fold structure. Thus, one of these molecules stay in the second hydration sphere of the lanthanide for longer average times, as one progresses along the lanthanide series. The interchange between predominantly CN=9 and CN=8 is found between Tb and Dy. Therefore, we propose a model that determines the properties governing the change in the first-shell coordination number across the series, confirming the basic hypothesis proposed by Helm and Merbach. We show that it is not a sudden change in behaviour, but rather that it results from a statistical predominance of one first hydration shell structure containing nine water molecules over one containing eight. This is observed progressively across the series. (O.M.)

  1. Mining Gene Regulatory Networks by Neural Modeling of Expression Time-Series.

    Science.gov (United States)

    Rubiolo, Mariano; Milone, Diego H; Stegmayer, Georgina

    2015-01-01

    Discovering gene regulatory networks from data is one of the most studied topics in recent years. Neural networks can be successfully used to infer an underlying gene network by modeling expression profiles as times series. This work proposes a novel method based on a pool of neural networks for obtaining a gene regulatory network from a gene expression dataset. They are used for modeling each possible interaction between pairs of genes in the dataset, and a set of mining rules is applied to accurately detect the subjacent relations among genes. The results obtained on artificial and real datasets confirm the method effectiveness for discovering regulatory networks from a proper modeling of the temporal dynamics of gene expression profiles.

  2. Development of New Loan Payment Models with Piecewise Geometric Gradient Series

    Directory of Open Access Journals (Sweden)

    Erdal Aydemir

    2014-12-01

    Full Text Available Engineering economics plays an important role in decision making. Also, the cash flows, time value of money and interest rates are the most important research fields in mathematical finance. Generalized formulae obtained from a variety of models with the time value of money and cash flows are inadequate to solve some problems. In this study, a new generalized formulae is considered for the first time and derived from a loan payment model which is a certain number of payment amount determined by customer at the beginning of payment period and the other repayments with piecewise linear gradient series. As a result, some numerical examples with solutions are given for the developed models

  3. Virtual expansion of the technical vision system for smart vehicles based on multi-agent cooperation model

    Science.gov (United States)

    Krapukhina, Nina; Senchenko, Roman; Kamenov, Nikolay

    2017-12-01

    Road safety and driving in dense traffic flows poses some challenges in receiving information about surrounding moving object, some of which can be in the vehicle's blind spot. This work suggests an approach to virtual monitoring of the objects in a current road scene via a system with a multitude of cooperating smart vehicles exchanging information. It also describes the intellectual agent model, and provides methods and algorithms of identifying and evaluating various characteristics of moving objects in video flow. Authors also suggest ways for integrating the information from the technical vision system into the model with further expansion of virtual monitoring for the system's objects. Implementation of this approach can help to expand the virtual field of view for a technical vision system.

  4. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  5. Patient specific dynamic geometric models from sequential volumetric time series image data.

    Science.gov (United States)

    Cameron, B M; Robb, R A

    2004-01-01

    Generating patient specific dynamic models is complicated by the complexity of the motion intrinsic and extrinsic to the anatomic structures being modeled. Using a physics-based sequentially deforming algorithm, an anatomically accurate dynamic four-dimensional model can be created from a sequence of 3-D volumetric time series data sets. While such algorithms may accurately track the cyclic non-linear motion of the heart, they generally fail to accurately track extrinsic structural and non-cyclic motion. To accurately model these motions, we have modified a physics-based deformation algorithm to use a meta-surface defining the temporal and spatial maxima of the anatomic structure as the base reference surface. A mass-spring physics-based deformable model, which can expand or shrink with the local intrinsic motion, is applied to the metasurface, deforming this base reference surface to the volumetric data at each time point. As the meta-surface encompasses the temporal maxima of the structure, any extrinsic motion is inherently encoded into the base reference surface and allows the computation of the time point surfaces to be performed in parallel. The resultant 4-D model can be interactively transformed and viewed from different angles, showing the spatial and temporal motion of the anatomic structure. Using texture maps and per-vertex coloring, additional data such as physiological and/or biomechanical variables (e.g., mapping electrical activation sequences onto contracting myocardial surfaces) can be associated with the dynamic model, producing a 5-D model. For acquisition systems that may capture only limited time series data (e.g., only images at end-diastole/end-systole or inhalation/exhalation), this algorithm can provide useful interpolated surfaces between the time points. Such models help minimize the number of time points required to usefully depict the motion of anatomic structures for quantitative assessment of regional dynamics.

  6. The Development of Technical Services Training. Historical Paper 3

    Science.gov (United States)

    Dunkin, Paul S.

    2015-01-01

    In this article the author discusses the evolution of the profession of librarianship and the compromise of educating librarians in schools instead of by apprenticeship. He poses a series of questions, some more rhetorical than others: (1) Is Technical Services an intellectual concept or an administrative device?; (2) Can the routines and rules of…

  7. Forecasting Cryptocurrencies Financial Time Series

    OpenAIRE

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely on Dynamic Model Averaging to combine a large set of univariate Dynamic Linear Models and several multivariate Vector Autoregressive models with different forms of time variation. We find statistical si...

  8. Book Review: New Perspectives on Technical Editing

    Science.gov (United States)

    Murphy, A. J. (Ed.); Sterken, Christiaan

    2012-08-01

    skills, or do not have a very broad knowledge base. The language fluency of every contributor makes this book a pleasure to read, and this particular volume of Baywood's Technical Communications Series is very well edited. The subject index covers almost 8 two-column pages.

  9. Energy Economic Data Base (EEDB) Program. Technical Reference Book

    International Nuclear Information System (INIS)

    Allen, R.E.; Benedict, R.G.; Hodson, J.S.

    1983-09-01

    Purpose of the program is to develop current technical and cost information for nuclear and comparison electric power generating stations. Purpose of this Technical Reference Book is to provide the current technical design bases for each of the technical data models updated in the Sixth Update (1983). It contains a set of detailed system design descriptions for these technical data models, which are supplemented with engineering drawings. The system design descriptions reflect regulatory and industry practice and experience for nuclear and coal-fired power generating stations that are current for January 1, 1983

  10. CauseMap: fast inference of causality from complex time series.

    Science.gov (United States)

    Maher, M Cyrus; Hernandez, Ryan D

    2015-01-01

    Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a

  11. CauseMap: fast inference of causality from complex time series

    Directory of Open Access Journals (Sweden)

    M. Cyrus Maher

    2015-03-01

    Full Text Available Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data.Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM, a method for establishing causality from long time series data (≳25 observations. Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens’ Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement

  12. Intraprocedural Safety and Technical Success of the MVP Micro Vascular Plug for Embolization of Pulmonary Arteriovenous Malformations.

    Science.gov (United States)

    Conrad, Miles B; Ishaque, Brandon M; Surman, Andrew M; Kerlan, Robert K; Hope, Michael D; Dickey, Melissa A; Hetts, Steven W; Wilson, Mark W

    2015-11-01

    This case series describes early experience, intraprocedural safety, and technical success of the MVP Micro Vascular Plug (MVP; Covidien, Irvine, California) for embolization of 20 pulmonary arteriovenous malformations (PAVMs) using 23 plugs in seven patients with hereditary hemorrhagic telangiectasia. There was no device migration, and all devices were successfully detached electrolytically. Immediate cessation of flow through the feeding artery was achieved in 21 of 23 (91%) deployments. There was one minor complication. This series demonstrates the MVP to be safe and technically successful in the treatment of PAVMs. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  13. Decoupling of modeling and measuring interval in groundwater time series analysis based on response characteristics

    NARCIS (Netherlands)

    Berendrecht, W.L.; Heemink, A.W.; Geer, F.C. van; Gehrels, J.C.

    2003-01-01

    A state-space representation of the transfer function-noise (TFN) model allows the choice of a modeling (input) interval that is smaller than the measuring interval of the output variable. Since in geohydrological applications the interval of the available input series (precipitation excess) is

  14. Technical and economic modelling of processes for liquid fuel production in Europe

    International Nuclear Information System (INIS)

    Bridgwater, A.V.; Double, J.M.

    1991-01-01

    The project which is described had the objective of examining the full range of technologies for liquid fuel production from renewable feedstocks in a technical and economic evaluation in order to identify the most promising technologies. The technologies considered are indirect thermochemical liquefaction (i.e. via gasification) to produce methanol, fuel alcohol or hydrocarbon fuels, direct thermochemical liquefaction or pyrolysis to produce hydrocarbon fuels and fermentation to produce ethanol. Feedstocks considered were wood, refuse derived fuel, straw, wheat and sugar beet. In order to carry out the evaluation, a computer model was developed, based on a unit process approach. Each unit operation is modelled as a process step, the model calculating the mass balance, energy balance and operating cost of the unit process. The results from the process step models are then combined to generate the mass balance, energy balance, capital cost and operating cost for the total process. The results show that the lowest production cost (L7/GJ) is obtained for methanol generated from a straw feedstock, but there is a moderate level of technical uncertainty associated with this result. The lowest production cost for hydrocarbon fuel (L8.6/GJ) is given by the pyrolysis process using a wood feedstock. This process has a high level of uncertainty. Fermentation processes showed the highest production costs, ranging from L14.4/GJ for a simple wood feedstock process to L25.2/GJ for a process based on sugar beet. The important conclusions are as follows: - In every case, the product cost is above current liquid fuel prices; - In most cases the feedstock cost dominates the production cost; -The most attractive products are thermochemically produced alcohol fuels

  15. Technical and economic modelling of processes for liquid fuel production in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Bridgwater, A V; Double, J M [Aston Univ. Birmingham (GB). Dept of Chemical Engineering

    1992-12-31

    The project which is described had the objective of examining the full range of technologies for liquid fuel production from renewable feedstocks in a technical and economic evaluation in order to identify the most promising technologies. The technologies considered are indirect thermochemical liquefaction (i.e. via gasification) to produce methanol, fuel alcohol or hydrocarbon fuels, direct thermochemical liquefaction or pyrolysis to produce hydrocarbon fuels and fermentation to produce ethanol. Feedstocks considered were wood, refuse derived fuel, straw, wheat and sugar beet. In order to carry out the evaluation, a computer model was developed, based on a unit process approach. Each unit operation is modelled as a process step, the model calculating the mass balance, energy balance and operating cost of the unit process. The results from the process step models are then combined to generate the mass balance, energy balance, capital cost and operating cost for the total process. The results show that the lowest production cost (L7/GJ) is obtained for methanol generated from a straw feedstock, but there is a moderate level of technical uncertainty associated with this result. The lowest production cost for hydrocarbon fuel (L8.6/GJ) is given by the pyrolysis process using a wood feedstock. This process has a high level of uncertainty. Fermentation processes showed the highest production costs, ranging from L14.4/GJ for a simple wood feedstock process to L25.2/GJ for a process based on sugar beet. The important conclusions are as follows: - In every case, the product cost is above current liquid fuel prices; - In most cases the feedstock cost dominates the production cost; -The most attractive products are thermochemically produced alcohol fuels.

  16. Crowd Sourcing for Challenging Technical Problems and Business Model

    Science.gov (United States)

    Davis, Jeffrey R.; Richard, Elizabeth

    2011-01-01

    Crowd sourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by an organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the Space Life Sciences Directorate (SLSD), with the support of Wyle Integrated Science and Engineering, established and implemented pilot projects in open innovation (crowd sourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technical challenges. These unsolved technical problems were converted to problem statements, also called "Challenges" or "Technical Needs" by the various open innovation service providers, and were then posted externally to seek solutions. In addition, an open call was issued internally to NASA employees Agency wide (10 Field Centers and NASA HQ) using an open innovation service provider crowd sourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external problems or challenges were posted through three different vendors: InnoCentive, Yet2.com and TopCoder. The 20 internal challenges were conducted using the InnoCentive crowd sourcing platform designed for internal use by an organization. This platform was customized for NASA use and promoted as NASA@Work. The results were significant. Of the seven InnoCentive external challenges, two full and five partial awards were made in complex technical areas such as predicting solar flares and long-duration food packaging. Similarly, the TopCoder challenge yielded an optimization algorithm for designing a lunar medical kit. The Yet2.com challenges yielded many new industry and academic contacts in bone

  17. Report of the Office of Nuclear Reactor Regulation technical assistance task force

    International Nuclear Information System (INIS)

    1981-11-01

    In 1981, the Director of the Office of Nuclear Reactor Regulation (NRR) of the Nuclear Regulatory Commission (NRC) chartered a task force to assess the office program of technical assistance and to recommend improvements. The task force divided the technical assistance program into four areas, and the practices in each area were assessed through a series of surveys of staff, management, and contractor personnel. The task force placed emphasis in its interview and assessment process on the problem areas that exist in the technical assistance program. The report thus reflects a weight on the faults found as a result of the inquiries made. The four major areas of technical assistance contracting studied were program planning, program management and execution, program control and management information systems, and program administration and coordination

  18. The logic of Technical Standardisation

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    In this paper technical standardisation is understood and explained in a model where economic analysis is coupled with an analysis of the political system as proposed in rational choice theory. The aim is to answer both the question why various countries (e.g. the United States versus European...... countries) let either the market or public intervention determine the mode of technical standardisation and the possible implications of these two ways of organizing technical standardisation from an economic and a political point of view. Based upon the analysis of the paper a couple of general policy...... recommendations are made concerning the mode of technical standardisation....

  19. Model for the respiratory modulation of the heart beat-to-beat time interval series

    Science.gov (United States)

    Capurro, Alberto; Diambra, Luis; Malta, C. P.

    2005-09-01

    In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.

  20. Building new meanings in technical English from the perspective of the lexical constellation model

    Directory of Open Access Journals (Sweden)

    Camino Rea Rizzo

    2010-10-01

    Full Text Available The need to name and communicate to others new concepts in specific domains of human activity leads to the formation of new terms. However, many of the technical words in English are not new from the point of view of form. They rather derive from the common stock of general language: new lexical units are built from already existing forms and/or meanings. The original form is used for naming a new concept by adding a distinctive specialized lexical feature while keeping some semantic features of the original concept. In this paper, we aim at explaining and visualizing the nature of some of the processes that allow for the construction of new senses in technical words through a branching and expanding process, as explained in the lexical constellation model. The analysis is performed on three words widely used in telecommunication English: “bus”, “hub” and “chip”. The understanding of the process may be of great help for learners of ESP in general and technical English in particular.

  1. Model for the heart beat-to-beat time series during meditation

    Science.gov (United States)

    Capurro, A.; Diambra, L.; Malta, C. P.

    2003-09-01

    We present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a pacemaker, that simulates the membrane potential of the sinoatrial node, modulated by a periodic input signal plus correlated noise that simulates the respiratory input. The model was used to assess the waveshape of the respiratory signals needed to reproduce in the phase space the trajectory of experimental heart beat-to-beat interval data. The data sets were recorded during meditation practices of the Chi and Kundalini Yoga techniques. Our study indicates that in the first case the respiratory signal has the shape of a smoothed square wave, and in the second case it has the shape of a smoothed triangular wave.

  2. Estimating and Analyzing Savannah Phenology with a Lagged Time Series Model

    DEFF Research Database (Denmark)

    Boke-Olen, Niklas; Lehsten, Veiko; Ardo, Jonas

    2016-01-01

    cycle due to their areal coverage and can have an effect on the food security in regions that depend on subsistence farming. In this study we investigate how soil moisture, mean annual precipitation, and day length control savannah phenology by developing a lagged time series model. The model uses...... climate data for 15 flux tower sites across four continents, and normalized difference vegetation index from satellite to optimize a statistical phenological model. We show that all three variables can be used to estimate savannah phenology on a global scale. However, it was not possible to create...... a simplified savannah model that works equally well for all sites on the global scale without inclusion of more site specific parameters. The simplified model showed no bias towards tree cover or between continents and resulted in a cross-validated r2 of 0.6 and root mean squared error of 0.1. We therefore...

  3. The story of technical cooperation

    International Nuclear Information System (INIS)

    Park, Yang Taek

    1989-09-01

    This book gives descriptions of technical cooperation, which is about why does technology transfer?, process of technology transfer with model, decisive cause and cooperation of technology transfer, cost and effect of technology transfer, historical experience of technology transfer, cases of technology transfer by field such as rubber tire, medicine and computer industry and automobile industry, technology transfer process and present condition of technical cooperation, and strategy for rising of technical cooperation : selection of technology for object of cooperation and development of human resources.

  4. Modeling of technical soil-erosion control measures and its impact on soil erosion off-site effects within urban areas

    Science.gov (United States)

    Dostal, Tomas; Devaty, Jan

    2013-04-01

    The paper presents results of surface runoff, soil erosion and sediment transport modeling using Erosion 3D software - physically based mathematical simulation model, event oriented, fully distributed. Various methods to simulate technical soil-erosion conservation measures were tested, using alternative digital elevation models of different precision and resolution. Ditches and baulks were simulated by three different approaches, (i) by change of the land-cover parameters to increase infiltration and decrease flow velocity, (ii) by change of the land-cover parameters to completely infiltrate the surface runoff and (iii) by adjusting the height of the digital elevation model by "burning in" the channels of the ditches. Results show advantages and disadvantages of each approach and conclude suitable methods for combinations of particular digital elevation model and purpose of the simulations. Further on a set of simulations was carried out to model situations before and after technical soil-erosion conservation measures application within a small catchment of 4 km2. These simulations were focused on quantitative and qualitative assessment of technical soil-erosion control measures impact on soil erosion off-site effects within urban areas located downstream of intensively used agricultural fields. The scenarios were built upon a raster digital elevation model with spatial resolution of 3 meters derived from LiDAR 5G vector point elevation data. Use of this high-resolution elevation model allowed simulating the technical soil-erosion control measures by direct terrain elevation adjustment. Also the structures within the settlements were emulated by direct change in the elevation of the terrain model. The buildings were lifted up to simulate complicated flow behavior of the surface runoff within urban areas, using approach of Arévalo (Arévalo, 2011) but focusing on the use of commonly available data without extensive detailed editing. Application of the technical

  5. The string prediction models as an invariants of time series in forex market

    OpenAIRE

    Richard Pincak; Marian Repasan

    2011-01-01

    In this paper we apply a new approach of the string theory to the real financial market. It is direct extension and application of the work [1] into prediction of prices. The models are constructed with an idea of prediction models based on the string invariants (PMBSI). The performance of PMBSI is compared to support vector machines (SVM) and artificial neural networks (ANN) on an artificial and a financial time series. Brief overview of the results and analysis is given. The first model is ...

  6. Testing the performance of technical trading rules in the Chinese markets based on superior predictive test

    Science.gov (United States)

    Wang, Shan; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing

    2015-12-01

    Technical trading rules have a long history of being used by practitioners in financial markets. The profitable ability and efficiency of technical trading rules are yet controversial. In this paper, we test the performance of more than seven thousand traditional technical trading rules on the Shanghai Securities Composite Index (SSCI) from May 21, 1992 through June 30, 2013 and China Securities Index 300 (CSI 300) from April 8, 2005 through June 30, 2013 to check whether an effective trading strategy could be found by using the performance measurements based on the return and Sharpe ratio. To correct for the influence of the data-snooping effect, we adopt the Superior Predictive Ability test to evaluate if there exists a trading rule that can significantly outperform the benchmark. The result shows that for SSCI, technical trading rules offer significant profitability, while for CSI 300, this ability is lost. We further partition the SSCI into two sub-series and find that the efficiency of technical trading in sub-series, which have exactly the same spanning period as that of CSI 300, is severely weakened. By testing the trading rules on both indexes with a five-year moving window, we find that during the financial bubble from 2005 to 2007, the effectiveness of technical trading rules is greatly improved. This is consistent with the predictive ability of technical trading rules which appears when the market is less efficient.

  7. An Overview of the Technical Basis of HENRE 2.0 Models

    Science.gov (United States)

    2015-08-01

    important for estimating health effects, but is also key in developing appropriate guidelines for responders and decontamination protocols. Modeling of...residential buildings (2008-2010). US Department of Homeland Security/US Fire Administration, National Fire Data Center . Topical Fire Report Series... Surgical Research, 61(1), 17-22. Harris, J.W., and Nonan, T.R., 1968. “Early Vascular Permeability Changes in Whole-body X- irradiated Rats,” Radiation

  8. ShapeSelectForest: a new r package for modeling landsat time series

    Science.gov (United States)

    Mary Meyer; Xiyue Liao; Gretchen Moisen; Elizabeth Freeman

    2015-01-01

    We present a new R package called ShapeSelectForest recently posted to the Comprehensive R Archival Network. The package was developed to fit nonparametric shape-restricted regression splines to time series of Landsat imagery for the purpose of modeling, mapping, and monitoring annual forest disturbance dynamics over nearly three decades. For each pixel and spectral...

  9. Modelling the behaviour of uranium-series radionuclides in soils and plants taking into account seasonal variations in soil hydrology.

    Science.gov (United States)

    Pérez-Sánchez, D; Thorne, M C

    2014-05-01

    In a previous paper, a mathematical model for the behaviour of (79)Se in soils and plants was described. Subsequently, a review has been published relating to the behaviour of (238)U-series radionuclides in soils and plants. Here, we bring together those two strands of work to describe a new mathematical model of the behaviour of (238)U-series radionuclides entering soils in solution and their uptake by plants. Initial studies with the model that are reported here demonstrate that it is a powerful tool for exploring the behaviour of this decay chain or subcomponents of it in soil-plant systems under different hydrological regimes. In particular, it permits studies of the degree to which secular equilibrium assumptions are appropriate when modelling this decay chain. Further studies will be undertaken and reported separately examining sensitivities of model results to input parameter values and also applying the model to sites contaminated with (238)U-series radionuclides. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Modelling and Reasoning about Security Requirements in Socio-Technical Systems

    NARCIS (Netherlands)

    Paja, Elda; Dalpiaz, Fabiano; Giorgini, Paolo

    2015-01-01

    Modern software systems operate within the context of larger socio-technical systems, wherein they interact—by exchanging data and outsourcing tasks—with other technical components, humans, and organisations. When interacting, these components (actors) operate autonomously; as such, they may

  11. Technical reference manual for TIME4. Vol. 2

    International Nuclear Information System (INIS)

    Wilmot, R.D.; Ringrose, P.S.; Larkin, J.P.A.; Kleissen, F.A.T.

    1991-07-01

    This document is the Technical Reference Manual for the TIME4 model. TIME4 is the environmental change model developed for use in the probabilistic risk analysis of deep disposal of radioactive waste. The Technical Reference Manual describes the theoretical background to the model. The modelling method is described, followed by a review of related work and a detailed description for each sub-model. (author)

  12. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    Directory of Open Access Journals (Sweden)

    Jun-He Yang

    2017-01-01

    Full Text Available Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir’s water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir’s water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  13. IASI instrument: technical description and measured performances

    Science.gov (United States)

    Hébert, Ph.; Blumstein, D.; Buil, C.; Carlier, T.; Chalon, G.; Astruc, P.; Clauss, A.; Siméoni, D.; Tournier, B.

    2017-11-01

    IASI is an infrared atmospheric sounder. It will provide meteorologist and scientific community with atmospheric spectra. The IASI system includes 3 instruments that will be mounted on the Metop satellite series, a data processing software integrated in the EPS (EUMETSAT Polar System) ground segment and a technical expertise centre implemented in CNES Toulouse. The instrument is composed of a Fourier transform spectrometer and an associated infrared imager. The optical configuration is based on a Michelson interferometer and the interferograms are processed by an on-board digital processing subsystem, which performs the inverse Fourier transforms and the radiometric calibration. The infrared imager co-registers the IASI soundings with AVHRR imager (AVHRR is another instrument on the Metop satellite). The presentation will focus on the architectures of the instrument, the description of the implemented technologies and the measured performance of the first flight model. CNES is leading the IASI program in association with EUMETSAT. The instrument Prime is ALCATEL SPACE.

  14. The Development of a Project-Based Collaborative Technical Writing Model Founded on Learner Feedback in a Tertiary Aeronautical Engineering Program

    Science.gov (United States)

    Tatzl, Dietmar; Hassler, Wolfgang; Messnarz, Bernd; Fluhr, Holger

    2012-01-01

    The present article describes and evaluates collaborative interdisciplinary group projects initiated by content lecturers and an English-as-a-Foreign-Language (EFL) instructor for the purpose of teaching technical writing skills in an aeronautical engineering degree program. The proposed technical writing model is assessed against the results of a…

  15. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study.

    Science.gov (United States)

    Li, Qiongge; Chan, Maria F

    2017-01-01

    Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.

  16. Urology technical and non-technical skills development: the emerging role of simulation.

    Science.gov (United States)

    Rashid, Prem; Gianduzzo, Troy R J

    2016-04-01

    To review the emerging role of technical and non-technical simulation in urological education and training. A review was conducted to examine the current role of simulation in urology training. A PUBMED search of the terms 'urology training', 'urology simulation' and 'urology education' revealed 11,504 titles. Three hundred and fifty-seven abstracts were identified as English language, peer reviewed papers pertaining to the role of simulation in urology and related topics. Key papers were used to explore themes. Some cross-referenced papers were also included. There is an ongoing need to ensure that training time is efficiently utilised while ensuring that optimal technical and non-technical skills are achieved. Changing working conditions and the need to minimise patient harm by inadvertent errors must be taken into account. Simulation models for specific technical aspects have been the mainstay of graduated step-wise low and high fidelity training. Whole scenario environments as well as non-technical aspects can be slowly incorporated into the curriculum. Doing so should also help define what have been challenging competencies to teach and evaluate. Dedicated time, resources and trainer up-skilling are important. Concurrent studies are needed to help evaluate the effectiveness of introducing step-wise simulation for technical and non-technical competencies. Simulation based learning remains the best avenue of progressing surgical education. Technical and non-technical simulation could be used in the selection process. There are good economic, logistic and safety reasons to pursue the process of ongoing development of simulation co-curricula. While the role of simulation is assured, its progress will depend on a structured program that takes advantage of what can be delivered via this medium. Overall, simulation can be developed further for urological training programs to encompass technical and non-technical skill development at all stages, including

  17. MMSNF 2005. Materials models and simulations for nuclear fuels

    Energy Technology Data Exchange (ETDEWEB)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J

    2006-07-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  18. MMSNF 2005. Materials models and simulations for nuclear fuels

    International Nuclear Information System (INIS)

    Freyss, M.; Durinck, J.; Carlot, G.; Sabathier, C.; Martin, P.; Garcia, P.; Ripert, M.; Blanpain, P.; Lippens, M.; Schut, H.; Federov, A.V.; Bakker, K.; Osaka, M.; Miwa, S.; Sato, I.; Tanaka, K.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Govers, K.; Verwerft, M.; Hou, M.; Lemehov, S.E.; Terentyev, D.; Govers, K.; Kotomin, E.A.; Ashley, N.J.; Grimes, R.W.; Van Uffelen, P.; Mastrikov, Y.; Zhukovskii, Y.; Rondinella, V.V.; Kurosaki, K.; Uno, M.; Yamanaka, S.; Minato, K.; Phillpot, S.; Watanabe, T.; Shukla, P.; Sinnott, S.; Nino, J.; Grimes, R.; Staicu, D.; Hiernaut, J.P.; Wiss, T.; Rondinella, V.V.; Ronchi, C.; Yakub, E.; Kaye, M.H.; Morrison, C.; Higgs, J.D.; Akbari, F.; Lewis, B.J.; Thompson, W.T.; Gueneau, C.; Gosse, S.; Chatain, S.; Dumas, J.C.; Sundman, B.; Dupin, N.; Konings, R.; Noel, H.; Veshchunov, M.; Dubourg, R.; Ozrin, C.V.; Veshchunov, M.S.; Welland, M.T.; Blanc, V.; Michel, B.; Ricaud, J.M.; Calabrese, R.; Vettraino, F.; Tverberg, T.; Kissane, M.; Tulenko, J.; Stan, M.; Ramirez, J.C.; Cristea, P.; Rachid, J.; Kotomin, E.; Ciriello, A.; Rondinella, V.V.; Staicu, D.; Wiss, T.; Konings, R.; Somers, J.; Killeen, J.

    2006-01-01

    The MMSNF Workshop series aims at stimulating research and discussions on models and simulations of nuclear fuels and coupling the results into fuel performance codes.This edition was focused on materials science and engineering for fuel performance codes. The presentations were grouped in three technical sessions: fundamental modelling of fuel properties; integral fuel performance codes and their validation; collaborations and integration of activities. (A.L.B.)

  19. An endogenous growth model with embodied energy-saving technical change

    International Nuclear Information System (INIS)

    Van Zon, A.; Yetkiner, I. H.

    2003-01-01

    In this paper, we extend the Romer [Journal of Political Economy 98 (Part 2) (1990) S271] model in two ways. First we include energy consumption of intermediates. Second, intermediates become heterogeneous due to endogenous energy-saving technical change. We show that the resulting model can still generate steady state growth, but the growth rate depends negatively on the growth of real energy prices. The reason is that real energy price rises will lower the profitability of using new intermediate goods, and hence, the profitability of doing research, and therefore have a negative impact on growth. We also show that the introduction of an energy tax that is recycled in the form of an R and D subsidy may increase growth. We conclude that in order to have energy efficiency growth and output growth under rising real energy prices, a combination of R and D and energy policy is called for

  20. Sensor response monitoring in pressurized water reactors using time series modeling

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Kerlin, T.W.

    1978-01-01

    Random data analysis in nuclear power reactors for purposes of process surveillance, pattern recognition and monitoring of temperature, pressure, flow and neutron sensors has gained increasing attention in view of their potential for helping to ensure safe plant operation. In this study, application of autoregressive moving-average (ARMA) time series modeling for monitoring temperature sensor response characteristrics is presented. The ARMA model is used to estimate the step and ramp response of the sensors and the related time constant and ramp delay time. The ARMA parameters are estimated by a two-stage algorithm in the spectral domain. Results of sensor testing for an operating pressurized water reactor are presented. 16 refs

  1. Application of semi parametric modelling to times series forecasting: case of the electricity consumption; Modeles semi-parametriques appliques a la prevision des series temporelles. Cas de la consommation d'electricite

    Energy Technology Data Exchange (ETDEWEB)

    Lefieux, V

    2007-10-15

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  2. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  3. An expanded system simulation model for solar energy storage (technical report), volume 1

    Science.gov (United States)

    Warren, A. W.

    1979-01-01

    The simulation model for wind energy storage (SIMWEST) program now includes wind and/or photovoltaic systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) and is available for the UNIVAC 1100 series and the CDC 6000 series computers. The level of detail is consistent with a role of evaluating the economic feasibility as well as the general performance of wind and/or photovoltaic energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind and/or photovoltaic source/storage/application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/0, the integration of system dynamics, and the iteration for conveyance of variables.

  4. Forecasting of time series with trend and seasonal cycle using the airline model and artificial neural networks Pronóstico de series de tiempo con tendencia y ciclo estacional usando el modelo airline y redes neuronales artificiales

    Directory of Open Access Journals (Sweden)

    J D Velásquez

    2012-06-01

    Full Text Available Many time series with trend and seasonal pattern are successfully modeled and forecasted by the airline model of Box and Jenkins; however, this model neglects the presence of nonlinearity on data. In this paper, we propose a new nonlinear version of the airline model; for this, we replace the moving average linear component by a multilayer perceptron neural network. The proposedmodel is used for forecasting two benchmark time series; we found that theproposed model is able to forecast the time series with more accuracy that other traditional approaches.Muchas series de tiempo con tendencia y ciclos estacionales son exitosamente modeladas y pronosticadas usando el modelo airline de Box y Jenkins; sin embargo, la presencia de no linealidades en los datos son despreciadas por este modelo. En este artículo, se propone una nueva versión no lineal del modelo airline; para esto, se reemplaza la componente lineal de promedios móviles por un perceptrón multicapa. El modelo propuesto es usado para pronosticar dos series de tiempo benchmark; se encontró que el modelo propuesto es capaz de pronosticar las series de tiempo con mayor precisión que otras aproximaciones tradicionales.

  5. PENDISC: a simple method for constructing a mathematical model from time-series data of metabolite concentrations.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Iwata, Michio; Hirai, Masami Yokota; Shiraishi, Fumihide

    2014-06-01

    The availability of large-scale datasets has led to more effort being made to understand characteristics of metabolic reaction networks. However, because the large-scale data are semi-quantitative, and may contain biological variations and/or analytical errors, it remains a challenge to construct a mathematical model with precise parameters using only these data. The present work proposes a simple method, referred to as PENDISC (Parameter Estimation in a N on- DImensionalized S-system with Constraints), to assist the complex process of parameter estimation in the construction of a mathematical model for a given metabolic reaction system. The PENDISC method was evaluated using two simple mathematical models: a linear metabolic pathway model with inhibition and a branched metabolic pathway model with inhibition and activation. The results indicate that a smaller number of data points and rate constant parameters enhances the agreement between calculated values and time-series data of metabolite concentrations, and leads to faster convergence when the same initial estimates are used for the fitting. This method is also shown to be applicable to noisy time-series data and to unmeasurable metabolite concentrations in a network, and to have a potential to handle metabolome data of a relatively large-scale metabolic reaction system. Furthermore, it was applied to aspartate-derived amino acid biosynthesis in Arabidopsis thaliana plant. The result provides confirmation that the mathematical model constructed satisfactorily agrees with the time-series datasets of seven metabolite concentrations.

  6. Endogenous induced technical change and the costs of Kyoto

    International Nuclear Information System (INIS)

    Buonanno, Paolo; Carraro, Carlo; Galeotti, M.

    2001-09-01

    Many predictions and conclusions in the climate change literature have been made and drawn on the basis of theoretical analyses and quantitative models that are either static or that allow for simple forms of changes in technology, often along exogenously given time paths. It is therefore not clear a priori whether those conclusions and policy recipes still hold in the more realistic case of endogenously evolving technologies. In this paper, a quantitative tool with the features of an endogenous growth model is presented, which also accounts for the possibility that technical change can be induced by environmental policy measures. Both the output production technology and the emission-output ratio depend upon the stock of knowledge, which accumulates through R and D activities. R and D is thus an additional policy variable that comes into play along with pollution abatement and capital investment. Two versions of this climate model are studied, one with endogenous technical change but exogenous environmental technical change (i.e. no induced technical change) and the other with both endogenous and induced technical change. Hence, in both models technical change evolves endogenously as far as the production technology is concerned, but endogenous environmental (or induced) technical change is only accounted for in the second version. Finally, a third version of the model also captures technological spillover effects. As an application, the three versions of the model are simulated allowing for trade of pollution permits as specified in the Kyoto Protocol and assessing the implications in terms of cost efficiency, economic growth and R and D efforts of the three different specifications of technical change

  7. Russian State Time and Earth Rotation Service: Observations, Eop Series, Prediction

    Science.gov (United States)

    Kaufman, M.; Pasynok, S.

    2010-01-01

    Russian State Time, Frequency and Earth Rotation Service provides the official EOP data and time for use in scientific, technical and metrological works in Russia. The observations of GLONASS and GPS on 30 stations in Russia, and also the Russian and worldwide observations data of VLBI (35 stations) and SLR (20 stations) are used now. To these three series of EOP the data calculated in two other Russian analysis centers are added: IAA (VLBI, GPS and SLR series) and MCC (SLR). Joint processing of these 7 series is carried out every day (the operational EOP data for the last day and the predicted values for 50 days). The EOP values are weekly refined and systematic errors of every individual series are corrected. The combined results become accessible on the VNIIFTRI server (ftp.imvp.ru) approximately at 6h UT daily.

  8. Teaching Technical Writing and Editing -- In-House Programs That Work. Anthology Series No. 5.

    Science.gov (United States)

    Shaw, James G., Ed.

    The 12 articles in this publication provide indepth treatment of important aspects of in-house training programs for technical writing and editing. The articles deal with the following topics: the value of an in-house writing course, teaching in industry, developing an in-house writing course for engineers and scientists, a new approach to…

  9. Technical Training: Make the most of Office, Sharepoint and Lync 2013

    CERN Multimedia

    2014-01-01

    The IT Department, in cooperation with the Technical Training team, would like to invite you to IT Technical Training Tutorials 2014: Make the most of Office, Sharepoint and Lync 2013. In this lecture series, we will present: Microsoft Office 2013 Microsoft Lync 2013 (Including IP telephony) Microsoft SharePoint 2013 Sessions in French: 7 October, 9 a.m. - 12 p.m. Sessions in English: 13 October, 9 a.m. - 12 p.m. This training is free of charge, but please create your training request via EDH at: https://edh.cern.ch/Document/Personnel/TRN/new?course=146OEL01. Objectives of the training: General overview of the Microsoft Office 2013, Lync and Sharepoint 2013. Changes in comparison to the 2010 release of the software Discussion of new ways to communicate in the work environment, including audio calls, instant messaging, social newsfeeds and online editing of documents. The exact schedule for both series is available at: http://cern.ch/go/IT3T.

  10. Time series segmentation: a new approach based on Genetic Algorithm and Hidden Markov Model

    Science.gov (United States)

    Toreti, A.; Kuglitsch, F. G.; Xoplaki, E.; Luterbacher, J.

    2009-04-01

    The subdivision of a time series into homogeneous segments has been performed using various methods applied to different disciplines. In climatology, for example, it is accompanied by the well-known homogenization problem and the detection of artificial change points. In this context, we present a new method (GAMM) based on Hidden Markov Model (HMM) and Genetic Algorithm (GA), applicable to series of independent observations (and easily adaptable to autoregressive processes). A left-to-right hidden Markov model, estimating the parameters and the best-state sequence, respectively, with the Baum-Welch and Viterbi algorithms, was applied. In order to avoid the well-known dependence of the Baum-Welch algorithm on the initial condition, a Genetic Algorithm was developed. This algorithm is characterized by mutation, elitism and a crossover procedure implemented with some restrictive rules. Moreover the function to be minimized was derived following the approach of Kehagias (2004), i.e. it is the so-called complete log-likelihood. The number of states was determined applying a two-fold cross-validation procedure (Celeux and Durand, 2008). Being aware that the last issue is complex, and it influences all the analysis, a Multi Response Permutation Procedure (MRPP; Mielke et al., 1981) was inserted. It tests the model with K+1 states (where K is the state number of the best model) if its likelihood is close to K-state model. Finally, an evaluation of the GAMM performances, applied as a break detection method in the field of climate time series homogenization, is shown. 1. G. Celeux and J.B. Durand, Comput Stat 2008. 2. A. Kehagias, Stoch Envir Res 2004. 3. P.W. Mielke, K.J. Berry, G.W. Brier, Monthly Wea Rev 1981.

  11. [Ageing and work: technical standards].

    Science.gov (United States)

    De Vito, G; Riva, M A; Meroni, R; Cesana, G C

    2010-01-01

    Over the last few years, studies on the relationship between ageing and work have attracted growing interest due to the increased probability among workers of developing major health problems as a consequence of ageing of the working population. Negative outcomes for health are possible when an age-related imbalance appears between physical workload and physical work capacity. Interventions based on workload reductions should help to keep workers on the job for as long as allowed by law. Reference masses by age and sex are suggested by the technical standards of the ISO 11228 series, which are also quoted by Italian law D.Lgs. 81/2008, and EN 1005 series, which recommend limits valid also for manual material handling, and pushing and pulling. Decreasing low back pain prevalence or recurrence, in an ageing population with high prevalence of back disorders, could be more effective than many other approaches to enhance workers' quality of life and consequently maintain and improve workers' performance.

  12. Endogenous induced technical change and the costs of Kyoto

    International Nuclear Information System (INIS)

    Buonanno, Paolo; Carraro, Carlo; Galeotti, Marzio

    2003-01-01

    We present a model for climate change policy analysis which accounts for the possibility that technology evolves endogenously and that technical change can be induced by environmental policy measures. Both the output production technology and the emission-output ratio depend upon a stock of knowledge, which accumulates through R and D activities. Two versions of this model are studied, one with endogenous technical change but exogenous environmental technical change and the other with both endogenous and induced technical change. A third version also captures technological spillover effects. As an application, the model is simulated allowing for trade of pollution permits as specified in the Kyoto Protocol and assessing the implications in terms of cost efficiency, economic growth and R and D efforts of the three different specifications of technical change

  13. Co-evolution of intelligent socio-technical systems modelling and applications in large scale emergency and transport domains

    CERN Document Server

    2013-01-01

    As the interconnectivity between humans through technical devices is becoming ubiquitous, the next step is already in the making: ambient intelligence, i.e. smart (technical) environments, which will eventually play the same active role in communication as the human players, leading to a co-evolution in all domains where real-time communication is essential. This topical volume, based on the findings of the Socionical European research project, gives equal attention to two highly relevant domains of applications: transport, specifically traffic, dynamics from the viewpoint of a socio-technical interaction and evacuation scenarios for large-scale emergency situations. Care was taken to investigate as much as possible the limits of scalability and to combine the modeling using complex systems science approaches with relevant data analysis.

  14. EO Model for Tacit Knowledge Externalization in Socio-Technical Enterprises

    Directory of Open Access Journals (Sweden)

    Shreyas Suresh Rao

    2017-03-01

    Full Text Available Aim/Purpose: A vital business activity within socio-technical enterprises is tacit knowledge externalization, which elicits and explicates tacit knowledge of enterprise employees as external knowledge. The aim of this paper is to integrate diverse aspects of externalization through the Enterprise Ontology model. Background: Across two decades, researchers have explored various aspects of tacit knowledge externalization. However, from the existing works, it is revealed that there is no uniform representation of the externalization process, which has resulted in divergent and contradictory interpretations across the literature. Methodology\t: The Enterprise Ontology model is constructed step-wise through the conceptual and measurement views. While the conceptual view encompasses three patterns that model the externalization process, the measurement view employs certainty-factor model to empirically measure the outcome of the externalization process. Contribution: The paper contributes towards knowledge management literature in two ways. The first contribution is the Enterprise Ontology model that integrates diverse aspects of externalization. The second contribution is a Web application that validates the model through a case study in banking. Findings: The findings show that the Enterprise Ontology model and the patterns are pragmatic in externalizing the tacit knowledge of experts in a problem-solving scenario within a banking enterprise. Recommendations for Practitioners\t: Consider the diverse aspects (what, where, when, why, and how during the tacit knowledge externalization process. Future Research:\tTo extend the Enterprise Ontology model to include externalization from partially automated enterprise systems.

  15. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    Science.gov (United States)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  16. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    International Nuclear Information System (INIS)

    Almog, Assaf; Garlaschelli, Diego

    2014-01-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information. (paper)

  17. Approaches in highly parameterized inversion: TSPROC, a general time-series processor to assist in model calibration and result summarization

    Science.gov (United States)

    Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.

    2012-01-01

    The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.

  18. Modelling transport energy demand: A socio-technical approach

    International Nuclear Information System (INIS)

    Anable, Jillian; Brand, Christian; Tran, Martino; Eyre, Nick

    2012-01-01

    Despite an emerging consensus that societal energy consumption and related emissions are not only influenced by technical efficiency but also by lifestyles and socio-cultural factors, few attempts have been made to operationalise these insights in models of energy demand. This paper addresses that gap by presenting a scenario exercise using an integrated suite of sectoral and whole systems models to explore potential energy pathways in the UK transport sector. Techno-economic driven scenarios are contrasted with one in which social change is strongly influenced by concerns about energy use, the environment and well-being. The ‘what if’ Lifestyle scenario reveals a future in which distance travelled by car is reduced by 74% by 2050 and final energy demand from transport is halved compared to the reference case. Despite the more rapid uptake of electric vehicles and the larger share of electricity in final energy demand, it shows a future where electricity decarbonisation could be delayed. The paper illustrates the key trade-off between the more aggressive pursuit of purely technological fixes and demand reduction in the transport sector and concludes there are strong arguments for pursuing both demand and supply side solutions in the pursuit of emissions reduction and energy security.

  19. Some aspects of technical models within bridge management system

    Directory of Open Access Journals (Sweden)

    Grković Slobodan

    2015-01-01

    Full Text Available Bridge Management System (BMS represents a rational and systematic approach to organizing and conducting of all activities related to bridge maintenance. Main goal of BMS is helping the bridge owner to make an optimal decision with respect to bridge maintenance budget, whether it is dedicated to one bridge or to group of bridges, by securing that the decision is made on the basis of Life-Cycle Cost (LCC estimates. The structure of BMS is based on condition rating and Bridge Database (BD, Deterioration model (DM, Cost Model for evaluation of costs and Optimization Model for choosing the most rational maintenance strategy. Relationship between bridge condition rating and DM with maintenance costs within BMS is very important. Predictions regarding future intensity and rate of bridge deterioration depends on multitude of factors and it is a consequence of several simultaneous actions and deterioration processes (DP which need to be included into DMs. DMs are mostly based on modeling of physical and chemical actions and processes or they are based on statistical analysis of large number of data regarding the condition of existing bridges, or on artificial intelligence models, etc. Stochastic models based on Markov processes are applied within more advanced contemporary BMS. Due to social and economic circumstances and lack of financial resources over the last two decades in Republic of Serbia bridge maintenance was neglected and creation of BD was discontinued. The paper deliberates some aspects of technical models within BMS, with DMs being pointed out, development of BD within bridge maintenance in Republic of Serbia and it gives an overview of sophisticated BMS and current advances in this field.

  20. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  1. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-01-01

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  2. A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.

    Science.gov (United States)

    Houseman, E Andres; Virji, M Abbas

    2017-08-01

    Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates

  3. Time series modelling of global mean temperature for managerial decision-making.

    Science.gov (United States)

    Romilly, Peter

    2005-07-01

    Climate change has important implications for business and economic activity. Effective management of climate change impacts will depend on the availability of accurate and cost-effective forecasts. This paper uses univariate time series techniques to model the properties of a global mean temperature dataset in order to develop a parsimonious forecasting model for managerial decision-making over the short-term horizon. Although the model is estimated on global temperature data, the methodology could also be applied to temperature data at more localised levels. The statistical techniques include seasonal and non-seasonal unit root testing with and without structural breaks, as well as ARIMA and GARCH modelling. A forecasting evaluation shows that the chosen model performs well against rival models. The estimation results confirm the findings of a number of previous studies, namely that global mean temperatures increased significantly throughout the 20th century. The use of GARCH modelling also shows the presence of volatility clustering in the temperature data, and a positive association between volatility and global mean temperature.

  4. Bridging Scales: A Model-Based Assessment of the Technical Tidal-Stream Energy Resource off Massachusetts, USA

    Science.gov (United States)

    Cowles, G. W.; Hakim, A.; Churchill, J. H.

    2016-02-01

    Tidal in-stream energy conversion (TISEC) facilities provide a highly predictable and dependable source of energy. Given the economic and social incentives to migrate towards renewable energy sources there has been tremendous interest in the technology. Key challenges to the design process stem from the wide range of problem scales extending from device to array. In the present approach we apply a multi-model approach to bridge the scales of interest and select optimal device geometries to estimate the technical resource for several realistic sites in the coastal waters of Massachusetts, USA. The approach links two computational models. To establish flow conditions at site scales ( 10m), a barotropic setup of the unstructured grid ocean model FVCOM is employed. The model is validated using shipboard and fixed ADCP as well as pressure data. For device scale, the structured multiblock flow solver SUmb is selected. A large ensemble of simulations of 2D cross-flow tidal turbines is used to construct a surrogate design model. The surrogate model is then queried using velocity profiles extracted from the tidal model to determine the optimal geometry for the conditions at each site. After device selection, the annual technical yield of the array is evaluated with FVCOM using a linear momentum actuator disk approach to model the turbines. Results for several key Massachusetts sites including comparison with theoretical approaches will be presented.

  5. Asymptotics for the conditional-sum-of-squares estimator in multivariate fractional time series models

    DEFF Research Database (Denmark)

    Ørregård Nielsen, Morten

    This paper proves consistency and asymptotic normality for the conditional-sum-of-squares estimator, which is equivalent to the conditional maximum likelihood estimator, in multivariate fractional time series models. The model is parametric and quite general, and, in particular, encompasses...... the multivariate non-cointegrated fractional ARIMA model. The novelty of the consistency result, in particular, is that it applies to a multivariate model and to an arbitrarily large set of admissible parameter values, for which the objective function does not converge uniformly in probablity, thus making...

  6. Stochastic modeling of neurobiological time series: Power, coherence, Granger causality, and separation of evoked responses from ongoing activity

    Science.gov (United States)

    Chen, Yonghong; Bressler, Steven L.; Knuth, Kevin H.; Truccolo, Wilson A.; Ding, Mingzhou

    2006-06-01

    In this article we consider the stochastic modeling of neurobiological time series from cognitive experiments. Our starting point is the variable-signal-plus-ongoing-activity model. From this model a differentially variable component analysis strategy is developed from a Bayesian perspective to estimate event-related signals on a single trial basis. After subtracting out the event-related signal from recorded single trial time series, the residual ongoing activity is treated as a piecewise stationary stochastic process and analyzed by an adaptive multivariate autoregressive modeling strategy which yields power, coherence, and Granger causality spectra. Results from applying these methods to local field potential recordings from monkeys performing cognitive tasks are presented.

  7. Time Series Modeling of Human Operator Dynamics in Manual Control Tasks

    Science.gov (United States)

    Biezad, D. J.; Schmidt, D. K.

    1984-01-01

    A time-series technique is presented for identifying the dynamic characteristics of the human operator in manual control tasks from relatively short records of experimental data. Control of system excitation signals used in the identification is not required. The approach is a multi-channel identification technique for modeling multi-input/multi-output situations. The method presented includes statistical tests for validity, is designed for digital computation, and yields estimates for the frequency response of the human operator. A comprehensive relative power analysis may also be performed for validated models. This method is applied to several sets of experimental data; the results are discussed and shown to compare favorably with previous research findings. New results are also presented for a multi-input task that was previously modeled to demonstrate the strengths of the method.

  8. Hydrological time series modeling: A comparison between adaptive neuro-fuzzy, neural network and autoregressive techniques

    Science.gov (United States)

    Lohani, A. K.; Kumar, Rakesh; Singh, R. D.

    2012-06-01

    SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.

  9. THE EFFECT OF DECOMPOSITION METHOD AS DATA PREPROCESSING ON NEURAL NETWORKS MODEL FOR FORECASTING TREND AND SEASONAL TIME SERIES

    Directory of Open Access Journals (Sweden)

    Subanar Subanar

    2006-01-01

    Full Text Available Recently, one of the central topics for the neural networks (NN community is the issue of data preprocessing on the use of NN. In this paper, we will investigate this topic particularly on the effect of Decomposition method as data processing and the use of NN for modeling effectively time series with both trend and seasonal patterns. Limited empirical studies on seasonal time series forecasting with neural networks show that some find neural networks are able to model seasonality directly and prior deseasonalization is not necessary, and others conclude just the opposite. In this research, we study particularly on the effectiveness of data preprocessing, including detrending and deseasonalization by applying Decomposition method on NN modeling and forecasting performance. We use two kinds of data, simulation and real data. Simulation data are examined on multiplicative of trend and seasonality patterns. The results are compared to those obtained from the classical time series model. Our result shows that a combination of detrending and deseasonalization by applying Decomposition method is the effective data preprocessing on the use of NN for forecasting trend and seasonal time series.

  10. Application of semi parametric modelling to times series forecasting: case of the electricity consumption

    International Nuclear Information System (INIS)

    Lefieux, V.

    2007-10-01

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  11. Optimal model-free prediction from multivariate time series

    Science.gov (United States)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  12. The ab initio model potential method. Second series transition metal elements

    International Nuclear Information System (INIS)

    Barandiaran, Z.; Seijo, L.; Huzinaga, S.

    1990-01-01

    The ab initio core method potential model (AIMP) has already been presented in its nonrelativistic version and applied to the main group and first series transition metal elements [J. Chem. Phys. 86, 2132 (1987); 91, 7011 (1989)]. In this paper we extend the AIMP method to include relativistic effects within the Cowan--Griffin approximation and we present relativistic Zn-like core model potentials and valence basis sets, as well as their nonrelativistic Zn-like core and Kr-like core counterparts. The pilot molecular calculations on YO, TcO, AgO, and AgH reveal that the 4p orbital is indeed a core orbital only at the end part of the series, whereas the 4s orbital can be safely frozen from Y to Cd. The all-electron and model potential results agree in 0.01--0.02 A in R e and 25--50 cm -1 in bar ν e if the same type of valence part of the basis set is used. The comparison of the relativistic results on AgH with those of the all-electron Dirac--Fock calculations by Lee and McLean is satisfactory: the absolute value of R e is reproduced within the 0.01 A margin and the relativistic contraction of 0.077 A is also very well reproduced (0.075 A). Finally, the relative magnitude of the effects of the core orbital change, mass--velocity potential, and Darwin potential on the net relativistic effects are analyzed in the four molecules studied

  13. Hidden discriminative features extraction for supervised high-order time series modeling.

    Science.gov (United States)

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Technical and economic analysis of hydrogen refuelling

    International Nuclear Information System (INIS)

    Nistor, Silviu; Dave, Saraansh; Fan, Zhong; Sooriyabandara, Mahesh

    2016-01-01

    Highlights: • Technical and economic models of a hydrogen station for vehicles refuelling. • Hydrogen demand from fuel cell electric vehicles modelled stochastically. • Study case based on a UK pilot project. • Operation of the H_2 station using combined energy from wind and power grid is preferred. • Return on investment of 5–10 years is possible for the hydrogen station. - Abstract: This paper focuses on technical and economic analysis of a hydrogen refilling station to provide operational insight through tight coupling of technical models of physical processes and economic models. This allows the dynamic relationships of the system to be captured and analysed to provide short/medium term analytical capability to support system design, planning, and financing. The modelling developed here highlights the need to closely link technical and economic models for technology led projects where technical capability and commercial feasibility are important. The results show that hydrogen fuel can be competitive with petrol on a GBP/KG basis if the return on investment period is over 10 years for PEM electrolysers and 5 for Alkaline electrolysers. We also show that subsidies on capital costs (as reflected by some R&D funding programs) make both PEM and Alkaline technologies cheaper than the equivalent price of petrol, which suggests more emphasis should be put on commercialising R&D funded projects as they have commercial advantages. The paper also shows that a combined wind and grid connected station is preferable so that a higher number of customers are served (i.e. minimum shortage of hydrogen).

  15. An Improved Method for Producing High Spatial-Resolution NDVI Time Series Datasets with Multi-Temporal MODIS NDVI Data and Landsat TM/ETM+ Images

    OpenAIRE

    Rao, Yuhan; Zhu, Xiaolin; Chen, Jin; Wang, Jianmin

    2015-01-01

    Due to technical limitations, it is impossible to have high resolution in both spatial and temporal dimensions for current NDVI datasets. Therefore, several methods are developed to produce high resolution (spatial and temporal) NDVI time-series datasets, which face some limitations including high computation loads and unreasonable assumptions. In this study, an unmixing-based method, NDVI Linear Mixing Growth Model (NDVI-LMGM), is proposed to achieve the goal of accurately and efficiently bl...

  16. 76 FR 6584 - Airworthiness Directives; Bombardier, Inc. Model DHC-8-400 Series Airplanes

    Science.gov (United States)

    2011-02-07

    .... Model DHC-8-400 Series Airplanes AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of... area on the rib at Yw-42.000 to ensure adequate electrical bonding, installing spiral wrap on certain cable assemblies where existing spiral wrap does not extend 4 inches past the tie mounts, applying a...

  17. Statistical properties of fluctuations of time series representing appearances of words in nationwide blog data and their applications: An example of modeling fluctuation scalings of nonstationary time series.

    Science.gov (United States)

    Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako

    2016-11-01

    To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3×10^{9} Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.

  18. Technical Leadership Development Program-Year 3

    Science.gov (United States)

    2012-08-30

    be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT...NASA6, Nokia , BAE Systems, the DoD, and the Australian government to develop our initial competency model. These models were discussed in deliverable...UNCLASSIFIED 17 solutions to pursue and why they should be built. 2. Help teams solve technical problems holistically, overcoming technical and non

  19. Stochastic modeling for time series InSAR: with emphasis on atmospheric effects

    Science.gov (United States)

    Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai

    2018-02-01

    Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.

  20. Documentation for Grants Equal to Tax model: Volume 1, Technical description

    International Nuclear Information System (INIS)

    1986-01-01

    A computerized model, the Grants Equal to Tax (GETT) model, was developed to assist in evaluating the amount of federal grant monies that would go to state and local jurisdictions under the provisions outlined in the Nuclear Waste Policy Act of 1982. The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes levied by state and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 1 of the GETT model documentation is a technical description of the program and its capabilities providing (1) descriptions of the data management system and its procedures; (2) formulas for calculating taxes (illustrated with flow charts); (3) descriptions of tax data base variables for the Deaf Smith County, Texas, Richton Dome, Mississippi, and Davis Canyon, Utah, salt sites; and (4) data inputs for the GETT model. 10 refs., 18 figs., 3 tabs