WorldWideScience

Sample records for model technical series

  1. Model Selection and Quality Estimation of Time Series Models for Artificial Technical Surface Generation

    Directory of Open Access Journals (Sweden)

    Matthias Eifler

    2017-12-01

    Full Text Available Standard compliant parameter calculation in surface topography analysis takes the manufacturing process into account. Thus, the measurement technician can be supported with automated suggestions for preprocessing, filtering and evaluation of the measurement data based on the character of the surface topography. Artificial neuronal networks (ANN are one approach for the recognition or classification of technical surfaces. However the required set of training data for ANN is often not available, especially when data acquisition is time consuming or expensive—as e.g., measuring surface topography. Thus, generation of artificial (simulated data becomes of interest. An approach from time series analysis is chosen and examined regarding its suitability for the description of technical surfaces: the ARMAsel model, an approach for time series modelling which is capable of choosing the statistical model with the smallest prediction error and the best number of coefficients for a certain surface. With a reliable model which features the relevant stochastic properties of a surface, a generation of training data for classifiers of artificial neural networks is possible. Based on the determined ARMA-coefficients from the ARMAsel-approach, with only few measured datasets many different artificial surfaces can be generated which can be used for training classifiers of an artificial neural network. In doing so, an improved calculation of the model input data for the generation of artificial surfaces is possible as the training data generation is based on actual measurement data. The trained artificial neural network is tested with actual measurement data of surfaces that were manufactured with varying manufacturing methods and a recognition rate of the according manufacturing principle between 60% and 78% can be determined. This means that based on only few measured datasets, stochastic surface information of various manufacturing principles can be extracted

  2. Technical Report Series on Global Modeling and Data Assimilation, Volume 41 : GDIS Workshop Report

    Science.gov (United States)

    Koster, Randal D. (Editor); Schubert, Siegfried; Pozzi, Will; Mo, Kingtse; Wood, Eric F.; Stahl, Kerstin; Hayes, Mike; Vogt, Juergen; Seneviratne, Sonia; Stewart, Ron; hide

    2015-01-01

    The workshop "An International Global Drought Information System Workshop: Next Steps" was held on 10-13 December 2014 in Pasadena, California. The more than 60 participants from 15 countries spanned the drought research community and included select representatives from applications communities as well as providers of regional and global drought information products. The workshop was sponsored and supported by the US National Integrated Drought Information System (NIDIS) program, the World Climate Research Program (WCRP: GEWEX, CLIVAR), the World Meteorological Organization (WMO), the Group on Earth Observations (GEO), the European Commission Joint Research Centre (JRC), the US Climate Variability and Predictability (CLIVAR) program, and the US National Oceanic and Atmospheric Administration (NOAA) programs on Modeling, Analysis, Predictions and Projections (MAPP) and Climate Variability & Predictability (CVP). NASA/JPL hosted the workshop with logistical support provided by the GEWEX program office. The goal of the workshop was to build on past Global Drought Information System (GDIS) progress toward developing an experimental global drought information system. Specific goals were threefold: (i) to review recent research results focused on understanding drought mechanisms and their predictability on a wide range of time scales and to identify gaps in understanding that could be addressed by coordinated research; (ii) to help ensure that WRCP research priorities mesh with efforts to build capacity to address drought at the regional level; and (iii) to produce an implementation plan for a short duration pilot project to demonstrate current GDIS capabilities. See http://www.wcrp-climate.org/gdis-wkshp-2014-objectives for more information.

  3. Technical Report Series on Global Modeling and Data Assimilation, Volume 43. MERRA-2; Initial Evaluation of the Climate

    Science.gov (United States)

    Koster, Randal D. (Editor); Bosilovich, Michael G.; Akella, Santha; Lawrence, Coy; Cullather, Richard; Draper, Clara; Gelaro, Ronald; Kovach, Robin; Liu, Qing; Molod, Andrea; hide

    2015-01-01

    The years since the introduction of MERRA have seen numerous advances in the GEOS-5 Data Assimilation System as well as a substantial decrease in the number of observations that can be assimilated into the MERRA system. To allow continued data processing into the future, and to take advantage of several important innovations that could improve system performance, a decision was made to produce MERRA-2, an updated retrospective analysis of the full modern satellite era. One of the many advances in MERRA-2 is a constraint on the global dry mass balance; this allows the global changes in water by the analysis increment to be near zero, thereby minimizing abrupt global interannual variations due to changes in the observing system. In addition, MERRA-2 includes the assimilation of interactive aerosols into the system, a feature of the Earth system absent from previous reanalyses. Also, in an effort to improve land surface hydrology, observations-corrected precipitation forcing is used instead of model-generated precipitation. Overall, MERRA-2 takes advantage of numerous updates to the global modeling and data assimilation system. In this document, we summarize an initial evaluation of the climate in MERRA-2, from the surface to the stratosphere and from the tropics to the poles. Strengths and weaknesses of the MERRA-2 climate are accordingly emphasized.

  4. MCFire model technical description

    Science.gov (United States)

    David R. Conklin; James M. Lenihan; Dominique Bachelet; Ronald P. Neilson; John B. Kim

    2016-01-01

    MCFire is a computer program that simulates the occurrence and effects of wildfire on natural vegetation, as a submodel within the MC1 dynamic global vegetation model. This report is a technical description of the algorithms and parameter values used in MCFire, intended to encapsulate its design and features a higher level that is more conceptual than the level...

  5. SERI biomass program annual technical report: 1982

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, P.W.; Corder, R.E.; Hill, A.M.; Lindsey, H.; Lowenstein, M.Z.

    1983-02-01

    The biomass with which this report is concerned includes aquatic plants, which can be converted into liquid fuels and chemicals; organic wastes (crop residues as well as animal and municipal wastes), from which biogas can be produced via anerobic digestion; and organic or inorganic waste streams, from which hydrogen can be produced by photobiological processes. The Biomass Program Office supports research in three areas which, although distinct, all use living organisms to create the desired products. The Aquatic Species Program (ASP) supports research on organisms that are themselves processed into the final products, while the Anaerobic Digestion (ADP) and Photo/Biological Hydrogen Program (P/BHP) deals with organisms that transform waste streams into energy products. The P/BHP is also investigating systems using water as a feedstock and cell-free systems which do not utilize living organisms. This report summarizes the progress and research accomplishments of the SERI Biomass Program during FY 1982.

  6. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  7. Modelling bursty time series

    International Nuclear Information System (INIS)

    Vajna, Szabolcs; Kertész, János; Tóth, Bálint

    2013-01-01

    Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)

  8. Manhattan Project Technical Series: The Chemistry of Uranium (I)

    International Nuclear Information System (INIS)

    Rabinowitch, E. I.; Katz, J. J.

    1947-01-01

    This constitutes Chapters 11 through 16, inclusive, of the Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Uranium Oxides, Sulfides, Selenides, and Tellurides; The Non-Volatile Fluorides of Uranium; Uranium Hexafluoride; Uranium-Chlorine Compounds; Bromides, Iodides, and Pseudo-Halides of Uranium; and Oxyhalides of Uranium.

  9. Manhattan Project Technical Series: The Chemistry of Uranium (I)

    Energy Technology Data Exchange (ETDEWEB)

    Rabinowitch, E. I. [Argonne National Lab. (ANL), Argonne, IL (United States); Katz, J. J. [Argonne National Lab. (ANL), Argonne, IL (United States)

    1947-03-10

    This constitutes Chapters 11 through 16, inclusive, of the Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Uranium Oxides, Sulfides, Selenides, and Tellurides; The Non-Volatile Fluorides of Uranium; Uranium Hexafluoride; Uranium-Chlorine Compounds; Bromides, Iodides, and Pseudo-Halides of Uranium; and Oxyhalides of Uranium.

  10. FOURIER SERIES MODELS THROUGH TRANSFORMATION

    African Journals Online (AJOL)

    DEPT

    This study considers the application of Fourier series analysis (FSA) to seasonal time series data. The ultimate objective of the study is to construct an FSA model that can lead to reliable forecast. Specifically, the study evaluates data for the assumptions of time series analysis; applies the necessary transformation to the ...

  11. Technical Report Series on Global Modeling and Data Assimilation. Volume 31; Global Surface Ocean Carbon Estimates in a Model Forced by MERRA

    Science.gov (United States)

    Gregg, Watson W.; Casey, Nancy W.; Rousseaux, Cecile S.

    2013-01-01

    MERRA products were used to force an established ocean biogeochemical model to estimate surface carbon inventories and fluxes in the global oceans. The results were compared to public archives of in situ carbon data and estimates. The model exhibited skill for ocean dissolved inorganic carbon (DIC), partial pressure of ocean CO2 (pCO2) and air-sea fluxes (FCO2). The MERRA-forced model produced global mean differences of 0.02% (approximately 0.3 microns) for DIC, -0.3% (about -1.2 (micro) atm; model lower) for pCO2, and -2.3% (-0.003 mol C/sq m/y) for FCO2 compared to in situ estimates. Basin-scale distributions were significantly correlated with observations for all three variables (r=0.97, 0.76, and 0.73, P<0.05, respectively for DIC, pCO2, and FCO2). All major oceanographic basins were represented as sources to the atmosphere or sinks in agreement with in situ estimates. However, there were substantial basin-scale and local departures.

  12. The Effects of Chlorophyll Assimilation on Carbon Fluxes in a Global Biogeochemical Model. [Technical Report Series on Global Modeling and Data Assimilation

    Science.gov (United States)

    Koster, Randal D. (Editor); Rousseaux, Cecile Severine; Gregg, Watson W.

    2014-01-01

    In this paper, we investigated whether the assimilation of remotely-sensed chlorophyll data can improve the estimates of air-sea carbon dioxide fluxes (FCO2). Using a global, established biogeochemical model (NASA Ocean Biogeochemical Model, NOBM) for the period 2003-2010, we found that the global FCO2 values produced in the free-run and after assimilation were within -0.6 mol C m(sup -2) y(sup -1) of the observations. The effect of satellite chlorophyll assimilation was assessed in 12 major oceanographic regions. The region with the highest bias was the North Atlantic. Here the model underestimated the fluxes by 1.4 mol C m(sup -2) y(sup -1) whereas all the other regions were within 1 mol C m(sup -2) y(sup -1) of the data. The FCO2 values were not strongly impacted by the assimilation, and the uncertainty in FCO2 was not decreased, despite the decrease in the uncertainty in chlorophyll concentration. Chlorophyll concentrations were within approximately 25% of the database in 7 out of the 12 regions, and the assimilation improved the chlorophyll concentration in the regions with the highest bias by 10-20%. These results suggest that the assimilation of chlorophyll data does not considerably improve FCO2 estimates and that other components of the carbon cycle play a role that could further improve our FCO2 estimates.

  13. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  14. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  15. SeaWiFS technical report series. Volume 11: Analysis of selected orbit propagation models for the SeaWiFS mission

    Science.gov (United States)

    Patt, Frederick S.; Hoisington, Charles M.; Gregg, Watson W.; Coronado, Patrick L.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Indest, A. W. (Editor)

    1993-01-01

    An analysis of orbit propagation models was performed by the Mission Operations element of the Sea-viewing Wide Field-of-View Sensor (SeaWiFS) Project, which has overall responsibility for the instrument scheduling. The orbit propagators selected for this analysis are widely available general perturbations models. The analysis includes both absolute accuracy determination and comparisons of different versions of the models. The results show that all of the models tested meet accuracy requirements for scheduling and data acquisition purposes. For internal Project use the SGP4 propagator, developed by the North American Air Defense (NORAD) Command, has been selected. This model includes atmospheric drag effects and, therefore, provides better accuracy. For High Resolution Picture Transmission (HRPT) ground stations, which have less stringent accuracy requirements, the publicly available Brouwer-Lyddane models are recommended. The SeaWiFS Project will make available portable source code for a version of this model developed by the Data Capture Facility (DCF).

  16. Symptomatic thoracic spinal cord herniation: case series and technical report.

    Science.gov (United States)

    Hawasli, Ammar H; Ray, Wilson Z; Wright, Neill M

    2014-09-01

    Idiopathic spinal cord herniation (ISCH) is an uncommon condition located predominantly in the thoracic spine and often associated with a remote history of a major traumatic injury. ISCH has an incompletely described presentation and unknown etiology. There is no consensus on the treatment algorithm and surgical technique, and there are few data on clinical outcomes. In this case series and technical report, we describe the atypical myelopathy presentation, remote history of traumatic injury, radiographic progression, treatment, and outcomes of 5 patients treated at Washington University for symptomatic ISCH. A video showing surgical repair is presented. In contrast to classic compressive myelopathy symptomatology, ISCH patients presented with an atypical myelopathy, characterized by asymmetric motor and sensory deficits and early-onset urinary incontinence. Clinical deterioration correlated with progressive spinal cord displacement and herniation observed on yearly spinal imaging in a patient imaged serially because of multiple sclerosis. Finally, compared with compressive myelopathy in the thoracic spine, surgical treatment of ISCH led to rapid improvement despite a long duration of symptoms. Symptomatic ISCH presents with atypical myelopathy and slow temporal progression and can be successfully managed with surgical repair.

  17. Technical Report Series on Global Modeling and Data Assimilation. Volume 32; Estimates of AOD Trends (2002 - 2012) Over the World's Major Cities Based on the MERRA Aerosol Reanalysis

    Science.gov (United States)

    Provencal, Simon; Kishcha, Pavel; Elhacham, Emily; daSilva, Arlindo M.; Alpert, Pinhas; Suarez, Max J.

    2014-01-01

    NASA's Global Modeling and Assimilation Office has extended the Modern-Era Retrospective Analysis for Research and Application (MERRA) tool with five atmospheric aerosol species (sulfates, organic carbon, black carbon, mineral dust and sea salt). This inclusion of aerosol reanalysis data is now known as MERRAero. This study analyses a ten-year period (July 2002 - June 2012) MERRAero aerosol reanalysis applied to the study of aerosol optical depth (AOD) and its trends for the aforementioned aerosol species over the world's major cities (with a population of over 2 million inhabitants). We found that a proportion of various aerosol species in total AOD exhibited a geographical dependence. Cities in industrialized regions (North America, Europe, central and eastern Asia) are characterized by a strong proportion of sulfate aerosols. Organic carbon aerosols are dominant over cities which are located in regions where biomass burning frequently occurs (South America and southern Africa). Mineral dust dominates other aerosol species in cities located in proximity to the major deserts (northern Africa and western Asia). Sea salt aerosols are prominent in coastal cities but are dominant aerosol species in very few of them. AOD trends are declining over cities in North America, Europe and Japan, as a result of effective air quality regulation. By contrast, the economic boom in China and India has led to increasing AOD trends over most cities in these two highly-populated countries. Increasing AOD trends over cities in the Middle East are caused by increasing desert dust.

  18. Women and Technical Professions. Leonardo da Vinci Series: Good Practices.

    Science.gov (United States)

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles programs for women in technical professions that are offered through the European Commission's Leonardo da Vinci program. The following programs are profiled: (1) Artemis and Diana (vocational guidance programs to help direct girls toward technology-related careers); (2) CEEWIT (an Internet-based information and…

  19. Fourier series models through transformation | Omekara | Global ...

    African Journals Online (AJOL)

    This study considers the application of Fourier series analysis (FSA) to seasonal time series data. The ultimate objective of the study is to construct an FSA model that can lead to reliable forecast. Specifically, the study evaluates data for the assumptions of time series analysis; applies the necessary transformation to the ...

  20. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  1. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  2. Technical discussions on Emissions and Atmospheric Modeling (TEAM)

    Science.gov (United States)

    Frost, G. J.; Henderson, B.; Lefer, B. L.

    2017-12-01

    A new informal activity, Technical discussions on Emissions and Atmospheric Modeling (TEAM), aims to improve the scientific understanding of emissions and atmospheric processes by leveraging resources through coordination, communication and collaboration between scientists in the Nation's environmental agencies. TEAM seeks to close information gaps that may be limiting emission inventory development and atmospheric modeling and to help identify related research areas that could benefit from additional coordinated efforts. TEAM is designed around webinars and in-person meetings on particular topics that are intended to facilitate active and sustained informal communications between technical staff at different agencies. The first series of TEAM webinars focuses on emissions of nitrogen oxides, a criteria pollutant impacting human and ecosystem health and a key precursor of ozone and particulate matter. Technical staff at Federal agencies with specific interests in emissions and atmospheric modeling are welcome to participate in TEAM.

  3. Modelling conditional heteroscedasticity in nonstationary series

    NARCIS (Netherlands)

    Cizek, P.; Cizek, P.; Härdle, W.K.; Weron, R.

    2011-01-01

    A vast amount of econometrical and statistical research deals with modeling financial time series and their volatility, which measures the dispersion of a series at a point in time (i.e., conditional variance). Although financial markets have been experiencing many shorter and longer periods of

  4. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in-creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack...... with social engineering. Due to this combination of attack steps on technical and social levels, risk assessment in socio-technical systems is complex. Therefore, established risk assessment methods often abstract away the internal structure of an organisation and ignore human factors when modelling...... and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...

  5. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic......In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...

  6. SAM Photovoltaic Model Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    Gilman, P. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-05-27

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM). The U.S. Department of Energy’s National Renewable Energy Laboratory maintains and distributes SAM, which is available as a free download from https://sam.nrel.gov. These descriptions are based on SAM 2015.1.30 (SSC 41).

  7. MODELLING OF ORDINAL TIME SERIES BY PROPORTIONAL ODDS MODEL

    Directory of Open Access Journals (Sweden)

    Serpil AKTAŞ ALTUNAY

    2013-06-01

    Full Text Available Categorical time series data with random time dependent covariates often arise when the variable categories are assigned as categorical. There are several other models that have been proposed in the literature for the analysis of categorical time series. For example, Markov chain models, integer autoregressive processes, discrete ARMA models can be utilized for modeling of categorical time series. In general, the choice of model depends on the measurement of study variables: nominal, ordinal and interval. However, regression theory is successful approach for categorical time series which is based on generalized linear models and partial likelihood inference. One of the models for ordinal time series in regression theory is proportional odds model. In this study, proportional odds model approach to ordinal categorical time series is investigated based on a real air pollution data set and the results are discussed.

  8. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  9. LAGRANGIAN PARTICLE DISPERSION MODEL (LPDM) TECHNICAL DESCRIPTION

    International Nuclear Information System (INIS)

    Chen, K

    2006-01-01

    The Savannah River National Laboratory (SRNL) uses the Lagrangian Particle Dispersion Model (LPDM) in conjunction with the Regional Atmospheric Modeling System as an operational tool for emergency response consequence assessments for the Savannah River Site (SRS). The LPDM is an advanced stochastic atmospheric transport model used to transport and disperse passive tracers subject to the meteorological field generated by RAMS from sources of varying number and shape. The Atmospheric Technologies Group (ATG) of the SRNL is undertaking the task of reviewing documentation and code for LPDM Quality Assurance (QA). The LPDM QA task will include a model technical description, computer coding descriptions, model applications, and configuration control. This report provides a comprehensive technical description of the LPDM model

  10. Series-produced Helium II Cryostats for the LHC Magnets Technical Choices, Industrialisation, Costs

    CERN Document Server

    Poncet, A

    2008-01-01

    Assembled in 8 continuous segments of approximately 2.7 km length each, the He II cryostats for the 1232 cryodipoles and 474 Short Straight Sections (SSS housing the quadrupoles) must fulfil tight technical requirements. They have been produced by industry in large series according to cost-effective industrial production methods to keep expenditure within the financial constraints of the project and assembled under contract at CERN. The specific technical requirements of the generic systems of the cryostat (vacuum, cryogenic, electrical distribution, magnet alignment) are briefly recalled, as well as the basic design choices leading to the definition of their components (vacuum vessels, thermal shielding, supporting systems). Early in the design process emphasis was placed on the feasibility of manufacturing techniques adequate for large series production of components, optimal tooling for time-effective assembly methods, and reliable quality assurance systems. An analytical review of the costs of the cryosta...

  11. Manhattan Project Technical Series The Chemistry of Uranium (I) Chapters 1-10

    Energy Technology Data Exchange (ETDEWEB)

    Rabinowitch, E. I. [Argonne National Laboratory (ANL), Argonne, IL (United States); Katz, J. J. [Argonne National Laboratory (ANL), Argonne, IL (United States)

    1946-09-30

    This constitutes Chapters 1 through 10. inclusive, of The Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Nuclear Properties of Uranium; Properties of the Uranium Atom; Uranium in Nature; Extraction of Uranium from Ores and Preparation of Uranium Metal; Physical Properties of Uranium Metal; Chemical Properties of Uranium Metal; Intermetallic Compounds and Alloy systems of Uranium; the Uranium-Hydrogen System; Uranium Borides, Carbides, and Silicides; Uranium Nitrides, Phosphides, Arsenides, and Antimonides.

  12. Manhattan Project Technical Series The Chemistry of Uranium (I) Chapters 1-10

    International Nuclear Information System (INIS)

    Rabinowitch, E. I.; Katz, J. J.

    1946-01-01

    This constitutes Chapters 1 through 10. inclusive, of The Survey Volume on Uranium Chemistry prepared for the Manhattan Project Technical Series. Chapters are titled: Nuclear Properties of Uranium; Properties of the Uranium Atom; Uranium in Nature; Extraction of Uranium from Ores and Preparation of Uranium Metal; Physical Properties of Uranium Metal; Chemical Properties of Uranium Metal; Intermetallic Compounds and Alloy systems of Uranium; the Uranium-Hydrogen System; Uranium Borides, Carbides, and Silicides; Uranium Nitrides, Phosphides, Arsenides, and Antimonides.

  13. GOES-R Ground Segment Technical Reference Model

    Science.gov (United States)

    Krause, R. G.; Burnett, M.; Khanna, R.

    2012-12-01

    NOAA Geostationary Environmental Operational Satellite -R Series (GOES-R) Ground Segment Project (GSP) has developed a Technical Reference Model (TRM) to support the documentation of technologies that could form the basis for a set of requirements that could support the evolution towards a NESDIS enterprise ground system. Architecture and technologies in this TRM can be applied or extended to other ground systems for planning and development. The TRM maps GOES-R technologies to the Office of Management and Budget's (OMB) Federal Enterprise Architecture (FEA) Consolidated Reference Model (CRM) V 2.3 Technical Services Standard (TSS). The FEA TRM categories are the framework for the GOES-R TRM. This poster will present the GOES-R TRM.

  14. Hindlimb unloading rodent model: technical aspects

    Science.gov (United States)

    Morey-Holton, Emily R.; Globus, Ruth K.

    2002-01-01

    Since its inception at the National Aeronautics and Space Administration (NASA) Ames Research Center in the mid-1970s, many laboratories around the world have used the rat hindlimb unloading model to simulate weightlessness and to study various aspects of musculoskeletal loading. In this model, the hindlimbs of rodents are elevated to produce a 30 degrees head-down tilt, which results in a cephalad fluid shift and avoids weightbearing by the hindquarters. Although several reviews have described scientific results obtained with this model, this is the first review to focus on the technical aspects of hindlimb unloading. This review includes a history of the technique, a brief comparison with spaceflight data, technical details, extension of the model to mice, and other important technical considerations (e.g., housing, room temperature, unloading angle, the potential need for multiple control groups, age, body weight, the use of the forelimb tissues as internal controls, and when to remove animals from experiments). This paper is intended as a reference for researchers, reviewers of manuscripts, and institutional animal care and use committees. Over 800 references, related to the hindlimb unloading model, can be accessed via the electronic version of this article.

  15. FRAM Modelling Complex Socio-technical Systems

    CERN Document Server

    Hollnagel, Erik

    2012-01-01

    There has not yet been a comprehensive method that goes behind 'human error' and beyond the failure concept, and various complicated accidents have accentuated the need for it. The Functional Resonance Analysis Method (FRAM) fulfils that need. This book presents a detailed and tested method that can be used to model how complex and dynamic socio-technical systems work, and understand both why things sometimes go wrong but also why they normally succeed.

  16. From Taylor series to Taylor models

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    An overview of the background of Taylor series methods and the utilization of the differential algebraic structure is given, and various associated techniques are reviewed. The conventional Taylor methods are extended to allow for a rigorous treatment of bounds for the remainder of the expansion in a similarly universal way. Utilizing differential algebraic and functional analytic arguments on the set of Taylor models, arbitrary order integrators with rigorous remainder treatment are developed. The integrators can meet pre-specified accuracy requirements in a mathematically strict way, and are a stepping stone towards fully rigorous estimates of stability of repetitive systems

  17. From Taylor series to Taylor models

    International Nuclear Information System (INIS)

    Berz, M.

    1997-01-01

    An overview of the background of Taylor series methods and the utilization of the differential algebraic structure is given, and various associated techniques are reviewed. The conventional Taylor methods are extended to allow for a rigorous treatment of bounds for the remainder of the expansion in a similarly universal way. Utilizing differential algebraic and functional analytic arguments on the set of Taylor models, arbitrary order integrators with rigorous remainder treatment are developed. The integrators can meet pre-specified accuracy requirements in a mathematically strict way, and are a stepping stone towards fully rigorous estimates of stability of repetitive systems. copyright 1997 American Institute of Physics

  18. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  19. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  20. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  1. Combination Welding Technical Terms. English-Thai Lexicon. Introduction to Combination Welding. Thai Version. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Shin, Masako T.

    This English-Thai lexicon and program introduction for combination welding is one of eight documents in the Multicultural Competency-Based Vocational/Technical Curricula Series. It is intended for use in postsecondary, adult, and preservice teacher and administrator education. The first two sections provide Thai equivalencies of English…

  2. Automotive Mechanics Technical Terms. English-Thai Lexicon. Introduction to Automotive Mechanics. Thai Version. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Shin, Masako T.

    This English-Thai lexicon and program introduction for automotive mechanics is one of eight documents in the Multicultural Competency-Based Vocational/Technical Curricula Series. It is intended for use in postsecondary, adult, and preservice teacher and administrator education. The first two sections provide Thai equivalencies of English…

  3. Maintenance Mechanics Technical Terms. English-Thai Lexicon. Introduction to Maintenance Mechanics. Thai Version. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Shin, Masako T.

    This English-Thai lexicon and program introduction for maintenance mechanics is one of eight documents in the Multicultural Competency-Based Vocational/Technical Curricula Series. It is intended for use in postsecondary, adult, and preservice teacher and administrator education. The first two sections provide Thai equivalencies of English…

  4. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  5. Stochastic modelling of regional archaeomagnetic series

    Science.gov (United States)

    Hellio, G.; Gillet, N.; Bouligand, C.; Jault, D.

    2014-11-01

    We report a new method to infer continuous time-series of the declination, inclination and intensity of the magnetic field from archaeomagnetic data. Adopting a Bayesian perspective, we need to specify a priori knowledge about the time evolution of the magnetic field. It consists in a time correlation function that we choose to be compatible with present knowledge about the geomagnetic time spectra. The results are presented as distributions of possible values for the declination, inclination or intensity. We find that the methodology can be adapted to account for the age uncertainties of archaeological artefacts and we use Markov chain Monte Carlo to explore the possible dates of observations. We apply the method to intensity data sets from Mari, Syria and to intensity and directional data sets from Paris, France. Our reconstructions display more rapid variations than previous studies and we find that the possible values of geomagnetic field elements are not necessarily normally distributed. Another output of the model is better age estimates of archaeological artefacts.

  6. Technical Report Series on Global Modeling and Data Assimilation. Volume 42; Soil Moisture Active Passive (SMAP) Project Calibration and Validation for the L4_C Beta-Release Data Product

    Science.gov (United States)

    Koster, Randal D. (Editor); Kimball, John S.; Jones, Lucas A.; Glassy, Joseph; Stavros, E. Natasha; Madani, Nima (Editor); Reichle, Rolf H.; Jackson, Thomas; Colliander, Andreas

    2015-01-01

    During the post-launch Cal/Val Phase of SMAP there are two objectives for each science product team: 1) calibrate, verify, and improve the performance of the science algorithms, and 2) validate accuracies of the science data products as specified in the L1 science requirements according to the Cal/Val timeline. This report provides analysis and assessment of the SMAP Level 4 Carbon (L4_C) product specifically for the beta release. The beta-release version of the SMAP L4_C algorithms utilizes a terrestrial carbon flux model informed by SMAP soil moisture inputs along with optical remote sensing (e.g. MODIS) vegetation indices and other ancillary biophysical data to estimate global daily NEE and component carbon fluxes, particularly vegetation gross primary production (GPP) and ecosystem respiration (Reco). Other L4_C product elements include surface (<10 cm depth) soil organic carbon (SOC) stocks and associated environmental constraints to these processes, including soil moisture and landscape FT controls on GPP and Reco (Kimball et al. 2012). The L4_C product encapsulates SMAP carbon cycle science objectives by: 1) providing a direct link between terrestrial carbon fluxes and underlying freeze/thaw and soil moisture constraints to these processes, 2) documenting primary connections between terrestrial water, energy and carbon cycles, and 3) improving understanding of terrestrial carbon sink activity in northern ecosystems.

  7. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  8. vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series(. )t ... showed that vector bilinear autoregressive (BIVAR) models provide better estimates than the long embraced linear models. ... order moving average (MA) polynomials on backward shift operator B ...

  9. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  10. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  11. Tracing the History of Technical Communication from 1850-2000: Plus a Series of Survey Studies.

    Science.gov (United States)

    McDowell, Earl E.

    This research focuses on the history of technical communication since 1850, with a specific focus on the technological changes that occurred between 1900 and 1950. This paper also discusses the development of professional technical communication organizations and the development of technical communication programs at the bachelor, masters, and…

  12. Fourier Series, the DFT and Shape Modelling

    DEFF Research Database (Denmark)

    Skoglund, Karl

    2004-01-01

    This report provides an introduction to Fourier series, the discrete Fourier transform, complex geometry and Fourier descriptors for shape analysis. The content is aimed at undergraduate and graduate students who wish to learn about Fourier analysis in general, as well as its application to shape...

  13. Modelling VLSI circuits using Taylor series

    Science.gov (United States)

    Kocina, Filip; Nečasová, Gabriela; Veigend, Petr; Chaloupka, Jan; Šátek, Václav; Kunovský, Jiří

    2017-07-01

    The paper introduces the capacitor substitution for CMOS logic gates, i.e. NANDs, NORs and inverters. It reveals the necessity of a very accurate and fast method for solving this problem. Therefore the Modern Taylor Series Method (MTSM) is used which provides an automatic choice of a higher order during the computation and a larger integration step size while keeping desired accuracy.

  14. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  15. Modeling and Analysing Socio-Technical Systems

    NARCIS (Netherlands)

    Aslanyan, Zaruhi; Ivanova, Marieta G.; Nielson, Flemming; Probst, Christian W.

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in- creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack with

  16. Improved technical efficiency and exogenous factors in transportation demand for energy: An application of structural time series analysis to South Korean data

    International Nuclear Information System (INIS)

    Sa'ad, Suleiman

    2010-01-01

    This paper stresses the importance of incorporating the effects of improved technical efficiency and exogenous factors when estimating energy demand functions. Using annual time series data for the period 1973-2007 in the STSM (structural time series model) developed by Harvey et al. the paper estimates price and income elasticities of demand for energy as well as the annual growth of the stochastic trend at the end of the estimation period. The results of the study reveal a long-run income elasticity of 1.37 and a price elasticity of -0.19. In addition, the underlying trend is generally stochastic and negatively sloping during the greater part of the estimation period. Finally, the estimated result from the structural time series is compared with the results from the Johansen Cointegration. These results suggest that income is the dominant factor in energy consumption. In addition, the coefficient of linear trend is negative, supporting the results from the STSM.

  17. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.

  18. Estimation of pure autoregressive vector models for revenue series ...

    African Journals Online (AJOL)

    This paper aims at applying multivariate approach to Box and Jenkins univariate time series modeling to three vector series. General Autoregressive Vector Models with time varying coefficients are estimated. The first vector is a response vector, while others are predictor vectors. By matrix expansion each vector, whether ...

  19. Technical Communicator: A New Model for the Electronic Resources Librarian?

    Science.gov (United States)

    Hulseberg, Anna

    2016-01-01

    This article explores whether technical communicator is a useful model for electronic resources (ER) librarians. The fields of ER librarianship and technical communication (TC) originated and continue to develop in relation to evolving technologies. A review of the literature reveals four common themes for ER librarianship and TC. While the…

  20. Wireless Emergency Alerts: Trust Model Technical Report

    Science.gov (United States)

    2014-02-01

    responses about big events, received comments on messaging around Sandy, the Derecho – these comments were more about technical glitches with phones...We typically do an after-action program on any exercise or big event for which we stand up the emergency operations center (EOC); the Derecho was...county employee messaging – the OPA may help us craft messages, we are always looking to simplify language, ease understanding; Derecho messaging was

  1. Time Series Modeling for Structural Response Prediction

    Science.gov (United States)

    1988-11-14

    results for 2nd mode. 69 5. 3DOF simulated data. 71 6. Experimental data. 72 7. Simulated data. 75 8. MPEM estimates for MDOF data with closely spaced...vector Ssteering matrix of residual time series 2DOF Two-degree-of-freedom 2LS Two-stage Least Squares Method 3DOF Three-degree-of-freedom x SUMMARY A...70 Table 5: 3DOF Simulated Data (fd= 1 ,10 ,25 ; C=.01,.0l,.0l; Amp=1,l,l; 256 pts, f,=2000 Hz) Algorithm grv noise higher mode grv, 4th mode, bias 40

  2. Time series modelling of overflow structures

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.

    1997-01-01

    The dynamics of a storage pipe is examined using a grey-box model based on on-line measured data. The grey-box modelling approach uses a combination of physically-based and empirical terms in the model formulation. The model provides an on-line state estimate of the overflows, pumping capacities...... to the overflow structures. The capacity of a pump draining the storage pipe has been estimated for two rain events, revealing that the pump was malfunctioning during the first rain event. The grey-box modelling approach is applicable for automated on-line surveillance and control. (C) 1997 IAWQ. Published...

  3. Modelling Social-Technical Attacks with Timed Automata

    DEFF Research Database (Denmark)

    David, Nicolas; David, Alexandre; Hansen, Rene Rydhof

    2015-01-01

    Attacks on a system often exploit vulnerabilities that arise from human behaviour or other human activity. Attacks of this type, so-called socio-technical attacks, cover everything from social engineering to insider attacks, and they can have a devastating impact on an unprepared organisation....... In this paper we develop an approach towards modelling socio-technical systems in general and socio-technical attacks in particular, using timed automata and illustrate its application by a complex case study. Thanks to automated model checking and automata theory, we can automatically generate possible attacks...

  4. Time series sightability modeling of animal populations.

    Directory of Open Access Journals (Sweden)

    Althea A ArchMiller

    Full Text Available Logistic regression models-or "sightability models"-fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model with year-specific parameters and a temporally-smoothed model (TS model that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  5. SEASONAL AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODEL FOR PRECIPITATION TIME SERIES

    OpenAIRE

    Yan Wang; Meng Gao; Xinghua Chang; Xiyong Hou

    2012-01-01

    Predicting the trend of precipitation is a difficult task in meteorology and environmental sciences. Statistical approaches from time series analysis provide an alternative way for precipitation prediction. The ARIMA model incorporating seasonal characteristics, which is referred to as seasonal ARIMA model was presented. The time series data is the monthly precipitation data in Yantai, China and the period is from 1961 to 2011. The model was denoted as SARIMA (1, 0, 1) (0, 1, 1)12 in this stu...

  6. Time series sightability modeling of animal populations

    Science.gov (United States)

    ArchMiller, Althea A.; Dorazio, Robert; St. Clair, Katherine; Fieberg, John R.

    2018-01-01

    Logistic regression models—or “sightability models”—fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  7. Comparison of annual maximum series and partial duration series methods for modeling extreme hydrologic events

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rasmussen, Peter F.; Rosbjerg, Dan

    1997-01-01

    Two different models for analyzing extreme hydrologic events, based on, respectively, partial duration series (PDS) and annual maximum series (AMS), are compared. The PDS model assumes a generalized Pareto distribution for modeling threshold exceedances corresponding to a generalized extreme value...... distribution for annual maxima. The performance of the two models in terms of the uncertainty of the T-year event estimator is evaluated in the cases of estimation with, respectively, the maximum likelihood (ML) method, the method of moments (MOM), and the method of probability weighted moments (PWM...... of the considered methods reveals that in general, one should use the PDS model with MOM estimation for negative shape parameters, the PDS model with exponentially distributed exceedances if the shape parameter is close to zero, the AMS model with MOM estimation for moderately positive shape parameters, and the PDS...

  8. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  9. Long Memory Models to Generate Synthetic Hydrological Series

    Directory of Open Access Journals (Sweden)

    Guilherme Armando de Almeida Pereira

    2014-01-01

    Full Text Available In Brazil, much of the energy production comes from hydroelectric plants whose planning is not trivial due to the strong dependence on rainfall regimes. This planning is accomplished through optimization models that use inputs such as synthetic hydrologic series generated from the statistical model PAR(p (periodic autoregressive. Recently, Brazil began the search for alternative models able to capture the effects that the traditional model PAR(p does not incorporate, such as long memory effects. Long memory in a time series can be defined as a significant dependence between lags separated by a long period of time. Thus, this research develops a study of the effects of long dependence in the series of streamflow natural energy in the South subsystem, in order to estimate a long memory model capable of generating synthetic hydrologic series.

  10. Tempered fractional time series model for turbulence in geophysical flows

    Science.gov (United States)

    Meerschaert, Mark M.; Sabzikar, Farzad; Phanikumar, Mantha S.; Zeleke, Aklilu

    2014-09-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model.

  11. Tempered fractional time series model for turbulence in geophysical flows

    International Nuclear Information System (INIS)

    Meerschaert, Mark M; Sabzikar, Farzad; Phanikumar, Mantha S; Zeleke, Aklilu

    2014-01-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model. (paper)

  12. forecasting with nonlinear time series model: a monte-carlo ...

    African Journals Online (AJOL)

    PUBLICATIONS1

    with nonlinear time series model by comparing the RMSE with the traditional bootstrap and. Monte-Carlo method of forecasting. We use the logistic smooth transition autoregressive. (LSTAR) model as a case study. We first consider a linear model called the AR. (p) model of order p which satisfies the follow- ing linear ...

  13. Comparison of annual maximum series and partial duration series methods for modeling extreme hydrologic events

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pearson, Charles P.; Rosbjerg, Dan

    1997-01-01

    Two regional estimation schemes, based on, respectively, partial duration series (PDS) and annual maximum series (AMS), are compared. The PDS model assumes a generalized Pareto (GP) distribution for modeling threshold exceedances corresponding to a generalized extreme value (GEV) distribution......-way grouping based on annual average rainfall is sufficient to attain homogeneity for PDS, whereas a further partitioning is necessary for AMS. In determination of the regional parent distribution using L-moment ratio diagrams, PDS data, in contrast to AMS data, provide an unambiguous interpretation......, supporting a GP distribution....

  14. World Magnetic Model 2015 Technical Report

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...

  15. With string model to time series forecasting

    Science.gov (United States)

    Pinčák, Richard; Bartoš, Erik

    2015-10-01

    Overwhelming majority of econometric models applied on a long term basis in the financial forex market do not work sufficiently well. The reason is that transaction costs and arbitrage opportunity are not included, as this does not simulate the real financial markets. Analyses are not conducted on the non equidistant date but rather on the aggregate date, which is also not a real financial case. In this paper, we would like to show a new way how to analyze and, moreover, forecast financial market. We utilize the projections of the real exchange rate dynamics onto the string-like topology in the OANDA market. The latter approach allows us to build the stable prediction models in trading in the financial forex market. The real application of the multi-string structures is provided to demonstrate our ideas for the solution of the problem of the robust portfolio selection. The comparison with the trend following strategies was performed, the stability of the algorithm on the transaction costs for long trade periods was confirmed.

  16. Technical Report on Alberta Essay Scales: Models.

    Science.gov (United States)

    Nyberg, Verner R.; Nyberg, Adell M.

    The supplementary information on "Alberta Essay Scales: Models" presented here includes similar models to employ in grading essays, the background and development of the scales, and the rationale for developing two scales of English mechanics and style/content. A standard is presented for evaluating current writing achievement by…

  17. SAM Photovoltaic Model Technical Reference 2016 Update

    Energy Technology Data Exchange (ETDEWEB)

    Gilman, Paul [National Renewable Energy Laboratory (NREL), Golden, CO (United States); DiOrio, Nicholas A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Freeman, Janine M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Janzou, Steven [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dobos, Aron [No longer NREL employee; Ryberg, David [No longer NREL employee

    2018-03-19

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixed arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.

  18. Technically Speaking: Columns from the Monthly Magazine, "The Source," 1987-88. Trace Reprint Series.

    Science.gov (United States)

    Borden, Peter A.; And Others

    The "Technically Speaking" columns from several isues of "The Source" magazine are reprinted. The columns were written by Gregg Vanderheiden, Peter Borden, Roger Smith, Jane Berliss, and Charles Lee. Titles of the columns included are: "Technological Advances: A Boon or a Barrier to Persons with Disabilities?";…

  19. Glossary of Mongolian Technical Terms. Program in Oriental Languages. Publications Series B--Aids--Number 13.

    Science.gov (United States)

    Buck, Frederick H.

    This glossary of Mongolian technical terms includes approximately 4,500 entries, covering such areas as political administration, economics, science, railways, stockfarming, agriculture, medicine, foreign affairs, military matters and miscellaneous items. A number of colloquial expressions are included, since they occur quite frequently and appear…

  20. Technical report on comparative analysis of ASME QA requirements and ISO series

    International Nuclear Information System (INIS)

    Kim, Kwan Hyun

    2000-06-01

    This technical report provides the differences on the QA requirement ASME and ISO in nuclear fields. This report applies to the quality assurance(QA) programmes of the design of two requirement. The organization having overall responsibility for the nuclear design, preservation, fabrication shall be described in this report in each stage of design project

  1. Technical performance of percutaneous and laminectomy leads analyzed by modeling

    NARCIS (Netherlands)

    Manola, L.; Holsheimer, J.

    2004-01-01

    The objective of this study was to compare the technical performance of laminectomy and percutaneous spinal cord stimulation leads with similar contact spacing by computer modeling. Monopolar and tripolar (guarded cathode) stimulation with both lead types in a low-thoracic spine model was simulated

  2. Agent-based modelling of socio-technical systems

    CERN Document Server

    van Dam, Koen H; Lukszo, Zofia

    2012-01-01

    Here is a practical introduction to agent-based modelling of socio-technical systems, based on methodology developed at TU Delft, which has been deployed in a number of case studies. Offers theory, methods and practical steps for creating real-world models.

  3. Modeling the chemistries of technical molecular plasmas

    Science.gov (United States)

    Munro, James J.; Tennyson, Jonathan; Brown, Daniel B.; Varambhia, Hemal N.; Doss, Natasha

    2008-10-01

    Plasma chemistries, especially for molecular gases, are complicated. With a limited amount of molecular data available, it is hard to model these plasmas accurately; just a couple of feedstock gases can lead to a minimal model containing perhaps dozens of gas-phase species. The possible gas-phase and surface reactions that can occur could be in the tens of thousands; less than a hundred are typically used in chemistry models. Understanding the importance of various species and reactions to a chemical model is vital. Here we present the progress on constructing a package (Quantemol-P)[1] to simplify and automate the process of building and analyzing plasma chemistries e.g. SF6/O2, CF4/O2 and O2/He. [1] J.J. Munro, J. Tennyson, J. Vac. Sci. Tech. A, accepted

  4. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  5. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  6. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  7. Small Signal Audiosusceptibility Model for Series Resonant Converter

    OpenAIRE

    G., Subhash Joshi T.; John, Vinod

    2018-01-01

    Models that accurately predict the output voltage ripple magnitude are essential for applications with stringent performance target for it. Impact of dc input ripple on the output ripple for a Series Resonant Converter (SRC) using discrete domain exact discretization modelling method is analysed in this paper. A novel discrete state space model along with a small signal model for SRC considering 3 state variables is presented. The audiosusceptibility (AS) transfer function which relates the i...

  8. Enhanced technical and economic working domains of industrial heat pumps operated in series

    DEFF Research Database (Denmark)

    Ommen, Torben; Jensen, Jonas Kjær; Markussen, Wiebke Brix

    2015-01-01

    and a decrease in net present value (NPV), due to the effects of economy of scale. However, connecting HPs in series may reduce the discharge temperature, which is of particular interest when assessing R717 HPs. Another possibility from serial connection is to mix different types of HP units, in order obtain...

  9. Technical Manual for the SAM Biomass Power Generation Model

    Energy Technology Data Exchange (ETDEWEB)

    Jorgenson, J.; Gilman, P.; Dobos, A.

    2011-09-01

    This technical manual provides context for the implementation of the biomass electric power generation performance model in the National Renewable Energy Laboratory's (NREL's) System Advisor Model (SAM). Additionally, the report details the engineering and scientific principles behind the underlying calculations in the model. The framework established in this manual is designed to give users a complete understanding of behind-the-scenes calculations and the results generated.

  10. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. High-temperature series expansions for random Potts models

    Directory of Open Access Journals (Sweden)

    M.Hellmund

    2005-01-01

    Full Text Available We discuss recently generated high-temperature series expansions for the free energy and the susceptibility of random-bond q-state Potts models on hypercubic lattices. Using the star-graph expansion technique, quenched disorder averages can be calculated exactly for arbitrary uncorrelated coupling distributions while keeping the disorder strength p as well as the dimension d as symbolic parameters. We present analyses of the new series for the susceptibility of the Ising (q=2 and 4-state Potts model in three dimensions up to the order 19 and 18, respectively, and compare our findings with results from field-theoretical renormalization group studies and Monte Carlo simulations.

  12. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  13. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...... distribution exhibits skewness and nonzero third-order autocovariance structure. In this respect, an asymmetric or nonlinear specification of the conditional mean is found to be of greater importance than the properties of the conditional variance. Several examples are discussed and, whenever possible...

  14. Model and Variable Selection Procedures for Semiparametric Time Series Regression

    Directory of Open Access Journals (Sweden)

    Risa Kato

    2009-01-01

    Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.

  15. Model to evaluate the technical efficiency of university units

    Directory of Open Access Journals (Sweden)

    Marlon Soliman

    2014-06-01

    Full Text Available In higher education institutions, the technical efficiency has been measured by several indicators that, when used separately, does not lead to an effective conclusion about the administrative reality of these. Therefore, this paper proposes a model to evaluate the technical efficiency of university units of a higher education institution (HEI from the perspectives of Teaching, Research and Extension. The conception of the model was performed according to the pressumptions of Data Envelopment Analysis (DEA, CCR model – product oriented, from the identification of relevant variables for the addressed context. The model was applied to evaluate the efficiency of nine academic units of the Federal University of Santa Maria (UFSM, obtaining as a result the efficiency of each unit as well as recommendations for the units considered inefficient. At the end of this study, it was verified that it is possible to measure the efficiency of various units and consequently establish goals for improvement based on the methodology used.

  16. Analysis of GARCH modeling in financial markets: an approach based on technical analysis strategies

    Directory of Open Access Journals (Sweden)

    Mircea Cristian Gherman

    2011-08-01

    Full Text Available In this paper we performed an analysis in order the make an evidence of GARCH modeling on the performances of trading rules applied for a stock market index. Our study relays on the overlap between econometrical modeling, technical analysis and a simulation computing technique. The non-linear structures presented in the daily returns of the analyzed index and also in other financial series, together with the phenomenon of volatility clustering are premises for applying a GARCH model. In our approach the standardized GARCH innovations are resampled using the bootstrap method. On the simulated data are then applied technical analysis trading strategies. For all the simulated paths the “p-values” are computed in order to verify that the hypothesis concerning the goodness of fit for GARCH model on the BET index is accepted. The processed data with trading rules are showing evidence that GARCH model is a good choice for econometrical modeling of financial time series including the romanian exchange trade index.

  17. Forecasting with nonlinear time series model: A Monte-Carlo ...

    African Journals Online (AJOL)

    In this paper, we propose a new method of forecasting with nonlinear time series model using Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-Carlo method of forecasting using a ...

  18. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of

  19. Modeling irregularly spaced residual series as a continuous stochastic process

    NARCIS (Netherlands)

    Von Asmuth, J.R.; Bierkens, M.F.P.

    2005-01-01

    In this paper, the background and functioning of a simple but effective continuous time approach for modeling irregularly spaced residual series is presented. The basic equations were published earlier by von Asmuth et al. (2002), who used them as part of a continuous time transfer function noise

  20. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...

  1. Combined Forecasts from Linear and Nonlinear Time Series Models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  2. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  3. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  4. Thermomechanical constitutive modeling of polyurethane-series shape memory polymer

    Energy Technology Data Exchange (ETDEWEB)

    Tobushi, H.; Ito, N.; Takata, K. [Aichi Inst. of Technol., Nagoya (Japan). Dept. of Mech. Eng.; Hayashi, S. [Nagoya Research and Development Center, Mitsubishi Heavy Industries, Ltd., Nagoya (Japan)

    2000-07-01

    In order to describe the thermomechanical properties in shape memory polymer of polyurethane series, a thermomechanical constitutive model was developed. In order to describe the variation in mechanical properties due to the glass transition, coefficients in the model were expressed by a single exponential function of temperature. The proposed theory expressed well the thermomechanical properties of the material, such as shape fixity and shape recovery. (orig.)

  5. Fundamental State Space Time Series Models for JEPX Electricity Prices

    Science.gov (United States)

    Ofuji, Kenta; Kanemoto, Shigeru

    Time series models are popular in attempts to model and forecast price dynamics in various markets. In this paper, we have formulated two state space models and tested them for its applicability to power price modeling and forecasting using JEPX (Japan Electric Power eXchange) data. The state space models generally have a high degree of flexibility with its time-dependent state transition matrix and system equation configurations. Based on empirical data analysis and past literatures, we used calculation assumptions to a) extract stochastic trend component to capture non-stationarity, and b) detect structural changes underlying in the market. The stepwise calculation algorithm followed that of Kalman Filter. We then evaluated the two models' forecasting capabilities, in comparison with ordinary AR (autoregressive) and ARCH (autoregressive conditional heteroskedasticity) models. By choosing proper explanatory variables, the latter state space model yielded as good a forecasting capability as that of the AR and the ARCH models for a short forecasting horizon.

  6. GRAM Series of Atmospheric Models for Aeroentry and Aeroassist

    Science.gov (United States)

    Duvall, Aleta; Justus, C. G.; Keller, Vernon W.

    2005-01-01

    The eight destinations in the Solar System with sufficient atmosphere for either aeroentry or aeroassist, including aerocapture, are: Venus, Earth, Mars, Jupiter, Saturn; Uranus. and Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for four of these (Earth, Mars, Titan, and Neptune) have been developed for use in NASA's systems analysis studies of aerocapture applications in potential future missions. Work has recently commenced on development of a similar atmospheric model for Venus. This series of MSFC-sponsored models is identified as the Global Reference Atmosphere Model (GRAM) series. An important capability of all of the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design. Example applications for Earth aeroentry and Mars aerocapture systems analysis studies are presented and illustrated. Current and planned updates to the Earth and Mars atmospheric models, in support of NASA's new exploration vision, are also presented.

  7. A Comparative Study of Portmanteau Tests for Univariate Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2006-07-01

    Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.

  8. Series

    Indian Academy of Sciences (India)

    Madhu

    molecular era. The second episode is familiar to historians of 20th century biology (Sapp 1987). Recent studies have enriched it, as well as thrown light on the first use of ciliates as models. 2. ... low level of consciousness in these organisms. A fascinating part of ... phenomena are due to self-replicating particles present in.

  9. Series

    Indian Academy of Sciences (India)

    2013-01-22

    Jan 22, 2013 ... (Fax, 33-144-323941; Email, morange@biologie.ens.fr). 1. Introduction. In 1968, Walther Stoeckenius summarized in Science the con- clusions of a meeting held the previous year at Frascati near. Rome on 'membrane modelling and membrane formation'. (Stoeckenius 1968). The diversity of the issues ...

  10. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  11. Wave basin model tests of technical-biological bank protection

    Science.gov (United States)

    Eisenmann, J.

    2012-04-01

    Sloped embankments of inland waterways are usually protected from erosion and other negative im-pacts of ship-induced hydraulic loads by technical revetments consisting of riprap. Concerning the dimensioning of such bank protection there are several design rules available, e.g. the "Principles for the Design of Bank and Bottom Protection for Inland Waterways" or the Code of Practice "Use of Standard Construction Methods for Bank and Bottom Protection on Waterways" issued by the BAW (Federal Waterways Engineering and Research Institute). Since the European Water Framework Directive has been put into action special emphasis was put on natural banks. Therefore the application of technical-biological bank protection is favoured. Currently design principles for technical-biological bank protection on inland waterways are missing. The existing experiences mainly refer to flowing waters with no or low ship-induced hydraulic loads on the banks. Since 2004 the Federal Waterways Engineering and Research Institute has been tracking the re-search and development project "Alternative Technical-Biological Bank Protection on Inland Water-ways" in company with the Federal Institute of Hydrology. The investigation to date includes the ex-amination of waterway sections where technical- biological bank protection is applied locally. For the development of design rules for technical-biological bank protection investigations shall be carried out in a next step, considering the mechanics and resilience of technical-biological bank protection with special attention to ship-induced hydraulic loads. The presentation gives a short introduction into hydraulic loads at inland waterways and their bank protection. More in detail model tests of a willow brush mattress as a technical-biological bank protec-tion in a wave basin are explained. Within the scope of these tests the brush mattresses were ex-posed to wave impacts to determine their resilience towards hydraulic loads. Since the

  12. Degeneracy of time series models: The best model is not always the correct model

    International Nuclear Information System (INIS)

    Judd, Kevin; Nakamura, Tomomichi

    2006-01-01

    There are a number of good techniques for finding, in some sense, the best model of a deterministic system given a time series of observations. We examine a problem called model degeneracy, which has the consequence that even when a perfect model of a system exists, one does not find it using the best techniques currently available. The problem is illustrated using global polynomial models and the theory of Groebner bases

  13. Identification of neutral biochemical network models from time series data

    Directory of Open Access Journals (Sweden)

    Maia Marco

    2009-05-01

    Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  14. Identification of neutral biochemical network models from time series data.

    Science.gov (United States)

    Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S

    2009-05-05

    The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  15. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  17. Technical Manual for the SAM Physical Trough Model

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M. J.; Gilman, P.

    2011-06-01

    NREL, in conjunction with Sandia National Lab and the U.S Department of Energy, developed the System Advisor Model (SAM) analysis tool for renewable energy system performance and economic analysis. This paper documents the technical background and engineering formulation for one of SAM's two parabolic trough system models in SAM. The Physical Trough model calculates performance relationships based on physical first principles where possible, allowing the modeler to predict electricity production for a wider range of component geometries than is possible in the Empirical Trough model. This document describes the major parabolic trough plant subsystems in detail including the solar field, power block, thermal storage, piping, auxiliary heating, and control systems. This model makes use of both existing subsystem performance modeling approaches, and new approaches developed specifically for SAM.

  18. Computational model for simulation small testing launcher, technical solution

    Science.gov (United States)

    Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian

    2014-12-01

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project "Suborbital Launcher for Testing" (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle

  19. A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series

    Directory of Open Access Journals (Sweden)

    Fernando Luiz Cyrino Oliveira

    2014-01-01

    Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.

  20. Deriving dynamic marketing effectiveness from econometric time series models

    OpenAIRE

    Horváth, C.; Franses, Ph.H.B.F.

    2003-01-01

    textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the literature are unit roots, cointegration, structural breaks and impulse response functions. In this paper we summarize the most important concepts by reviewing all possible empirical cases that can...

  1. Modelling of series of types of automated trenchless works tunneling

    Science.gov (United States)

    Gendarz, P.; Rzasinski, R.

    2016-08-01

    Microtunneling is the newest method for making underground installations. Show method is the result of experience and methods applied in other, previous methods of trenchless underground works. It is considered reasonable to elaborate a series of types of construction of tunneling machines, to develop this particular earthworks method. There are many design solutions of machines, but the current goal is to develop non - excavation robotized machine. Erosion machines with main dimensions of the tunnels which are: 1600, 2000, 2500, 3150 are design with use of the computer aided methods. Series of types of construction of tunneling machines creating process was preceded by analysis of current state. The verification of practical methodology of creating the systematic part series was based on the designed erosion machines series of types. There were developed: method of construction similarity of the erosion machines, algorithmic methods of quantitative construction attributes variant analyzes in the I-DEAS advanced graphical program, relational and program parameterization. There manufacturing process of the parts will be created, which allows to verify the technological process on the CNC machines. The models of designed will be modified and the construction will be consulted with erosion machine users and manufacturers like: Tauber Rohrbau GmbH & Co.KG from Minster, OHL ZS a.s. from Brna,. The companies’ acceptance will result in practical verification by JUMARPOL company.

  2. Unsupervised Classification During Time-Series Model Building.

    Science.gov (United States)

    Gates, Kathleen M; Lane, Stephanie T; Varangis, E; Giovanello, K; Guiskewicz, K

    2017-01-01

    Researchers who collect multivariate time-series data across individuals must decide whether to model the dynamic processes at the individual level or at the group level. A recent innovation, group iterative multiple model estimation (GIMME), offers one solution to this dichotomy by identifying group-level time-series models in a data-driven manner while also reliably recovering individual-level patterns of dynamic effects. GIMME is unique in that it does not assume homogeneity in processes across individuals in terms of the patterns or weights of temporal effects. However, it can be difficult to make inferences from the nuances in varied individual-level patterns. The present article introduces an algorithm that arrives at subgroups of individuals that have similar dynamic models. Importantly, the researcher does not need to decide the number of subgroups. The final models contain reliable group-, subgroup-, and individual-level patterns that enable generalizable inferences, subgroups of individuals with shared model features, and individual-level patterns and estimates. We show that integrating community detection into the GIMME algorithm improves upon current standards in two important ways: (1) providing reliable classification and (2) increasing the reliability in the recovery of individual-level effects. We demonstrate this method on functional MRI from a sample of former American football players.

  3. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  4. Modeling and Forecasting of Water Demand in Isfahan Using Underlying Trend Concept and Time Series

    Directory of Open Access Journals (Sweden)

    H. Sadeghi

    2016-02-01

    Full Text Available Introduction: Accurate water demand modeling for the city is very important for forecasting and policies adoption related to water resources management. Thus, for future requirements of water estimation, forecasting and modeling, it is important to utilize models with little errors. Water has a special place among the basic human needs, because it not hampers human life. The importance of the issue of water management in the extraction and consumption, it is necessary as a basic need. Municipal water applications is include a variety of water demand for domestic, public, industrial and commercial. Predicting the impact of urban water demand in better planning of water resources in arid and semiarid regions are faced with water restrictions. Materials and Methods: One of the most important factors affecting the changing technological advances in production and demand functions, we must pay special attention to the layout pattern. Technology development is concerned not only technically, but also other aspects such as personal, non-economic factors (population, geographical and social factors can be analyzed. Model examined in this study, a regression model is composed of a series of structural components over time allows changed invisible accidentally. Explanatory variables technology (both crystalline and amorphous in a model according to which the material is said to be better, but because of the lack of measured variables over time can not be entered in the template. Model examined in this study, a regression model is composed of a series of structural component invisible accidentally changed over time allows. In this study, structural time series (STSM and ARMA time series models have been used to model and estimate the water demand in Isfahan. Moreover, in order to find the efficient procedure, both models have been compared to each other. The desired data in this research include water consumption in Isfahan, water price and the monthly pay

  5. Modeling, design, and optimization of Mindwalker series elastic joint.

    Science.gov (United States)

    Wang, Shiqian; Meijneke, Cor; van der Kooij, Herman

    2013-06-01

    Weight and power autonomy are limiting the daily use of wearable exoskeleton. Lightweight, efficient and powerful actuation system are not easy to achieve. Choosing the right combinations of existing technologies, such as battery, gear and motor is not a trivial task. In this paper, we propose an optimization framework by setting up a power-based quasi-static model of the exoskeleton joint drivetrain. The goal is to find the most efficient and lightweight combinations. This framework can be generalized for other similar applications by extending or accommodating the model to their own needs. We also present the Mindwalker exoskeleton joint, for which a novel series elastic actuator, consisting of a ballscrew-driven linear actuator and a double spiral spring, was developed and tested. This linear actuator is capable of outputting 960 W power and the exoskeleton joint can output 100 Nm peak torque continuously. The double spiral spring can sense torque between 0.08Nm and 100 Nm and it exhibits linearity of 99.99%, with no backlash or hysteresis. The series elastic joint can track a chirp torque profile with amplitude of 100 Nm over 6 Hz (large torque bandwidth) and for small torque (2 Nm peak-to-peak), it has a bandwidth over 38 Hz. The integrated exoskeleton joint, including the ballscrew-driven linear actuator, the series spring, electronics and the metal housing which hosts these components, weighs 2.9 kg.

  6. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Science.gov (United States)

    Fu, Peihua; Zhu, Anding; Fang, Qiwen; Wang, Xi

    Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR)-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data. We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution. We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation. To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount. The buzz in public social communities

  7. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Directory of Open Access Journals (Sweden)

    Peihua Fu

    Full Text Available Online broadcasting substantially affects the production, distribution, and profit of TV series. In addition, online word-of-mouth significantly affects the diffusion of TV series. Because on-demand streaming rates are the most important factor that influences the earnings of online video suppliers, streaming statistics and forecasting trends are valuable. In this paper, we investigate the effects of periodic impulsive stimulation and pre-launch promotion on on-demand streaming dynamics. We consider imbalanced audience feverish distribution using an impulsive susceptible-infected-removed(SIR-like model. In addition, we perform a correlation analysis of online buzz volume based on Baidu Index data.We propose a PI-SIR model to evolve audience dynamics and translate them into on-demand streaming fluctuations, which can be observed and comprehended by online video suppliers. Six South Korean TV series datasets are used to test the model. We develop a coarse-to-fine two-step fitting scheme to estimate the model parameters, first by fitting inter-period accumulation and then by fitting inner-period feverish distribution.We find that audience members display similar viewing habits. That is, they seek new episodes every update day but fade away. This outcome means that impulsive intensity plays a crucial role in on-demand streaming diffusion. In addition, the initial audience size and online buzz are significant factors. On-demand streaming fluctuation is highly correlated with online buzz fluctuation.To stimulate audience attention and interpersonal diffusion, it is worthwhile to invest in promotion near update days. Strong pre-launch promotion is also a good marketing tool to improve overall performance. It is not advisable for online video providers to promote several popular TV series on the same update day. Inter-period accumulation is a feasible forecasting tool to predict the future trend of the on-demand streaming amount. The buzz in public

  8. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  9. A Feature Fusion Based Forecasting Model for Financial Time Series

    Science.gov (United States)

    Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie

    2014-01-01

    Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455

  10. Multi-Cultural Competency-Based Vocational Curricula. Automotive Mechanics. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on automotive mechanics. This program is designed to run 36 weeks and cover 10 instructional areas: the engine; drive trains--rear ends/drive shafts/manual transmission; carburetor; emission; ignition/tune-up; charging and starting;…

  11. Multi-Cultural Competency-Based Vocational Curricula. Maintenance Mechanics. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on maintenance mechanics. This program is designed to run 40 weeks and cover 5 instructional areas: basic electricity (14 weeks); maintenance and repair of heating (4 weeks); maintenance and repair of air conditioning (12 weeks); maintenance…

  12. Multi-Cultural Competency-Based Vocational Curricula. Food Service. Multi-Cultural Competency-Based Vocational/Technical Curricula Series.

    Science.gov (United States)

    Hepburn, Larry; Shin, Masako

    This document, one of eight in a multi-cultural competency-based vocational/technical curricula series, is on food service. This program is designed to run 24 weeks and cover 15 instructional areas: orientation, sanitation, management/planning, preparing food for cooking, preparing beverages, cooking eggs, cooking meat, cooking vegetables,…

  13. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  14. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  15. Assimilation of LAI time-series in crop production models

    Science.gov (United States)

    Kooistra, Lammert; Rijk, Bert; Nannes, Louis

    2014-05-01

    Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor

  16. TECHNICAL VISION SYSTEM FOR THE ROBOTIC MODEL OF SURFACE VESSEL

    Directory of Open Access Journals (Sweden)

    V. S. Gromov

    2016-07-01

    Full Text Available The paper presents results of work on creation of technical vision systems within the training complex for the verification of control systems by the model of surface vessel. The developed system allows determination of the coordinates and orientation angle of the object of control by means of an external video camera on one bench mark and without the need to install additional equipment on the object of control itself. Testing of the method was carried out on the robotic complex with the model of a surface vessel with a length of 430 mm; coordinates of the control object were determined with the accuracy of 2 mm. This method can be applied as a subsystem of receiving coordinates for systems of automatic control of surface vessels when testing on the scale models.

  17. Empirical investigation on modeling solar radiation series with ARMA–GARCH models

    International Nuclear Information System (INIS)

    Sun, Huaiwei; Yan, Dong; Zhao, Na; Zhou, Jianzhong

    2015-01-01

    Highlights: • Apply 6 ARMA–GARCH(-M) models to model and forecast solar radiation. • The ARMA–GARCH(-M) models produce more accurate radiation forecasting than conventional methods. • Show that ARMA–GARCH-M models are more effective for forecasting solar radiation mean and volatility. • The ARMA–EGARCH-M is robust and the ARMA–sGARCH-M is very competitive. - Abstract: Simulation of radiation is one of the most important issues in solar utilization. Time series models are useful tools in the estimation and forecasting of solar radiation series and their changes. In this paper, the effectiveness of autoregressive moving average (ARMA) models with various generalized autoregressive conditional heteroskedasticity (GARCH) processes, namely ARMA–GARCH models are evaluated for their effectiveness in radiation series. Six different GARCH approaches, which contain three different ARMA–GARCH models and corresponded GARCH in mean (ARMA–GARCH-M) models, are applied in radiation data sets from two representative climate stations in China. Multiple evaluation metrics of modeling sufficiency are used for evaluating the performances of models. The results show that the ARMA–GARCH(-M) models are effective in radiation series estimation. Both in fitting and prediction of radiation series, the ARMA–GARCH(-M) models show better modeling sufficiency than traditional models, while ARMA–EGARCH-M models are robustness in two sites and the ARMA–sGARCH-M models appear very competitive. Comparisons of statistical diagnostics and model performance clearly show that the ARMA–GARCH-M models make the mean radiation equations become more sufficient. It is recommended the ARMA–GARCH(-M) models to be the preferred method to use in the modeling of solar radiation series

  18. Modelling of Water Cooled Fuel Including Design Basis and Severe Accidents. Proceedings of a Technical Meeting

    International Nuclear Information System (INIS)

    2015-11-01

    The demands on nuclear fuel have recently been increasing, and include transient regimes, higher discharge burnup and longer fuel cycles. This has resulted in an increase of loads on fuel and core internals. In order to satisfy these demands while ensuring compliance with safety criteria, new national and international programmes have been launched and advanced modelling codes are being developed. The Fukushima Daiichi accident has particularly demonstrated the need for adequate analysis of all aspects of fuel performance to prevent a failure and also to predict fuel behaviour were an accident to occur.This publication presents the Proceedings of the Technical Meeting on Modelling of Water Cooled Fuel Including Design Basis and Severe Accidents, which was hosted by the Nuclear Power Institute of China (NPIC) in Chengdu, China, following the recommendation made in 2013 at the IAEA Technical Working Group on Fuel Performance and Technology. This recommendation was in agreement with IAEA mid-term initiatives, linked to the post-Fukushima IAEA Nuclear Safety Action Plan, as well as the forthcoming Coordinated Research Project (CRP) on Fuel Modelling in Accident Conditions. At the technical meeting in Chengdu, major areas and physical phenomena, as well as types of code and experiment to be studied and used in the CRP, were discussed. The technical meeting provided a forum for international experts to review the state of the art of code development for modelling fuel performance of nuclear fuel for water cooled reactors with regard to steady state and transient conditions, and for design basis and early phases of severe accidents, including experimental support for code validation. A round table discussion focused on the needs and perspectives on fuel modelling in accident conditions. This meeting was the ninth in a series of IAEA meetings, which reflects Member States’ continuing interest in nuclear fuel issues. The previous meetings were held in 1980 (jointly with

  19. Mathematical Modeling and Dynamic Simulation of Metabolic Reaction Systems Using Metabolome Time Series Data.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Shiraishi, Fumihide; Hirai, Masami Yokota

    2016-01-01

    The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although, hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.

  20. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    Science.gov (United States)

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. Modeling technical change in climate analysis: evidence from agricultural crop damages.

    Science.gov (United States)

    Ahmed, Adeel; Devadason, Evelyn S; Al-Amin, Abul Quasem

    2017-05-01

    This study accounts for the Hicks neutral technical change in a calibrated model of climate analysis, to identify the optimum level of technical change for addressing climate changes. It demonstrates the reduction to crop damages, the costs to technical change, and the net gains for the adoption of technical change for a climate-sensitive Pakistan economy. The calibrated model assesses the net gains of technical change for the overall economy and at the agriculture-specific level. The study finds that the gains of technical change are overwhelmingly higher than the costs across the agriculture subsectors. The gains and costs following technical change differ substantially for different crops. More importantly, the study finds a cost-effective optimal level of technical change that potentially reduces crop damages to a minimum possible level. The study therefore contends that the climate policy for Pakistan should consider the role of technical change in addressing climate impacts on the agriculture sector.

  2. Self-organising mixture autoregressive model for non-stationary time series modelling.

    Science.gov (United States)

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  3. Mathematical Models for the Education Sector, A Survey. (Les Modeles Mathematiques du Sector Enseignement.) Technical Report.

    Science.gov (United States)

    Organisation for Economic Cooperation and Development, Paris (France).

    The purposes of this volume are to report a survey of current practice in the construction and use of mathematical models for the education sector: to identify the most important technical and substantive problems confronting the model-building effort; and to bridge the gap between the advancing research pursuit of model-building and the lagging…

  4. Crowd Sourcing for Challenging Technical Problems and Business Model

    Science.gov (United States)

    Davis, Jeffrey R.; Richard, Elizabeth

    2011-01-01

    Crowd sourcing may be defined as the act of outsourcing tasks that are traditionally performed by an employee or contractor to an undefined, generally large group of people or community (a crowd) in the form of an open call. The open call may be issued by an organization wishing to find a solution to a particular problem or complete a task, or by an open innovation service provider on behalf of that organization. In 2008, the Space Life Sciences Directorate (SLSD), with the support of Wyle Integrated Science and Engineering, established and implemented pilot projects in open innovation (crowd sourcing) to determine if these new internet-based platforms could indeed find solutions to difficult technical challenges. These unsolved technical problems were converted to problem statements, also called "Challenges" or "Technical Needs" by the various open innovation service providers, and were then posted externally to seek solutions. In addition, an open call was issued internally to NASA employees Agency wide (10 Field Centers and NASA HQ) using an open innovation service provider crowd sourcing platform to post NASA challenges from each Center for the others to propose solutions). From 2008 to 2010, the SLSD issued 34 challenges, 14 externally and 20 internally. The 14 external problems or challenges were posted through three different vendors: InnoCentive, Yet2.com and TopCoder. The 20 internal challenges were conducted using the InnoCentive crowd sourcing platform designed for internal use by an organization. This platform was customized for NASA use and promoted as NASA@Work. The results were significant. Of the seven InnoCentive external challenges, two full and five partial awards were made in complex technical areas such as predicting solar flares and long-duration food packaging. Similarly, the TopCoder challenge yielded an optimization algorithm for designing a lunar medical kit. The Yet2.com challenges yielded many new industry and academic contacts in bone

  5. Modelling Technical and Economic Parameters in Selection of Manufacturing Devices

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2017-11-01

    Full Text Available Sustainable science and technology development is also conditioned by continuous development of means of production which have a key role in structure of each production system. Mechanical nature of the means of production is complemented by controlling and electronic devices in context of intelligent industry. A selection of production machines for a technological process or technological project has so far been practically resolved, often only intuitively. With regard to increasing intelligence, the number of variable parameters that have to be considered when choosing a production device is also increasing. It is necessary to use computing techniques and decision making methods according to heuristic methods and more precise methodological procedures during the selection. The authors present an innovative model for optimization of technical and economic parameters in the selection of manufacturing devices for industry 4.0.

  6. Faults on gamma projector Model 660 series for industrial radiography

    International Nuclear Information System (INIS)

    Sukhri Ahmad; Arshad Yassin; Saidi Rajab; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    The main objective of this paper is to present MINTs experience pertaining to gamma projector maintenance activity. In Malaysia there are more than 100 gamma projectors that need to be maintained annually. Most of these projectors are of Tech-Ops Model 660 series portable gamma radiography systems, primarily for industrial radiography. The portability feature of the system provides both a safe means of transporting the radioactive source and operating flexibility, particularly useful in areas where access is limited. In Malaysia, Malaysian Institute for Nuclear Technology Research (MINT) has been approved as the National Gamma Projector Maintenance Centre by the Atomic Energy Licensing Board (AELB). This approval entitles MINT to undertake projector maintenance activity for all projectors throughout the country. Within 10 years of operation, MINT has dealt with thousands of projectors. (Author)

  7. Promoting Creative Thinking Ability Using Contextual Learning Model in Technical Drawing Achievement

    Science.gov (United States)

    Mursid, R.

    2018-02-01

    The purpose of this study is to determine whether there is influence; the differences in the results between students that learn drawing techniques taught by the Contextual Innovative Model (CIM) and taught by Direct Instructional Model (DIM), the differences in achievement among students of technical drawing that have High Creative Thinking Ability (HCTA) with Low Creative Thinking Ability (LCTA), and the interaction between the learning model with the ability to think creatively to the achievement technical drawing. Quasi-experimental research method. Results of research appoint that: the achievement of students that learned technical drawing by using CIM is higher than the students that learned technical drawing by using DIM, the achievement of students of technical drawings HCTA is higher than the achievement of students who have technical drawing LCTA, and there are interactions between the use of learning models and creative thinking abilities in influencing student achievement technical drawing.

  8. Exploring the Benefits of Teacher-Modeling Strategies Integrated into Career and Technical Education

    Science.gov (United States)

    Cathers, Thomas J., Sr.

    2013-01-01

    This case study examined how career and technical education classes function using multiple instructional modeling strategies integrated into vocational and technical training environments. Seven New Jersey public school technical teachers received an introductory overview of the investigation and participated by responding to 10 open-end…

  9. Microeconomic co-evolution model for financial technical analysis signals

    Science.gov (United States)

    Rotundo, G.; Ausloos, M.

    2007-01-01

    Technical analysis (TA) has been used for a long time before the availability of more sophisticated instruments for financial forecasting in order to suggest decisions on the basis of the occurrence of data patterns. Many mathematical and statistical tools for quantitative analysis of financial markets have experienced a fast and wide growth and have the power for overcoming classical TA methods. This paper aims to give a measure of the reliability of some information used in TA by exploring the probability of their occurrence within a particular microeconomic agent-based model of markets, i.e., the co-evolution Bak-Sneppen model originally invented for describing species population evolutions. After having proved the practical interest of such a model in describing financial index so-called avalanches, in the prebursting bubble time rise, the attention focuses on the occurrence of trend line detection crossing of meaningful barriers, those that give rise to some usual TA strategies. The case of the NASDAQ crash of April 2000 serves as an illustration.

  10. A Human View Model for Socio-Technical Interactions

    Science.gov (United States)

    Handley, Holly A.; Tolk, Andreas

    2012-01-01

    The Human View was developed as an additional architectural viewpoint to focus on the human part of a system. The Human View can be used to collect and organize data in order to understand how human operators interact and impact the other elements of a system. This framework can also be used to develop a model to describe how humans interact with each other in network enabled systems. These socio-technical interactions form the foundation of the emerging area of Human Interoperability. Human Interoperability strives to understand the relationships required between human operators that impact collaboration across networked environments, including the effect of belonging to different organizations. By applying organizational relationship concepts from network theory to the Human View elements, and aligning these relationships with a model developed to identify layers of coalition interoperability, the conditions for different levels for Human Interoperability for network enabled systems can be identified. These requirements can then be captured in the Human View products to improve the overall network enabled system.

  11. Modelling transport energy demand: A socio-technical approach

    International Nuclear Information System (INIS)

    Anable, Jillian; Brand, Christian; Tran, Martino; Eyre, Nick

    2012-01-01

    Despite an emerging consensus that societal energy consumption and related emissions are not only influenced by technical efficiency but also by lifestyles and socio-cultural factors, few attempts have been made to operationalise these insights in models of energy demand. This paper addresses that gap by presenting a scenario exercise using an integrated suite of sectoral and whole systems models to explore potential energy pathways in the UK transport sector. Techno-economic driven scenarios are contrasted with one in which social change is strongly influenced by concerns about energy use, the environment and well-being. The ‘what if’ Lifestyle scenario reveals a future in which distance travelled by car is reduced by 74% by 2050 and final energy demand from transport is halved compared to the reference case. Despite the more rapid uptake of electric vehicles and the larger share of electricity in final energy demand, it shows a future where electricity decarbonisation could be delayed. The paper illustrates the key trade-off between the more aggressive pursuit of purely technological fixes and demand reduction in the transport sector and concludes there are strong arguments for pursuing both demand and supply side solutions in the pursuit of emissions reduction and energy security.

  12. A Series of Molecular Dynamics and Homology Modeling Computer Labs for an Undergraduate Molecular Modeling Course

    Science.gov (United States)

    Elmore, Donald E.; Guayasamin, Ryann C.; Kieffer, Madeleine E.

    2010-01-01

    As computational modeling plays an increasingly central role in biochemical research, it is important to provide students with exposure to common modeling methods in their undergraduate curriculum. This article describes a series of computer labs designed to introduce undergraduate students to energy minimization, molecular dynamics simulations,…

  13. Formal Modelling and Analysis of Socio-Technical Systems

    NARCIS (Netherlands)

    Probst, Christian W.; Kammüller, Florian; Rydhof Hansen, René; Probst, Christian W.; Hankin, Chris; Rydhof Hansen, René

    2015-01-01

    Attacks on systems and organisations increasingly exploit human actors, for example through social engineering. This non-technical aspect of attacks complicates their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio-technical

  14. Modelling the International Climate Change Negotiations: A Non-Technical Outline of Model Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Underdal, Arild

    1997-12-31

    This report discusses in non-technical terms the overall architecture of a model that will be designed to enable the user to (1) explore systematically the political feasibility of alternative policy options and (2) to determine the set of politically feasible solutions in the global climate change negotiations. 25 refs., 2 figs., 1 tab.

  15. GROUP GUIDANCE SERVICES MANAGEMENT OF BEHAVIORAL TECHNIC HOMEWORK MODEL

    Directory of Open Access Journals (Sweden)

    Juhri A M.

    2013-09-01

    Full Text Available Abstract: This simple paper describes the implementation of management guidance service groups using the model home visits behavioral techniques (behavior technic homework. The ideas outlined in this paper are intended to add insight for counselors in the management of the implementation of counseling services group that carried out effectively. This simple paper is expected to be used as reference studies in theoretical matters relating to the management guidance services group, for counselors to students both need guidance services and those who passively as they face various problems difficulties martial jar and obstacles in the achievement of learning , In general, this study aims to provide insight in particular in the development of social skills for students, especially the ability to communicate with the participants of the service (students more While specifically to encourage the development of feelings, thoughts, perceptions, insights and attitudes that support embodiments behavior Iebih creative and effective in improving communication skills both verbal and non-verbal for students. Keyword: counselor, counseling, group, student

  16. Technical note: A noniterative approach to modelling moist thermodynamics

    Science.gov (United States)

    Moisseeva, Nadya; Stull, Roland

    2017-12-01

    Formulation of noniterative mathematical expressions for moist thermodynamics presents a challenge for both numerical and theoretical modellers. This technical note offers a simple and efficient tool for approximating two common thermodynamic relationships: temperature, T, at a given pressure, P, along a saturated adiabat, T(P, θw), as well as its corresponding inverse form θw(P, T), where θw is wet-bulb potential temperature. Our method allows direct calculation of T(P, θw) and θw(P, T) on a thermodynamic domain bounded by -70 ≤ θw 1 kPa and -100 ≤ T 1 kPa, respectively. The proposed parameterizations offer high accuracy (mean absolute errors of 0.016 and 0.002 °C for T(P, θw) and θw(P, T), respectively) on a notably larger thermodynamic region than previously studied. The paper includes a method summary and a ready-to-use tool to aid atmospheric physicists in their practical applications.

  17. Interpolation techniques used for data quality control and calculation of technical series: an example of a Central European daily time series

    Czech Academy of Sciences Publication Activity Database

    Štěpánek, P.; Zahradníček, P.; Huth, Radan

    2011-01-01

    Roč. 115, 1-2 (2011), s. 87-98 ISSN 0324-6329 R&D Projects: GA ČR GA205/08/1619 Institutional research plan: CEZ:AV0Z30420517 Keywords : data quality control * filling missing values * interpolation techniques * climatological time series Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.364, year: 2011 http://www.met.hu/en/ismeret-tar/kiadvanyok/idojaras/index.php?id=34

  18. Final Technical Report Advanced Solar Resource Modeling and Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The SunShot Initiative coordinates research, development, demonstration, and deployment activities aimed at dramatically reducing the total installed cost of solar power. The SunShot Initiative focuses on removing critical technical and non-technical barriers to installing and integrating solar energy into the electricity grid. Uncertainty in projected power and energy production from solar power systems contributes to these barriers by increasing financial risks to photovoltaic (PV) deployment and by exacerbating the technical challenges to integration of solar power on the electricity grid.

  19. TECHNICAL PRODUCT RISK ASSESSMENT: STANDARDS, INTEGRATION IN THE ERM MODEL AND UNCERTAINTY MODELING

    Directory of Open Access Journals (Sweden)

    Mirko Djapic

    2016-03-01

    Full Text Available European Union has accomplished, through introducing New Approach to technical harmonization and standardization, a breakthrough in the field of technical products safety and in assessing their conformity, in such a manner that it integrated products safety requirements into the process of products development. This is achieved by quantifying risk levels with the aim of determining the scope of the required safety measures and systems. The theory of probability is used as a tool for modeling uncertainties in the assessment of that risk. In the last forty years are developed new mathematical theories have proven to be better at modeling uncertainty when we have not enough data about uncertainty events which is usually the case in product development. Bayesian networks based on modeling of subjective probability and Evidence networks based on Dempster-Shafer theory of belief functions proved to be an excellent tool for modeling uncertainty when we do not have enough information about all events aspect.

  20. Formal modelling and analysis of socio-technical systems

    DEFF Research Database (Denmark)

    Probst, Christian W.; Kammüller, Florian; Hansen, Rene Rydhof

    2016-01-01

    -technical systems are still mostly identified through brainstorming of experts. In this work we discuss several approaches to formalising socio-technical systems and their analysis. Starting from a flow logic-based analysis of the insider threat, we discuss how to include the socio aspects explicitly, and show......Attacks on systems and organisations increasingly exploit human actors, for example through social engineering. This non-technical aspect of attacks complicates their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio...... a formalisation that proves properties of this formalisation. On the formal side, our work closes the gap between formal and informal approaches to socio-technical systems. On the informal side, we show how to steal a birthday cake from a bakery by social engineering....

  1. Modelling Socio-Technical Aspects of Organisational Security

    NARCIS (Netherlands)

    Ivanova, Marieta G.

    2016-01-01

    Identification of threats to organisations and risk assessment often take into consideration the pure technical aspects, overlooking the vulnerabilities originating from attacks on a social level, for example social engineering, and abstracting away the physical infrastructure. However, attacks on

  2. Convergence Properties of a Class of Probabilistic Adaptive Schemes Called Sequential Reproductive Plans. Psychology and Education Series, Technical Report No. 210.

    Science.gov (United States)

    Martin, Nancy

    Presented is a technical report concerning the use of a mathematical model describing certain aspects of the duplication and selection processes in natural genetic adaptation. This reproductive plan/model occurs in artificial genetics (the use of ideas from genetics to develop general problem solving techniques for computers). The reproductive…

  3. A Decision Model for Technical Journal Deselection with an Experiment in Biomedical Communications.

    Science.gov (United States)

    Triolo, Victor A; Bao, Dachun

    1993-01-01

    Proposes a model for technical journal deselection based on the Bradford law of distribution. The model's operational prescriptives, characteristics of the Bradford distribution in the field of pediatrics, and relevance to collection management are discussed. (42 references) (KRN)

  4. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    Science.gov (United States)

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  5. time series modeling of daily abandoned calls in a call centre

    African Journals Online (AJOL)

    DJFLEX

    Models for evaluating and predicting the short periodic time series in daily abandoned calls in a call center are developed. Abandonment of calls due to impatient is an identified problem among most call centers. The two competing models were derived using Fourier series and the Box and Jenkins modeling approaches.

  6. Time series modeling of daily abandoned calls in a call centre ...

    African Journals Online (AJOL)

    Models for evaluating and predicting the short periodic time series in daily abandoned calls in a call center are developed. Abandonment of calls due to impatient is an identified problem among most call centers. The two competing models were derived using Fourier series and the Box and Jenkins modeling approaches.

  7. A prediction method based on wavelet transform and multiple models fusion for chaotic time series

    International Nuclear Information System (INIS)

    Zhongda, Tian; Shujiang, Li; Yanhong, Wang; Yi, Sha

    2017-01-01

    In order to improve the prediction accuracy of chaotic time series, a prediction method based on wavelet transform and multiple models fusion is proposed. The chaotic time series is decomposed and reconstructed by wavelet transform, and approximate components and detail components are obtained. According to different characteristics of each component, least squares support vector machine (LSSVM) is used as predictive model for approximation components. At the same time, an improved free search algorithm is utilized for predictive model parameters optimization. Auto regressive integrated moving average model (ARIMA) is used as predictive model for detail components. The multiple prediction model predictive values are fusion by Gauss–Markov algorithm, the error variance of predicted results after fusion is less than the single model, the prediction accuracy is improved. The simulation results are compared through two typical chaotic time series include Lorenz time series and Mackey–Glass time series. The simulation results show that the prediction method in this paper has a better prediction.

  8. a model for nonlinear innovation in time series

    African Journals Online (AJOL)

    DJFLEX

    heteroscedastic errors are common in financial and econometric time series. The conditional variance may be specified as nonlinear autoregressive conditional heteroscedasticity ...... applied econometrics, 8, 31 – 49. Rao, C. R., 1973. Linear statistical inference and its applications, 2nd edition. New york: John Wiley.

  9. Carbon footprint estimator, phase II : volume I - GASCAP model & volume II - technical appendices [technical brief].

    Science.gov (United States)

    2014-03-01

    This study resulted in the development of the GASCAP model (the Greenhouse Gas Assessment : Spreadsheet for Transportation Capital Projects). This spreadsheet model provides a user-friendly interface for determining the greenhouse gas (GHG) emissions...

  10. Modelling Socio-Technical Aspects of Organisational Security

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva

    Identification of threats to organisations and risk assessment often take into consideration the pure technical aspects, overlooking the vulnerabilities originating from attacks on a social level, for example social engineering, and abstracting away the physical infrastructure. However, attacks o...... it. We validate our approach using scenarios from IPTV and Cloud Infrastructure case studies....

  11. Memos with Personality: A Model from British Technical Colleges.

    Science.gov (United States)

    Yee, Carole

    1986-01-01

    Notes that while American technical writing texts stress brevity and directness as important characteristics of business correspondence, British texts stress qualities of personality and courtesy, especially in the memo. Shows how to incorporate personality into correspondence, thereby building cooperation among colleagues. (FL)

  12. Competence Model and Modern Trends of Development of the Russian Institute of Technical Customer

    Directory of Open Access Journals (Sweden)

    Mishlanova Marina

    2017-01-01

    Full Text Available Article considers modern maintenance and development of the management actor by the investment-construction projects of the technical customer. Urgent problems of the formation of Institute of the technical customer establishment are allocated. Elementary competence model is presented: based competences of technical customer, model of the primary competence, example of the operational level of the model. Analysis of the development of the Institute of the technical customer was performed: compliance with current realities of investment-construction activities, improvement of contractual relations, compliance with international standards, state participation, creation of the single technical customer. Necessity of development of competence models for the urgent justification of professional standards is assessed. The possibility of modeling of the competencies and functions of technical customer in approach to the FIDIC-model was revealed. Possibility of usage of the competence model of the technical customer on the stage of building in terms of public-private partnership. Results show the direction for further researches.

  13. Evolution of Black-Box Models Based on Volterra Series

    Directory of Open Access Journals (Sweden)

    Daniel D. Silveira

    2015-01-01

    Full Text Available This paper presents a historical review of the many behavioral models actually used to model radio frequency power amplifiers and a new classification of these behavioral models. It also discusses the evolution of these models, from a single polynomial to multirate Volterra models, presenting equations and estimation methods. New trends in RF power amplifier behavioral modeling are suggested.

  14. Modeling Interdependent Socio-technical Networks via ABM Smart Grid Case

    NARCIS (Netherlands)

    Worm, D.T.H.; Langley, D.J.; Becker, J.M.

    2013-01-01

    The objective of this paper is to improve scientific modeling of interdependent socio-technical networks. In these networks the interplay between technical or infrastructural elements on the one hand and social and behavioral aspects on the other hand, is of importance. Examples include electricity

  15. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  16. Learning restricted Boolean network model by time-series data.

    Science.gov (United States)

    Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.

  17. Technical know-how for modeling of geological environment. (1) Overview and groundwater flow modeling

    International Nuclear Information System (INIS)

    Saegusa, Hiromitsu; Takeuchi, Shinji; Maekawa, Keisuke; Osawa, Hideaki; Semba, Takeshi

    2011-01-01

    It is important for site characterization projects to manage the decision-making process with transparency and traceability and to transfer the technical know-how accumulated during the research and development to the implementing phase and to future generations. The modeling for a geological environment is to be used to synthesize investigation results. Evaluation of the impact of uncertainties in the model is important to identify and prioritize key issues for further investigations. Therefore, a plan for site characterization should be made based on the results of the modeling. The aim of this study is to support for the planning of initial surface-based site characterization based on the technical know-how accumulated from the Mizunami Underground Research Laboratory Project and the Horonobe Underground Research Laboratory Project. These projects are broad scientific studies of the deep geological environment that are a basis for research and development for the geological disposal of high-level radioactive wastes. In this study, the work-flow of the groundwater flow modeling, which is one of the geological environment models, and is to be used for setting the area for the geological environment modeling and for groundwater flow characterization, and the related decision-making process using literature data have been summarized. (author)

  18. Modeling technical efficiency of inshore fishery using data envelopment analysis

    Science.gov (United States)

    Rahman, Rahayu; Zahid, Zalina; Khairi, Siti Shaliza Mohd; Hussin, Siti Aida Sheikh

    2016-10-01

    Fishery industry contributes significantly to the economy of Malaysia. This study utilized Data Envelopment Analysis application in estimating the technical efficiency of fishery in Terengganu, a state on the eastern coast of Peninsular Malaysia, based on multiple output, i.e. total fish landing and income of fishermen with six inputs, i.e. engine power, vessel size, number of trips, number of workers, cost and operation distance. The data were collected by survey conducted between November and December 2014. The decision making units (DMUs) involved 100 fishermen from 10 fishery areas. The result showed that the technical efficiency in Season I (dry season) and Season II (rainy season) were 90.2% and 66.7% respectively. About 27% of the fishermen were rated to be efficient during Season I, meanwhile only 13% of the fishermen achieved full efficiency 100% during Season II. The results also found out that there was a significance difference in the efficiency performance between the fishery areas.

  19. Task technical plan: DWPF air permit/dispersion modeling

    International Nuclear Information System (INIS)

    Lambert, D.P.

    1993-01-01

    This Task Technical Plan summarizes work required to project the benzene emissions from the Late Wash Facility (LWF) as well as update the benzene, mercury, and NO x emissions from the remainder of the Defense Waste Processing Facility (DWPF). These calculations will reflect (1) the addition of the LWF and (2) the replacement of formic acid with nitric acid in the melter preparation process. The completed calculations will be used to assist DWPF in applying for the LWF Air Quality Permit

  20. Fitting ARMA Time Series by Structural Equation Models.

    Science.gov (United States)

    van Buuren, Stef

    1997-01-01

    This paper outlines how the stationary ARMA (p,q) model (G. Box and G. Jenkins, 1976) can be specified as a structural equation model. Maximum likelihood estimates for the parameters in the ARMA model can be obtained by software for fitting structural equation models. The method is applied to three problem types. (SLD)

  1. Modeling Large Time Series for Efficient Approximate Query Processing

    DEFF Research Database (Denmark)

    Perera, Kasun S; Hahmann, Martin; Lehner, Wolfgang

    2015-01-01

    -wise aggregation to derive the models. These models are initially created from the original data and are kept in the database along with it. Subsequent queries are answered using the stored models rather than scanning and processing the original datasets. In order to support model query processing, we maintain...

  2. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  3. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...

  4. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. Presented at the IADIS m-learning 2008 Conference. April, 11-13, 2008, Carvoeiro, Portugal.

  5. Modeling refractive metasurfaces in series as a single metasurface

    Science.gov (United States)

    Toor, Fatima; Guneratne, Ananda C.

    2016-03-01

    Metasurfaces are boundaries between two media that are engineered to induce an abrupt phase shift in propagating light over a distance comparable to the wavelength of the light. Metasurface applications exploit this rapid phase shift to allow for precise control of wavefronts. The phase gradient is used to compute the angle at which light is refracted using the generalized Snell's Law. [1] In practice, refractive metasurfaces are designed using a relatively small number of phaseshifting elements such that the phase gradient is discrete rather than continuous. Designing such a metasurface requires finding phase-shifting elements that cover a full range of phases (a phase range) from 0 to 360 degrees. We demonstrate an analytical technique to calculate the refraction angle due to multiple metasurfaces arranged in series without needing to account for the effect of each individual metasurface. The phase gradients of refractive metasurfaces in series may be summed to obtain the phase gradient of a single equivalent refractive metasurface. This result is relevant to any application that requires a system with multiple metasurfaces, such as biomedical imaging [2], wavefront correctors [3], and beam shaping [4].

  6. On the Practice of Bayesian Inference in Basic Economic Time Series Models using Gibbs Sampling

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); R. Segers (René); H.K. van Dijk (Herman)

    2006-01-01

    textabstractSeveral lessons learned from a Bayesian analysis of basic economic time series models by means of the Gibbs sampling algorithm are presented. Models include the Cochrane-Orcutt model for serial correlation, the Koyck distributed lag model, the Unit Root model, the Instrumental Variables

  7. A generalized exponential time series regression model for electricity prices

    DEFF Research Database (Denmark)

    Haldrup, Niels; Knapik, Oskar; Proietti, Tomasso

    on the estimated model, the best linear predictor is constructed. Our modeling approach provides good fit within sample and outperforms competing benchmark predictors in terms of forecasting accuracy. We also find that building separate models for each hour of the day and averaging the forecasts is a better...

  8. SERI Wind Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Noun, R. J.

    1983-06-01

    The SERI Wind Energy Program manages the areas or innovative research, wind systems analysis, and environmental compatibility for the U.S. Department of Energy. Since 1978, SERI wind program staff have conducted in-house aerodynamic and engineering analyses of novel concepts for wind energy conversion and have managed over 20 subcontracts to determine technical feasibility; the most promising of these concepts is the passive blade cyclic pitch control project. In the area of systems analysis, the SERI program has analyzed the impact of intermittent generation on the reliability of electric utility systems using standard utility planning models. SERI has also conducted methodology assessments. Environmental issues related to television interference and acoustic noise from large wind turbines have been addressed. SERI has identified the causes, effects, and potential control of acoustic noise emissions from large wind turbines.

  9. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    waste LCA models. This review infers that some of the differences in waste LCA models are inherent to the time they were developed. It is expected that models developed later, benefit from past modelling assumptions and knowledge and issues. Models developed in different countries furthermore rely...

  10. Development of knowledge models by linguistic analysis of lexical relationships in technical documents

    International Nuclear Information System (INIS)

    Seguela, Patrick

    2001-01-01

    This research thesis addresses the problem of knowledge acquisition and structuring from technical texts, and the use of this knowledge in the development of models. The author presents the Cameleon method which aims at extracting binary lexical relationships from technical texts by identifying linguistic markers. The relevance of this method is assessed in the case of four different corpuses: a written technical corpus, an oral technical corpus, a corpus of texts of instructions, and a corpus of academic texts. The author reports the development of a model of representation of knowledge of a specific field by using lexical relationships. The method is then applied to develop a model used in document search within a knowledge management system [fr

  11. 78 FR 76248 - Special Conditions: Airbus, Model A350-900 Series Airplane; Side Stick Controller

    Science.gov (United States)

    2013-12-17

    ..., Model A350-900 Series Airplane; Side Stick Controller AGENCY: Federal Aviation Administration (FAA), DOT... associated with side stick controllers which require limited pilot force because they are operated by only... A350-900 series airplane is equipped with two side stick controllers instead of the conventional...

  12. Modeling BAS Dysregulation in Bipolar Disorder : Illustrating the Potential of Time Series Analysis

    NARCIS (Netherlands)

    Hamaker, Ellen L.; Grasman, Raoul P P P; Kamphuis, Jan Henk

    2016-01-01

    Time series analysis is a technique that can be used to analyze the data from a single subject and has great potential to investigate clinically relevant processes like affect regulation. This article uses time series models to investigate the assumed dysregulation of affect that is associated with

  13. Mixed Portmanteau Test for Diagnostic Checking of Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2014-01-01

    Full Text Available Model criticism is an important stage of model building and thus goodness of fit tests provides a set of tools for diagnostic checking of the fitted model. Several tests are suggested in literature for diagnostic checking. These tests use autocorrelation or partial autocorrelation in the residuals to criticize the adequacy of fitted model. The main idea underlying these portmanteau tests is to identify if there is any dependence structure which is yet unexplained by the fitted model. In this paper, we suggest mixed portmanteau tests based on autocorrelation and partial autocorrelation functions of the residuals. We derived the asymptotic distribution of the mixture test and studied its size and power using Monte Carlo simulations.

  14. SEM Based CARMA Time Series Modeling for Arbitrary N.

    Science.gov (United States)

    Oud, Johan H L; Voelkle, Manuel C; Driver, Charles C

    2018-01-01

    This article explains in detail the state space specification and estimation of first and higher-order autoregressive moving-average models in continuous time (CARMA) in an extended structural equation modeling (SEM) context for N = 1 as well as N > 1. To illustrate the approach, simulations will be presented in which a single panel model (T = 41 time points) is estimated for a sample of N = 1,000 individuals as well as for samples of N = 100 and N = 50 individuals, followed by estimating 100 separate models for each of the one-hundred N = 1 cases in the N = 100 sample. Furthermore, we will demonstrate how to test the difference between the full panel model and each N = 1 model by means of a subject-group-reproducibility test. Finally, the proposed analyses will be applied in an empirical example, in which the relationships between mood at work and mood at home are studied in a sample of N = 55 women. All analyses are carried out by ctsem, an R-package for continuous time modeling, interfacing to OpenMx.

  15. Danish wind turbines: Technical-economic feasibility of commercial models

    International Nuclear Information System (INIS)

    Falchetta, M.

    1992-01-01

    This feasibility study examines the principal technical and economic (investment-manufacturing-installation-operation unit costs, supply and demand) characteristics of wind turbines being commercialized in Denmark. The general configuration of the 150 to 450 kW range machines currently being manufactured can be described as that of a three bladed fibreglass rotor, of from 24 to 35 meters in diameter, and mounted on a tower of from 29 to 41 meters in height. The electrical system consists of one asynchronous generator or a two generator system with a power ratio of 1 to 5 between the two generators. The cost analysis reveals that the Danish wind turbines are competitively priced, with per kWh costs varying from $0. 0675 to $0. 040 for operating wind speeds ranging from 5 to 7 m/sec, and that their overall design and performance characteristics make them suitable for Italian site conditions

  16. A case series on the technical use of three-dimensional image guidance in subaxial anterior cervical surgery.

    Science.gov (United States)

    Pirris, Stephen M; Nottmeier, Eric W

    2015-03-01

    Three dimensional (3D) image guidance has been used to improve the safety of complex spine surgeries, but its use has been limited in anterior cervical spine approaches. Twenty-two patients underwent complex anterior cervical spine surgeries in which 3D image guidance provided intraoperative assistance with the dissection, decompression and implant placement. One of two paired systems, the BrainLAB (BrainLAB, Westchester, Illinois) system, or Stealth (Medtronic Inc., Littleton, Massachusetts) system was used for 3D image guidance in this study. Image guidance was able to reliably locate pertinent anatomical structures in complex anterior cervical spine surgery involving re-exploration, dissection around vertebral arteries or deformity correction. No complications occurred, and no patients required a revision anterior surgery. This technical note describes the setup and technique for the use of cone beam computed tomography (cbCT)-based, 3D image guidance in subaxial anterior cervical surgery. The authors have found this technique to be a useful adjunct in revision anterior cervical procedures, as well as anterior cervical procedures involving corpectomy or tumor removal. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    Science.gov (United States)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  18. Applying Time Series Analysis Model to Temperature Data in Greenhouses

    Directory of Open Access Journals (Sweden)

    Abdelhafid Hasni

    2011-03-01

    Full Text Available The objective of the research is to find an appropriate Seasonal Auto-Regressive Integrated Moving Average (SARIMA Model for fitting the inside air temperature (Tin of a naturally ventilated greenhouse under Mediterranean conditions by considering the minimum of Akaike Information Criterion (AIC. The results of fitting were as follows: the best SARIMA Model for fitting air temperature of greenhouse is SARIMA (1,0,0 (1,0,224.

  19. MODELLING INTERNATIONAL OILSEED PRICES: AN APPLICATION OF THE STRUCTURAL TIME SERIES MODEL

    Directory of Open Access Journals (Sweden)

    Jaweriah Hazrana

    2017-04-01

    Full Text Available The fundamentals characterizing agricultural commodity prices have often been debated in research and policy circles. Building on limitations in the existing literature, the present study conducts an integrated test and empirically analyses the international price of palm and soybean oil from 1960(1 to 2016(8. For this purpose the univariate Structural Time Series Model based on the state space framework is applied. This approach allows flexibility to model complex stochastic movements, seasonality, cyclical patterns and incorporate intervention analysis. Estimation is based on the Maximum Likelihood method via the Kalman Filter. The results establish that both series exhibit a stochastic long term trend punctuated by multiple breaks. The findings also uncover the presence of cyclicality which results in price swings of varying duration and amplitude. The model works well as a description of oilseed prices and improves awareness of their separate structural components. These are fundamental to design country and commodity specific policy strategies and respond to volatile market conditions. The results underscore that contrary to previous price spikes most of the drivers of the mid 2000s price spikes are structural and on the demand side. These new drivers in oilseed markets suggest the possibility of fundamental change in price behaviour with longer-lasting effects

  20. Energy flow models for the estimation of technical losses in distribution network

    International Nuclear Information System (INIS)

    Au, Mau Teng; Tan, Chin Hooi

    2013-01-01

    This paper presents energy flow models developed to estimate technical losses in distribution network. Energy flow models applied in this paper is based on input energy and peak demand of distribution network, feeder length and peak demand, transformer loading capacity, and load factor. Two case studies, an urban distribution network and a rural distribution network are used to illustrate application of the energy flow models. Results on technical losses obtained for the two distribution networks are consistent and comparable to network of similar types and characteristics. Hence, the energy flow models are suitable for practical application.

  1. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  2. TIME SERIES MODELS OF THREE SETS OF RXTE OBSERVATIONS OF 4U 1543–47

    International Nuclear Information System (INIS)

    Koen, C.

    2013-01-01

    The X-ray nova 4U 1543–47 was in a different physical state (low/hard, high/soft, and very high) during the acquisition of each of the three time series analyzed in this paper. Standard time series models of the autoregressive moving average (ARMA) family are fitted to these series. The low/hard data can be adequately modeled by a simple low-order model with fixed coefficients, once the slowly varying mean count rate has been accounted for. The high/soft series requires a higher order model, or an ARMA model with variable coefficients. The very high state is characterized by a succession of 'dips', with roughly equal depths. These seem to appear independently of one another. The underlying stochastic series can again be modeled by an ARMA form, or roughly as the sum of an ARMA series and white noise. The structuring of each model in terms of short-lived aperiodic and 'quasi-periodic' components is discussed.

  3. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  4. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  5. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  6. Fluoroscopy-Guided Percutaneous Vertebral Body Biopsy Using a Novel Drill-Powered Device: Technical Case Series

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Adam N., E-mail: wallacea@mir.wustl.edu; Pacheco, Rafael A., E-mail: pachecor@mir.wustl.edu; Tomasian, Anderanik, E-mail: tomasiana@mir.wustl.edu [Washington University School of Medicine, Mallinckrodt Institute of Radiology (United States); Hsi, Andy C., E-mail: hsia@path.wustl.edu [Washington University School of Medicine, Division of Anatomic Pathology, Department of Pathology & Immunology (United States); Long, Jeremiah, E-mail: longj@mir.wustl.edu [Washington University School of Medicine, Mallinckrodt Institute of Radiology (United States); Chang, Randy O., E-mail: changr@wusm.wustl.edu [Washington University School of Medicine (United States); Jennings, Jack W., E-mail: jenningsj@mir.wustl.edu [Washington University School of Medicine, Mallinckrodt Institute of Radiology (United States)

    2016-02-15

    BackgroundA novel coaxial biopsy system powered by a handheld drill has recently been introduced for percutaneous bone biopsy. This technical note describes our initial experience performing fluoroscopy-guided vertebral body biopsies with this system, compares the yield of drill-assisted biopsy specimens with those obtained using a manual technique, and assesses the histologic adequacy of specimens obtained with drill assistance.MethodsMedical records of all single-level, fluoroscopy-guided vertebral body biopsies were reviewed. Procedural complications were documented according to the Society of Interventional Radiology classification. The total length of bone core obtained from drill-assisted biopsies was compared with that of matched manual biopsies. Pathology reports were reviewed to determine the histologic adequacy of specimens obtained with drill assistance.ResultsTwenty eight drill-assisted percutaneous vertebral body biopsies met study inclusion criteria. No acute complications were reported. Of the 86 % (24/28) of patients with clinical follow-up, no delayed complications were reported (median follow-up, 28 weeks; range 5–115 weeks). The median total length of bone core obtained from drill-assisted biopsies was 28 mm (range 8–120 mm). This was longer than that obtained from manual biopsies (median, 20 mm; range 5–45 mm; P = 0.03). Crush artifact was present in 11 % (3/28) of drill-assisted biopsy specimens, which in one case (3.6 %; 1/28) precluded definitive diagnosis.ConclusionsA drill-assisted, coaxial biopsy system can be used to safely obtain vertebral body core specimens under fluoroscopic guidance. The higher bone core yield obtained with drill assistance may be offset by the presence of crush artifact.

  7. Model of a synthetic wind speed time series generator

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.

    2008-01-01

    is described and some statistical issues (seasonal characteristics, autocorrelation functions, average values and distribution functions) are used for verification. The output of the model has been designed as input for sequential Monte Carlo simulation; however, it is expected that it can be used for other...... of the main elements to consider for this purpose is the model of the wind speed that is usually required as input. Wind speed measurements may represent a solution for this problem, but, for techniques such as sequential Monte Carlo simulation, they have to be long enough in order to describe a wide range...

  8. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    Directory of Open Access Journals (Sweden)

    Yanhui Xi

    2016-01-01

    Full Text Available The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation. With the new representations, a theoretical explanation of leverage effect is provided. Simulated data and daily stock market indices (Shanghai composite index, Shenzhen component index, and Standard and Poor’s 500 Composite index via Bayesian Markov Chain Monte Carlo (MCMC method are used to estimate the leverage market microstructure model. The results verify the effectiveness of the model and its estimation approach proposed in the paper and also indicate that the stock markets have strong leverage effects. Compared with the classical leverage stochastic volatility (SV model in terms of DIC (Deviance Information Criterion, the leverage market microstructure model fits the data better.

  9. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  10. Linear models for multivariate, time series, and spatial data

    CERN Document Server

    Christensen, Ronald

    1991-01-01

    This is a companion volume to Plane Answers to Complex Questions: The Theory 0/ Linear Models. It consists of six additional chapters written in the same spirit as the last six chapters of the earlier book. Brief introductions are given to topics related to linear model theory. No attempt is made to give a comprehensive treatment of the topics. Such an effort would be futile. Each chapter is on a topic so broad that an in depth discussion would require a book-Iength treatment. People need to impose structure on the world in order to understand it. There is a limit to the number of unrelated facts that anyone can remem­ ber. If ideas can be put within a broad, sophisticatedly simple structure, not only are they easier to remember but often new insights become avail­ able. In fact, sophisticatedly simple models of the world may be the only ones that work. I have often heard Arnold Zellner say that, to the best of his knowledge, this is true in econometrics. The process of modeling is fundamental to understand...

  11. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  12. Simulated lumbar minimally invasive surgery educational model with didactic and technical components.

    Science.gov (United States)

    Chitale, Rohan; Ghobrial, George M; Lobel, Darlene; Harrop, James

    2013-10-01

    The learning and development of technical skills are paramount for neurosurgical trainees. External influences and a need for maximizing efficiency and proficiency have encouraged advancements in simulator-based learning models. To confirm the importance of establishing an educational curriculum for teaching minimally invasive techniques of pedicle screw placement using a computer-enhanced physical model of percutaneous pedicle screw placement with simultaneous didactic and technical components. A 2-hour educational curriculum was created to educate neurosurgical residents on anatomy, pathophysiology, and technical aspects associated with image-guided pedicle screw placement. Predidactic and postdidactic practical and written scores were analyzed and compared. Scores were calculated for each participant on the basis of the optimal pedicle screw starting point and trajectory for both fluoroscopy and computed tomographic navigation. Eight trainees participated in this module. Average mean scores on the written didactic test improved from 78% to 100%. The technical component scores for fluoroscopic guidance improved from 58.8 to 52.9. Technical score for computed tomography-navigated guidance also improved from 28.3 to 26.6. Didactic and technical quantitative scores with a simulator-based educational curriculum improved objectively measured resident performance. A minimally invasive spine simulation model and curriculum may serve a valuable function in the education of neurosurgical residents and outcomes for patients.

  13. Technical Work Plan for: Thermodynamic Databases for Chemical Modeling

    International Nuclear Information System (INIS)

    C.F. Jovecolon

    2006-01-01

    The objective of the work scope covered by this Technical Work Plan (TWP) is to correct and improve the Yucca Mountain Project (YMP) thermodynamic databases, to update their documentation, and to ensure reasonable consistency among them. In addition, the work scope will continue to generate database revisions, which are organized and named so as to be transparent to internal and external users and reviewers. Regarding consistency among databases, it is noted that aqueous speciation and mineral solubility data for a given system may differ according to how solubility was determined, and the method used for subsequent retrieval of thermodynamic parameter values from measured data. Of particular concern are the details of the determination of ''infinite dilution'' constants, which involve the use of specific methods for activity coefficient corrections. That is, equilibrium constants developed for a given system for one set of conditions may not be consistent with constants developed for other conditions, depending on the species considered in the chemical reactions and the methods used in the reported studies. Hence, there will be some differences (for example in log K values) between the Pitzer and ''B-dot'' database parameters for the same reactions or species

  14. Technical Work Plan for: Thermodynamic Database for Chemical Modeling

    Energy Technology Data Exchange (ETDEWEB)

    C.F. Jovecolon

    2006-09-07

    The objective of the work scope covered by this Technical Work Plan (TWP) is to correct and improve the Yucca Mountain Project (YMP) thermodynamic databases, to update their documentation, and to ensure reasonable consistency among them. In addition, the work scope will continue to generate database revisions, which are organized and named so as to be transparent to internal and external users and reviewers. Regarding consistency among databases, it is noted that aqueous speciation and mineral solubility data for a given system may differ according to how solubility was determined, and the method used for subsequent retrieval of thermodynamic parameter values from measured data. Of particular concern are the details of the determination of ''infinite dilution'' constants, which involve the use of specific methods for activity coefficient corrections. That is, equilibrium constants developed for a given system for one set of conditions may not be consistent with constants developed for other conditions, depending on the species considered in the chemical reactions and the methods used in the reported studies. Hence, there will be some differences (for example in log K values) between the Pitzer and ''B-dot'' database parameters for the same reactions or species.

  15. Mesorad dose assessment model. Volume 1. Technical basis

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; Bander, T.J.; Athey, G.F.; Ramsdell, J.V.

    1986-03-01

    MESORAD is a dose assessment model for emergency response applications. Using release data for as many as 50 radionuclides, the model calculates: (1) external doses resulting from exposure to radiation emitted by radionuclides contained in elevated or deposited material; (2) internal dose commitment resulting from inhalation; and (3) total whole-body doses. External doses from airborne material are calculated using semi-infinite and finite cloud approximations. At each stage in model execution, the appropriate approximation is selected after considering the cloud dimensions. Atmospheric processes are represented in MESORAD by a combination of Lagrangian puff and Gaussian plume dispersion models, a source depletion (deposition velocity) dry deposition model, and a wet deposition model using washout coefficients based on precipitation rates

  16. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...... Carlo (MCMC) sampling techniques. The advantage of this method is that detection of short time learning effects between repeated trials is possible since inference is based only on single trial experiments....

  17. Developing and Validating the Socio-Technical Model in Ontology Engineering

    Science.gov (United States)

    Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin

    2018-03-01

    This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.

  18. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  19. Research on power grid loss prediction model based on Granger causality property of time series

    Energy Technology Data Exchange (ETDEWEB)

    Wang, J. [North China Electric Power Univ., Beijing (China); State Grid Corp., Beijing (China); Yan, W.P.; Yuan, J. [North China Electric Power Univ., Beijing (China); Xu, H.M.; Wang, X.L. [State Grid Information and Telecommunications Corp., Beijing (China)

    2009-03-11

    This paper described a method of predicting power transmission line losses using the Granger causality property of time series. The stable property of the time series was investigated using unit root tests. The Granger causality relationship between line losses and other variables was then determined. Granger-caused time series were then used to create the following 3 prediction models: (1) a model based on line loss binomials that used electricity sales to predict variables, (2) a model that considered both power sales and grid capacity, and (3) a model based on autoregressive distributed lag (ARDL) approaches that incorporated both power sales and the square of power sales as variables. A case study of data from China's electric power grid between 1980 and 2008 was used to evaluate model performance. Results of the study showed that the model error rates ranged between 2.7 and 3.9 percent. 6 refs., 3 tabs., 1 fig.

  20. Nonlinearity, Breaks, and Long-Range Dependence in Time-Series Models

    DEFF Research Database (Denmark)

    Hillebrand, Eric Tobias; Medeiros, Marcelo C.

    We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building...

  1. Power flow control for transmission networks with implicit modeling of static synchronous series compensator

    DEFF Research Database (Denmark)

    Kamel, S.; Jurado, F.; Chen, Zhe

    2015-01-01

    This paper presents an implicit modeling of Static Synchronous Series Compensator (SSSC) in Newton–Raphson load flow method. The algorithm of load flow is based on the revised current injection formulation. The developed model of SSSC is depended on the current injection approach. In this model...

  2. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  3. On Fire regime modelling using satellite TM time series

    Science.gov (United States)

    Oddi, F.; . Ghermandi, L.; Lanorte, A.; Lasaponara, R.

    2009-04-01

    Wildfires can cause an environment deterioration modifying vegetation dynamics because they have the capacity of changing vegetation diversity and physiognomy. In semiarid regions, like the northwestern Patagonia, fire disturbance is also important because it could impact on the potential productivity of the ecosystem. There is reduction plant biomass and with that reducing the animal carrying capacity and/or the forest site quality with negative economics implications. Therefore knowledge of the fires regime in a region is of great importance to understand and predict the responses of vegetation and its possible effect on the regional economy. Studies of this type at a landscape level can be addressed using GIS tools. Satellite imagery allows detect burned areas and through a temporary analysis can be determined to fire regime and detecting changes at landscape scale. The study area of work is located on the east of the city of Bariloche including the San Ramon Ranch (22,000 ha) and its environs in the ecotone formed by the sub Antarctic forest and the patagonian steppe. We worked with multiespectral Landsat TM images and Landsat ETM + 30m spatial resolution obtained at different times. For the spatial analysis we used the software Erdas Imagine 9.0 and ArcView 3.3. A discrimination of vegetation types has made and was determined areas affected by fires in different years. We determined the level of change on vegetation induced by fire. In the future the use of high spatial resolution images combined with higher spectral resolution will allows distinguish burned areas with greater precision on study area. Also the use of digital terrain models derived from satellite imagery associated with climatic variables will allows model the relationship between them and the dynamics of vegetation.

  4. Communications network design and costing model technical manual

    Science.gov (United States)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    This computer model provides the capability for analyzing long-haul trunking networks comprising a set of user-defined cities, traffic conditions, and tariff rates. Networks may consist of all terrestrial connectivity, all satellite connectivity, or a combination of terrestrial and satellite connectivity. Network solutions provide the least-cost routes between all cities, the least-cost network routing configuration, and terrestrial and satellite service cost totals. The CNDC model allows analyses involving three specific FCC-approved tariffs, which are uniquely structured and representative of most existing service connectivity and pricing philosophies. User-defined tariffs that can be variations of these three tariffs are accepted as input to the model and allow considerable flexibility in network problem specification. The resulting model extends the domain of network analysis from traditional fixed link cost (distance-sensitive) problems to more complex problems involving combinations of distance and traffic-sensitive tariffs.

  5. Dispersive processes in models of regional radionuclide migration. Technical memorandum

    International Nuclear Information System (INIS)

    Evenson, D.E.; Dettinger, M.D.

    1980-05-01

    Three broad areas of concern in the development of aquifer scale transport models will be local scale diffusion and dispersion processes, regional scale dispersion processes, and numerical problems associated with the advection-dispersion equation. Local scale dispersion processes are fairly well understood and accessible to observation. These processes will generally be dominated in large scale systems by regional processes, or macro-dispersion. Macro-dispersion is primarily the result of large scale heterogeneities in aquifer properties. In addition, the effects of many modeling approximations are often included in the process. Because difficulties arise in parameterization of this large scale phenomenon, parameterization should be based on field measurements made at the same scale as the transport process of interest or else partially circumvented through the application of a probabilistic advection model. Other problems associated with numerical transport models include difficulties with conservation of mass, stability, numerical dissipation, overshoot, flexibility, and efficiency. We recommend the random-walk model formulation for Lawrence Livermore Laboratory's purposes as the most flexible, accurate and relatively efficient modeling approach that overcomes these difficulties

  6. Technical note: A linear model for predicting δ13 Cprotein.

    Science.gov (United States)

    Pestle, William J; Hubbe, Mark; Smith, Erin K; Stevenson, Joseph M

    2015-08-01

    Development of a model for the prediction of δ(13) Cprotein from δ(13) Ccollagen and Δ(13) Cap-co . Model-generated values could, in turn, serve as "consumer" inputs for multisource mixture modeling of paleodiet. Linear regression analysis of previously published controlled diet data facilitated the development of a mathematical model for predicting δ(13) Cprotein (and an experimentally generated error term) from isotopic data routinely generated during the analysis of osseous remains (δ(13) Cco and Δ(13) Cap-co ). Regression analysis resulted in a two-term linear model (δ(13) Cprotein (%) = (0.78 × δ(13) Cco ) - (0.58× Δ(13) Cap-co ) - 4.7), possessing a high R-value of 0.93 (r(2)  = 0.86, P analysis of human osseous remains. These predicted values are ideal for use in multisource mixture modeling of dietary protein source contribution. © 2015 Wiley Periodicals, Inc.

  7. Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns

    Science.gov (United States)

    Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto

    2017-09-01

    Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.

  8. Bragg Gratings, Photosensitivity, and Poling in Glass Fibers and Waveguides: Applications and Fundamentals. Technical Digest Series, Volume 17

    Science.gov (United States)

    1998-05-26

    Reimund Gerhard-Multhaupt, Univ. Potsdam, Germany. Chro- mophore dipoles in amorphous nonlinear optical polymers are oriented with electric fields...www.gel.ulaval.ca/~copgel) Departement de genie electrique et de genie informatique Universite Laval, Sainte-Foy, Quebec, Canada, G1K 7P4 Introduction...the charged VAP, but with a modified direction of its dipole moment. JMH4-3/251 •si«? ° Figure 4: Model for PA in S-rich (a), in stoichiometric

  9. Simulated spinal cerebrospinal fluid leak repair: an educational model with didactic and technical components.

    Science.gov (United States)

    Ghobrial, George M; Anderson, Paul A; Chitale, Rohan; Campbell, Peter G; Lobel, Darlene A; Harrop, James

    2013-10-01

    In the era of surgical resident work hour restrictions, the traditional apprenticeship model may provide fewer hours for neurosurgical residents to hone technical skills. Spinal dura mater closure or repair is 1 skill that is infrequently encountered, and persistent cerebrospinal fluid leaks are a potential morbidity. To establish an educational curriculum to train residents in spinal dura mater closure with a novel durotomy repair model. The Congress of Neurological Surgeons has developed a simulation-based model for durotomy closure with the ongoing efforts of their simulation educational committee. The core curriculum consists of didactic training materials and a technical simulation model of dural repair for the lumbar spine. Didactic pretest scores ranged from 4/11 (36%) to 10/11 (91%). Posttest scores ranged from 8/11 (73%) to 11/11 (100%). Overall, didactic improvements were demonstrated by all participants, with a mean improvement between pre- and posttest scores of 1.17 (18.5%; P = .02). The technical component consisted of 11 durotomy closures by 6 participants, where 4 participants performed multiple durotomies. Mean time to closure of the durotomy ranged from 490 to 546 seconds in the first and second closures, respectively (P = .66), whereby the median leak rate improved from 14 to 7 (P = .34). There were also demonstrative technical improvements by all. Simulated spinal dura mater repair appears to be a potentially valuable tool in the education of neurosurgery residents. The combination of a didactic and technical assessment appears to be synergistic in terms of educational development.

  10. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    Science.gov (United States)

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  11. Biotrans functional and technical description. Report of VIEWLS WP5, modelling studies

    International Nuclear Information System (INIS)

    Van Tilburg, X.; Egging, R.; Londo, H.M.

    2006-01-01

    The overall objectives of this project are to provide structured and clear data on the availability and performance of biofuels and to identify the possibilities and strategies towards large scale sustainable production, use and trading of biofuels for the transport sector in Europe, including Central and Eastern European Countries (CEEC). The report supplements the two other reports in the work package: 'Biofuel and Bio-energy implementation scenarios - final report of VIEWLS WP5' (2005) and 'VIEWLS modelling and analysis, technical data for biofuel production chains' (2005). This document contains a functional and technical description of the BioTrans model, accompanied by a description of the system. Section 2 contains a conceptual and functional description of the biofuel model. Section 3 describes the optimisation method in technical terms, discussing aspects like the target function and constraints used. Finally, section 4 discusses the input and output requirements for the BioTrans system

  12. Technical Note: How to use Winbugs to infer animal models

    DEFF Research Database (Denmark)

    Damgaard, Lars Holm

    2007-01-01

    This paper deals with Bayesian inferences of animal models using Gibbs sampling. First, we suggest a general and efficient method for updating additive genetic effects, in which the computational cost is independent of the pedigree depth and increases linearly only with the size of the pedigree. ...... having Student's t distributions. In conclusion, Winbugs can be used to make inferences in small-sized, quantitative, genetic data sets applying a wide range of animal models that are not yet standard in the animal breeding literature...

  13. Blaney-Morin-Nigeria (BMN) Evapotranspiration Model (A Technical ...

    African Journals Online (AJOL)

    Duru [1] presented a modified form of the Blaney-Morin potential evapotranspiration equation christened Blaney-Morin- Nigeria (BMN) Evapotranspiration (ET) model for use in Nigeria. In this work, Duru recognize the very wide variability of relative humidity in Nigeria and consequently the very important role this parameter ...

  14. Technical note: River modelling to infer flood management framework

    African Journals Online (AJOL)

    River hydraulic models have successfully identified the weaknesses and areas for improvement with respect to flooding in the Sarawak River system, and can also be used to support decisions on flood management measures. Often, the big question is 'how'. This paper demonstrates a theoretical flood management ...

  15. Efforts - Final technical report on task 4. Physical modelling calidation

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Olsson, David Dam; Christensen, T. W.

    The present report is documentation for the work carried out in Task 4 at DTU Physical modelling-validation on the Brite/Euram project No. BE96-3340, contract No. BRPR-CT97-0398, with the title Enhanced Framework for forging design using reliable three-dimensional simulation (EFFORTS). The report...

  16. FA3100 series industrial personal computer, the highest speed model 7010 in Japan; Sangyoyo computer FA3100A series kokunai saisoku model 7010

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    The FA3100 series industrial personal computer was added with the highest order model 7010 mounted with the Pentium(reg sing) III processor(550 MHz) which is of the highest speed in Japan. In association with the increase in information processing amount, applications requiring higher CPU processing speed are increasing also in the field of FA (Factory Automation). The model can respond to this demand. The number of expansion slot required for industrial use is eleven in total, providing affluent expandability. The three-series disk bay allowing devices to be mounted only by removing the front panel contains disks that can be selected according to their applications, such as the hard disk, duplicated hard disk, silicon disk, CD-ROM, and photo-magnetic (MO) disk. (translated by NEDO)

  17. Travel Cost Inference from Sparse, Spatio-Temporally Correlated Time Series Using Markov Models

    DEFF Research Database (Denmark)

    Yang, Bin; Guo, Chenjuan; Jensen, Christian S.

    2013-01-01

    of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...... with the sparsity, spatio-temporal correlation, and heterogeneity of the time series. Using the resulting STHMM, near future travel costs in the transportation network, e.g., travel time or greenhouse gas emissions, can be inferred, enabling a variety of routing services, e.g., eco-routing. Empirical studies...

  18. Travel cost inference from sparse, spatio-temporally correlated time series using markov models

    DEFF Research Database (Denmark)

    Yang, B.; Guo, C.; Jensen, C.S.

    2013-01-01

    of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...... with the sparsity, spatio-temporal correlation, and heterogeneity of the time series. Using the resulting STHMM, near future travel costs in the transportation network, e.g., travel time or greenhouse gas emissions, can be inferred, enabling a variety of routing services, e.g., eco-routing. Empirical studies...

  19. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...... to the log-spectrum. We then propose two extensions. The first deals with replacing the logarithmic link with a more general Box-Cox link, which encompasses also the identity and the inverse links: this enables nesting alternative spectral estimation methods (autoregressive, exponential, etc.) under the same...

  20. Electronic resource management practical perspectives in a new technical services model

    CERN Document Server

    Elguindi, Anne

    2012-01-01

    A significant shift is taking place in libraries, with the purchase of e-resources accounting for the bulk of materials spending. Electronic Resource Management makes the case that technical services workflows need to make a corresponding shift toward e-centric models and highlights the increasing variety of e-formats that are forcing new developments in the field.Six chapters cover key topics, including: technical services models, both past and emerging; staffing and workflow in electronic resource management; implementation and transformation of electronic resource management systems; the ro

  1. Characteristics of the LeRC/Hughes J-series 30-cm engineering model thruster

    Science.gov (United States)

    Collett, C. R.; Poeschel, R. L.; Kami, S.

    1981-01-01

    As a consequence of endurance and structural tests performed on 900-series engineering model thrusters (EMT), several modifications in design were found to be necessary for achieving performance goals. The modified thruster is known as the J-series EMT. The most important of the design modifications affect the accelerator grid, gimbal mount, cathode polepiece, and wiring harness. The paper discusses the design modifications incorporated, the condition(s) they corrected, and the characteristics of the modified thruster.

  2. Technical documentation of HGSYSTEM/UF{sub 6} model

    Energy Technology Data Exchange (ETDEWEB)

    Hanna, S.R.; Chang, J.C.; Zhang, J.X. [Earth Technology Corp., Concord, MA (United States)

    1996-01-01

    MMES has been directed to upgrade the safety analyses for the gaseous diffusion plants at Paducah KY and Piketon OH. These will require assessment of consequences of accidental releases of UF{sub 6} to the atmosphere at these plants. The HGSYSTEM model has been chosen as the basis for evaluating UF{sub 6} releases; it includes dispersion algorithms for dense gases and treats the chemistry and thermodynamics of HF, a major product of the reaction of UF{sub 6} with water vapor in air. Objective of this project was to incorporate additional capability into HGSYSTEM: UF{sub 6} chemistry and thermodynamics, plume lift-off algorithms, and wet and dry deposition. The HGSYSTEM modules are discussed. The hybrid HGSYSTEM/UF{sub 6} model has been evaluated in three ways.

  3. Technical documentation of HGSYSTEM/UF6 model

    International Nuclear Information System (INIS)

    Hanna, S.R.; Chang, J.C.; Zhang, J.X.

    1996-01-01

    MMES has been directed to upgrade the safety analyses for the gaseous diffusion plants at Paducah KY and Piketon OH. These will require assessment of consequences of accidental releases of UF 6 to the atmosphere at these plants. The HGSYSTEM model has been chosen as the basis for evaluating UF 6 releases; it includes dispersion algorithms for dense gases and treats the chemistry and thermodynamics of HF, a major product of the reaction of UF 6 with water vapor in air. Objective of this project was to incorporate additional capability into HGSYSTEM: UF 6 chemistry and thermodynamics, plume lift-off algorithms, and wet and dry deposition. The HGSYSTEM modules are discussed. The hybrid HGSYSTEM/UF 6 model has been evaluated in three ways

  4. The electricity portfolio simulation model (EPSim) technical description.

    Energy Technology Data Exchange (ETDEWEB)

    Drennen, Thomas E.; Klotz, Richard (Hobart and William Smith Colleges, Geneva, NY)

    2005-09-01

    Stakeholders often have competing interests when selecting or planning new power plants. The purpose of developing this preliminary Electricity Portfolio Simulation Model (EPSim) is to provide a first cut, dynamic methodology and approach to this problem, that can subsequently be refined and validated, that may help energy planners, policy makers, and energy students better understand the tradeoffs associated with competing electricity portfolios. EPSim allows the user to explore competing electricity portfolios annually from 2002 to 2025 in terms of five different criteria: cost, environmental impacts, energy dependence, health and safety, and sustainability. Four additional criteria (infrastructure vulnerability, service limitations, policy needs and science and technology needs) may be added in future versions of the model. Using an analytic hierarchy process (AHP) approach, users or groups of users apply weights to each of the criteria. The default energy assumptions of the model mimic Department of Energy's (DOE) electricity portfolio to 2025 (EIA, 2005). At any time, the user can compare alternative portfolios to this reference case portfolio.

  5. Fluctuation complexity of agent-based financial time series model by stochastic Potts system

    Science.gov (United States)

    Hong, Weijia; Wang, Jun

    2015-03-01

    Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.

  6. Modelling Changes in the Unconditional Variance of Long Stock Return Series

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    show that the long-memory property in volatility may be explained by ignored changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecast accuracy of the new model over the GJR-GARCH model at all......In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long return series. For the purpose, we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta (2011...

  7. Modelling changes in the unconditional variance of long stock return series

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    2014-01-01

    that the apparent long memory property in volatility may be interpreted as changes in the unconditional variance of the long series. Finally, based on a formal statistical test we find evidence of the superiority of volatility forecasting accuracy of the new model over the GJR-GARCH model at all horizons for eight......In this paper we develop a testing and modelling procedure for describing the long-term volatility movements over very long daily return series. For this purpose we assume that volatility is multiplicatively decomposed into a conditional and an unconditional component as in Amado and Teräsvirta...

  8. Image reconstruction method for electrical capacitance tomography based on the combined series and parallel normalization model

    International Nuclear Information System (INIS)

    Dong, Xiangyuan; Guo, Shuqing

    2008-01-01

    In this paper, a novel image reconstruction method for electrical capacitance tomography (ECT) based on the combined series and parallel model is presented. A regularization technique is used to obtain a stabilized solution of the inverse problem. Also, the adaptive coefficient of the combined model is deduced by numerical optimization. Simulation results indicate that it can produce higher quality images when compared to the algorithm based on the parallel or series models for the cases tested in this paper. It provides a new algorithm for ECT application

  9. Biomedical time series clustering based on non-negative sparse coding and probabilistic topic model.

    Science.gov (United States)

    Wang, Jin; Liu, Ping; F H She, Mary; Nahavandi, Saeid; Kouzani, Abbas

    2013-09-01

    Biomedical time series clustering that groups a set of unlabelled temporal signals according to their underlying similarity is very useful for biomedical records management and analysis such as biosignals archiving and diagnosis. In this paper, a new framework for clustering of long-term biomedical time series such as electrocardiography (ECG) and electroencephalography (EEG) signals is proposed. Specifically, local segments extracted from the time series are projected as a combination of a small number of basis elements in a trained dictionary by non-negative sparse coding. A Bag-of-Words (BoW) representation is then constructed by summing up all the sparse coefficients of local segments in a time series. Based on the BoW representation, a probabilistic topic model that was originally developed for text document analysis is extended to discover the underlying similarity of a collection of time series. The underlying similarity of biomedical time series is well captured attributing to the statistic nature of the probabilistic topic model. Experiments on three datasets constructed from publicly available EEG and ECG signals demonstrates that the proposed approach achieves better accuracy than existing state-of-the-art methods, and is insensitive to model parameters such as length of local segments and dictionary size. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Animal model for endoscopic neurosurgical training: technical note.

    Science.gov (United States)

    Fernandez-Miranda, J C; Barges-Coll, J; Prevedello, D M; Engh, J; Snyderman, C; Carrau, R; Gardner, P A; Kassam, A B

    2010-10-01

    The learning curve for endonasal endoscopic and neuroendoscopic port surgery is long and often associated with an increase in complication rates as surgeons gain experience. We present an animal model for laboratory training aiming to encourage the young generation of neurosurgeons to pursue proficiency in endoscopic neurosurgical techniques. 20 Wistar rats were used as models. The animals were introduced into a physical trainer with multiple ports to carry out fully endoscopic microsurgical procedures. The vertical and horizontal dimensions of the paired ports (simulated nostrils) were: 35×20 mm, 35×15 mm, 25×15 mm, and 25×10 mm. 2 additional single 11.5 mm endoscopic ports were added. Surgical depth varied as desired between 8 and 15 cm. The cervical and abdominal regions were the focus of the endoscopic microsurgical exercises. The different endoscopic neurosurgical techniques were effectively trained at the millimetric dimension. Levels of progressive surgical difficulty depending upon the endoneurosurgical skills set needed for a particular surgical exercise were distinguished. LEVEL 1 is soft-tissue microdissection (exposure of cervical muscular plane and retroperitoneal space); LEVEL 2 is soft-tissue-vascular and vascular-capsule microdissection (aorto-cava exposure, carotid sheath opening, external jugular vein isolation); LEVEL 3 is artery-nerve microdissection (carotid-vagal separation); LEVEL 4 is artery-vein microdissection (aorto-cava separation); LEVEL 5 is vascular repair and microsuturing (aortic rupture), which verified the lack of current proper instrumentation. The animal training model presented here has the potential to shorten the length of the learning curve in endonasal endoscopic and neuroendoscopic port surgery and reduce the incidence of training-related surgical complications. © Georg Thieme Verlag KG Stuttgart · New York.

  11. The partial duration series method in regional index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    A regional index-flood method based on the partial duration series model is introduced. The model comprises the assumptions of a Poisson-distributed number of threshold exceedances and generalized Pareto (GP) distributed peak magnitudes. The regional T-year event estimator is based on a regional...

  12. A new model for reliability optimization of series-parallel systems with non-homogeneous components

    International Nuclear Information System (INIS)

    Feizabadi, Mohammad; Jahromi, Abdolhamid Eshraghniaye

    2017-01-01

    In discussions related to reliability optimization using redundancy allocation, one of the structures that has attracted the attention of many researchers, is series-parallel structure. In models previously presented for reliability optimization of series-parallel systems, there is a restricting assumption based on which all components of a subsystem must be homogeneous. This constraint limits system designers in selecting components and prevents achieving higher levels of reliability. In this paper, a new model is proposed for reliability optimization of series-parallel systems, which makes possible the use of non-homogeneous components in each subsystem. As a result of this flexibility, the process of supplying system components will be easier. To solve the proposed model, since the redundancy allocation problem (RAP) belongs to the NP-hard class of optimization problems, a genetic algorithm (GA) is developed. The computational results of the designed GA are indicative of high performance of the proposed model in increasing system reliability and decreasing costs. - Highlights: • In this paper, a new model is proposed for reliability optimization of series-parallel systems. • In the previous models, there is a restricting assumption based on which all components of a subsystem must be homogeneous. • The presented model provides a possibility for the subsystems’ components to be non- homogeneous in the required conditions. • The computational results demonstrate the high performance of the proposed model in improving reliability and reducing costs.

  13. Applying ARIMA model for annual volume time series of the Magdalena River

    Directory of Open Access Journals (Sweden)

    Gloria Amaris

    2017-04-01

    Conclusions: The simulated results obtained with the ARIMA model compared to the observed data showed a fairly good adjustment of the minimum and maximum magnitudes. This allows concluding that it is a good tool for estimating minimum and maximum volumes, even though this model is not capable of simulating the exact behaviour of an annual volume time series.

  14. Development of Simulink-Based SiC MOSFET Modeling Platform for Series Connected Devices

    DEFF Research Database (Denmark)

    Tsolaridis, Georgios; Ilves, Kalle; Reigosa, Paula Diaz

    2016-01-01

    A new MATLAB/Simulink-based modeling platform has been developed for SiC MOSFET power modules. The modeling platform describes the electrical behavior f a single 1.2 kV/ 350 A SiC MOSFET power module, as well as the series connection of two of them. A fast parameter initialization is followed...

  15. Algorithms for global total least squares modelling of finite multivariable time series

    NARCIS (Netherlands)

    Roorda, Berend

    1995-01-01

    In this paper we present several algorithms related to the global total least squares (GTLS) modelling of multivariable time series observed over a finite time interval. A GTLS model is a linear, time-invariant finite-dimensional system with a behaviour that has minimal Frobenius distance to a given

  16. Model technical and tactical training karate «game» manner of conducting a duel

    Directory of Open Access Journals (Sweden)

    Natalya Boychenko

    2015-04-01

    Full Text Available Purpose: optimization of technical and tactical training karate «gaming» the manner of conducting a duel. Material and Methods: analysis and compilation of scientific and methodological literature, interviews with coaches for shock combat sports, video analysis techniques, teacher observations. Results: the model of technical and tactical training karate «game» manner of conducting a duel. Selection was done complexes jobs matching techniques to improve athletes 'game' in the manner of conducting a duel «Kyokushin» karate. Conclusion: the model of technical and tactical training fighters "game" manner of conducting a duel, which reveals the particular combination technique karate style «Kyokushin». Selection was done complexes jobs matching techniques to improve athletes 'game' in the manner of conducting a duel «Kyokushin» karate, aimed at improving the combinations with the action on the response of the enemy.

  17. Fuel Behaviour and Modelling under Severe Transient and Loss of Coolant Accident (LOCA) Conditions. Proceedings of a Technical Meeting

    International Nuclear Information System (INIS)

    2013-06-01

    In recent years the demands on 'fuel duties' have increased, including transient regimes, higher burnups and longer fuel cycles. To satisfy these demands, fuel vendors have developed and introduced new cladding and fuel material designs to provide sufficient margins for safe operation of the fuel components. National and international experimental programmes have been launched, and models have been developed or adapted to take into account the changed conditions. These developments enable water cooled reactors, which contribute about 95% of the nuclear power in the world today, to operate safely under all operating conditions; moreover, even under severe transient or accident conditions, such as reactivity initiated accidents (RIAs) or loss of coolant accidents (LOCAs), the behaviour of the fuel can be adequately predicted and the consequences of such events can be safely contained. In 2010 the IAEA Technical Working Group on Fuel Performance and Technology (TWGFPT) recommended that a technical meeting on ''Fuel Behaviour and Modelling under Severe Transient and LOCA Conditions'' be held in Japan. The accident at the Fukushima Daiichi nuclear power plant in March 2011 highlighted the need to address this subject, and despite the difficult situation in Japan at the time, the recommended plan was confirmed, and the Japan Atomic Energy Agency (JAEA) hosted the technical meeting in Mito, Ibaraki Prefecture, Japan, from 18 to 21 October 2011. This meeting was the eighth in a series of IAEA meetings, which reflects Member States' continuing interest in the above issues. The previous meetings were held in 1980 (jointly with OECD Nuclear Energy Agency, Helsinki, Finland), 1983 (Riso, Denmark), 1986 (Vienna, Austria), 1988 (Preston, United Kingdom), 1992 (Pembroke, Canada), 1995 (Dimitrovgrad, Russian Federation) and 2001 (Halden, Norway). The purpose of the technical meeting was to provide a forum for international experts to review the current situation and the state of

  18. Dynamic modeling and simulation of a two-stage series-parallel vibration isolation system

    Directory of Open Access Journals (Sweden)

    Rong Guo

    2016-07-01

    Full Text Available A two-stage series-parallel vibration isolation system is already widely used in various industrial fields. However, when the researchers analyze the vibration characteristics of a mechanical system, the system is usually regarded as a single-stage one composed of two substructures. The dynamic modeling of a two-stage series-parallel vibration isolation system using frequency response function–based substructuring method has not been studied. Therefore, this article presents the source-path-receiver model and the substructure property identification model of such a system. These two models make up the transfer path model of the system. And the model is programmed by MATLAB. To verify the proposed transfer path model, a finite element model simulating a vehicle system, which is a typical two-stage series-parallel vibration isolation system, is developed. The substructure frequency response functions and system level frequency response functions can be obtained by MSC Patran/Nastran and LMS Virtual.lab based on the finite element model. Next, the system level frequency response functions are substituted into the transfer path model to predict the substructural frequency response functions and the system response of the coupled structure can then be further calculated. By comparing the predicted results and exact value, the model proves to be correct. Finally, the random noise is introduced into several relevant system level frequency response functions for error sensitivity analysis. The system level frequency response functions that are most sensitive to the random error are found. Since a two-stage series-parallel system has not been well studied, the proposed transfer path model improves the dynamic theory of the multi-stage vibration isolation system. Moreover, the validation process of the model here actually provides an example for acoustic and vibration transfer path analysis based on the proposed model. And it is worth noting that the

  19. New insights into soil temperature time series modeling: linear or nonlinear?

    Science.gov (United States)

    Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram

    2018-03-01

    Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and

  20. An extension of Box-Jenkins transfer/noise models for spatial interpolation of groundwater head series

    NARCIS (Netherlands)

    Geer, F.C. van; Zuur, A.F.

    1997-01-01

    This paper advocates an approach to extend single-output Box-Jenkins transfer/noise models for several groundwater head series to a multiple-output transfer/noise model. The approach links several groundwater head series and enables a spatial interpolation in terms of time series analysis. Our

  1. Effective low-order models for atmospheric dynamics and time series analysis.

    Science.gov (United States)

    Gluhovsky, Alexander; Grady, Kevin

    2016-02-01

    The paper focuses on two interrelated problems: developing physically sound low-order models (LOMs) for atmospheric dynamics and employing them as novel time-series models to overcome deficiencies in current atmospheric time series analysis. The first problem is warranted since arbitrary truncations in the Galerkin method (commonly used to derive LOMs) may result in LOMs that violate fundamental conservation properties of the original equations, causing unphysical behaviors such as unbounded solutions. In contrast, the LOMs we offer (G-models) are energy conserving, and some retain the Hamiltonian structure of the original equations. This work examines LOMs from recent publications to show that all of them that are physically sound can be converted to G-models, while those that cannot lack energy conservation. Further, motivated by recent progress in statistical properties of dynamical systems, we explore G-models for a new role of atmospheric time series models as their data generating mechanisms are well in line with atmospheric dynamics. Currently used time series models, however, do not specifically utilize the physics of the governing equations and involve strong statistical assumptions rarely met in real data.

  2. A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress

    Directory of Open Access Journals (Sweden)

    Ching-Hsue Cheng

    2018-01-01

    Full Text Available The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i the proposed model is different from the previous models lacking the concept of time series; (ii the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.

  3. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2017-11-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  4. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  5. Rotary ATPases: models, machine elements and technical specifications.

    Science.gov (United States)

    Stewart, Alastair G; Sobti, Meghna; Harvey, Richard P; Stock, Daniela

    2013-01-01

    Rotary ATPases are molecular rotary motors involved in biological energy conversion. They either synthesize or hydrolyze the universal biological energy carrier adenosine triphosphate. Recent work has elucidated the general architecture and subunit compositions of all three sub-types of rotary ATPases. Composite models of the intact F-, V- and A-type ATPases have been constructed by fitting high-resolution X-ray structures of individual subunits or sub-complexes into low-resolution electron densities of the intact enzymes derived from electron cryo-microscopy. Electron cryo-tomography has provided new insights into the supra-molecular arrangement of eukaryotic ATP synthases within mitochondria and mass-spectrometry has started to identify specifically bound lipids presumed to be essential for function. Taken together these molecular snapshots show that nano-scale rotary engines have much in common with basic design principles of man made machines from the function of individual "machine elements" to the requirement of the right "fuel" and "oil" for different types of motors.

  6. Modeling annual Coffee production in Ghana using ARIMA time series Model

    Directory of Open Access Journals (Sweden)

    E. Harris

    2013-07-01

    Full Text Available In the international commodity trade, coffee, which represents the world’s most valuable tropical agricultural commodity, comes next to oil. Indeed, it is estimated that about 40 million people in the major producing countries in Africa derive their livelihood from coffee, with Africa accounting for about 12 per cent of global production. The paper applied Autoregressive Integrated Moving Average (ARIMA time series model to study the behavior of Ghana’s annual coffee production as well as make five years forecasts. Annual coffee production data from 1990 to 2010 was obtained from Ghana cocoa board and analyzed using ARIMA. The results showed that in general, the trend of Ghana’s total coffee production follows an upward and downward movement. The best model arrived at on the basis of various diagnostics, selection and an evaluation criterion was ARIMA (0,3,1. Finally, the forecast figures base on Box- Jenkins method showed that Ghana’s annual coffee production will decrease continuously in the next five (5 years, all things being equal

  7. Technical performance of percutaneous leads for spinal cord stimulation: a modeling study

    NARCIS (Netherlands)

    Manola, L.; Holsheimer, J.; Veltink, Petrus H.

    Objective. To compare the technical performance of different percutaneous lead types for spinal cord stimulation. Methods. Using the UT-SCS software (University of Twente's spinal cord stimulation), lead models having similar characteristics such as the 3487A PISCES-Quad (PQ), 3887 PISCES-Quad

  8. Evaluation of the Curriculum of English Preparatory Classes at Yildiz Technical University Using CIPP Model

    Science.gov (United States)

    Akpur, Ugur; Alci, Bülent; Karatas, Hakan

    2016-01-01

    The purpose of this study is to evaluate the instruction program of preparatory classes at Yildiz Technical University using CIPP model. A total of 54 teachers and 753 university students attending preparatory classes in the Academic Year of 2014-2015 formed the study group. The research is based on a questionnaire applied to teachers and…

  9. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong,T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. In I. A. Sánchez & P. Isaías (Eds.), Proceedings of the IADIS Mobile Learning Conference 2008 (pp. 206-210). April, 11-13, 2008, Carvoeiro, Portugal.

  10. AIDS: An ICT Model for Integrating Teaching, Learning and Research in Technical University Education in Ghana

    Science.gov (United States)

    Asabere, Nana; Togo, Gilbert; Acakpovi, Amevi; Torby, Wisdom; Ampadu, Kwame

    2017-01-01

    Information and Communication Technologies (ICT) has changed the way we communicate and carry out certain daily activities. Globally, ICT has become an essential means for disseminating information. Using Accra Technical University in Ghana as a case study, this paper proposes an ICT model called Awareness Incentives Demand and Support (AIDS). Our…

  11. Bayesian dynamic modeling of time series of dengue disease case counts.

    Science.gov (United States)

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful

  12. Bayesian dynamic modeling of time series of dengue disease case counts.

    Directory of Open Access Journals (Sweden)

    Daniel Adyro Martínez-Bello

    2017-07-01

    Full Text Available The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease

  13. Slow crack growth modeling of a technical ferrite ceramics

    International Nuclear Information System (INIS)

    Romero de la Osa, M.

    2009-04-01

    Iron oxide ferrite ceramics are subjected to slow crack growth (SCG) and also environmentally assisted failure, similarly to what is observed for amorphous silica and alumina polycrystals The kinetics of fracture are known to be dependent on the load level with a crack velocity V that increase with K I , but also with temperature and with the Relative Humidity (RH). In addition, SCG represented by V-K diagrams is noticeably sensitive to microstructural effects as variations of the grain size, and also influenced by the presence of some porosity at the triple junctions. The ferrites under consideration exhibit a heterogeneous microstructure with a distribution on the grain size, with some regions in which pores are present at the triple junctions. Such a microstructure results in noticeable scattering in the measurements of the V-K characteristics from sample to sample, so that predictions based on these experiments for the estimate of the material's lifetime are not reliable. Thus, additional analyses based on numerical simulations of SCG are necessary to gain insight on the material's durability. We have developed a local description of SCG, at the length scale of the microstructure which is explicitly accounted for. Within a cohesive zone methodology and based on available physics and on recent atomistic results, we propose a viscoplastic cohesive model that mimics the reaction-rupture mechanism underlying the time dependent failure. The description is shown able to capture variations in the V-K predictions in agreement with the observations. From the simulations of intergranular failure under static fatigue, we observe a discontinuous crack advance in time, with different crack velocities depending on the local crack path. The crossing of the triple junction slows down crack propagation, and ultimately governs the average crack velocity. We evidence that account for the initial stresses originating from the process's cooling from the temperature at sintering

  14. Wavelet Network Model Based on Multiple Criteria Decision Making for Forecasting Temperature Time Series

    OpenAIRE

    Zhang, Jian; Yang, Xiao-hua; Chen, Xiao-juan

    2015-01-01

    Due to nonlinear and multiscale characteristics of temperature time series, a new model called wavelet network model based on multiple criteria decision making (WNMCDM) has been proposed, which combines the advantage of wavelet analysis, multiple criteria decision making, and artificial neural network. One case for forecasting extreme monthly maximum temperature of Miyun Reservoir has been conducted to examine the performance of WNMCDM model. Compared with nearest neighbor bootstrapping regr...

  15. Forecasting electricity spot-prices using linear univariate time-series models

    International Nuclear Information System (INIS)

    Cuaresma, Jesus Crespo; Hlouskova, Jaroslava; Kossmeier, Stephan; Obersteiner, Michael

    2004-01-01

    This paper studies the forecasting abilities of a battery of univariate models on hourly electricity spot prices, using data from the Leipzig Power Exchange. The specifications studied include autoregressive models, autoregressive-moving average models and unobserved component models. The results show that specifications, where each hour of the day is modelled separately present uniformly better forecasting properties than specifications for the whole time-series, and that the inclusion of simple probabilistic processes for the arrival of extreme price events can lead to improvements in the forecasting abilities of univariate models for electricity spot prices. (Author)

  16. Short-Term Bus Passenger Demand Prediction Based on Time Series Model and Interactive Multiple Model Approach

    Directory of Open Access Journals (Sweden)

    Rui Xue

    2015-01-01

    Full Text Available Although bus passenger demand prediction has attracted increased attention during recent years, limited research has been conducted in the context of short-term passenger demand forecasting. This paper proposes an interactive multiple model (IMM filter algorithm-based model to predict short-term passenger demand. After aggregated in 15 min interval, passenger demand data collected from a busy bus route over four months were used to generate time series. Considering that passenger demand exhibits various characteristics in different time scales, three time series were developed, named weekly, daily, and 15 min time series. After the correlation, periodicity, and stationarity analyses, time series models were constructed. Particularly, the heteroscedasticity of time series was explored to achieve better prediction performance. Finally, IMM filter algorithm was applied to combine individual forecasting models with dynamically predicted passenger demand for next interval. Different error indices were adopted for the analyses of individual and hybrid models. The performance comparison indicates that hybrid model forecasts are superior to individual ones in accuracy. Findings of this study are of theoretical and practical significance in bus scheduling.

  17. Integrating uncertainty in time series population forecasts: An illustration using a simple projection model

    Directory of Open Access Journals (Sweden)

    Guy J. Abel

    2013-12-01

    Full Text Available Background: Population forecasts are widely used for public policy purposes. Methods to quantify the uncertainty in forecasts tend to ignore model uncertainty and to be based on a single model. Objective: In this paper, we use Bayesian time series models to obtain future population estimates with associated measures of uncertainty. The models are compared based on Bayesian posterior model probabilities, which are then used to provide model-averaged forecasts. Methods: The focus is on a simple projection model with the historical data representing population change in England and Wales from 1841 to 2007. Bayesian forecasts to the year 2032 are obtained based on a range of models, including autoregression models, stochastic volatility models and random variance shift models. The computational steps to fit each of these models using the OpenBUGS software via R are illustrated. Results: We show that the Bayesian approach is adept in capturing multiple sources of uncertainty in population projections, including model uncertainty. The inclusion of non-constant variance improves the fit of the models and provides more realistic predictive uncertainty levels. The forecasting methodology is assessed through fitting the models to various truncated data series.

  18. Modelos de gestión de conflictos en serie de ficción televisiva (Conflict management models in television fiction series

    Directory of Open Access Journals (Sweden)

    Yolanda Navarro-Abal

    2012-12-01

    Full Text Available Television fiction series sometimes generate an unreal vision of life, especially among young people, becoming a mirror in which they can see themselves reflected. The series become models of values, attitudes, skills and behaviours that tend to be imitated by some viewers. The aim of this study was to analyze the conflict management behavioural styles presented by the main characters of television fiction series. Thus, we evaluated the association between these styles and the age and sex of the main characters, as well as the nationality and genre of the fiction series. 16 fiction series were assessed by selecting two characters of both sexes from each series. We adapted the Rahim Organizational Conflict Inventory-II for observing and recording the data. The results show that there is no direct association between the conflict management behavioural styles presented in the drama series and the sex of the main characters. However, associations were found between these styles and the age of the characters and the genre of the fiction series.

  19. A meta-analysis of motivational interviewing process: Technical, relational, and conditional process models of change.

    Science.gov (United States)

    Magill, Molly; Apodaca, Timothy R; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E F; Tonigan, J Scott; Moyers, Theresa

    2018-02-01

    In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3,025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Therapist MI-consistent skills were correlated with more client change talk (r = .55, p technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = -.16, p technical hypothesis path effect sizes was partially explained by inter- and intrapersonal moderators. This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Evaluating Technical Efficiency of Nursing Care Using Data Envelopment Analysis and Multilevel Modeling.

    Science.gov (United States)

    Min, Ari; Park, Chang Gi; Scott, Linda D

    2016-05-23

    Data envelopment analysis (DEA) is an advantageous non-parametric technique for evaluating relative efficiency of performance. This article describes use of DEA to estimate technical efficiency of nursing care and demonstrates the benefits of using multilevel modeling to identify characteristics of efficient facilities in the second stage of analysis. Data were drawn from LTCFocUS.org, a secondary database including nursing home data from the Online Survey Certification and Reporting System and Minimum Data Set. In this example, 2,267 non-hospital-based nursing homes were evaluated. Use of DEA with nurse staffing levels as inputs and quality of care as outputs allowed estimation of the relative technical efficiency of nursing care in these facilities. In the second stage, multilevel modeling was applied to identify organizational factors contributing to technical efficiency. Use of multilevel modeling avoided biased estimation of findings for nested data and provided comprehensive information on differences in technical efficiency among counties and states. © The Author(s) 2016.

  1. THE MODEL OF TEACHING A FOREIGN LANGUAGE FOR SPECIFIC PURPOSES IN A TECHNICAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Cherkashina, E.I.

    2017-06-01

    Full Text Available The article presents a new model of a linguistic educational process that can be implemented in the practice of teaching a foreign language in a technical university. The proposed model takes into account the characteristic features of mindset of students of technical universities and faculties, and it constitutes a matrix with a binary opposition. Filled-in matrix cells represent a structure of the language knowledge content in a visual form. Knowledge of the system organization of a language helps the students to understand "language in action" in the way that corresponds to their left hemisphere mindset. The knowledge of the dominant hemisphere cerebration peculiarities of the students of technical specializations (engineering physicists lets us model a lingvo-educational process in a non-linguistic university. A complex linking of lingvo-didactic components makes the teachers of foreign language take into consideration the results of the research in the field of functional interhemispheric asymmetry of the brain. The emphasis on the abilities of the left hemisphere dominating among the students has to change the approach of the teachers of foreign languages to the organization of the linguistic educational process in a technical university. It is also important to consider that the skills which led the life in the information age remain necessary, but they alone are no longer sufficient for personal self-realization in the new conceptual age.

  2. A Series Solution of the Cauchy Problem for Turing Reaction-diffusion Model

    Directory of Open Access Journals (Sweden)

    L. Päivärinta

    2011-12-01

    Full Text Available In this paper, the series pattern solution of the Cauchy problem for Turing reaction-diffusion model is obtained by using the homotopy analysis method (HAM. Turing reaction-diffusion model is nonlinear reaction-diffusion system which usually has power-law nonlinearities or may be rewritten in the form of power-law nonlinearities. Using the HAM, it is possible to find the exact solution or an approximate solution of the problem. This technique provides a series of functions which converges rapidly to the exact solution of the problem. The efficiency of the approach will be shown by applying the procedure on two problems. Furthermore, the so-called homotopy-Pade technique (HPT is applied to enlarge the convergence region and rate of solution series given by the HAM.

  3. Markov Chain Modelling for Short-Term NDVI Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Stepčenko Artūrs

    2016-12-01

    Full Text Available In this paper, the NDVI time series forecasting model has been developed based on the use of discrete time, continuous state Markov chain of suitable order. The normalised difference vegetation index (NDVI is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation; therefore, it is an important variable for vegetation forecasting. A Markov chain is a stochastic process that consists of a state space. This stochastic process undergoes transitions from one state to another in the state space with some probabilities. A Markov chain forecast model is flexible in accommodating various forecast assumptions and structures. The present paper discusses the considerations and techniques in building a Markov chain forecast model at each step. Continuous state Markov chain model is analytically described. Finally, the application of the proposed Markov chain model is illustrated with reference to a set of NDVI time series data.

  4. The Benefit of Multi-Mission Altimetry Series for the Calibration of Hydraulic Models

    Science.gov (United States)

    Domeneghetti, Alessio; Tarpanelli, Angelica; Tourian, Mohammad J.; Brocca, Luca; Moramarco, Tommaso; Castellarin, Attilio; Sneeuw, Nico

    2016-04-01

    The growing availability of satellite altimetric time series during last decades has fostered their use in many hydrological and hydraulic applications. However, the use of remotely sensed water level series still remains hampered by the limited temporal resolution that characterizes each sensor (i.e. revisit time varying from 10 to 35 days), as well as by the accuracy of different instrumentation adopted for monitoring inland water. As a consequence, each sensor is characterized by distinctive potentials and limitations that constrain its use for hydrological applications. In this study we refer to a stretch of about 140 km of the Po River (the longest Italian river) in order to investigate the performance of different altimetry series for the calibration of a quasi-2d model built with detailed topographic information. The usefulness of remotely sensed water surface elevation is tested using data collected by different altimetry missions (i.e., ERS-2, ENVISAT, TOPEX/Poseidon, JASON-2 and SARAL/Altika) by investigating the effect of (i) record length (i.e. number of satellite measurements provided by a given sensor at a specific satellite track) and (ii) data uncertainty (i.e. altimetry measurements errors). Since the relatively poor time resolution of satellites constrains the operational use of altimetric time series, in this study we also investigate the use of multi-mission altimetry series obtained by merging datasets sensed by different sensors over the study area. Benefits of the highest temporal frequency of multi-mission series are tested by calibrating the quasi-2d model referring in turn to original satellite series and multi-mission datasets. Jason-2 and ENVISAT outperform other sensors, ensuring the reliability on the calibration process for shorter time series. The multi-mission dataset appears particularly reliable and suitable for the calibration of hydraulic model. If short time periods are considered, the performance of the multi-mission dataset

  5. Transfer function modeling of the monthly accumulated rainfall series over the Iberian Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Mateos, Vidal L.; Garcia, Jose A.; Serrano, Antonio; De la Cruz Gallego, Maria [Departamento de Fisica, Universidad de Extremadura, Badajoz (Spain)

    2002-10-01

    In order to improve the results given by Autoregressive Moving-Average (ARMA) modeling for the monthly accumulated rainfall series taken at 19 observatories of the Iberian Peninsula, a Discrete Linear Transfer Function Noise (DLTFN) model was applied taking the local pressure series (LP), North Atlantic sea level pressure series (SLP) and North Atlantic sea surface temperature (SST) as input variables, and the rainfall series as the output series. In all cases, the performance of the DLTFN models, measured by the explained variance of the rainfall series, is better than the performance given by the ARMA modeling. The best performance is given by the models that take the local pressure as the input variable, followed by the sea level pressure models and the sea surface temperature models. Geographically speaking, the models fitted to those observatories located in the west of the Iberian Peninsula work better than those on the north and east of the Peninsula. Also, it was found that there is a region located between 0 N and 20 N, which shows the highest cross-correlation between SST and the peninsula rainfalls. This region moves to the west and northwest off the Peninsula when the SLP series are used. [Spanish] Con el objeto de mejorar los resultados porporcionados por los modelos Autorregresivo Media Movil (ARMA) ajustados a las precipitaciones mensuales acumuladas registradas en 19 observatorios de la Peninsula Iberica se han usado modelos de funcion de transferencia (DLTFN) en los que se han empleado como variable independiente la presion local (LP), la presion a nivel del mar (SLP) o la temperatura de agua del mar (SST) en el Atlantico Norte. En todos los casos analizados, los resultados obtenidos con los modelos DLTFN, medidos mediante la varianza explicada por el modelo, han sido mejores que los resultados proporcionados por los modelos ARMA. Los mejores resultados han sido dados por aquellos modelos que usan la presion local como variable de entrada, seguidos

  6. Using the mean approach in pooling cross-section and time series data for regression modelling

    International Nuclear Information System (INIS)

    Nuamah, N.N.N.N.

    1989-12-01

    The mean approach is one of the methods for pooling cross section and time series data for mathematical-statistical modelling. Though a simple approach, its results are sometimes paradoxical in nature. However, researchers still continue using it for its simplicity. Here, the paper investigates the nature and source of such unwanted phenomena. (author). 7 refs

  7. Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox

    DEFF Research Database (Denmark)

    Nonejad, Nima

    This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...

  8. 76 FR 36390 - Airworthiness Directives; The Boeing Company Model 747SP Series Airplanes

    Science.gov (United States)

    2011-06-22

    ... power control modules (PCM). This proposed AD was prompted by a report of a rudder hard-over event on a... rudder PCM manifold, which could result in a hard-over of the rudder surface leading to an increase in... of a Model 747-400 series airplane of a lower rudder hard-over event caused by a lower rudder PCM...

  9. 75 FR 38945 - Airworthiness Directives; The Boeing Company Model 777-200 and -300 Series Airplanes

    Science.gov (United States)

    2010-07-07

    ... certain Model 777-200 and -300 series airplanes. This proposed AD would require installing new operational software in the cabin management system, and loading new software into the mass memory card. This proposed..., 2006. The service bulletin describes procedures for installing new operational software in the cabin...

  10. The River Basin Model: Computer Output. Water Pollution Control Research Series.

    Science.gov (United States)

    Envirometrics, Inc., Washington, DC.

    This research report is part of the Water Pollution Control Research Series which describes the results and progress in the control and abatement of pollution in our nation's waters. The River Basin Model described is a computer-assisted decision-making tool in which a number of computer programs simulate major processes related to water use that…

  11. Modeling the impact of forecast-based regime switches on macroeconomic time series

    NARCIS (Netherlands)

    K. Bel (Koen); R. Paap (Richard)

    2013-01-01

    textabstractForecasts of key macroeconomic variables may lead to policy changes of governments, central banks and other economic agents. Policy changes in turn lead to structural changes in macroeconomic time series models. To describe this phenomenon we introduce a logistic smooth transition

  12. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data

    Science.gov (United States)

    Barry T. Wilson; Joseph F. Knight; Ronald E. McRoberts

    2018-01-01

    Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several...

  13. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  14. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    Science.gov (United States)

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. © 2013 Blackwell Verlag GmbH.

  15. Comparison of Uncertainty of Two Precipitation Prediction Models at Los Alamos National Lab Technical Area 54

    Energy Technology Data Exchange (ETDEWEB)

    Shield, Stephen Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dai, Zhenxue [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-18

    Meteorological inputs are an important part of subsurface flow and transport modeling. The choice of source for meteorological data used as inputs has significant impacts on the results of subsurface flow and transport studies. One method to obtain the meteorological data required for flow and transport studies is the use of weather generating models. This paper compares the difference in performance of two weather generating models at Technical Area 54 of Los Alamos National Lab. Technical Area 54 is contains several waste pits for low-level radioactive waste and is the site for subsurface flow and transport studies. This makes the comparison of the performance of the two weather generators at this site particularly valuable.

  16. STRUCTURAL AND FUNCTIONAL MODEL OF FORMING INFORMATIONAL COMPETENCE OF TECHNICAL UNIVERSITY STUDENTS

    Directory of Open Access Journals (Sweden)

    Taras Ostapchuk

    2016-11-01

    Full Text Available The article elaborates and analyses the structural and functional model of formation of information competence of technical university students. The system and mutual relationships between its elements are revealed. It is found out that the presence of the target structure of the proposed model, process and result-evaluative blocks ensure its functioning and the opportunity to optimize the learning process for technical school students’ information training. It is established that the formation of technical university students’ information competence based on components such as motivational value, as well as operational activity, cognitive, and reflexive one. These criteria (motivation, operational and activity, cognitive, reflective, indexes and levels (reproductive, technologized, constructive forming technical university students’ information competence are disclosed. Expediency of complex organizational and educational conditions in the stages of information competence is justified. The complex organizational and pedagogical conditions include: orientation in the organization and implementation of class work for technical university students’ positive value treatment; the issue of forming professionalism; informatization of educational and socio-cultural environment of higher technical educational institutions; orientation of technical university students’ training to the demands of European and international standards on information competence as a factor in the formation of competitiveness at the labor market; introducing a special course curriculum that will provide competence formation due to the use of information technology in professional activities. Forms (lecture, visualization, problem lecture, combined lecture, scientific online conference, recitals, excursions, etc., tools (computer lab, multimedia projector, interactive whiteboard, multimedia technology (audio, video, the Internet technologies; social networks, etc

  17. pplication of Time-series Modeling to Predict Infiltration of Different Soil Textures

    Directory of Open Access Journals (Sweden)

    S. Vazirpour

    2016-10-01

    Full Text Available Introduction: Infiltration is one of the most important parameters affecting irrigation. For this reason, measuring and estimating this parameter is very important, particularly when designing and managing irrigation systems. Infiltration affects water flow and solute transport in the soil surface and subsurface. Due to temporal and spatial variability, Many measurements are needed to explain the average soil infiltration characteristics under field conditions. Stochastic characteristics of the different natural phenomena led to the application of random variables and time series in predicting the performance of these phenomena. Time-series analysis is a simple and efficient method for prediction, which is widely used in various sciences. However, a few researches have investigated the time-series modeling to predict soil infiltration characteristics. In this study, capability of time series in estimating infiltration rate for different soil textures was evaluated. Materials and methods: For this purpose, the 60 and 120 minutes data of double ring infiltrometer test in Lali plain, Khuzestan, Iran, with its proposed time intervals (0, 1, 3, 5, 10, 15, 20, 30, 45, 60, 80, 100, 120, 150, 180, 210, 240 minutes were used to predict cumulative infiltration until the end of the experiment time for heavy (clay, medium (loam and light (sand soil textures. Moreover, used parameters of Kostiakov-Lewis equation recommended by NRCS, 24 hours cumulative infiltration curves were applied in time-series modeling for six different soil textures (clay, clay loam, silty, silty loam, sandy loam and sand. Different time-series models including Autoregressive (AR, Moving Average (MA, Autoregressive Moving Average (ARMA, autoregressive integrated moving average (ARIMA, ARMA model with eXogenous variables (ARMAX and AR model with eXogenous variables (ARX were evaluated in predicting cumulative infiltration. Autocorrelation and partial autocorrelation charts for each

  18. Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System

    Directory of Open Access Journals (Sweden)

    Wuyang Cheng

    2014-01-01

    Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.

  19. A spatial time series framework for modeling daily precipitationat regional scales

    Energy Technology Data Exchange (ETDEWEB)

    Kyriakidis, Phaedon C.; Miller, Norman L.; Kim, Jinwon

    2001-11-14

    In this paper, a framework for stochastic spatiotemporal modeling of daily precipitation in a hindcast mode is presented. Observed precipitation levels in space and time are modeled as a joint realization of a collection of space-indexed time series, one for each spatial location. Time series model parameters are spatially varying, thus capturing space-time interactions. Stochastic simulation, i.e., the procedure of generating alternative precipitation realizations (synthetic fields) over the space-time domain of interest (Deutsch and Journel, 1998), is employed for ensemble prediction. The simulated daily precipitation fields reproduce a data-based histogram and spatiotemporal covariance model, and identify the measured precipitation values at the rain gauges (conditional simulation). Such synthetic precipitation fields can be used in a Monte Carlo framework for risk analysis studies in hydrologic impact assessment investigations.

  20. On determining the prediction limits of mathematical models for time series

    International Nuclear Information System (INIS)

    Peluso, E.; Gelfusa, M.; Lungaroni, M.; Talebzadeh, S.; Gaudio, P.; Murari, A.; Contributors, JET

    2016-01-01

    Prediction is one of the main objectives of scientific analysis and it refers to both modelling and forecasting. The determination of the limits of predictability is an important issue of both theoretical and practical relevance. In the case of modelling time series, reached a certain level in performance in either modelling or prediction, it is often important to assess whether all the information available in the data has been exploited or whether there are still margins for improvement of the tools being developed. In this paper, an information theoretic approach is proposed to address this issue and quantify the quality of the models and/or predictions. The excellent properties of the proposed indicator have been proved with the help of a systematic series of numerical tests and a concrete example of extreme relevance for nuclear fusion.

  1. The partial duration series method in regional index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    A regional index-flood method based on the partial duration series model is introduced. The model comprises the assumptions of a Poisson-distributed number of threshold exceedances and generalized Pareto (GP) distributed peak magnitudes. The regional T-year event estimator is based on a regional ...... preferable to at-site estimation in moderately heterogeneous and homogeneous regions for large sample sizes. Modest intersite dependence has only a small effect on the performance of the regional index-flood estimator.......A regional index-flood method based on the partial duration series model is introduced. The model comprises the assumptions of a Poisson-distributed number of threshold exceedances and generalized Pareto (GP) distributed peak magnitudes. The regional T-year event estimator is based on a regional...

  2. Stochastic Modeling of Rainfall Series in Kelantan Using an Advanced Weather Generator

    Directory of Open Access Journals (Sweden)

    A. H. Syafrina

    2018-02-01

    Full Text Available Weather generator is a numerical tool that uses existing meteorological records to generate series of synthetic weather data. The AWE-GEN (Advanced Weather Generator model has been successful in producing a broad range of temporal scale weather variables, ranging from the high-frequency hourly values to the low-frequency inter-annual variability. In Malaysia, AWE-GEN has produced reliable projections of extreme rainfall events for some parts of Peninsular Malaysia. This study focuses on the use of AWE-GEN model to assess rainfall distribution in Kelantan. Kelantan is situated on the north east of the Peninsular, a region which is highly susceptible to flood. Embedded within the AWE-GEN model is the Neyman Scott process which employs parameters to represent physical rainfall characteristics. The use of correct probability distributions to represent the parameters is imperative to allow reliable results to be produced. This study compares the performance of two probability distributions, Weibull and Gamma to represent rainfall intensity and the better distribution found was used subsequently to simulate hourly scaled rainfall series. Thirty years of hourly scaled meteorological data from two stations in Kelantan were used in model construction. Results indicate that both probability distributions are capable of replicating the rainfall series at both stations very well, however numerical evaluations suggested that Gamma performs better. Despite Gamma not being a heavy tailed distribution, it is able to replicate the key characteristics of rainfall series and particularly extreme values. The overall simulation results showed that the AWE-GEN model is capable of generating tropical rainfall series which could be beneficial in flood preparedness studies in areas vulnerable to flood.

  3. 78 FR 76731 - Special Conditions: Boeing Model 777-200, -300, and -300ER Series Airplanes; Rechargeable Lithium...

    Science.gov (United States)

    2013-12-19

    ... series airplanes have fly-by-wire controls, fully software-configurable avionics, and fiber-optic... Series Airplanes; Rechargeable Lithium Ion Batteries and Battery Systems AGENCY: Federal Aviation... Boeing Model 777- 200, -300, and -300ER series airplanes. These airplanes as modified by the ARINC...

  4. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  5. Technical Work Plan for: Near Field Environment: Engineered Barrier System: Radionuclide Transport Abstraction Model Report

    International Nuclear Information System (INIS)

    J.D. Schreiber

    2006-01-01

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model will be revised to be consistent with

  6. PSO-MISMO modeling strategy for multistep-ahead time series prediction.

    Science.gov (United States)

    Bao, Yukun; Xiong, Tao; Hu, Zhongyi

    2014-05-01

    Multistep-ahead time series prediction is one of the most challenging research topics in the field of time series modeling and prediction, and is continually under research. Recently, the multiple-input several multiple-outputs (MISMO) modeling strategy has been proposed as a promising alternative for multistep-ahead time series prediction, exhibiting advantages compared with the two currently dominating strategies, the iterated and the direct strategies. Built on the established MISMO strategy, this paper proposes a particle swarm optimization (PSO)-based MISMO modeling strategy, which is capable of determining the number of sub-models in a self-adaptive mode, with varying prediction horizons. Rather than deriving crisp divides with equal-size s prediction horizons from the established MISMO, the proposed PSO-MISMO strategy, implemented with neural networks, employs a heuristic to create flexible divides with varying sizes of prediction horizons and to generate corresponding sub-models, providing considerable flexibility in model construction, which has been validated with simulated and real datasets.

  7. A regional and nonstationary model for partial duration series of extreme rainfall

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Madsen, Henrik; Rosbjerg, Dan

    2017-01-01

    of extreme rainfall. The framework is built on a partial duration series approach with a nonstationary, regional threshold value. The model is based on generalized linear regression solved by generalized estimation equations. It allows a spatial correlation between the stations in the network and accounts...... furthermore for variable observation periods at each station and in each year. Marginal regional and temporal regression models solved by generalized least squares are used to validate and discuss the results of the full spatiotemporal model. The model is applied on data from a large Danish rain gauge network...

  8. Estimates by bootstrap interval for time series forecasts obtained by theta model

    Directory of Open Access Journals (Sweden)

    Daniel Steffen

    2017-03-01

    Full Text Available In this work, are developed an experimental computer program in Matlab language version 7.1 from the univariate method for time series forecasting called Theta, and implementation of resampling technique known as computer intensive "bootstrap" to estimate the prediction for the point forecast obtained by this method by confidence interval. To solve this problem built up an algorithm that uses Monte Carlo simulation to obtain the interval estimation for forecasts. The Theta model presented in this work was very efficient in M3 Makridakis competition, where tested 3003 series. It is based on the concept of modifying the local curvature of the time series obtained by a coefficient theta (Θ. In it's simplest approach the time series is decomposed into two lines theta representing terms of long term and short term. The prediction is made by combining the forecast obtained by fitting lines obtained with the theta decomposition. The results of Mape's error obtained for the estimates confirm the favorable results to the method of M3 competition being a good alternative for time series forecast.

  9. Generation of future high-resolution rainfall time series with a disaggregation model

    Science.gov (United States)

    Müller, Hannes; Haberlandt, Uwe

    2017-04-01

    High-resolution rainfall data are needed in many fields of hydrology and water resources management. For analyzes of future rainfall condition climate scenarios exist with hourly values of rainfall. However, the direct usage of these data is associated with uncertainties which can be indicated by comparisons of observations and C20 control runs. An alternative is the derivation of changes of rainfall behavior over the time from climate simulations. Conclusions about future rainfall conditions can be drawn by adding these changes to observed time series. A multiplicative cascade model is used in this investigation for the disaggregation of daily rainfall amounts to hourly values. Model parameters can be estimated by REMO rainfall time series (UBA-, BfG- and ENS-realization), based on ECHAM5. Parameter estimation is carried out for C20 period as well as near term and long term future (2021-2050 and 2071-2100). Change factors for both future periods are derived by parameter comparisons and added to the parameters estimated from observed time series. This enables the generation of hourly rainfall time series from observed daily values with respect to future changes. The investigation is carried out for rain gauges in Lower Saxony. Generated Time series are analyzed regarding statistical characteristics, e.g. extreme values, event-based (wet spell duration and amounts, dry spell duration, …) and continuum characteristics (average intensity, fraction of dry intervals,…). The generation of the time series is validated by comparing the changes in the statistical characteristics from the REMO data and from the disaggregated data.

  10. Information Model and Its Element for Displaying Information on Technical Condition of Objects of Integrated Information System

    OpenAIRE

    Kovalenko, Anna; Smirnov, Alexey; Kovalenko, Alexander; Dorensky, Alexander; Коваленко, А. С.; Смірнов, О. А.; Коваленко, О. В.; Доренський, О. П.

    2016-01-01

    The suggested information elements for the system of information display of the technical condition of the integrated information system meet the essential requirements of the information presentation. They correlate with the real object simply and very accurately. The suggested model of information display of the technical condition of the objects of integrated information system improves the efficiency of the operator of technical diagnostics in evaluating the information about the...

  11. Unified Series-Shunt Compensator for PQ Analysis using Dynamic Phasor Modeling and EMT Simulation

    Science.gov (United States)

    Hannan, M. A.; Mohamed, Azah; Hussain, Aini

    2010-06-01

    Modeling of unified series-shunt compensator (USSC) and its PQ analysis of a simple test system is simulated based on dynamic phasor model and EMT program. Its aim is to investigate the overall efficiency of USSC for power quality (PQ) analysis and results will be compared with EMTP like simulation. The dynamic phasor model is implemented in Matlab/Simulink toolbox where as the EMT model simulation of the USSC uses the PSCAD/EMTDC software. Credible solutions to the PQ problems on the distribution network have been analyzed using dynamic phasor model and EMT model simulation techniques. Simulation results of the USSC dynamic phasor model including the system makes a perfect agreement with the detailed time-domain EMTP like PSCAD/EMTDC simulation. It is found that the dynamic behavior of USSC phasor model have very good potential application in analyzing overall PQ issues, faster in speed and higher accuracy as compared with PSCAD/EMTDC simulation.

  12. The Potential of Experiential Learning Models and Practices in Career and Technical Education and Career and Technical Teacher Education

    Science.gov (United States)

    Clark, Robert W.; Threeton, Mark D.; Ewing, John C.

    2010-01-01

    Since inception, career and technical education programs have embraced experiential learning as a true learning methodology for students to obtain occupational skills valued by employers. Programs have integrated classroom instruction with laboratory experiences to provide students a significant opportunity to learn. However, it is questionable as…

  13. Predicting Time Series Outputs and Time-to-Failure for an Aircraft Controller Using Bayesian Modeling

    Science.gov (United States)

    He, Yuning

    2015-01-01

    Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.

  14. Intuitionistic Fuzzy Time Series Forecasting Model Based on Intuitionistic Fuzzy Reasoning

    Directory of Open Access Journals (Sweden)

    Ya’nan Wang

    2016-01-01

    Full Text Available Fuzzy sets theory cannot describe the data comprehensively, which has greatly limited the objectivity of fuzzy time series in uncertain data forecasting. In this regard, an intuitionistic fuzzy time series forecasting model is built. In the new model, a fuzzy clustering algorithm is used to divide the universe of discourse into unequal intervals, and a more objective technique for ascertaining the membership function and nonmembership function of the intuitionistic fuzzy set is proposed. On these bases, forecast rules based on intuitionistic fuzzy approximate reasoning are established. At last, contrast experiments on the enrollments of the University of Alabama and the Taiwan Stock Exchange Capitalization Weighted Stock Index are carried out. The results show that the new model has a clear advantage of improving the forecast accuracy.

  15. [Prediction of epidemic tendency of schistosomiasis with time-series model in Hubei Province].

    Science.gov (United States)

    Chen, Yan-Yan; Cai, Shun-Xiang; Xiao, Ying; Jiang, Yong; Shan, Xiao-Wei; Zhang, Juan; Liu, Jian-Bing

    2014-12-01

    To study the endemic trend of schistosomiasis japonica in Hubei Province, so as to provide the theoretical basis for surveillance and forecasting of schistosomiasis. The time-series auto regression integrated moving average (ARIMA) model was applied to fit the infection rate of residents of Hubei Province from 1987 to 2013, and to predict the short-term trend of infection rate. The actual values of infection rate of residents were all in the 95% confidence internals of value predicted by the ARIMA model. The prediction showed that the infection rate of residents of Hubei Province would continue to decrease slowly. The time-series ARIMA model has good prediction accuracy, and could be used for the short-term forecasting of schistosomiasis.

  16. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    Science.gov (United States)

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  17. Mathematical model for estimating of technical and technological indicators of railway stations operation

    Directory of Open Access Journals (Sweden)

    D.M. Kozachenko

    2013-06-01

    Full Text Available Purpose. The article aims to create a mathematical model of the railway station functioning for the solving of problems of station technology development on the plan-schedule basis. Methodology. The methods of graph theory and object-oriented analysis are used as research methods. The model of the station activity plan-schedule includes a model of technical equipment of the station (plan-schedule net and a model of the station functioning , which are formalized on the basis of parametric graphs. Findings. The presented model is implemented as an application to the graphics package AutoCAD. The software is developed in Visual LISP and Visual Basic. Taking into account that the construction of the plan-schedule is mostly a traditional process of adding, deleting, and modifying of icons, the developed interface is intuitively understandable for a technologist and practically does not require additional training. Originality. A mathematical model was created on the basis of the theory of graphs and object-oriented analysis in order to evaluate the technical and process of railway stations indicators; it is focused on solving problems of technology development of their work. Practical value. The proposed mathematical model is implemented as an application to the graphics package of AutoCAD. The presence of a mathematical model allows carrying out an automatic analysis of the plan-schedule and, thereby, reducing the period of its creation more than twice.

  18. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

    Energy Technology Data Exchange (ETDEWEB)

    Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

    2010-12-01

    The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequately configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.

  19. Four simultaneous component models for the analysis of multivariate time series from more than one subject to model intraindividual and interindividual differences

    NARCIS (Netherlands)

    Timmerman, Mariek E.; Kiers, Henk A.L.

    A class of four simultaneous component models for the exploratory analysis of multivariate time series collected from more than one subject simultaneously is discussed. In each of the models, the multivariate time series of each subject is decomposed into a few series of component scores and a

  20. Modelling technical snow production for skiing areas in the Austrian Alps with the physically based snow model AMUNDSEN

    Science.gov (United States)

    Hanzer, F.; Marke, T.; Steiger, R.; Strasser, U.

    2012-04-01

    Tourism and particularly winter tourism is a key factor for the Austrian economy. Judging from currently available climate simulations, the Austrian Alps show a particularly high vulnerability to climatic changes. To reduce the exposure of ski areas towards changes in natural snow conditions as well as to generally enhance snow conditions at skiing sites, technical snowmaking is widely utilized across Austrian ski areas. While such measures result in better snow conditions at the skiing sites and are important for the local skiing industry, its economic efficiency has also to be taken into account. The current work emerges from the project CC-Snow II, where improved future climate scenario simulations are used to determine future natural and artificial snow conditions and their effects on tourism and economy in the Austrian Alps. In a first step, a simple technical snowmaking approach is incorporated into the process based snow model AMUNDSEN, which operates at a spatial resolution of 10-50 m and a temporal resolution of 1-3 hours. Locations of skiing slopes within a ski area in Styria, Austria, were digitized and imported into the model environment. During a predefined time frame in the beginning of the ski season, the model produces a maximum possible amount of technical snow and distributes the associated snow on the slopes, whereas afterwards, until to the end of the ski season, the model tries to maintain a certain snow depth threshold value on the slopes. Due to only few required input parameters, this approach is easily transferable to other ski areas. In our poster contribution, we present first results of this snowmaking approach and give an overview of the data and methodology applied. In a further step in CC-Snow, this simple bulk approach will be extended to consider actual snow cannon locations and technical specifications, which will allow a more detailed description of technical snow production as well as cannon-based recordings of water and energy

  1. MATHEMATICAL MODELS OF PROCESSES AND SYSTEMS OF TECHNICAL OPERATION FOR ONBOARD COMPLEXES AND FUNCTIONAL SYSTEMS OF AVIONICS

    Directory of Open Access Journals (Sweden)

    Sergey Viktorovich Kuznetsov

    2017-01-01

    Full Text Available Modern aircraft are equipped with complicated systems and complexes of avionics. Aircraft and its avionics tech- nical operation process is observed as a process with changing of operation states. Mathematical models of avionics pro- cesses and systems of technical operation are represented as Markov chains, Markov and semi-Markov processes. The pur- pose is to develop the graph-models of avionics technical operation processes, describing their work in flight, as well as during maintenance on the ground in the various systems of technical operation. The graph-models of processes and sys- tems of on-board complexes and functional avionics systems in flight are proposed. They are based on the state tables. The models are specified for the various technical operation systems: the system with control of the reliability level, the system with parameters control and the system with resource control. The events, which cause the avionics complexes and func- tional systems change their technical state, are failures and faults of built-in test equipment. Avionics system of technical operation with reliability level control is applicable for objects with constant or slowly varying in time failure rate. Avion- ics system of technical operation with resource control is mainly used for objects with increasing over time failure rate. Avionics system of technical operation with parameters control is used for objects with increasing over time failure rate and with generalized parameters, which can provide forecasting and assign the borders of before-fail technical states. The pro- posed formal graphical approach avionics complexes and systems models designing is the basis for models and complex systems and facilities construction, both for a single aircraft and for an airline aircraft fleet, or even for the entire aircraft fleet of some specific type. The ultimate graph-models for avionics in various systems of technical operation permit the beginning of

  2. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  3. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  4. Time series modeling for analysis and control advanced autopilot and monitoring systems

    CERN Document Server

    Ohtsu, Kohei; Kitagawa, Genshiro

    2015-01-01

    This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships’ autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state–space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracki...

  5. Comparing and Contrasting Traditional Membrane Bioreactor Models with Novel Ones Based on Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Parneet Paul

    2013-02-01

    Full Text Available The computer modelling and simulation of wastewater treatment plant and their specific technologies, such as membrane bioreactors (MBRs, are becoming increasingly useful to consultant engineers when designing, upgrading, retrofitting, operating and controlling these plant. This research uses traditional phenomenological mechanistic models based on MBR filtration and biochemical processes to measure the effectiveness of alternative and novel time series models based upon input–output system identification methods. Both model types are calibrated and validated using similar plant layouts and data sets derived for this purpose. Results prove that although both approaches have their advantages, they also have specific disadvantages as well. In conclusion, the MBR plant designer and/or operator who wishes to use good quality, calibrated models to gain a better understanding of their process, should carefully consider which model type is selected based upon on what their initial modelling objectives are. Each situation usually proves unique.

  6. Statistical properties of fluctuations of time series representing appearances of words in nationwide blog data and their applications: An example of modeling fluctuation scalings of nonstationary time series

    Science.gov (United States)

    Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako

    2016-11-01

    To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3 ×109 Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.

  7. A time series model: First-order integer-valued autoregressive (INAR(1))

    Science.gov (United States)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  8. Water quality management using statistical analysis and time-series prediction model

    Science.gov (United States)

    Parmar, Kulwinder Singh; Bhardwaj, Rashmi

    2014-12-01

    This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.

  9. Extracting Knowledge From Time Series An Introduction to Nonlinear Empirical Modeling

    CERN Document Server

    Bezruchko, Boris P

    2010-01-01

    This book addresses the fundamental question of how to construct mathematical models for the evolution of dynamical systems from experimentally-obtained time series. It places emphasis on chaotic signals and nonlinear modeling and discusses different approaches to the forecast of future system evolution. In particular, it teaches readers how to construct difference and differential model equations depending on the amount of a priori information that is available on the system in addition to the experimental data sets. This book will benefit graduate students and researchers from all natural sciences who seek a self-contained and thorough introduction to this subject.

  10. Modeling Dyadic Processes Using Hidden Markov Models: A Time Series Approach to Mother-Infant Interactions during Infant Immunization

    Science.gov (United States)

    Stifter, Cynthia A.; Rovine, Michael

    2015-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…

  11. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    Science.gov (United States)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  12. Energy Supply Planning Model documentation. Volume II. Technical manual. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-09-01

    The Energy Supply Planning Model (ESPM) provides a systematic means of calculating, for any candidate energy development strategy, the total direct resources (capital, labor, materials, equipment, land, water, and energy) required to build and operate the energy-related supply facilities needed for the strategy. The model is used to analyze the feasibility and impacts of proposed strategies. This report provides a technical description of the model's computation methods and file structure to guide model set-up and allow program modification. It documents the model's primary data base. The ESPM consists of a number of separate programs which are generally run in sequence as submodels. Section 2 of this report provides an overview of these programs - their functions, application sequence, and the interconnecting file structure. The remaining sections describe each program and the model data base. The source code on the computer tape provides a complete definition of the algorithms used. (MCW)

  13. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data

    Science.gov (United States)

    Wilson, Barry T.; Knight, Joseph F.; McRoberts, Ronald E.

    2018-03-01

    Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several methods have previously been developed for use with finer temporal resolution imagery (e.g. AVHRR and MODIS), including image compositing and harmonic regression using Fourier series. The manuscript presents a study, using Minnesota, USA during the years 2009-2013 as the study area and timeframe. The study examined the relative predictive power of land cover models, in particular those related to tree cover, using predictor variables based solely on composite imagery versus those using estimated harmonic regression coefficients. The study used two common non-parametric modeling approaches (i.e. k-nearest neighbors and random forests) for fitting classification and regression models of multiple attributes measured on USFS Forest Inventory and Analysis plots using all available Landsat imagery for the study area and timeframe. The estimated Fourier coefficients developed by harmonic regression of tasseled cap transformation time series data were shown to be correlated with land cover, including tree cover. Regression models using estimated Fourier coefficients as predictor variables showed a two- to threefold increase in explained variance for a small set of continuous response variables, relative to comparable models using monthly image composites. Similarly, the overall accuracies of classification models using the estimated Fourier coefficients were approximately 10-20 percentage points higher than the models using the image composites, with corresponding individual class accuracies between six and 45 percentage points higher.

  14. Creation and evaluation of a database of renewable production time series and other data for energy system modelling

    International Nuclear Information System (INIS)

    Janker, Karl Albert

    2015-01-01

    This thesis describes a model which generates renewable power generation time series as input data for energy system models. The focus is on photovoltaic systems and wind turbines. The basis is a high resolution global raster data set of weather data for many years. This data is validated, corrected and preprocessed. The composition of the hourly generation data is done via simulation of the respective technology. The generated time series are aggregated for different regions and are validated against historical production time series.

  15. Time-series regression models to study the short-term effects of environmental factors on health

    OpenAIRE

    Tobías, Aureli; Saez, Marc

    2004-01-01

    Time series regression models are especially suitable in epidemiology for evaluating short-term effects of time-varying exposures on health. The problem is that potential for confounding in time series regression is very high. Thus, it is important that trend and seasonality are properly accounted for. Our paper reviews the statistical models commonly used in time-series regression methods, specially allowing for serial correlation, make them potentially useful for selected epidemiological pu...

  16. A model of cardiopulmonary bypass staged training integrating technical and non-technical skills dedicated to cardiac trainees.

    Science.gov (United States)

    Fouilloux, V; Doguet, F; Kotsakis, A; Dubrowski, A; Berdah, S

    2015-03-01

    To develop a standardized simulation-based curriculum to teach medical knowledge and technical, communication and critical thinking skills necessary to initiate and wean from cardiopulmonary bypass (CPB) to junior cardiac trainees (CTs) in France. Performance on post-curricular tests was compared between CTs who participated in the new curriculum to those who did not. The simulation-based curriculum was developed by content and education experts. Simulations sequentially taught the skills necessary for initiating and weaning from CPB as well as managing crises by adding fidelity and complexity to scenarios. Nine CTs were randomly assigned to the new curriculum (n=5) or the traditional curriculum (n=4). Skills were assessed using tests of medical knowledge and technical, communication (GRS) and critical thinking (SCT) skills. A two-sample Wilcoxon rank-sum test compared average scores between the two groups. Alpha of 0.05 was set to indicate statistically significant differences. The resutls revealed that CTs in the new curriculum significantly outperformed CTs in the traditional curriculum on technical (18.2 vs 14.8, p=0.05) and communication (3.5 vs 2.2, p=0.013) skills. There was no significant difference between CTs in the new curriculum in the Script Concordance Test (16.5 vs 14.8, p=0.141) and knowledge tests (26.9 vs 24.6, p=0.14) compared to CTs in the traditional curriculum. Our new curriculum teaches communication and technical skills necessary for CPB. The results of this pilot study are encouraging and relevant. They give grounds for future research with a larger panel of trainees. Based on the current distribution of scores, a sample size of 12 CTs per group should yield significant results for all tests. © The Author(s) 2014.

  17. Prediction of traffic-related nitrogen oxides concentrations using Structural Time-Series models

    Science.gov (United States)

    Lawson, Anneka Ruth; Ghosh, Bidisha; Broderick, Brian

    2011-09-01

    Ambient air quality monitoring, modeling and compliance to the standards set by European Union (EU) directives and World Health Organization (WHO) guidelines are required to ensure the protection of human and environmental health. Congested urban areas are most susceptible to traffic-related air pollution which is the most problematic source of air pollution in Ireland. Long-term continuous real-time monitoring of ambient air quality at such urban centers is essential but often not realistic due to financial and operational constraints. Hence, the development of a resource-conservative ambient air quality monitoring technique is essential to ensure compliance with the threshold values set by the standards. As an intelligent and advanced statistical methodology, a Structural Time Series (STS) based approach has been introduced in this paper to develop a parsimonious and computationally simple air quality model. In STS methodology, the different components of a time-series dataset such as the trend, seasonal, cyclical and calendar variations can be modeled separately. To test the effectiveness of the proposed modeling strategy, average hourly concentrations of nitrogen dioxide and nitrogen oxides from a congested urban arterial in Dublin city center were modeled using STS methodology. The prediction error estimates from the developed air quality model indicate that the STS model can be a useful tool in predicting nitrogen dioxide and nitrogen oxides concentrations in urban areas and will be particularly useful in situations where the information on external variables such as meteorology or traffic volume is not available.

  18. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  19. Anomalous Advection-Dispersion Equations within General Fractional-Order Derivatives: Models and Series Solutions

    Directory of Open Access Journals (Sweden)

    Xin Liang

    2018-01-01

    Full Text Available In this paper, an anomalous advection-dispersion model involving a new general Liouville–Caputo fractional-order derivative is addressed for the first time. The series solutions of the general fractional advection-dispersion equations are obtained with the aid of the Laplace transform. The results are given to demonstrate the efficiency of the proposed formulations to describe the anomalous advection dispersion processes.

  20. Technical-economic modelling of an aluminium high pressure die casting system for automotive parts fabrication

    International Nuclear Information System (INIS)

    Faura, F.

    1997-01-01

    In the present paper a technical-economic model for an aluminium high pressure die casting system has been developed. In order to obtain the necessary data for correlations utilized by the model, has been analyzed the production systems of companies that use these processes. This has allowed to determine the most important technological variables that affect to the economical aspect of the process. A computer application has been developed which allows to explore easily the influence of different system parameters. (Author) 12 refs

  1. The Use of Function/Means Trees for Modelling Technical, Semantic and Business Functions

    DEFF Research Database (Denmark)

    Robotham, Antony John

    2000-01-01

    This paper considers the feasibility of using the function/means tree to create a single tree for a complete motor vehicle. It is argued that function/means trees can be used for modelling technical and semantic functions, but it is an inappropriate method for business functions when one tree...... of the vehicle is required. Life cycle modelling provides an effective means for determining all the required purpose functions and is considered a more effective method than the function/means tree for this task when the structure and mode of operation of the vehicle is well defined and understood....

  2. Electrostatic micro-actuator with a pre-charged series capacitor: modeling, design, and demonstration

    International Nuclear Information System (INIS)

    Yang, Hyun-Ho; Han, Chang-Hoon; Lee, Jeong Oen; Yoon, Jun-Bo

    2014-01-01

    As a powerful method to reduce actuation voltage in an electrostatic micro-actuator, we propose and investigate an electrostatic micro-actuator with a pre-charged series capacitor. In contrast to a conventional electrostatic actuator, the injected pre-charges into the series capacitor can freely modulate the pull-in voltage of the proposed actuator even after the completion of fabrication. The static characteristics of the proposed actuator were investigated by first developing analytical models based on a parallel-plate capacitor model. We then successfully designed and demonstrated a micro-switch with a pre-charged series capacitor. The pull-in voltage of the fabricated micro-switch was reduced from 65.4 to 0.6 V when pre-charged with 46.3 V. The on-resistance of the fabricated micro-switch was almost the same as the initial one, even when the device was pre-charged, which was demonstrated for the first time. All results from the analytical models, finite element method simulations, and measurements were in good agreement with deviations of less than 10%. This work can be favorably adapted to electrostatic micro-switches which need a low actuation voltage without noticeable degradation of performance. (paper)

  3. Posterior-Predictive Evidence on US Inflation using Phillips Curve Models with Non-Filtered Time Series

    NARCIS (Netherlands)

    N. Basturk (Nalan); C. Cakmakli (Cem); S.P. Ceyhan (Pinar); H.K. van Dijk (Herman)

    2012-01-01

    textabstractChanging time series properties of US inflation and economic activity are analyzed within a class of extended Phillips Curve (PC) models. First, the misspecification effects of mechanical removal of low frequency movements of these series on posterior inference of a basic PC model are

  4. Posterior-Predictive Evidence on US Inflation using Phillips Curve Models with Non-Filtered Time Series

    NARCIS (Netherlands)

    Basturk, N.; Cakmakli, C.; Ceyhan, P.; van Dijk, H.K.

    2013-01-01

    Changing time series properties of US inflation and economic activity are analyzed within a class of extended Phillips Curve (PC) models. First, the misspecification effects of mechanical removal of low frequency movements of these series on posterior inference of a basic PC model are analyzed using

  5. 78 FR 76249 - Special Conditions: Airbus, Model A350-900 Series Airplane; Flight Envelope Protection: Normal...

    Science.gov (United States)

    2013-12-17

    .... As with the previous fly-by-wire airplanes, the FAA has no regulatory or safety reason to inhibit the...-0905; Notice No. 25-13-28-SC] Special Conditions: Airbus, Model A350-900 Series Airplane; Flight... Airbus Model A350- 900 series airplanes. These airplanes will have a novel or unusual design feature(s...

  6. Construction of the exact Fisher information matrix of Gaussian time series models by means of matrix differential rules

    NARCIS (Netherlands)

    Klein, A.A.B.; Melard, G.; Zahaf, T.

    2000-01-01

    The Fisher information matrix is of fundamental importance for the analysis of parameter estimation of time series models. In this paper the exact information matrix of a multivariate Gaussian time series model expressed in state space form is derived. A computationally efficient procedure is used

  7. a Landsat Time-Series Stacks Model for Detection of Cropland Change

    Science.gov (United States)

    Chen, J.; Chen, J.; Zhang, J.

    2017-09-01

    Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.

  8. Big Data impacts on stochastic Forecast Models: Evidence from FX time series

    Directory of Open Access Journals (Sweden)

    Sebastian Dietz

    2013-12-01

    Full Text Available With the rise of the Big Data paradigm new tasks for prediction models appeared. In addition to the volume problem of such data sets nonlinearity becomes important, as the more detailed data sets contain also more comprehensive information, e.g. about non regular seasonal or cyclical movements as well as jumps in time series. This essay compares two nonlinear methods for predicting a high frequency time series, the USD/Euro exchange rate. The first method investigated is Autoregressive Neural Network Processes (ARNN, a neural network based nonlinear extension of classical autoregressive process models from time series analysis (see Dietz 2011. Its advantage is its simple but scalable time series process model architecture, which is able to include all kinds of nonlinearities based on the universal approximation theorem of Hornik, Stinchcombe and White 1989 and the extensions of Hornik 1993. However, restrictions related to the numeric estimation procedures limit the flexibility of the model. The alternative is a Support Vector Machine Model (SVM, Vapnik 1995. The two methods compared have different approaches of error minimization (Empirical error minimization at the ARNN vs. structural error minimization at the SVM. Our new finding is, that time series data classified as “Big Data” need new methods for prediction. Estimation and prediction was performed using the statistical programming language R. Besides prediction results we will also discuss the impact of Big Data on data preparation and model validation steps. Normal 0 21 false false false DE X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

  9. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    Directory of Open Access Journals (Sweden)

    Jun-He Yang

    2017-01-01

    Full Text Available Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir’s water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir’s water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  10. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China

    Science.gov (United States)

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry. PMID:28459872

  11. Optimizing Availability of a Framework in Series Configuration Utilizing Markov Model and Monte Carlo Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-06-01

    Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.

  12. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China.

    Science.gov (United States)

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.

  13. Time Series Model of Wind Speed for Multi Wind Turbines based on Mixed Copula

    Directory of Open Access Journals (Sweden)

    Nie Dan

    2016-01-01

    Full Text Available Because wind power is intermittent, random and so on, large scale grid will directly affect the safe and stable operation of power grid. In order to make a quantitative study on the characteristics of the wind speed of wind turbine, the wind speed time series model of the multi wind turbine generator is constructed by using the mixed Copula-ARMA function in this paper, and a numerical example is also given. The research results show that the model can effectively predict the wind speed, ensure the efficient operation of the wind turbine, and provide theoretical basis for the stability of wind power grid connected operation.

  14. SITE-94. The CRYSTAL Geosphere Transport Model: Technical documentation version 2.1

    International Nuclear Information System (INIS)

    Worgan, K.; Robinson, P.

    1995-12-01

    CRYSTAL, a one-dimensional contaminant transport model of a densely fissured geosphere, was originally developed for the SKI Project-90 performance assessment program. It has since been extended to include matrix blocks of alternative basic geometries. CRYSTAL predicts the transport of arbitrary-length decay chains by advection, diffusion and surface sorption in the fissures and diffusion into the rock matrix blocks. The model equations are solved in Laplace transform space, and inverted numerically to the time domain. This approach avoids time-stepping and consequently is numerically very efficient. The source term for crystal may be supplied internally using either simple leaching or band release submodels or by input of a general time-series output from a near-field model. The time series input is interfaced with the geosphere model using the method of convolution. The response of the geosphere to delta-function inputs from each nuclide is combined with the time series outputs from the near-field, to obtain the nuclide flux emerging from the far-field. 14 refs

  15. Correlations in magnitude series to assess nonlinearities: Application to multifractal models and heartbeat fluctuations

    Science.gov (United States)

    Bernaola-Galván, Pedro A.; Gómez-Extremera, Manuel; Romance, A. Ramón; Carpena, Pedro

    2017-09-01

    The correlation properties of the magnitude of a time series are associated with nonlinear and multifractal properties and have been applied in a great variety of fields. Here we have obtained the analytical expression of the autocorrelation of the magnitude series (C|x |) of a linear Gaussian noise as a function of its autocorrelation (Cx). For both, models and natural signals, the deviation of C|x | from its expectation in linear Gaussian noises can be used as an index of nonlinearity that can be applied to relatively short records and does not require the presence of scaling in the time series under study. In a model of artificial Gaussian multifractal signal we use this approach to analyze the relation between nonlinearity and multifractallity and show that the former implies the latter but the reverse is not true. We also apply this approach to analyze experimental data: heart-beat records during rest and moderate exercise. For each individual subject, we observe higher nonlinearities during rest. This behavior is also achieved on average for the analyzed set of 10 semiprofessional soccer players. This result agrees with the fact that other measures of complexity are dramatically reduced during exercise and can shed light on its relationship with the withdrawal of parasympathetic tone and/or the activation of sympathetic activity during physical activity.

  16. Correlations in magnitude series to assess nonlinearities: Application to multifractal models and heartbeat fluctuations.

    Science.gov (United States)

    Bernaola-Galván, Pedro A; Gómez-Extremera, Manuel; Romance, A Ramón; Carpena, Pedro

    2017-09-01

    The correlation properties of the magnitude of a time series are associated with nonlinear and multifractal properties and have been applied in a great variety of fields. Here we have obtained the analytical expression of the autocorrelation of the magnitude series (C_{|x|}) of a linear Gaussian noise as a function of its autocorrelation (C_{x}). For both, models and natural signals, the deviation of C_{|x|} from its expectation in linear Gaussian noises can be used as an index of nonlinearity that can be applied to relatively short records and does not require the presence of scaling in the time series under study. In a model of artificial Gaussian multifractal signal we use this approach to analyze the relation between nonlinearity and multifractallity and show that the former implies the latter but the reverse is not true. We also apply this approach to analyze experimental data: heart-beat records during rest and moderate exercise. For each individual subject, we observe higher nonlinearities during rest. This behavior is also achieved on average for the analyzed set of 10 semiprofessional soccer players. This result agrees with the fact that other measures of complexity are dramatically reduced during exercise and can shed light on its relationship with the withdrawal of parasympathetic tone and/or the activation of sympathetic activity during physical activity.

  17. A new Markov-chain-related statistical approach for modelling synthetic wind power time series

    International Nuclear Information System (INIS)

    Pesch, T; Hake, J F; Schröders, S; Allelein, H J

    2015-01-01

    The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)

  18. Environmental monitoring model for a drainage basin obtained through spectral analysis of time series.

    Science.gov (United States)

    Faht, Guilherme; da Silva, Marcos Rivail; Pinheiro, Adilson; Kaufmann, Vander; de Aguida, Leandro Mazzuco

    2012-08-01

    The quality of results of an environmental monitoring plan is limited to the weakest component, which could be the analytical approach or sampling method. Considering both the possibilities and the fragility that sampling methods offer, this environmental monitoring study focused on the uncertainties caused by the time component. Four time series of nutrient concentration at two sampling points (PB1 and PB2) in the Ribeirão Garcia basin in Blumenau, Brazil, which were significantly correlated to the spatial component, were considered with a 2-hour resolution to develop efficient sampling models. These models were based on the time at which there was the highest tendency toward adverse environmental effects. Fourier spectral analysis was used to evaluated the time series and resulted in two sampling models: (1) the SMCP (sampling model for critical period) that operated with 100% efficiency for registering the highest concentration of nutrients and was valid for 83% of the studied parameters; and (2) the SMGCP (sampling model for global critical period) that operated with 83 and 50% efficiency for PB1 and PB2, respectively.

  19. Structural Time Series Model for El Niño Prediction

    Science.gov (United States)

    Petrova, Desislava; Koopman, Siem Jan; Ballester, Joan; Rodo, Xavier

    2015-04-01

    ENSO is a dominant feature of climate variability on inter-annual time scales destabilizing weather patterns throughout the globe, and having far-reaching socio-economic consequences. It does not only lead to extensive rainfall and flooding in some regions of the world, and anomalous droughts in others, thus ruining local agriculture, but also substantially affects the marine ecosystems and the sustained exploitation of marine resources in particular coastal zones, especially the Pacific South American coast. As a result, forecasting of ENSO and especially of the warm phase of the oscillation (El Niño/EN) has long been a subject of intense research and improvement. Thus, the present study explores a novel method for the prediction of the Niño 3.4 index. In the state-of-the-art the advantageous statistical modeling approach of Structural Time Series Analysis has not been applied. Therefore, we have developed such a model using a State Space approach for the unobserved components of the time series. Its distinguishing feature is that observations consist of various components - level, seasonality, cycle, disturbance, and regression variables incorporated as explanatory covariates. These components are aimed at capturing the various modes of variability of the N3.4 time series. They are modeled separately, then combined in a single model for analysis and forecasting. Customary statistical ENSO prediction models essentially use SST, SLP and wind stress in the equatorial Pacific. We introduce new regression variables - subsurface ocean temperature in the western equatorial Pacific, motivated by recent (Ramesh and Murtugudde, 2012) and classical research (Jin, 1997), (Wyrtki, 1985), showing that subsurface processes and heat accumulation there are fundamental for initiation of an El Niño event; and a southern Pacific temperature-difference tracer, the Rossbell dipole, leading EN by about nine months (Ballester, 2011).

  20. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  1. Pencilbeam irradiation technique for whole brain radiotherapy: technical and biological challenges in a small animal model.

    Science.gov (United States)

    Schültke, Elisabeth; Trippel, Michael; Bräuer-Krisch, Elke; Renier, Michel; Bartzsch, Stefan; Requardt, Herwig; Döbrössy, Máté D; Nikkhah, Guido

    2013-01-01

    We have conducted the first in-vivo experiments in pencilbeam irradiation, a new synchrotron radiation technique based on the principle of microbeam irradiation, a concept of spatially fractionated high-dose irradiation. In an animal model of adult C57 BL/6J mice we have determined technical and physiological limitations with the present technical setup of the technique. Fifty-eight animals were distributed in eleven experimental groups, ten groups receiving whole brain radiotherapy with arrays of 50 µm wide beams. We have tested peak doses ranging between 172 Gy and 2,298 Gy at 3 mm depth. Animals in five groups received whole brain radiotherapy with a center-to-center (ctc) distance of 200 µm and a peak-to-valley ratio (PVDR) of ∼ 100, in the other five groups the ctc was 400 µm (PVDR ∼ 400). Motor and memory abilities were assessed during a six months observation period following irradiation. The lower dose limit, determined by the technical equipment, was at 172 Gy. The LD50 was about 1,164 Gy for a ctc of 200 µm and higher than 2,298 Gy for a ctc of 400 µm. Age-dependent loss in motor and memory performance was seen in all groups. Better overall performance (close to that of healthy controls) was seen in the groups irradiated with a ctc of 400 µm.

  2. Building new meanings in technical English from the perspective of the lexical constellation model

    Directory of Open Access Journals (Sweden)

    Camino Rea Rizzo

    2010-10-01

    Full Text Available The need to name and communicate to others new concepts in specific domains of human activity leads to the formation of new terms. However, many of the technical words in English are not new from the point of view of form. They rather derive from the common stock of general language: new lexical units are built from already existing forms and/or meanings. The original form is used for naming a new concept by adding a distinctive specialized lexical feature while keeping some semantic features of the original concept. In this paper, we aim at explaining and visualizing the nature of some of the processes that allow for the construction of new senses in technical words through a branching and expanding process, as explained in the lexical constellation model. The analysis is performed on three words widely used in telecommunication English: “bus”, “hub” and “chip”. The understanding of the process may be of great help for learners of ESP in general and technical English in particular.

  3. Technical note: The Lagrangian particle dispersion model FLEXPART version 6.2

    Directory of Open Access Journals (Sweden)

    A. Stohl

    2005-01-01

    Full Text Available The Lagrangian particle dispersion model FLEXPART was originally (about 8 years ago designed for calculating the long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis. Its application fields were extended from air pollution studies to other topics where atmospheric transport plays a role (e.g., exchange between the stratosphere and troposphere, or the global water cycle. It has evolved into a true community model that is now being used by at least 25 groups from 14 different countries and is seeing both operational and research applications. A user manual has been kept actual over the years and was distributed over an internet page along with the model's source code. In this note we provide a citeable technical description of FLEXPART's latest version (6.2.

  4. Using Model-Based System Engineering to Provide Artifacts for NASA Project Life-Cycle and Technical Reviews Presentation

    Science.gov (United States)

    Parrott, Edith L.; Weiland, Karen J.

    2017-01-01

    This is the presentation for the AIAA Space conference in September 2017. It highlights key information from Using Model-Based Systems Engineering to Provide Artifacts for NASA Project Life-cycle and Technical Reviews paper.

  5. Series solution for continuous population models for single and interacting species by the homotopy analysis method

    Directory of Open Access Journals (Sweden)

    Magdy A. El-Tawil

    2012-07-01

    Full Text Available The homotopy analysis method (HAM is used to find approximate analytical solutions of continuous population models for single and interacting species. The homotopy analysis method contains the auxiliary parameter $hbar,$ which provides us with a simple way to adjust and control the convergence region of series solution. the solutions are compared with the numerical results obtained using NDSolve, an ordinary differential equation solver found in the Mathematica package and a good agreement is found. Also the solutions are compared with the available analytic results obtained by other methods and more accurate and convergent series solution found. The convergence region is also computed which shows the validity of the HAM solution. This method is reliable and manageable.

  6. Sample correlations of infinite variance time series models: an empirical and theoretical study

    Directory of Open Access Journals (Sweden)

    Jason Cohen

    1998-01-01

    Full Text Available When the elements of a stationary ergodic time series have finite variance the sample correlation function converges (with probability 1 to the theoretical correlation function. What happens in the case where the variance is infinite? In certain cases, the sample correlation function converges in probability to a constant, but not always. If within a class of heavy tailed time series the sample correlation functions do not converge to a constant, then more care must be taken in making inferences and in model selection on the basis of sample autocorrelations. We experimented with simulating various heavy tailed stationary sequences in an attempt to understand what causes the sample correlation function to converge or not to converge to a constant. In two new cases, namely the sum of two independent moving averages and a random permutation scheme, we are able to provide theoretical explanations for a random limit of the sample autocorrelation function as the sample grows.

  7. Volterra-series-based nonlinear system modeling and its engineering applications: A state-of-the-art review

    Science.gov (United States)

    Cheng, C. M.; Peng, Z. K.; Zhang, W. M.; Meng, G.

    2017-03-01

    Nonlinear problems have drawn great interest and extensive attention from engineers, physicists and mathematicians and many other scientists because most real systems are inherently nonlinear in nature. To model and analyze nonlinear systems, many mathematical theories and methods have been developed, including Volterra series. In this paper, the basic definition of the Volterra series is recapitulated, together with some frequency domain concepts which are derived from the Volterra series, including the general frequency response function (GFRF), the nonlinear output frequency response function (NOFRF), output frequency response function (OFRF) and associated frequency response function (AFRF). The relationship between the Volterra series and other nonlinear system models and nonlinear problem solving methods are discussed, including the Taylor series, Wiener series, NARMAX model, Hammerstein model, Wiener model, Wiener-Hammerstein model, harmonic balance method, perturbation method and Adomian decomposition. The challenging problems and their state of arts in the series convergence study and the kernel identification study are comprehensively introduced. In addition, a detailed review is then given on the applications of Volterra series in mechanical engineering, aeroelasticity problem, control engineering, electronic and electrical engineering.

  8. Evaluation model of commercial geological exploration and mining development project and analysis of some technical problems in commercial negotiation

    International Nuclear Information System (INIS)

    Yao Zhenkai

    2012-01-01

    A composite evaluation model of commercial geological exploration and mining development project was discussed, this new model consists of polity-economy-technique (PET) synthetic evaluation sub-model and geology-mining-metallurgy (GMM) technique evaluation sub-model. Besides, some key technical problems in commercial negotiation, such as information screening, quoted price and analysis of deadline, were briefly analyzed. (author)

  9. Teachers' modeling advantage and their modeling effects on college students' learning styles and occupational stereotypes: a case of collaborative teaching in technical courses.

    Science.gov (United States)

    Chiou, Wen-Bin; Yang, Chao-Chin

    2006-01-01

    In this study, modeling advantage that depicts the likelihood of a teacher model being imitated by students over other competing models in a particular class was developed to differentiate the rival modeling of two kinds of teachers (the technical teachers vs. the lecturing teachers) between college students' learning styles and occupational stereotypes in the collaborative teaching of technical courses. Results of a one-semester longitudinal study indicated that the students perceived a greater modeling advantage of the technical teachers than that of the lecturing teachers. Both the students' learning styles and occupational stereotypes were in accordance with those teachers as their role models. In general, the impact of the teachers' learning styles and occupational stereotypes on students appeared to be mediated by the teachers' modeling advantage. Administrators and curriculum designers should pay attention to the fact that the technical teachers appeared to exhibit greater modeling effects than the lecturing teachers in collaborative teaching.

  10. Analysis of hohlraum energetics of the SG series and the NIF experiments with energy balance model

    Directory of Open Access Journals (Sweden)

    Guoli Ren

    2017-01-01

    Full Text Available The basic energy balance model is applied to analyze the hohlraum energetics data from the Shenguang (SG series laser facilities and the National Ignition Facility (NIF experiments published in the past few years. The analysis shows that the overall hohlraum energetics data are in agreement with the energy balance model within 20% deviation. The 20% deviation might be caused by the diversity in hohlraum parameters, such as material, laser pulse, gas filling density, etc. In addition, the NIF's ignition target designs and our ignition target designs given by simulations are also in accordance with the energy balance model. This work confirms the value of the energy balance model for ignition target design and experimental data assessment, and demonstrates that the NIF energy is enough to achieve ignition if a 1D spherical radiation drive could be created, meanwhile both the laser plasma instabilities and hydrodynamic instabilities could be suppressed.

  11. The evaluator as technical assistant: A model for systemic reform support

    Science.gov (United States)

    Century, Jeanne Rose

    This study explored evaluation of systemic reform. Specifically, it focused on the evaluation of a systemic effort to improve K-8 science, mathematics and technology education. The evaluation was of particular interest because it used both technical assistance and evaluation strategies. Through studying the combination of these roles, this investigation set out to increase understanding of potentially new evaluator roles, distinguish important characteristics of the evaluator/project participant relationship, and identify how these roles and characteristics contribute to effective evaluation of systemic science education reform. This qualitative study used interview, document analysis, and participant observation as methods of data collection. Interviews were conducted with project leaders, project participants, and evaluators and focused on the evaluation strategies and process, the use of the evaluation, and technical assistance. Documents analyzed included transcripts of evaluation team meetings and reports, memoranda and other print materials generated by the project leaders and the evaluators. Data analysis consisted of analytic and interpretive procedures consistent with the qualitative data collected and entailed a combined process of coding transcripts of interviews and meetings, field notes, and other documents; analyzing and organizing findings; writing of reflective and analytic memos; and designing and diagramming conceptual relationships. The data analysis resulted in the development of the Multi-Function Model for Systemic Reform Support. This model organizes systemic reform support into three functions: evaluation, technical assistance, and a third, named here as "systemic perspective." These functions work together to support the project's educational goals as well as a larger goal--building capacity in project participants. This model can now serve as an informed starting point or "blueprint" for strategically supporting systemic reform.

  12. SERIES ARTICLES

    Indian Academy of Sciences (India)

    ensis fruit. 4. SERIES ARTICLES. Evolution of the Atmosphere and Oceans: Evidence from Geological Records. Evolution of the Early Atmosphere. P V Sukumaran. 11 Electrostatics in Chemistry. Electrostatic Models for Weak Molecular ...

  13. Visibility Modeling and Forecasting for Abu Dhabi using Time Series Analysis Method

    Science.gov (United States)

    Eibedingil, I. G.; Abula, B.; Afshari, A.; Temimi, M.

    2015-12-01

    Land-Atmosphere interactions-their strength, directionality and evolution-are one of the main sources of uncertainty in contemporary climate modeling. A particularly crucial role in sustaining and modulating land-atmosphere interaction is the one of aerosols and dusts. Aerosols are tiny particles suspended in the air ranging from a few nanometers to a few hundred micrometers in diameter. Furthermore, the amount of dust and fog in the atmosphere is an important measure of visibility, which is another dimension of land-atmosphere interactions. Visibility affects all form of traffic, aviation, land and sailing. Being able to predict the change of visibility in the air in advance enables relevant authorities to take necessary actions before the disaster falls. Time Series Analysis (TAS) method is an emerging technique for modeling and forecasting the behavior of land-atmosphere interactions, including visibility. This research assess the dynamics and evolution of visibility around Abu Dhabi International Airport (+24.4320 latitude, +54.6510 longitude, and 27m elevation) using mean daily visibility and mean daily wind speed. TAS has been first used to model and forecast the visibility, and then the Transfer Function Model has been applied, considering the wind speed as an exogenous variable. By considering the Akaike Information Criterion (AIC) and Mean Absolute Percentage Error (MAPE) as a statistical criteria, two forecasting models namely univarite time series model and transfer function model, were developed to forecast the visibility around Abu Dhabi International Airport for three weeks. Transfer function model improved the MAPE of the forecast significantly.

  14. Application of semi parametric modelling to times series forecasting: case of the electricity consumption

    International Nuclear Information System (INIS)

    Lefieux, V.

    2007-10-01

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  15. A Technical Review on Biomass Processing: Densification, Preprocessing, Modeling and Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright

    2010-06-01

    It is now a well-acclaimed fact that burning fossil fuels and deforestation are major contributors to climate change. Biomass from plants can serve as an alternative renewable and carbon-neutral raw material for the production of bioenergy. Low densities of 40–60 kg/m3 for lignocellulosic and 200–400 kg/m3 for woody biomass limits their application for energy purposes. Prior to use in energy applications these materials need to be densified. The densified biomass can have bulk densities over 10 times the raw material helping to significantly reduce technical limitations associated with storage, loading and transportation. Pelleting, briquetting, or extrusion processing are commonly used methods for densification. The aim of the present research is to develop a comprehensive review of biomass processing that includes densification, preprocessing, modeling and optimization. The specific objective include carrying out a technical review on (a) mechanisms of particle bonding during densification; (b) methods of densification including extrusion, briquetting, pelleting, and agglomeration; (c) effects of process and feedstock variables and biomass biochemical composition on the densification (d) effects of preprocessing such as grinding, preheating, steam explosion, and torrefaction on biomass quality and binding characteristics; (e) models for understanding the compression characteristics; and (f) procedures for response surface modeling and optimization.

  16. Fractality of profit landscapes and validation of time series models for stock prices

    Science.gov (United States)

    Yi, Il Gu; Oh, Gabjin; Kim, Beom Jun

    2013-08-01

    We apply a simple trading strategy for various time series of real and artificial stock prices to understand the origin of fractality observed in the resulting profit landscapes. The strategy contains only two parameters p and q, and the sell (buy) decision is made when the log return is larger (smaller) than p (-q). We discretize the unit square (p,q) ∈ [0,1] × [0,1] into the N × N square grid and the profit Π(p,q) is calculated at the center of each cell. We confirm the previous finding that local maxima in profit landscapes are scattered in a fractal-like fashion: the number M of local maxima follows the power-law form M ˜ Na, but the scaling exponent a is found to differ for different time series. From comparisons of real and artificial stock prices, we find that the fat-tailed return distribution is closely related to the exponent a ≈ 1.6 observed for real stock markets. We suggest that the fractality of profit landscape characterized by a ≈ 1.6 can be a useful measure to validate time series model for stock prices.

  17. FY 2016 Status Report on the Modeling of the M8 Calibration Series using MAMMOTH

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Benjamin Allen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ortensi, Javier [Idaho National Lab. (INL), Idaho Falls, ID (United States); DeHart, Mark David [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This report provides a summary of the progress made towards validating the multi-physics reactor analysis application MAMMOTH using data from measurements performed at the Transient Reactor Test facility, TREAT. The work completed consists of a series of comparisons of TREAT element types (standard and control rod assemblies) in small geometries as well as slotted mini-cores to reference Monte Carlo simulations to ascertain the accuracy of cross section preparation techniques. After the successful completion of these smaller problems, a full core model of the half slotted core used in the M8 Calibration series was assembled. Full core MAMMOTH simulations were compared to Serpent reference calculations to assess the cross section preparation process for this larger configuration. As part of the validation process the M8 Calibration series included a steady state wire irradiation experiment and coupling factors for the experiment region. The shape of the power distribution obtained from the MAMMOTH simulation shows excellent agreement with the experiment. Larger differences were encountered in the calculation of the coupling factors, but there is also great uncertainty on how the experimental values were obtained. Future work will focus on resolving some of these differences.

  18. Technical know-how for the investigation and modelling of topographic evolution for site characterisation - 59171

    International Nuclear Information System (INIS)

    Doke, Ryosuke; Yasue, Ken-ichi; Niizato, Tadafumi; Nakayasu, Akio

    2012-01-01

    Geological hazard assessments are being used to make important decisions relevant to nuclear facilities such as a repository for deep geological disposal of high-level radioactive waste. With respect to such repositories, topographic evolution is a key issue for description of the long-term evolution of a groundwater flow characteristics in time spans of tens to hundreds of thousands of years. The construction of topographic evolution models is complex, involving tacit knowledge and working processes. Therefore, it is important to externalise, that is to explicitly present the tacit knowledge and decision-making processes used by experts in the model building unambiguously, with thorough documentation and to provide key knowledge to support planning and implementation of investigations. In this study, documentation of the technical know-how used for the construction of a topographic evolution model is demonstrated. The process followed in the construction of the model is illustrated using task-flow logic diagrams; the process involves four main tasks with several sub-tasks. The task-flow followed for an investigation to estimate uplift rates linked to the task-flow for the modelling of topographic evolution is also illustrated. In addition, the decision-making processes in the investigation are expressed in logical IF-THEN format for each task. Based on the documented technical know-how, an IT-based Expert System was constructed. In future work, it is necessary to analyse the knowledge, including the management of uncertainties in the modelling and investigations, and to integrate fundamental ideas for managing uncertainties with expert system. (authors)

  19. A parameter network and model pyramid for managing technical information flow

    International Nuclear Information System (INIS)

    Sinnock, S.; Hartman, H.A.

    1994-01-01

    Prototypes of information management tools have been developed that can help communicate the technical basis for nuclear waste disposal to a broad audience of program scientists and engineers, project managers, and informed observers from stakeholder organizations. These tools include system engineering concepts, parameter networks expressed as influence diagrams, associated model hierarchies, and a relational database. These tools are used to express relationships among data-collection parameters, model input parameters, model output parameters, systems requirements, physical elements of a system description, and functional analysis of the contribution of physical elements and their associated parameters in satisfying the system requirements. By organizing parameters, models, physical elements, functions, and requirements in a visually reviewable network and a relational database the severe communication challenges facing participants in the nuclear waste dialog can be addressed. The network identifies the influences that data collected in the field have on measures of repository suitability, providing a visual, traceable map that clarifies the role of data and models in supporting conclusions about repository suitability. The map allows conclusions to be traced directly to the underlying parameters and models. Uncertainty in these underlying elements can be exposed to open review in the context of the effects uncertainty has on judgements about suitability. A parameter network provides a stage upon which an informed social dialog about the technical merits of a nuclear waste repository can be conducted. The basis for such dialog must be that stage, if decisions about repository suitability are to be based on a repository's ability to meet requirements embodied in laws and regulations governing disposal of nuclear wastes

  20. EO Model for Tacit Knowledge Externalization in Socio-Technical Enterprises

    Directory of Open Access Journals (Sweden)

    Shreyas Suresh Rao

    2017-03-01

    Full Text Available Aim/Purpose: A vital business activity within socio-technical enterprises is tacit knowledge externalization, which elicits and explicates tacit knowledge of enterprise employees as external knowledge. The aim of this paper is to integrate diverse aspects of externalization through the Enterprise Ontology model. Background: Across two decades, researchers have explored various aspects of tacit knowledge externalization. However, from the existing works, it is revealed that there is no uniform representation of the externalization process, which has resulted in divergent and contradictory interpretations across the literature. Methodology\t: The Enterprise Ontology model is constructed step-wise through the conceptual and measurement views. While the conceptual view encompasses three patterns that model the externalization process, the measurement view employs certainty-factor model to empirically measure the outcome of the externalization process. Contribution: The paper contributes towards knowledge management literature in two ways. The first contribution is the Enterprise Ontology model that integrates diverse aspects of externalization. The second contribution is a Web application that validates the model through a case study in banking. Findings: The findings show that the Enterprise Ontology model and the patterns are pragmatic in externalizing the tacit knowledge of experts in a problem-solving scenario within a banking enterprise. Recommendations for Practitioners\t: Consider the diverse aspects (what, where, when, why, and how during the tacit knowledge externalization process. Future Research:\tTo extend the Enterprise Ontology model to include externalization from partially automated enterprise systems.

  1. Time series modeling of soil moisture dynamics on a steep mountainous hillside

    Science.gov (United States)

    Kim, Sanghyun

    2016-05-01

    The response of soil moisture to rainfall events along hillslope transects is an important hydrologic process and a critical component of interactions between soil vegetation and the atmosphere. In this context, the research described in this article addresses the spatial distribution of soil moisture as a function of topography. In order to characterize the temporal variation in soil moisture on a steep mountainous hillside, a transfer function, including a model for noise, was introduced. Soil moisture time series with similar rainfall amounts, but different wetness gradients were measured in the spring and fall. Water flux near the soil moisture sensors was modeled and mathematical expressions were developed to provide a basis for input-output modeling of rainfall and soil moisture using hydrological processes such as infiltration, exfiltration and downslope lateral flow. The characteristics of soil moisture response can be expressed in terms of model structure. A seasonal comparison of models reveals differences in soil moisture response to rainfall, possibly associated with eco-hydrological process and evapotranspiration. Modeling results along the hillslope indicate that the spatial structure of the soil moisture response patterns mainly appears in deeper layers. Similarities between topographic attributes and stochastic model structures are spatially organized. The impact of temporal and spatial discretization scales on parameter expression is addressed in the context of modeling results that link rainfall events and soil moisture.

  2. Dynamics modeling for sugar cane sucrose estimation using time series satellite imagery

    Science.gov (United States)

    Zhao, Yu; Justina, Diego Della; Kazama, Yoriko; Rocha, Jansle Vieira; Graziano, Paulo Sergio; Lamparelli, Rubens Augusto Camargo

    2016-10-01

    Sugarcane, as one of the most mainstay crop in Brazil, plays an essential role in ethanol production. To monitor sugarcane crop growth and predict sugarcane sucrose content, remote sensing technology plays an essential role while accurate and timely crop growth information is significant, in particularly for large scale farming. We focused on the issues of sugarcane sucrose content estimation using time-series satellite image. Firstly, we calculated the spectral features and vegetation indices to make them be correspondence to the sucrose accumulation biological mechanism. Secondly, we improved the statistical regression model considering more other factors. The evaluation was performed and we got precision of 90% which is about 20% higher than the conventional method. The validation results showed that prediction accuracy using our sugarcane growth modeling and improved mix model is satisfied.

  3. Non-stationary time series modeling on caterpillars pest of palm oil for early warning system

    Science.gov (United States)

    Setiyowati, Susi; Nugraha, Rida F.; Mukhaiyar, Utriweni

    2015-12-01

    The oil palm production has an important role for the plantation and economic sector in Indonesia. One of the important problems in the cultivation of oil palm plantation is pests which causes damage to the quality of fruits. The caterpillar pest which feed palm tree's leaves will cause decline in quality of palm oil production. Early warning system is needed to minimize losses due to this pest. Here, we applied non-stationary time series modeling, especially the family of autoregressive models to predict the number of pests based on its historical data. We realized that there is some uniqueness of these pests data, i.e. the spike value that occur almost periodically. Through some simulations and case study, we obtain that the selection of constant factor has a significance influence to the model so that it can shoot the spikes value precisely.

  4. Technical Manual for the Geospatial Stream Flow Model (GeoSFM)

    Science.gov (United States)

    Asante, Kwabena O.; Artan, Guleid A.; Pervez, Md Shahriar; Bandaragoda, Christina; Verdin, James P.

    2008-01-01

    The monitoring of wide-area hydrologic events requires the use of geospatial and time series data available in near-real time. These data sets must be manipulated into information products that speak to the location and magnitude of the event. Scientists at the U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center have implemented a hydrologic modeling system which consists of an operational data processing system and the Geospatial Stream Flow Model (GeoSFM). The data processing system generates daily forcing evapotranspiration and precipitation data from various remotely sensed and ground-based data sources. To allow for rapid implementation in data scarce environments, widely available terrain, soil, and land cover data sets are used for model setup and initial parameter estimation. GeoSFM performs geospatial preprocessing and postprocessing tasks as well as hydrologic modeling tasks within an ArcView GIS environment. The integration of GIS routines and time series processing routines is achieved seamlessly through the use of dynamically linked libraries (DLLs) embedded within Avenue scripts. GeoSFM is run operationally to identify and map wide-area streamflow anomalies. Daily model results including daily streamflow and soil water maps are disseminated through Internet map servers, flood hazard bulletins and other media.

  5. Study on the Technical Efficiency of Creative Human Capital in China by Three-Stage Data Envelopment Analysis Model

    Directory of Open Access Journals (Sweden)

    Jian Ma

    2014-01-01

    Full Text Available Previous researches have proved the positive effect of creative human capital and its development on the development of economy. Yet, the technical efficiency of creative human capital and its effects are still under research. The authors are trying to estimate the technical efficiency value in Chinese context, which is adjusted by the environmental variables and statistical noises, by establishing a three-stage data envelopment analysis model, using data from 2003 to 2010. The research results indicate that, in this period, the entirety of creative human capital in China and the technical efficiency value in different regions and different provinces is still in the low level and could be promoted. Otherwise, technical non-efficiency is mostly derived from the scale nonefficiency and rarely affected by pure technical efficiency. The research also examines environmental variables’ marked effects on the technical efficiency, and it shows that different environmental variables differ in the aspect of their own effects. The expansion of the scale of education, development of healthy environment, growth of GDP, development of skill training, and population migration could reduce the input of creative human capital and promote the technical efficiency, while development of trade and institutional change, on the contrary, would block the input of creative human capital and the promotion the technical efficiency.

  6. Comparison of ARIMA and Random Forest time series models for prediction of avian influenza H5N1 outbreaks.

    Science.gov (United States)

    Kane, Michael J; Price, Natalie; Scotch, Matthew; Rabinowitz, Peter

    2014-08-13

    Time series models can play an important role in disease prediction. Incidence data can be used to predict the future occurrence of disease events. Developments in modeling approaches provide an opportunity to compare different time series models for predictive power. We applied ARIMA and Random Forest time series models to incidence data of outbreaks of highly pathogenic avian influenza (H5N1) in Egypt, available through the online EMPRES-I system. We found that the Random Forest model outperformed the ARIMA model in predictive ability. Furthermore, we found that the Random Forest model is effective for predicting outbreaks of H5N1 in Egypt. Random Forest time series modeling provides enhanced predictive ability over existing time series models for the prediction of infectious disease outbreaks. This result, along with those showing the concordance between bird and human outbreaks (Rabinowitz et al. 2012), provides a new approach to predicting these dangerous outbreaks in bird populations based on existing, freely available data. Our analysis uncovers the time-series structure of outbreak severity for highly pathogenic avain influenza (H5N1) in Egypt.

  7. A closed-loop power controller model of series-resonant-inverter-fitted induction heating system

    Directory of Open Access Journals (Sweden)

    Pal Palash

    2016-12-01

    Full Text Available This paper presents a mathematical model of a power controller for a high-frequency induction heating system based on a modified half-bridge series resonant inverter. The output real power is precise over the heating coil, and this real power is processed as a feedback signal that contends a closed-loop topology with a proportional-integral-derivative controller. This technique enables both control of the closed-loop power and determination of the stability of the high-frequency inverter. Unlike the topologies of existing power controllers, the proposed topology enables direct control of the real power of the high-frequency inverter.

  8. Final Technical Report: "Representing Endogenous Technological Change in Climate Policy Models: General Equilibrium Approaches"

    Energy Technology Data Exchange (ETDEWEB)

    Ian Sue Wing

    2006-04-18

    The research supported by this award pursued three lines of inquiry: (1) The construction of dynamic general equilibrium models to simulate the accumulation and substitution of knowledge, which has resulted in the preparation and submission of several papers: (a) A submitted pedagogic paper which clarifies the structure and operation of computable general equilibrium (CGE) models (C.2), and a review article in press which develops a taxonomy for understanding the representation of technical change in economic and engineering models for climate policy analysis (B.3). (b) A paper which models knowledge directly as a homogeneous factor, and demonstrates that inter-sectoral reallocation of knowledge is the key margin of adjustment which enables induced technical change to lower the costs of climate policy (C.1). (c) An empirical paper which estimates the contribution of embodied knowledge to aggregate energy intensity in the U.S. (C.3), followed by a companion article which embeds these results within a CGE model to understand the degree to which autonomous energy efficiency improvement (AEEI) is attributable to technical change as opposed to sub-sectoral shifts in industrial composition (C.4) (d) Finally, ongoing theoretical work to characterize the precursors and implications of the response of innovation to emission limits (E.2). (2) Data development and simulation modeling to understand how the characteristics of discrete energy supply technologies determine their succession in response to emission limits when they are embedded within a general equilibrium framework. This work has produced two peer-reviewed articles which are currently in press (B.1 and B.2). (3) Empirical investigation of trade as an avenue for the transmission of technological change to developing countries, and its implications for leakage, which has resulted in an econometric study which is being revised for submission to a journal (E.1). As work commenced on this topic, the U.S. withdrawal

  9. Multivariate time series modeling of short-term system scale irrigation demand

    Science.gov (United States)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara

    2015-12-01

    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as

  10. Modeling commodity salam contract between two parties for discrete and continuous time series

    Science.gov (United States)

    Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd

    2017-08-01

    In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.

  11. Financial Time Series Modelling with Hybrid Model Based on Customized RBF Neural Network Combined With Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Lukas Falat

    2014-01-01

    Full Text Available In this paper, authors apply feed-forward artificial neural network (ANN of RBF type into the process of modelling and forecasting the future value of USD/CAD time series. Authors test the customized version of the RBF and add the evolutionary approach into it. They also combine the standard algorithm for adapting weights in neural network with an unsupervised clustering algorithm called K-means. Finally, authors suggest the new hybrid model as a combination of a standard ANN and a moving average for error modeling that is used to enhance the outputs of the network using the error part of the original RBF. Using high-frequency data, they examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, authors perform the comparative out-of-sample analysis of the suggested hybrid model with statistical models and the standard neural network.

  12. Updating the CHAOS series of field models using Swarm data and resulting candidate models for IGRF-12

    DEFF Research Database (Denmark)

    Finlay, Chris; Olsen, Nils; Tøffner-Clausen, Lars

    Ten months of data from ESA's Swarm mission, together with recent ground observatory monthly means, are used to update the CHAOS series of geomagnetic field models with a focus on time-changes of the core field. As for previous CHAOS field models quiet-time, night-side, data selection criteria......th order spline representation with knot points spaced at 0.5 year intervals. The resulting field model is able to consistently fit data from six independent low Earth orbit satellites: Oersted, CHAMP, SAC-C and the three Swarm satellites. As an example, we present comparisons of the excellent model...... fit obtained to both the Swarm data and the CHAMP data. The new model also provides a good description of observatory secular variation, capturing rapid field evolution events during the past decade. Maps of the core surface field and its secular variation can already be extracted in the Swarm-era. We...

  13. Streamflow characteristics from modelled runoff time series: Importance of calibration criteria selection

    Science.gov (United States)

    Poole, Sandra; Vis, Marc; Knight, Rodney; Seibert, Jan

    2017-01-01

    Ecologically relevant streamflow characteristics (SFCs) of ungauged catchments are often estimated from simulated runoff of hydrologic models that were originally calibrated on gauged catchments. However, SFC estimates of the gauged donor catchments and subsequently the ungauged catchments can be substantially uncertain when models are calibrated using traditional approaches based on optimization of statistical performance metrics (e.g., Nash–Sutcliffe model efficiency). An improved calibration strategy for gauged catchments is therefore crucial to help reduce the uncertainties of estimated SFCs for ungauged catchments. The aim of this study was to improve SFC estimates from modeled runoff time series in gauged catchments by explicitly including one or several SFCs in the calibration process. Different types of objective functions were defined consisting of the Nash–Sutcliffe model efficiency, single SFCs, or combinations thereof. We calibrated a bucket-type runoff model (HBV – Hydrologiska Byråns Vattenavdelning – model) for 25 catchments in the Tennessee River basin and evaluated the proposed calibration approach on 13 ecologically relevant SFCs representing major flow regime components and different flow conditions. While the model generally tended to underestimate the tested SFCs related to mean and high-flow conditions, SFCs related to low flow were generally overestimated. The highest estimation accuracies were achieved by a SFC-specific model calibration. Estimates of SFCs not included in the calibration process were of similar quality when comparing a multi-SFC calibration approach to a traditional model efficiency calibration. For practical applications, this implies that SFCs should preferably be estimated from targeted runoff model calibration, and modeled estimates need to be carefully interpreted.

  14. Streamflow characteristics from modeled runoff time series - importance of calibration criteria selection

    Science.gov (United States)

    Pool, Sandra; Vis, Marc J. P.; Knight, Rodney R.; Seibert, Jan

    2017-11-01

    Ecologically relevant streamflow characteristics (SFCs) of ungauged catchments are often estimated from simulated runoff of hydrologic models that were originally calibrated on gauged catchments. However, SFC estimates of the gauged donor catchments and subsequently the ungauged catchments can be substantially uncertain when models are calibrated using traditional approaches based on optimization of statistical performance metrics (e.g., Nash-Sutcliffe model efficiency). An improved calibration strategy for gauged catchments is therefore crucial to help reduce the uncertainties of estimated SFCs for ungauged catchments. The aim of this study was to improve SFC estimates from modeled runoff time series in gauged catchments by explicitly including one or several SFCs in the calibration process. Different types of objective functions were defined consisting of the Nash-Sutcliffe model efficiency, single SFCs, or combinations thereof. We calibrated a bucket-type runoff model (HBV - Hydrologiska Byråns Vattenavdelning - model) for 25 catchments in the Tennessee River basin and evaluated the proposed calibration approach on 13 ecologically relevant SFCs representing major flow regime components and different flow conditions. While the model generally tended to underestimate the tested SFCs related to mean and high-flow conditions, SFCs related to low flow were generally overestimated. The highest estimation accuracies were achieved by a SFC-specific model calibration. Estimates of SFCs not included in the calibration process were of similar quality when comparing a multi-SFC calibration approach to a traditional model efficiency calibration. For practical applications, this implies that SFCs should preferably be estimated from targeted runoff model calibration, and modeled estimates need to be carefully interpreted.

  15. Documentation for Grants Equal to Tax model: Volume 1, Technical description

    International Nuclear Information System (INIS)

    1986-01-01

    A computerized model, the Grants Equal to Tax (GETT) model, was developed to assist in evaluating the amount of federal grant monies that would go to state and local jurisdictions under the provisions outlined in the Nuclear Waste Policy Act of 1982. The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes levied by state and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 1 of the GETT model documentation is a technical description of the program and its capabilities providing (1) descriptions of the data management system and its procedures; (2) formulas for calculating taxes (illustrated with flow charts); (3) descriptions of tax data base variables for the Deaf Smith County, Texas, Richton Dome, Mississippi, and Davis Canyon, Utah, salt sites; and (4) data inputs for the GETT model. 10 refs., 18 figs., 3 tabs

  16. Multivariate autoregressive modelling of sea level time series from TOPEX/Poseidon satellite altimetry

    Directory of Open Access Journals (Sweden)

    S. M. Barbosa

    2006-01-01

    Full Text Available This work addresses the autoregressive modelling of sea level time series from TOPEX/Poseidon satellite altimetry mission. Datasets from remote sensing applications are typically very large and correlated both in time and space. Multivariate analysis methods are useful tools to summarise and extract information from such large space-time datasets. Multivariate autoregressive analysis is a generalisation of Principal Oscillation Pattern (POP analysis, widely used in the geosciences for the extraction of dynamical modes by eigen-decomposition of a first order autoregressive model fitted to the multivariate dataset of observations. The extension of the POP methodology to autoregressions of higher order, although increasing the difficulties in estimation, allows one to model a larger class of complex systems. Here, sea level variability in the North Atlantic is modelled by a third order multivariate autoregressive model estimated by stepwise least squares. Eigen-decomposition of the fitted model yields physically-interpretable seasonal modes. The leading autoregressive mode is an annual oscillation and exhibits a very homogeneous spatial structure in terms of amplitude reflecting the large scale coherent behaviour of the annual pattern in the Northern hemisphere. The phase structure reflects the seesaw pattern between the western and eastern regions in the tropical North Atlantic associated with the trade winds regime. The second mode is close to a semi-annual oscillation. Multivariate autoregressive models provide a useful framework for the description of time-varying fields while enclosing a predictive potential.

  17. Assessing multiscale complexity of short heart rate variability series through a model-based linear approach

    Science.gov (United States)

    Porta, Alberto; Bari, Vlasta; Ranuzzi, Giovanni; De Maria, Beatrice; Baselli, Giuseppe

    2017-09-01

    We propose a multiscale complexity (MSC) method assessing irregularity in assigned frequency bands and being appropriate for analyzing the short time series. It is grounded on the identification of the coefficients of an autoregressive model, on the computation of the mean position of the poles generating the components of the power spectral density in an assigned frequency band, and on the assessment of its distance from the unit circle in the complex plane. The MSC method was tested on simulations and applied to the short heart period (HP) variability series recorded during graded head-up tilt in 17 subjects (age from 21 to 54 years, median = 28 years, 7 females) and during paced breathing protocols in 19 subjects (age from 27 to 35 years, median = 31 years, 11 females) to assess the contribution of time scales typical of the cardiac autonomic control, namely in low frequency (LF, from 0.04 to 0.15 Hz) and high frequency (HF, from 0.15 to 0.5 Hz) bands to the complexity of the cardiac regulation. The proposed MSC technique was compared to a traditional model-free multiscale method grounded on information theory, i.e., multiscale entropy (MSE). The approach suggests that the reduction of HP variability complexity observed during graded head-up tilt is due to a regularization of the HP fluctuations in LF band via a possible intervention of sympathetic control and the decrement of HP variability complexity observed during slow breathing is the result of the regularization of the HP variations in both LF and HF bands, thus implying the action of physiological mechanisms working at time scales even different from that of respiration. MSE did not distinguish experimental conditions at time scales larger than 1. Over a short time series MSC allows a more insightful association between cardiac control complexity and physiological mechanisms modulating cardiac rhythm compared to a more traditional tool such as MSE.

  18. Evaluation of a Global Vegetation Model using time series of satellite vegetation indices

    Directory of Open Access Journals (Sweden)

    F. Maignan

    2011-12-01

    Full Text Available Atmospheric CO2 drives most of the greenhouse effect increase. One major uncertainty on the future rate of increase of CO2 in the atmosphere is the impact of the anticipated climate change on the vegetation. Dynamic Global Vegetation Models (DGVM are used to address this question. ORCHIDEE is such a DGVM that has proven useful for climate change studies. However, there is no objective and methodological way to accurately assess each new available version on the global scale. In this paper, we submit a methodological evaluation of ORCHIDEE by correlating satellite-derived Vegetation Index time series against those of the modeled Fraction of absorbed Photosynthetically Active Radiation (FPAR. A perfect correlation between the two is not expected, however an improvement of the model should lead to an increase of the overall performance.

    We detail two case studies in which model improvements are demonstrated, using our methodology. In the first one, a new phenology version in ORCHIDEE is shown to bring a significant impact on the simulated annual cycles, in particular for C3 Grasses and C3 Crops. In the second case study, we compare the simulations when using two different weather fields to drive ORCHIDEE. The ERA-Interim forcing leads to a better description of the FPAR interannual anomalies than the simulation forced by a mixed CRU-NCEP dataset. This work shows that long time series of satellite observations, despite their uncertainties, can identify weaknesses in global vegetation models, a necessary first step to improving them.

  19. Modeling error and stability of endothelial cytoskeletal membrane parameters based on modeling transendothelial impedance as resistor and capacitor in series.

    Science.gov (United States)

    Bodmer, James E; English, Anthony; Brady, Megan; Blackwell, Ken; Haxhinasto, Kari; Fotedar, Sunaina; Borgman, Kurt; Bai, Er-Wei; Moy, Alan B

    2005-09-01

    Transendothelial impedance across an endothelial monolayer grown on a microelectrode has previously been modeled as a repeating pattern of disks in which the electrical circuit consists of a resistor and capacitor in series. Although this numerical model breaks down barrier function into measurements of cell-cell adhesion, cell-matrix adhesion, and membrane capacitance, such solution parameters can be inaccurate without understanding model stability and error. In this study, we have evaluated modeling stability and error by using a chi(2) evaluation and Levenberg-Marquardt nonlinear least-squares (LM-NLS) method of the real and/or imaginary data in which the experimental measurement is compared with the calculated measurement derived by the model. Modeling stability and error were dependent on current frequency and the type of experimental data modeled. Solution parameters of cell-matrix adhesion were most susceptible to modeling instability. Furthermore, the LM-NLS method displayed frequency-dependent instability of the solution parameters, regardless of whether the real or imaginary data were analyzed. However, the LM-NLS method identified stable and reproducible solution parameters between all types of experimental data when a defined frequency spectrum of the entire data set was selected on the basis of a criterion of minimizing error. The frequency bandwidth that produced stable solution parameters varied greatly among different data types. Thus a numerical model based on characterizing transendothelial impedance as a resistor and capacitor in series and as a repeating pattern of disks is not sufficient to characterize the entire frequency spectrum of experimental transendothelial impedance.

  20. Contributing to Tourism Industry Vitality of a Natural Resource Based Region through Educational/Technical Assistance. Staff Paper Series P83-20.

    Science.gov (United States)

    Blank, Uel; And Others

    From 1979 to 1982 an extension education program provided assistance to the tourism industry in rural communities adjoining northeastern Minnesota's Boundary Waters Canoe Area (BWCA). Program activities involved needs assessment, educational and technical assistance to communities and tourism-related firms, marketing programs, grants management…

  1. New Unesco Project To Improve Primary School Performance through Improved Nutrition and Health. First Technical Report. Nutrition Education Series, Issue 18.

    Science.gov (United States)

    United Nations Educational, Scientific, and Cultural Organization, Paris (France). Div. of Science, Technical and Environmental Education.

    A new Unesco project seeks to increase the capacity of developing countries to strengthen primary school academic performance by improving children's nutrition and health status. The first technical meeting of the new project took place in Stockholm, Sweden, in April 1989. Three working groups were formed which focused on assessment, intervention,…

  2. Technical and economic modelling of processes for liquid fuel production in Europe

    International Nuclear Information System (INIS)

    Bridgwater, A.V.; Double, J.M.

    1991-01-01

    The project which is described had the objective of examining the full range of technologies for liquid fuel production from renewable feedstocks in a technical and economic evaluation in order to identify the most promising technologies. The technologies considered are indirect thermochemical liquefaction (i.e. via gasification) to produce methanol, fuel alcohol or hydrocarbon fuels, direct thermochemical liquefaction or pyrolysis to produce hydrocarbon fuels and fermentation to produce ethanol. Feedstocks considered were wood, refuse derived fuel, straw, wheat and sugar beet. In order to carry out the evaluation, a computer model was developed, based on a unit process approach. Each unit operation is modelled as a process step, the model calculating the mass balance, energy balance and operating cost of the unit process. The results from the process step models are then combined to generate the mass balance, energy balance, capital cost and operating cost for the total process. The results show that the lowest production cost (L7/GJ) is obtained for methanol generated from a straw feedstock, but there is a moderate level of technical uncertainty associated with this result. The lowest production cost for hydrocarbon fuel (L8.6/GJ) is given by the pyrolysis process using a wood feedstock. This process has a high level of uncertainty. Fermentation processes showed the highest production costs, ranging from L14.4/GJ for a simple wood feedstock process to L25.2/GJ for a process based on sugar beet. The important conclusions are as follows: - In every case, the product cost is above current liquid fuel prices; - In most cases the feedstock cost dominates the production cost; -The most attractive products are thermochemically produced alcohol fuels

  3. Stochastic modeling for time series InSAR: with emphasis on atmospheric effects

    Science.gov (United States)

    Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai

    2018-02-01

    Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.

  4. A Virtual Machine Migration Strategy Based on Time Series Workload Prediction Using Cloud Model

    Directory of Open Access Journals (Sweden)

    Yanbing Liu

    2014-01-01

    Full Text Available Aimed at resolving the issues of the imbalance of resources and workloads at data centers and the overhead together with the high cost of virtual machine (VM migrations, this paper proposes a new VM migration strategy which is based on the cloud model time series workload prediction algorithm. By setting the upper and lower workload bounds for host machines, forecasting the tendency of their subsequent workloads by creating a workload time series using the cloud model, and stipulating a general VM migration criterion workload-aware migration (WAM, the proposed strategy selects a source host machine, a destination host machine, and a VM on the source host machine carrying out the task of the VM migration. Experimental results and analyses show, through comparison with other peer research works, that the proposed method can effectively avoid VM migrations caused by momentary peak workload values, significantly lower the number of VM migrations, and dynamically reach and maintain a resource and workload balance for virtual machines promoting an improved utilization of resources in the entire data center.

  5. Estimating the basic reproduction rate of HFMD using the time series SIR model in Guangdong, China.

    Directory of Open Access Journals (Sweden)

    Zhicheng Du

    Full Text Available Hand, foot, and mouth disease (HFMD has caused a substantial burden of disease in China, especially in Guangdong Province. Based on notifiable cases, we use the time series Susceptible-Infected-Recovered model to estimate the basic reproduction rate (R0 and the herd immunity threshold, understanding the transmission and persistence of HFMD more completely for efficient intervention in this province. The standardized difference between the reported and fitted time series of HFMD was 0.009 (<0.2. The median basic reproduction rate of total, enterovirus 71, and coxsackievirus 16 cases in Guangdong were 4.621 (IQR: 3.907-5.823, 3.023 (IQR: 2.289-4.292 and 7.767 (IQR: 6.903-10.353, respectively. The heatmap of R0 showed semiannual peaks of activity, including a major peak in spring and early summer (about the 12th week followed by a smaller peak in autumn (about the 36th week. The county-level model showed that Longchuan (R0 = 33, Gaozhou (R0 = 24, Huazhou (R0 = 23 and Qingxin (R0 = 19 counties have higher basic reproduction rate than other counties in the province. The epidemic of HFMD in Guangdong Province is still grim, and strategies like the World Health Organization's expanded program on immunization need to be implemented. An elimination of HFMD in Guangdong might need a Herd Immunity Threshold of 78%.

  6. Technical integration of hippocampus, Basal Ganglia and physical models for spatial navigation.

    Science.gov (United States)

    Fox, Charles; Humphries, Mark; Mitchinson, Ben; Kiss, Tamas; Somogyvari, Zoltan; Prescott, Tony

    2009-01-01

    Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large-scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings.

  7. Technical integration of hippocampus, basal ganglia and physical models for spatial navigation

    Directory of Open Access Journals (Sweden)

    Charles W Fox

    2009-03-01

    Full Text Available Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings.

  8. Cooling load calculation by the radiant time series method - effect of solar radiation models

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Alexandre M.S. [Universidade Estadual de Maringa (UEM), PR (Brazil)], E-mail: amscosta@uem.br

    2010-07-01

    In this work was analyzed numerically the effect of three different models for solar radiation on the cooling load calculated by the radiant time series' method. The solar radiation models implemented were clear sky, isotropic sky and anisotropic sky. The radiant time series' method (RTS) was proposed by ASHRAE (2001) for replacing the classical methods of cooling load calculation, such as TETD/TA. The method is based on computing the effect of space thermal energy storage on the instantaneous cooling load. The computing is carried out by splitting the heat gain components in convective and radiant parts. Following the radiant part is transformed using time series, which coefficients are a function of the construction type and heat gain (solar or non-solar). The transformed result is added to the convective part, giving the instantaneous cooling load. The method was applied for investigate the influence for an example room. The location used was - 23 degree S and 51 degree W and the day was 21 of January, a typical summer day in the southern hemisphere. The room was composed of two vertical walls with windows exposed to outdoors with azimuth angles equals to west and east directions. The output of the different models of solar radiation for the two walls in terms of direct and diffuse components as well heat gains were investigated. It was verified that the clear sky exhibited the less conservative (higher values) for the direct component of solar radiation, with the opposite trend for the diffuse component. For the heat gain, the clear sky gives the higher values, three times higher for the peek hours than the other models. Both isotropic and anisotropic models predicted similar magnitude for the heat gain. The same behavior was also verified for the cooling load. The effect of room thermal inertia was decreasing the cooling load during the peak hours. On the other hand the higher thermal inertia values are the greater for the non peak hours. The effect

  9. Deterministic decomposition and seasonal ARIMA time series models applied to airport noise forecasting

    Science.gov (United States)

    Guarnaccia, Claudio; Quartieri, Joseph; Tepedino, Carmine

    2017-06-01

    One of the most hazardous physical polluting agents, considering their effects on human health, is acoustical noise. Airports are a strong source of acoustical noise, due to the airplanes turbines, to the aero-dynamical noise of transits, to the acceleration or the breaking during the take-off and landing phases of aircrafts, to the road traffic around the airport, etc.. The monitoring and the prediction of the acoustical level emitted by airports can be very useful to assess the impact on human health and activities. In the airports noise scenario, thanks to flights scheduling, the predominant sources may have a periodic behaviour. Thus, a Time Series Analysis approach can be adopted, considering that a general trend and a seasonal behaviour can be highlighted and used to build a predictive model. In this paper, two different approaches are adopted, thus two predictive models are constructed and tested. The first model is based on deterministic decomposition and is built composing the trend, that is the long term behaviour, the seasonality, that is the periodic component, and the random variations. The second model is based on seasonal autoregressive moving average, and it belongs to the stochastic class of models. The two different models are fitted on an acoustical level dataset collected close to the Nice (France) international airport. Results will be encouraging and will show good prediction performances of both the adopted strategies. A residual analysis is performed, in order to quantify the forecasting error features.

  10. Scaling symmetry, renormalization, and time series modeling: the case of financial assets dynamics.

    Science.gov (United States)

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  11. A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoping Yang

    2016-01-01

    Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.

  12. Generalized least squares and empirical Bayes estimation in regional partial duration series index-flood modeling

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan

    1997-01-01

    A regional estimation procedure that combines the index-flood concept with an empirical Bayes method for inferring regional information is introduced. The model is based on the partial duration series approach with generalized Pareto (GP) distributed exceedances. The prior information of the model...... parameters is inferred from regional data using generalized least squares (GLS) regression. Two different Bayesian T-year event estimators are introduced: a linear estimator that requires only some moments of the prior distributions to be specified and a parametric estimator that is based on specified...... families of prior distributions. The regional method is applied to flood records from 48 New Zealand catchments. In the case of a strongly heterogeneous intersite correlation structure, the GLS procedure provides a more efficient estimate of the regional GP shape parameter as compared to the usually...

  13. An Application of the Coherent Noise Model for the Prediction of Aftershock Magnitude Time Series

    Directory of Open Access Journals (Sweden)

    Stavros-Richard G. Christopoulos

    2017-01-01

    Full Text Available Recently, the study of the coherent noise model has led to a simple (binary prediction algorithm for the forthcoming earthquake magnitude in aftershock sequences. This algorithm is based on the concept of natural time and exploits the complexity exhibited by the coherent noise model. Here, using the relocated catalogue from Southern California Seismic Network for 1981 to June 2011, we evaluate the application of this algorithm for the aftershocks of strong earthquakes of magnitude M≥6. The study is also extended by using the Global Centroid Moment Tensor Project catalogue to the case of the six strongest earthquakes in the Earth during the last almost forty years. The predictor time series exhibits the ubiquitous 1/f noise behavior.

  14. Time Series Neural Network Model for Part-of-Speech Tagging Indonesian Language

    Science.gov (United States)

    Tanadi, Theo

    2018-03-01

    Part-of-speech tagging (POS tagging) is an important part in natural language processing. Many methods have been used to do this task, including neural network. This paper models a neural network that attempts to do POS tagging. A time series neural network is modelled to solve the problems that a basic neural network faces when attempting to do POS tagging. In order to enable the neural network to have text data input, the text data will get clustered first using Brown Clustering, resulting a binary dictionary that the neural network can use. To further the accuracy of the neural network, other features such as the POS tag, suffix, and affix of previous words would also be fed to the neural network.

  15. Stochastic models in the DORIS position time series: estimates for IDS contribution to ITRF2014

    Science.gov (United States)

    Klos, Anna; Bogusz, Janusz; Moreaux, Guilhem

    2017-11-01

    This paper focuses on the investigation of the deterministic and stochastic parts of the Doppler Orbitography and Radiopositioning Integrated by Satellite (DORIS) weekly time series aligned to the newest release of ITRF2014. A set of 90 stations was divided into three groups depending on when the data were collected at an individual station. To reliably describe the DORIS time series, we employed a mathematical model that included the long-term nonlinear signal, linear trend, seasonal oscillations and a stochastic part, all being estimated with maximum likelihood estimation. We proved that the values of the parameters delivered for DORIS data are strictly correlated with the time span of the observations. The quality of the most recent data has significantly improved. Not only did the seasonal amplitudes decrease over the years, but also, and most importantly, the noise level and its type changed significantly. Among several tested models, the power-law process may be chosen as the preferred one for most of the DORIS data. Moreover, the preferred noise model has changed through the years from an autoregressive process to pure power-law noise with few stations characterised by a positive spectral index. For the latest observations, the medians of the velocity errors were equal to 0.3, 0.3 and 0.4 mm/year, respectively, for the North, East and Up components. In the best cases, a velocity uncertainty of DORIS sites of 0.1 mm/year is achievable when the appropriate coloured noise model is taken into consideration.

  16. Inference of quantitative models of bacterial promoters from time-series reporter gene data.

    Science.gov (United States)

    Stefan, Diana; Pinel, Corinne; Pinhal, Stéphane; Cinquemani, Eugenio; Geiselmann, Johannes; de Jong, Hidde

    2015-01-01

    The inference of regulatory interactions and quantitative models of gene regulation from time-series transcriptomics data has been extensively studied and applied to a range of problems in drug discovery, cancer research, and biotechnology. The application of existing methods is commonly based on implicit assumptions on the biological processes under study. First, the measurements of mRNA abundance obtained in transcriptomics experiments are taken to be representative of protein concentrations. Second, the observed changes in gene expression are assumed to be solely due to transcription factors and other specific regulators, while changes in the activity of the gene expression machinery and other global physiological effects are neglected. While convenient in practice, these assumptions are often not valid and bias the reverse engineering process. Here we systematically investigate, using a combination of models and experiments, the importance of this bias and possible corrections. We measure in real time and in vivo the activity of genes involved in the FliA-FlgM module of the E. coli motility network. From these data, we estimate protein concentrations and global physiological effects by means of kinetic models of gene expression. Our results indicate that correcting for the bias of commonly-made assumptions improves the quality of the models inferred from the data. Moreover, we show by simulation that these improvements are expected to be even stronger for systems in which protein concentrations have longer half-lives and the activity of the gene expression machinery varies more strongly across conditions than in the FliA-FlgM module. The approach proposed in this study is broadly applicable when using time-series transcriptome data to learn about the structure and dynamics of regulatory networks. In the case of the FliA-FlgM module, our results demonstrate the importance of global physiological effects and the active regulation of FliA and FlgM half-lives for

  17. A simulation model for wind energy storage systems. Volume 1: Technical report

    Science.gov (United States)

    Warren, A. W.; Edsinger, R. W.; Chan, Y. K.

    1977-01-01

    A comprehensive computer program for the modeling of wind energy and storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic) was developed. The level of detail of Simulation Model for Wind Energy Storage (SIMWEST) is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. The first program is a precompiler which generates computer models (in FORTRAN) of complex wind source storage application systems, from user specifications using the respective library components. The second program provides the techno-economic system analysis with the respective I/O, the integration of systems dynamics, and the iteration for conveyance of variables. SIMWEST program, as described, runs on the UNIVAC 1100 series computers.

  18. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    Science.gov (United States)

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  19. Clustering gene expression time series data using an infinite Gaussian process mixture model.

    Science.gov (United States)

    McDowell, Ian C; Manandhar, Dinesh; Vockley, Christopher M; Schmid, Amy K; Reddy, Timothy E; Engelhardt, Barbara E

    2018-01-01

    Transcriptome-wide time series expression profiling is used to characterize the cellular response to environmental perturbations. The first step to analyzing transcriptional response data is often to cluster genes with similar responses. Here, we present a nonparametric model-based method, Dirichlet process Gaussian process mixture model (DPGP), which jointly models data clusters with a Dirichlet process and temporal dependencies with Gaussian processes. We demonstrate the accuracy of DPGP in comparison to state-of-the-art approaches using hundreds of simulated data sets. To further test our method, we apply DPGP to published microarray data from a microbial model organism exposed to stress and to novel RNA-seq data from a human cell line exposed to the glucocorticoid dexamethasone. We validate our clusters by examining local transcription factor binding and histone modifications. Our results demonstrate that jointly modeling cluster number and temporal dependencies can reveal shared regulatory mechanisms. DPGP software is freely available online at https://github.com/PrincetonUniversity/DP_GP_cluster.

  20. Clustering gene expression time series data using an infinite Gaussian process mixture model.

    Directory of Open Access Journals (Sweden)

    Ian C McDowell

    2018-01-01

    Full Text Available Transcriptome-wide time series expression profiling is used to characterize the cellular response to environmental perturbations. The first step to analyzing transcriptional response data is often to cluster genes with similar responses. Here, we present a nonparametric model-based method, Dirichlet process Gaussian process mixture model (DPGP, which jointly models data clusters with a Dirichlet process and temporal dependencies with Gaussian processes. We demonstrate the accuracy of DPGP in comparison to state-of-the-art approaches using hundreds of simulated data sets. To further test our method, we apply DPGP to published microarray data from a microbial model organism exposed to stress and to novel RNA-seq data from a human cell line exposed to the glucocorticoid dexamethasone. We validate our clusters by examining local transcription factor binding and histone modifications. Our results demonstrate that jointly modeling cluster number and temporal dependencies can reveal shared regulatory mechanisms. DPGP software is freely available online at https://github.com/PrincetonUniversity/DP_GP_cluster.

  1. The string prediction models as invariants of time series in the forex market

    Science.gov (United States)

    Pincak, R.

    2013-12-01

    In this paper we apply a new approach of string theory to the real financial market. The models are constructed with an idea of prediction models based on the string invariants (PMBSI). The performance of PMBSI is compared to support vector machines (SVM) and artificial neural networks (ANN) on an artificial and a financial time series. A brief overview of the results and analysis is given. The first model is based on the correlation function as invariant and the second one is an application based on the deviations from the closed string/pattern form (PMBCS). We found the difference between these two approaches. The first model cannot predict the behavior of the forex market with good efficiency in comparison with the second one which is, in addition, able to make relevant profit per year. The presented string models could be useful for portfolio creation and financial risk management in the banking sector as well as for a nonlinear statistical approach to data optimization.

  2. Historical Streamflow Series Analysis Applied to Furnas HPP Reservoir Watershed Using the SWAT Model

    Directory of Open Access Journals (Sweden)

    Viviane de Souza Dias

    2018-04-01

    Full Text Available Over the last few years, the operation of the Furnas Hydropower Plant (HPP reservoir, located in the Grande River Basin, has been threatened due to a significant reduction in inflow. In the region, hydrological modelling tools are being used and tested to support decision making and water sustainability. In this study, the streamflow was modelled in the area of direct influence of the Furnas HPP reservoir, and the Soil and Water Assessment Tool (SWAT model performance was verified for studies in the region. Analyses of sensitivity and uncertainty were undertaken using the Sequential Uncertainty Fitting algorithm (SUFI-2 with a Calibration Uncertainty Program (SWAT-CUP. The hydrological modelling, at a monthly scale, presented good results in the calibration (NS 0.86, with a slight reduction of the coefficient in the validation period (NS 0.64. The results suggested that this tool could be applied in future hydrological studies in the region of study. With the consideration that special attention should be given to the historical series used in the calibration and validation of the models. It is important to note that this region has high demands for water resources, primarily for agricultural use. Water demands must also be taken into account in future hydrological simulations. The validation of this methodology led to important contributions to the management of water resources in regions with tropical climates, whose climatological and geological reality resembles the one studied here.

  3. Tracer kinetic model-driven registration for dynamic contrast-enhanced MRI time-series data.

    Science.gov (United States)

    Buonaccorsi, Giovanni A; O'Connor, James P B; Caunce, Angela; Roberts, Caleb; Cheung, Sue; Watson, Yvonne; Davies, Karen; Hope, Lynn; Jackson, Alan; Jayson, Gordon C; Parker, Geoffrey J M

    2007-11-01

    Dynamic contrast-enhanced MRI (DCE-MRI) time series data are subject to unavoidable physiological motion during acquisition (e.g., due to breathing) and this motion causes significant errors when fitting tracer kinetic models to the data, particularly with voxel-by-voxel fitting approaches. Motion correction is problematic, as contrast enhancement introduces new features into postcontrast images and conventional registration similarity measures cannot fully account for the increased image information content. A methodology is presented for tracer kinetic model-driven registration that addresses these problems by explicitly including a model of contrast enhancement in the registration process. The iterative registration procedure is focused on a tumor volume of interest (VOI), employing a three-dimensional (3D) translational transformation that follows only tumor motion. The implementation accurately removes motion corruption in a DCE-MRI software phantom and it is able to reduce model fitting errors and improve localization in 3D parameter maps in patient data sets that were selected for significant motion problems. Sufficient improvement was observed in the modeling results to salvage clinical trial DCE-MRI data sets that would otherwise have to be rejected due to motion corruption. Copyright 2007 Wiley-Liss, Inc.

  4. Optimizing the De-Noise Neural Network Model for GPS Time-Series Monitoring of Structures

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2015-09-01

    Full Text Available The Global Positioning System (GPS is recently used widely in structures and other applications. Notwithstanding, the GPS accuracy still suffers from the errors afflicting the measurements, particularly the short-period displacement of structural components. Previously, the multi filter method is utilized to remove the displacement errors. This paper aims at using a novel application for the neural network prediction models to improve the GPS monitoring time series data. Four prediction models for the learning algorithms are applied and used with neural network solutions: back-propagation, Cascade-forward back-propagation, adaptive filter and extended Kalman filter, to estimate which model can be recommended. The noise simulation and bridge’s short-period GPS of the monitoring displacement component of one Hz sampling frequency are used to validate the four models and the previous method. The results show that the Adaptive neural networks filter is suggested for de-noising the observations, specifically for the GPS displacement components of structures. Also, this model is expected to have significant influence on the design of structures in the low frequency responses and measurements’ contents.

  5. An advection-based model to increase the temporal resolution of PIV time series.

    Science.gov (United States)

    Scarano, Fulvio; Moore, Peter

    A numerical implementation of the advection equation is proposed to increase the temporal resolution of PIV time series. The method is based on the principle that velocity fluctuations are transported passively, similar to Taylor's hypothesis of frozen turbulence . In the present work, the advection model is extended to unsteady three-dimensional flows. The main objective of the method is that of lowering the requirement on the PIV repetition rate from the Eulerian frequency toward the Lagrangian one. The local trajectory of the fluid parcel is obtained by forward projection of the instantaneous velocity at the preceding time instant and backward projection from the subsequent time step. The trajectories are approximated by the instantaneous streamlines, which yields accurate results when the amplitude of velocity fluctuations is small with respect to the convective motion. The verification is performed with two experiments conducted at temporal resolutions significantly higher than that dictated by Nyquist criterion. The flow past the trailing edge of a NACA0012 airfoil closely approximates frozen turbulence , where the largest ratio between the Lagrangian and Eulerian temporal scales is expected. An order of magnitude reduction of the needed acquisition frequency is demonstrated by the velocity spectra of super-sampled series. The application to three-dimensional data is made with time-resolved tomographic PIV measurements of a transitional jet. Here, the 3D advection equation is implemented to estimate the fluid trajectories. The reduction in the minimum sampling rate by the use of super-sampling in this case is less, due to the fact that vortices occurring in the jet shear layer are not well approximated by sole advection at large time separation. Both cases reveal that the current requirements for time-resolved PIV experiments can be revised when information is poured from space to time . An additional favorable effect is observed by the analysis in the

  6. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  7. ANN-QSAR model for selection of anticancer leads from structurally heterogeneous series of compounds.

    Science.gov (United States)

    González-Díaz, Humberto; Bonet, Isis; Terán, Carmen; De Clercq, Erik; Bello, Rafael; García, Maria M; Santana, Lourdes; Uriarte, Eugenio

    2007-05-01

    Developing a model for predicting anticancer activity of any classes of organic compounds based on molecular structure is very important goal for medicinal chemist. Different molecular descriptors can be used to solve this problem. Stochastic molecular descriptors so-called the MARCH-INSIDE approach, shown to be very successful in drug design. Nevertheless, the structural diversity of compounds is so vast that we may need non-linear models such as artificial neural networks (ANN) instead of linear ones. SmartMLP-ANN analysis used to model the anticancer activity of organic compounds has shown high average accuracy of 93.79% (train performance) and predictability of 90.88% (validation performance) for the 8:3-MLP topology with different training and predicting series. This ANN model favourably compares with respect to a previous linear discriminant analysis (LDA) model [H. González-Díaz et al., J. Mol. Model 9 (2003) 395] that showed only 80.49% of accuracy and 79.34% of predictability. The present SmartMLP approach employed shorter training times of only 10h while previous models give accuracies of 70-89% only after 25-46 h of training. In order to illustrate the practical use of the model in bioorganic medicinal chemistry, we report the in silico prediction, and in vitro evaluation of six new synthetic tegafur analogues having IC(50) values in a broad range between 37.1 and 138 microgmL(-1) for leukemia (L1210/0) and human T-lymphocyte (Molt4/C8, CEM/0) cells. Theoretical predictions coincide very well with experimental results.

  8. Impact of different troposphere modelling methods on ZTD time series: case study of mountainous GPS stations

    Science.gov (United States)

    Stepniak, Katarzyna; Klos, Anna; Bock, Olivier; Bogusz, Janusz

    2016-04-01

    GNSS Zenith Total Delay (ZTD) data is useful for numerical weather forecasting and climate analysis. Considering the fact that tropospheric delays over the mountainous areas are the most difficult to be modelled, we explored the influence of different troposphere models in Precise Point Positioning (PPP) mode. We used GPS data from 2008 to 2014 at 28 permanent EUPOS (European Position Determination System) stations, including 9 EPN (EUREF Permanent Network) ones, located in the Sudeten and Carpathians. The GPS data was processed in PPP mode using Bernese 5.2 GNSS software with the final IGS (International GNSS Service) orbits and clocks. Different processing variants were tested implying the newest mapping functions (Global Mapping Function - GMF, and Vienna Mapping Function - VMF1) as well as different time resolutions and constraints on estimated parameters (ZTD and gradients). Median trends and amplitudes of annual/semi-annual oscillations for ZTD series were determined with Weighted Least Squares Estimation (WLSE) obtaining 0.1±0.5 mm/year and 44.7 / 7.2 ± 5 mm, respectively. Power Spectral Densities (PSDs) were estimated using Lomb-Scargle method for each of individual variants. PSDs showed, except oscillations of year and half a year, many other significant peaks in ZTD time series at higher frequencies, about 60, 30, 24, 20, 15, 12, 10, 8, 7, 6, 5, 4 and 3 cpy. The proper subtraction of the periodicities is crucial, because they will make stochastic part appear to be artificially autocorrelated. In order to recognized the periodicities in the ZTD signal, we analyzed the ZTD differences between GPS-derived delays and ERA-Interim reanalysis. The results of analysis showed the significant change from station to station and between variants. According to these results the authors will indicate an optimal processing strategy concerning troposphere modelling.

  9. Knowledge fusion: Time series modeling followed by pattern recognition applied to unusual sections of background data

    International Nuclear Information System (INIS)

    Burr, T.; Doak, J.; Howell, J.A.; Martinez, D.; Strittmatter, R.

    1996-03-01

    This report describes work performed during FY 95 for the Knowledge Fusion Project, which by the Department of Energy, Office of Nonproliferation and National Security. The project team selected satellite sensor data as the one main example to which its analysis algorithms would be applied. The specific sensor-fusion problem has many generic features that make it a worthwhile problem to attempt to solve in a general way. The generic problem is to recognize events of interest from multiple time series in a possibly noisy background. By implementing a suite of time series modeling and forecasting methods and using well-chosen alarm criteria, we reduce the number of false alarms. We then further reduce the number of false alarms by analyzing all suspicious sections of data, as judged by the alarm criteria, with pattern recognition methods. This report describes the implementation and application of this two-step process for separating events from unusual background. As a fortunate by-product of this activity, it is possible to gain a better understanding of the natural background

  10. Technical innovation: Intragastric Single Port Sleeve Gastrectomy (IGSG). A feasibility survival study on porcine model.

    Science.gov (United States)

    Estupinam, Oscar; Oliveira, André Lacerda de Abreu; Antunes, Fernanda; Galvão, Manoel; Phillips, Henrique; Scheffer, Jussara Peters; Rios, Marcelo; Zorron, Ricardo

    2018-01-01

    To perform technically the laparoscopic sleeve gastrectomy (LSG) using a unique Intragastric Single Port (IGSG) in animal swine model, evidencing an effective and safe procedure, optimizing the conventional technique. IGSG was performed in 4 minipigs, using a percutaneous intragastric single port located in the pre-pyloric region. The gastric stapling of the greater curvature started from the pre-pyloric region towards the angle of His by Endo GIA™ system and the specimen was removed through the single port. In the postoperative day 30, the animals were sacrificed and submitted to autopsy. All procedures were performed without conversion, and all survived 30 days. The mean operative time was 42 min. During the perioperative period no complications were observed during invagination and stapling. No postoperative complications occurred. Post-mortem examination showed no leaks or infectious complications. Intragastric Single Port is a feasible procedure that may be a suitable alternative technique of sleeve gastrectomy for the treatment of morbid obesity.

  11. Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model

    Science.gov (United States)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2014-02-01

    Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation

  12. Technical Note: Approximate Bayesian parameterization of a complex tropical forest model

    Science.gov (United States)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2013-08-01

    Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can

  13. Systems Modelling of the Socio-Technical Aspects of Residential Electricity Use and Network Peak Demand

    Science.gov (United States)

    Lewis, Jim; Mengersen, Kerrie; Buys, Laurie; Vine, Desley; Bell, John; Morris, Peter; Ledwich, Gerard

    2015-01-01

    Provision of network infrastructure to meet rising network peak demand is increasing the cost of electricity. Addressing this demand is a major imperative for Australian electricity agencies. The network peak demand model reported in this paper provides a quantified decision support tool and a means of understanding the key influences and impacts on network peak demand. An investigation of the system factors impacting residential consumers’ peak demand for electricity was undertaken in Queensland, Australia. Technical factors, such as the customers’ location, housing construction and appliances, were combined with social factors, such as household demographics, culture, trust and knowledge, and Change Management Options (CMOs) such as tariffs, price, managed supply, etc., in a conceptual ‘map’ of the system. A Bayesian network was used to quantify the model and provide insights into the major influential factors and their interactions. The model was also used to examine the reduction in network peak demand with different market-based and government interventions in various customer locations of interest and investigate the relative importance of instituting programs that build trust and knowledge through well designed customer-industry engagement activities. The Bayesian network was implemented via a spreadsheet with a tickbox interface. The model combined available data from industry-specific and public sources with relevant expert opinion. The results revealed that the most effective intervention strategies involve combining particular CMOs with associated education and engagement activities. The model demonstrated the importance of designing interventions that take into account the interactions of the various elements of the socio-technical system. The options that provided the greatest impact on peak demand were Off-Peak Tariffs and Managed Supply and increases in the price of electricity. The impact in peak demand reduction differed for each of the

  14. Systems Modelling of the Socio-Technical Aspects of Residential Electricity Use and Network Peak Demand.

    Science.gov (United States)

    Lewis, Jim; Mengersen, Kerrie; Buys, Laurie; Vine, Desley; Bell, John; Morris, Peter; Ledwich, Gerard

    2015-01-01

    Provision of network infrastructure to meet rising network peak demand is increasing the cost of electricity. Addressing this demand is a major imperative for Australian electricity agencies. The network peak demand model reported in this paper provides a quantified decision support tool and a means of understanding the key influences and impacts on network peak demand. An investigation of the system factors impacting residential consumers' peak demand for electricity was undertaken in Queensland, Australia. Technical factors, such as the customers' location, housing construction and appliances, were combined with social factors, such as household demographics, culture, trust and knowledge, and Change Management Options (CMOs) such as tariffs, price, managed supply, etc., in a conceptual 'map' of the system. A Bayesian network was used to quantify the model and provide insights into the major influential factors and their interactions. The model was also used to examine the reduction in network peak demand with different market-based and government interventions in various customer locations of interest and investigate the relative importance of instituting programs that build trust and knowledge through well designed customer-industry engagement activities. The Bayesian network was implemented via a spreadsheet with a tickbox interface. The model combined available data from industry-specific and public sources with relevant expert opinion. The results revealed that the most effective intervention strategies involve combining particular CMOs with associated education and engagement activities. The model demonstrated the importance of designing interventions that take into account the interactions of the various elements of the socio-technical system. The options that provided the greatest impact on peak demand were Off-Peak Tariffs and Managed Supply and increases in the price of electricity. The impact in peak demand reduction differed for each of the locations

  15. Model series in industrial robot systems. Sangyoyo robot system no kishu keiretsu

    Energy Technology Data Exchange (ETDEWEB)

    Hosono, T.; Ueno, T.; Izawa, T. (Meidensha Corp., Tokyo (Japan))

    1993-06-11

    Higher speed and rigidity are required in robots of which applications are expanding to other than material processing use. This paper describes systematization of model series according to applications, and development of industrial manipulators. As a result of expansion in workpieces requiring deburring. Such new application as cutting aluminum casting gates, and expanded application to handling works, requirements have emerged on improvement in robot functions and performances, and development of large capacity models that can handle loads of 100 kg or more. Therefore, five models have been developed that can meet torque requirements. It was learned that a robot that can carry a weight of up to 150 kg can handle almost all the applications. The improvement items included the expansion of available models, and function and performance improvements in the mechanisms and controls. General industrial manipulators have also been developed applying the master slave manipulator techniques used for remotely controlled maintenance works in nuclear energy related facilities. These manipulators are made so compact that they can be installed in small spaces in a factory, and facilitate maintenance works as a result of adopting an electric drive system. 4 refs., 5 figs., 7 tabs.

  16. Bayesian models of thermal and pluviometric time series in the Fucino plateau

    Directory of Open Access Journals (Sweden)

    Adriana Trabucco

    2011-09-01

    Full Text Available This work was developed within the Project Metodologie e sistemi integrati per la qualificazione di produzioni orticole del Fucino (Methodologies and integrated systems for the classification of horticultural products in the Fucino plateau, sponsored by the Italian Ministry of Education, University and Research, Strategic Projects, Law 448/97. Agro-system managing, especially if necessary to achieve high quality in speciality crops, requires knowledge of main features and intrinsic variability of climate. Statistical models may properly summarize the structure existing behind the observed variability, furthermore they may support the agronomic manager by providing the probability that meteorological events happen in a time window of interest. More than 30 years of daily values collected in four sites located on the Fucino plateau, Abruzzo region, Italy, were studied by fitting Bayesian generalized linear models to air temperature maximum /minimum and rainfall time series. Bayesian predictive distributions of climate variables supporting decision-making processes were calculated at different timescales, 5-days for temperatures and 10-days for rainfall, both to reduce computational efforts and to simplify statistical model assumptions. Technicians and field operators, even with limited statistical training, may exploit the model output by inspecting graphs and climatic profiles of the cultivated areas during decision-making processes. Realizations taken from predictive distributions may also be used as input for agro-ecological models (e.g. models of crop growth, water balance. Fitted models may be exploited to monitor climatic changes and to revise climatic profiles of interest areas, periodically updating the probability distributions of target climatic variables. For the sake of brevity, the description of results is limited to just one of the four sites, and results for all other sites are available as supplementary information.

  17. Solid Waste Projection Model: Database (Version 1.4). Technical reference manual

    Energy Technology Data Exchange (ETDEWEB)

    Blackburn, C.; Cillan, T.

    1993-09-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software and data structures utilized in developing the SWPM Version 1.4 Database. This document is intended for use by experienced database specialists and supports database maintenance, utility development, and database enhancement. Those interested in using the SWPM database should refer to the SWPM Database User`s Guide. This document is available from the PNL Task M Project Manager (D. L. Stiles, 509-372-4358), the PNL Task L Project Manager (L. L. Armacost, 509-372-4304), the WHC Restoration Projects Section Manager (509-372-1443), or the WHC Waste Characterization Manager (509-372-1193).

  18. Identification of two-phase flow regimes by time-series modeling

    International Nuclear Information System (INIS)

    King, C.H.; Ouyang, M.S.; Pei, B.S.

    1987-01-01

    The identification of two-phase flow patterns in pipes or ducts is important to the design and operation of thermal-hydraulic systems, especially in the nuclear reactor cores of boiling water reactors or in the steam generators of pressurized water reactors. Basically, two-phase flow shows some fluctuating characteristics even at steady-state conditions. These fluctuating characteristics can be analyzed by statistical methods for obtaining flow signatures. There have been a number of experimental studies conducted that are concerned with the statistical properties of void fraction or pressure pulsation in two-phase flow. In this study, the authors propose a new technique of identifying the patterns of air-water two-phase flow in a vertical pipe. This technique is based on analyzing the statistic characteristics of the pressure signals of the test loop by time-series modeling

  19. A comparative study of shallow groundwater level simulation with three time series models in a coastal aquifer of South China

    Science.gov (United States)

    Yang, Q.; Wang, Y.; Zhang, J.; Delgado, J.

    2017-05-01

    Accurate and reliable groundwater level forecasting models can help ensure the sustainable use of a watershed's aquifers for urban and rural water supply. In this paper, three time series analysis methods, Holt-Winters (HW), integrated time series (ITS), and seasonal autoregressive integrated moving average (SARIMA), are explored to simulate the groundwater level in a coastal aquifer, China. The monthly groundwater table depth data collected in a long time series from 2000 to 2011 are simulated and compared with those three time series models. The error criteria are estimated using coefficient of determination ( R 2), Nash-Sutcliffe model efficiency coefficient ( E), and root-mean-squared error. The results indicate that three models are all accurate in reproducing the historical time series of groundwater levels. The comparisons of three models show that HW model is more accurate in predicting the groundwater levels than SARIMA and ITS models. It is recommended that additional studies explore this proposed method, which can be used in turn to facilitate the development and implementation of more effective and sustainable groundwater management strategies.

  20. Time series models of decadal trends in the harmful algal species Karlodinium veneficum in Chesapeake Bay.

    Science.gov (United States)

    Lin, Chih-Hsien Michelle; Lyubchich, Vyacheslav; Glibert, Patricia M

    2018-03-01

    The harmful dinoflagellate, Karlodnium veneficum, has been implicated in fish-kill and other toxic, harmful algal bloom (HAB) events in waters worldwide. Blooms of K. veneficum are known to be related to coastal nutrient enrichment but the relationship is complex because this HAB taxon relies not only on dissolved nutrients but also particulate prey, both of which have also changed over time. Here, applying cross-correlations of climate-related physical factors, nutrients and prey, with abundance of K. veneficum over a 10-year (2002-2011) period, a synthesis of the interactive effects of multiple factors on this species was developed for Chesapeake Bay, where blooms of the HAB have been increasing. Significant upward trends in the time series of K. veneficum were observed in the mesohaline stations of the Bay, but not in oligohaline tributary stations. For the mesohaline regions, riverine sources of nutrients with seasonal lags, together with particulate prey with zero lag, explained 15%-46% of the variation in the K. veneficum time series. For the oligohaline regions, nutrients and particulate prey generally showed significant decreasing trends with time, likely a reflection of nutrient reduction efforts. A conceptual model of mid-Bay blooms is presented, in which K. veneficum, derived from the oceanic end member of the Bay, may experience enhanced growth if it encounters prey originating from the tributaries with different patterns of nutrient loading and which are enriched in nitrogen. For all correlation models developed herein, prey abundance was a primary factor in predicting K. veneficum abundance. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. A COMPARATIVE STUDY OF FORECASTING MODELS FOR TREND AND SEASONAL TIME SERIES DOES COMPLEX MODEL ALWAYS YIELD BETTER FORECAST THAN SIMPLE MODELS

    Directory of Open Access Journals (Sweden)

    Suhartono Suhartono

    2005-01-01

    Full Text Available Many business and economic time series are non-stationary time series that contain trend and seasonal variations. Seasonality is a periodic and recurrent pattern caused by factors such as weather, holidays, or repeating promotions. A stochastic trend is often accompanied with the seasonal variations and can have a significant impact on various forecasting methods. In this paper, we will investigate and compare some forecasting methods for modeling time series with both trend and seasonal patterns. These methods are Winter's, Decomposition, Time Series Regression, ARIMA and Neural Networks models. In this empirical research, we study on the effectiveness of the forecasting performance, particularly to answer whether a complex method always give a better forecast than a simpler method. We use a real data, that is airline passenger data. The result shows that the more complex model does not always yield a better result than a simpler one. Additionally, we also find the possibility to do further research especially the use of hybrid model by combining some forecasting method to get better forecast, for example combination between decomposition (as data preprocessing and neural network model.

  2. Balloon dilation of the eustachian tube in a cadaver model: technical considerations, learning curve, and potential barriers.

    Science.gov (United States)

    McCoul, Edward D; Singh, Ameet; Anand, Vijay K; Tabaee, Abtin

    2012-04-01

    The surgical management options for eustachian tube dysfunction have historically been limited. The goal of the current study was to evaluate the technical considerations, learning curve, and potential barriers for balloon dilation of the eustachian tube (BDET) as an alternative treatment modality. Prospective preclinical trial of BDET in a cadaver model. A novel balloon catheter device was used for eustachian tube dilation. Twenty-four BDET procedures were performed by three independent rhinologists with no prior experience with the procedure (eight procedures per surgeon). The duration and number of attempts of the individual steps and overall procedure were recorded. Endoscopic examination of the eustachian tube was performed after each procedure, and the surgeon was asked to rate the subjective difficulty on a five-point scale. Successful completion of the procedure occurred in each case. The overall mean duration of the procedure was 284 seconds, and a mean number of 1.15 attempts were necessary to perform the individual steps. The mean subjective procedure difficulty was noted as somewhat easy. Statistically shorter duration and subjectively easier procedure were noted in the second compared to the first half of the series, indicating a favorable learning curve. Linear fissuring within the eustachian tube lumen without submucosal disruption (nine procedures, 37%) and with submucosal disruption (five procedures, 21%) were noted. The significance of these physical findings is unclear. Preclinical testing of BDET is associated with favorable duration, learning curve, and overall ease of completion. Clinical trials are necessary to evaluate safety and efficacy. Copyright © 2012 The American Laryngological, Rhinological, and Otological Society, Inc.

  3. Technical note: Improving modeling of coagulation, curd firming, and syneresis of sheep milk.

    Science.gov (United States)

    Cipolat-Gotet, Claudio; Pazzola, Michele; Ferragina, Alessandro; Cecchinato, Alessio; Dettori, Maria L; Vacca, Giuseppe M

    2018-04-18

    The importance of milk coagulation properties for milk processing, cheese yield, and quality is widely recognized. The use of traditional coagulation traits presents several limitations for testing bovine milk and even more for sheep milk, due to its rapid coagulation and curd firming, and early syneresis of coagulum. The aim of this technical note is to test and improve model fitting for assessing coagulation, curd firming, and syneresis of sheep milk. Using milk samples from 87 Sarda ewes, we performed in duplicate lactodynamographic testing. On each of the 174 analyzed milk aliquots, using 180 observations from each aliquot (one every 15 s for 45 min after rennet addition), we compared 4 different curd firming models as a function of time (CF t , mm) using a nonlinear procedure. The most accurate and informative results were observed using a modified 4-parameter model, structured as follows: [Formula: see text] , where t is time, RCT eq (min) is the gelation time, CF P (mm) is the potential asymptotical CF at an infinite time, k CF (%/min) is the curd firming rate constant, and k SR (%/min) is the curd syneresis rate constant. To avoid nonconvergence and computational problems due to interrelations among the equation parameters, CF P was preliminarily defined as a function of maximum observed curd firmness (CF max , mm) recorded during the analysis. For this model, all the modeling equations of individual sheep milk aliquots were converging, with a negligible standard error of the estimates (coefficient of determination >0.99 for all individual sample equations). Repeatability of the modeled parameters was acceptable, also in the presence of curd syneresis during the lactodynamographic analysis. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. PNNL Technical Support to The Implementation of EMTA and EMTA-NLA Models in Autodesk® Moldflow® Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Wang, Jin

    2012-12-01

    Under the Predictive Engineering effort, PNNL developed linear and nonlinear property prediction models for long-fiber thermoplastics (LFTs). These models were implemented in PNNL’s EMTA and EMTA-NLA codes. While EMTA is a standalone software for the computation of the composites thermoelastic properties, EMTA-NLA presents a series of nonlinear models implemented in ABAQUS® via user subroutines for structural analyses. In all these models, it is assumed that the fibers are linear elastic while the matrix material can exhibit a linear or typical nonlinear behavior depending on the loading prescribed to the composite. The key idea is to model the constitutive behavior of the matrix material and then to use an Eshelby-Mori-Tanaka approach (EMTA) combined with numerical techniques for fiber length and orientation distributions to determine the behavior of the as-formed composite. The basic property prediction models of EMTA and EMTA-NLA have been subject for implementation in the Autodesk® Moldflow® software packages. These models are the elastic stiffness model accounting for fiber length and orientation distributions, the fiber/matrix interface debonding model, and the elastic-plastic models. The PNNL elastic-plastic models for LFTs describes the composite nonlinear stress-strain response up to failure by an elastic-plastic formulation associated with either a micromechanical criterion to predict failure or a continuum damage mechanics formulation coupling damage to plasticity. All the models account for fiber length and orientation distributions as well as fiber/matrix debonding that can occur at any stage of loading. In an effort to transfer the technologies developed under the Predictive Engineering project to the American automotive and plastics industries, PNNL has obtained the approval of the DOE Office of Vehicle Technologies to provide Autodesk, Inc. with the technical support for the implementation of the basic property prediction models of EMTA and

  5. PENGEMBANGAN FOIL NACA SERI 2412 SEBAGAI SISTEM PENYELAMAN MODEL KAPAL SELAM

    Directory of Open Access Journals (Sweden)

    Ali Munazid

    2015-06-01

    Full Text Available Bentuk  foil menghasilkan gaya angkat (lift force ketika foil dilewati oleh aliran fluida  karena adanya pengaruh interaksi antara aliran fluida dengan permukaan foil yang mengakibatkan tekanan permukaan atas lebih kecil dari permukaan bawah. Bagaimana mengaplikasikan teori foil pada hydroplane kapal selam sebagai  system penyelaman, dengan membalik foil maka lift force tersebut menjadi gaya ke bawah, dengan demikian memungkinkan kapal selam dapat menyelam, melayang dan bermanouver di bawah air, seperti halnya gerak pesawat terbang yang terbang dan melayang dengan menggunakan sayap. Dilakukan penelitian dan pengamatan terhadap kemampuan penyelaman (diving plan dari foil NACA seri 2412 pada model kapal selam, dengan mencari nilai Cl (coefisien lift di Laboratorium, serta mendesain bentuk badan kapal selam dan analisa gaya-gaya yang bekerja pada model kapal selam, jumlah gaya-gaya yang bekerja keatas lebih rendah dari gaya-gaya ke bawah maka kapal selam mampu menyelam. Penerapan Hydroplane sebagai diving plane dapat diterapkan, kemampuan penyelaman dipengaruhi oleh sudut flip  Hydroplane dan kecepatan model, semakin besar kecepatan dan sudut flip maka semakin besar kedalaman penyelaman yang dapat dilakukan.

  6. Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series

    Science.gov (United States)

    Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth

    2017-12-01

    The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large data sets. Gaussian processes (GPs) are a popular class of models used for this purpose, but since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small data sets. In this paper, we present a novel method for GPs modeling in one dimension where the computational requirements scale linearly with the size of the data set. We demonstrate the method by applying it to simulated and real astronomical time series data sets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically driven damped harmonic oscillators—providing a physical motivation for and interpretation of this choice—but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable GP methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.

  7. A Skill Score of Trajectory Model Evaluation Using Reinitialized Series of Normalized Cumulative Lagrangian Separation

    Science.gov (United States)

    Liu, Y.; Weisberg, R. H.

    2017-12-01

    The Lagrangian separation distance between the endpoints of simulated and observed drifter trajectories is often used to assess the performance of numerical particle trajectory models. However, the separation distance fails to indicate relative model performance in weak and strong current regions, such as a continental shelf and its adjacent deep ocean. A skill score is proposed based on the cumulative Lagrangian separation distances normalized by the associated cumulative trajectory lengths. The new metrics correctly indicates the relative performance of the Global HYCOM in simulating the strong currents of the Gulf of Mexico Loop Current and the weaker currents of the West Florida Shelf in the eastern Gulf of Mexico. In contrast, the Lagrangian separation distance alone gives a misleading result. Also, the observed drifter position series can be used to reinitialize the trajectory model and evaluate its performance along the observed trajectory, not just at the drifter end position. The proposed dimensionless skill score is particularly useful when the number of drifter trajectories is limited and neither a conventional Eulerian-based velocity nor a Lagrangian-based probability density function may be estimated.

  8. Real-time GPS Satellite Clock Error Prediction Based On No-stationary Time Series Model

    Science.gov (United States)

    Wang, Q.; Xu, G.; Wang, F.

    2009-04-01

    Analysis Centers of the IGS provide precise satellite ephemeris for GPS data post-processing. The accuracy of orbit products is better than 5cm, and that of the satellite clock errors (SCE) approaches 0.1ns (igscb.jpl.nasa.gov), which can meet with the requirements of precise point positioning (PPP). Due to the 13 day-latency of the IGS final products, only the broadcast ephemeris and IGS ultra rapid products (predicted) are applicable for real time PPP (RT-PPP). Therefore, development of an approach to estimate high precise GPS SCE in real time is of particular importance for RT-PPP. Many studies have been carried out for forecasting the corrections using models, such as Linear Model (LM), Quadratic Polynomial Model (QPM), Quadratic Polynomial Model with Cyclic corrected Terms (QPM+CT), Grey Model (GM) and Kalman Filter Model (KFM), etc. However, the precisions of these models are generally in nanosecond level. The purpose of this study is to develop a method using which SCE forecasting for RT-PPP can be reached with a precision of sub-nanosecond. Analysis of the last 8 years IGS SCE data shown that predicted precision depend on the stability of the individual satellite clock. The clocks of the most recent GPS satellites (BLOCK IIR and BLOCK IIR-M) are more stable than that of the former GPS satellites (BLOCK IIA). For the stable satellite clock, the next 6 hours SCE can be easily predict with LM. The residuals of unstable satellite clocks are periodic ones with noise components. Dominant periods of residuals are found by using Fourier Transform and Spectrum Analysis. For the rest part of the residuals, an auto-regression model is used to determine their systematic trends. Summarized from this study, a no-stationary time series model can be proposed to predict GPS SCE in real time. This prediction model includes: linear term, cyclic corrected terms and auto-regression term, which are used to represent SCE trend, cyclic parts and rest of the errors, respectively

  9. Hydrodynamic models for slurry bubble column reactors. Seventh technical progress report, January--March 1996

    Energy Technology Data Exchange (ETDEWEB)

    Gidaspow, D.

    1996-04-01

    The objective of this investigation is to convert our ``learning gas solid-liquid`` fluidization model into a predictive design model. The IIT hydrodynamic model computes the phase velocities and the volume fractions of gas, liquid and particulate phase. Model verification involves a comparison of these computed velocities and volume fractions to experimental values. A hydrodynamic model for multiphase flows, based on the principles of mass, momentum and energy conservation for each phase, was developed and applied to model gas-liquid, gas-liquid-solid fluidization and gas-solid-solid separation. To simulate the industrial slurry bubble column reactors, a computer program based on the hydrodynamic model was written with modules for chemical reactions (e.g. the synthesis of methanol), phase changes and heat exchangers. In the simulations of gas-liquid two phases flow system, the gas hold-ups, computed with a variety of operating conditions such as temperature, pressure, gas and liquid velocities, agree well with the measurements obtained at Air Products` pilot plant. The hydrodynamic model has more flexible features than the previous empirical correlations in predicting the gas hold-up of gas-liquid two-phase flow systems. In the simulations of gas-liquid-solid bubble column reactors with and without slurry circulation, the code computes volume fractions, temperatures and velocity distributions for the gas, the liquid and the solid phases, as well as concentration distributions for the species (CO, H{sub 2}, CH{sub 3}0H, ... ), after startup from a certain initial state. A kinetic theory approach is used to compute a solid viscosity due to particle collisions. Solid motion and gas-liquid-solid mixing are observed on a color PCSHOW movie made from computed time series data. The steady state and time average catalyst concentration profiles, the slurry height and the rates of methanol production agree well with the measurements obtained at an Air Products` pilot plant.

  10. Pan-Arctic TV Series on Inuit wellness: a northern model of communication for social change?

    Science.gov (United States)

    Johnson, Rhonda; Morales, Robin; Leavitt, Doreen; Carry, Catherine; Kinnon, Dianne; Rideout, Denise; Clarida, Kath

    2011-06-01

    This paper provides highlights of a utilization-focused evaluation of a collaborative Pan-Arctic Inuit Wellness TV Series that was broadcast live in Alaska and Canada in May 2009. This International Polar Year (IPY) communication and outreach project intended to (1) share information on International Polar Year research progress, disseminate findings and explore questions with Inuit in Alaska, Canada and Greenland; (2) provide a forum for Inuit in Alaska, Canada and Greenland to showcase innovative health and wellness projects; (3) ensure Inuit youth and adult engagement throughout; and (4) document and reflect on the overall experience for the purposes of developing and "testing" a participatory communication model. Utilization-focused formative evaluation of the project, with a focus on overall objectives, key messages and lessons learned to facilitate program improvement. Participant observation, surveys, key informant interviews, document review and website tracking. Promising community programs related to 3 themes - men's wellness, maternity care and youth resilience - in diverse circumpolar regions were highlighted, as were current and stillevolving findings from ongoing Arctic research. Multiple media methods were used to effectively deliver and receive key messages determined by both community and academic experts. Local capacity and new regional networks were strengthened. Evidence-based resources for health education and community action were archived in digital formats (websites and DVDs), increasing accessibility to otherwise isolated individuals and remote communities. The Pan-Arctic Inuit Wellness TV Series was an innovative, multi-dimensional communication project that raised both interest and awareness about complex health conditions in the North and stimulated community dialogue and potential for increased collaborative action. Consistent with a communication for social change approach, the project created new networks, increased motivation to act

  11. Preliminary Study of Soil Available Nutrient Simulation Using a Modified WOFOST Model and Time-Series Remote Sensing Observations

    OpenAIRE

    Zhiqiang Cheng; Jihua Meng; Yanyou Qiao; Yiming Wang; Wenquan Dong; Yanxin Han

    2018-01-01

    The approach of using multispectral remote sensing (RS) to estimate soil available nutrients (SANs) has been recently developed and shows promising results. This method overcomes the limitations of commonly used methods by building a statistical model that connects RS-based crop growth and nutrient content. However, the stability and accuracy of this model require improvement. In this article, we replaced the statistical model by integrating the World Food Studies (WOFOST) model and time seri...

  12. Linear time series modeling of GPS-derived TEC observations over the Indo-Thailand region

    Science.gov (United States)

    Suraj, Puram Sai; Kumar Dabbakuti, J. R. K.; Chowdhary, V. Rajesh; Tripathi, Nitin K.; Ratnam, D. Venkata

    2017-12-01

    This paper proposes a linear time series model to represent the climatology of the ionosphere and to investigate the characteristics of hourly averaged total electron content (TEC). The GPS-TEC observation data at the Bengaluru international global navigation satellite system (GNSS) service (IGS) station (geographic 13.02°N , 77.57°E ; geomagnetic latitude 4.4°N ) have been utilized for processing the TEC data during an extended period (2009-2016) in the 24{th} solar cycle. Solar flux F10.7p index, geomagnetic Ap index, and periodic oscillation factors have been considered to construct a linear TEC model. It is evident from the results that solar activity effect on TEC is high. It reaches the maximum value (˜ 40 TECU) during the high solar activity (HSA) year (2014) and minimum value (˜ 15 TECU) during the low solar activity (LSA) year (2009). The larger magnitudes of semiannual variations are observed during the HSA periods. The geomagnetic effect on TEC is relatively low, with the highest being ˜ 4 TECU (March 2015). The magnitude of periodic variations can be seen more significantly during HSA periods (2013-2015) and less during LSA periods (2009-2011). The correlation coefficient of 0.89 between the observations and model-based estimations has been found. The RMSE between the observed TEC and model TEC values is 4.0 TECU (linear model) and 4.21 TECU (IRI2016 Model). Further, the linear TEC model has been validated at different latitudes over the northern low-latitude region. The solar component (F10.7p index) value decreases with an increase in latitude. The magnitudes of the periodic component become less significant with the increase in latitude. The influence of geomagnetic component becomes less significant at Lucknow GNSS station (26.76°N, 80.88°E) when compared to other GNSS stations. The hourly averaged TEC values have been considered and ionospheric features are well recovered with linear TEC model.

  13. Beyond Rating Curves: Time Series Models for in-Stream Turbidity Prediction

    Science.gov (United States)

    Wang, L.; Mukundan, R.; Zion, M.; Pierson, D. C.

    2012-12-01

    The New York City Department of Environmental Protection (DEP) manages New York City's water supply, which is comprised of over 20 reservoirs and supplies over 1 billion gallons of water per day to more than 9 million customers. DEP's "West of Hudson" reservoirs located in the Catskill Mountains are unfiltered per a renewable filtration avoidance determination granted by the EPA. While water quality is usually pristine, high volume storm events occasionally cause the reservoirs to become highly turbid. A logical strategy for turbidity control is to temporarily remove the turbid reservoirs from service. While effective in limiting delivery of turbid water and reducing the need for in-reservoir alum flocculation, this strategy runs the risk of negatively impacting water supply reliability. Thus, it is advantageous for DEP to understand how long a particular turbidity event will affect their system. In order to understand the duration, intensity and total load of a turbidity event, predictions of future in-stream turbidity values are important. Traditionally, turbidity predictions have been carried out by applying streamflow observations/forecasts to a flow-turbidity rating curve. However, predictions from rating curves are often inaccurate due to inter- and intra-event variability in flow-turbidity relationships. Predictions can be improved by applying an autoregressive moving average (ARMA) time series model in combination with a traditional rating curve. Since 2003, DEP and the Upstate Freshwater Institute have compiled a relatively consistent set of 15-minute turbidity observations at various locations on Esopus Creek above Ashokan Reservoir. Using daily averages of this data and streamflow observations at nearby USGS gauges, flow-turbidity rating curves were developed via linear regression. Time series analysis revealed that the linear regression residuals may be represented using an ARMA(1,2) process. Based on this information, flow-turbidity regressions with

  14. An Algorithm for Modified Times Series Analysis Method for Modeling and Prognosis of the River Water Quality

    Directory of Open Access Journals (Sweden)

    Petrov M.

    2007-12-01

    Full Text Available An algorithm and programs for modeling, analysis, and prognosis of river quality has been developed, which is a modified method of the time series analysis (TSA. The algorithm and program are used for modeling and prognosis of the river quality of Bulgarian river ecosystems.

  15. Assessment and prediction of road accident injuries trend using time-series models in Kurdistan.

    Science.gov (United States)

    Parvareh, Maryam; Karimi, Asrin; Rezaei, Satar; Woldemichael, Abraha; Nili, Sairan; Nouri, Bijan; Nasab, Nader Esmail

    2018-01-01

    Road traffic accidents are commonly encountered incidents that can cause high-intensity injuries to the victims and have direct impacts on the members of the society. Iran has one of the highest incident rates of road traffic accidents. The objective of this study was to model the patterns of road traffic accidents leading to injury in Kurdistan province, Iran. A time-series analysis was conducted to characterize and predict the frequency of road traffic accidents that lead to injury in Kurdistan province. The injuries were categorized into three separate groups which were related to the car occupants, motorcyclists and pedestrian road traffic accident injuries. The Box-Jenkins time-series analysis was used to model the injury observations applying autoregressive integrated moving average (ARIMA) and seasonal autoregressive integrated moving average (SARIMA) from March 2009 to February 2015 and to predict the accidents up to 24 months later (February 2017). The analysis was carried out using R-3.4.2 statistical software package. A total of 5199 pedestrians, 9015 motorcyclists, and 28,906 car occupants' accidents were observed. The mean (SD) number of car occupant, motorcyclist and pedestrian accident injuries observed were 401.01 (SD 32.78), 123.70 (SD 30.18) and 71.19 (SD 17.92) per year, respectively. The best models for the pattern of car occupant, motorcyclist, and pedestrian injuries were the ARIMA (1, 0, 0), SARIMA (1, 0, 2) (1, 0, 0) 12 , and SARIMA (1, 1, 1) (0, 0, 1) 12 , respectively. The motorcyclist and pedestrian injuries showed a seasonal pattern and the peak was during summer (August). The minimum frequency for the motorcyclist and pedestrian injuries were observed during the late autumn and early winter (December and January). Our findings revealed that the observed motorcyclist and pedestrian injuries had a seasonal pattern that was explained by air temperature changes overtime. These findings call the need for close monitoring of the

  16. Multiscale cartilage biomechanics: technical challenges in realizing a high-throughput modelling and simulation workflow.

    Science.gov (United States)

    Erdemir, Ahmet; Bennetts, Craig; Davis, Sean; Reddy, Akhil; Sibole, Scott

    2015-04-06

    Understanding the mechanical environment of articular cartilage and chondrocytes is of the utmost importance in evaluating tissue damage which is often related to failure of the fibre architecture and mechanical injury to the cells. This knowledge also has significant implications for understanding the mechanobiological response in healthy and diseased cartilage and can drive the development of intervention strategies, ranging from the design of tissue-engineered constructs to the establishment of rehabilitation protocols. Spanning multiple spatial scales, a wide range of biomechanical factors dictate this mechanical environment. Computational modelling and simulation provide descriptive and predictive tools to identify multiscale interactions, and can lead towards a greater comprehension of healthy and diseased cartilage function, possibly in an individualized manner. Cartilage and chondrocyte mechanics can be examined in silico, through post-processing or feed-forward approaches. First, joint-tissue level simulations, typically using the finite-element method, solve boundary value problems representing the joint articulation and underlying tissue, which can differentiate the role of compartmental joint loading in cartilage contact mechanics and macroscale cartilage field mechanics. Subsequently, tissue-cell scale simulations, driven by the macroscale cartilage mechanical field information, can predict chondrocyte deformation metrics along with the mechanics of the surrounding pericellular and extracellular matrices. A high-throughput modelling and simulation framework is necessary to develop models representative of regional and population-wide variations in cartilage and chondrocyte anatomy and mechanical properties, and to conduct large-scale analysis accommodating a multitude of loading scenarios. However, realization of such a framework is a daunting task, with technical difficulties hindering the processes of model development, scale coupling, simulation and

  17. Computer models of dipole magnets of a series 'VULCAN' for the ALICE experiment

    International Nuclear Information System (INIS)

    Vodop'yanov, A.S.; Shishov, Yu.A.; Yuldasheva, M.B.; Yuldashev, O.I.

    1998-01-01

    The paper is devoted to a construction of computer models for three magnets of the 'VULCAN' series in the framework of a differential approach for two scalar potentials. The distinctive property of these magnets is that they are 'warm' and their coils are of conic saddle shape. The algorithm of creating a computer model for the coils is suggested. The coil field is computed by Biot-Savart law and a part of the integrals is calculated with the help of analytical formulas. To compute three-dimensional magnetic fields by the finite element method with a local accuracy control, two new algorithms are suggested. The former is based on a comparison of the fields computed by means of linear and quadratic shape functions. The latter is based on a comparison of the field computed with the help of linear shape functions and a local classical solution. The distributions of the local accuracy control characteristics within a working part of the third magnet and the other results of the computations are presented

  18. Technical and Non-Technical Measures for air pollution emission reduction: The integrated assessment of the regional Air Quality Management Plans through the Italian national model

    Science.gov (United States)

    D'Elia, I.; Bencardino, M.; Ciancarella, L.; Contaldi, M.; Vialetto, G.

    2009-12-01

    The Italian Air Quality legislation underwent sweeping changes with the implementation of the 1996 European Air Quality Framework Directive when the Italian administrative Regions were entrusted with air quality management tasks. The most recent Regional Air Quality Management Plans (AQMPs) highlighted the importance of Non-Technical Measures (NTMs), in addition to Technical Measures (TMs), in meeting environmental targets. The aim of the present work is to compile a list of all the TMs and NTMs taken into account in the Italian Regional AQMPs and to give in the target year, 2010, an estimation of SO 2, NO x and PM 10 emission reductions, of PM 10 concentration and of the health impact of PM 2.5 concentrations in terms of Life Expectancy Reduction. In order to do that, RAINS-Italy, as part of the National Integrated Modeling system for International Negotiation on atmospheric pollution (MINNI), has been applied. The management of TMs and NTMs inside RAINS have often obliged both the introduction of exogenous driving force scenarios and the control strategy modification. This has inspired a revision of the many NTM definitions and a clear choice of the definition adopted. It was finally highlighted that only few TMs and NTMs implemented in the AQMPs represent effective measures in reaching the environmental targets.

  19. Validation of detailed thermal hydraulic models used for LMR safety and for improvement of technical specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, F.E.

    1995-12-31

    Detailed steady-state and transient coolant temperatures and flow rates from an operating reactor have been used to validate the multiple pin model in the SASSYS-1 liquid metal reactor systems analysis code. This multiple pin capability can be used for explicit calculations of axial and lateral temperature distributions within individual subassemblies. Thermocouples at a number of axial locations and in a number of different coolant sub-channels m the XXO9 instrumented subassembly in the EBR-II reactor provided temperature data from the Shutdown Heat Removal Test (SHRT) series. Flow meter data for XXO9 and for the overall system are also available from these tests. Results of consistent SASSYS-1 multiple pin analyses for both the SHRT-45 loss-of-flow-without-scram-test and the S14RT-17 protected loss-of-flow test agree well with the experimental data, providing validation of the SASSYS-1 code over a wide range of conditions.

  20. Air/superfund national technical guidance study series. Volume 4. Guidance for ambient air monitoring at superfund sites (revised). Final report

    International Nuclear Information System (INIS)

    Roffman, A.; Stoner, R.

    1993-05-01

    The report presents the results of an EPA-sponsored study to develop guidance for designing and conducting ambient air monitoring at Superfund sites. By law, all exposure pathways - including the air pathway - must be evaluated for every Superfund site; therefore, some level of ambient air monitoring usually is necessary at each site. The document offers technical guidance for use by a diverse audience, including EPA Air and Superfund Regional and Headquarters staff, State Air and Superfund staff, federal and state remedial and removal contractors, and potentially responsible parties. The manual is written to serve the needs of individuals with various levels of scientific training and experience in selecting and using ambient air monitoring methods in support of air pathway assessments

  1. Structuring Socio-technical Complexity: Modelling Agent Systems using Institutional Analysis

    NARCIS (Netherlands)

    Ghorbani, A.

    2013-01-01

    Socio-technical systems consist of many heterogeneous decision making entities and technological artefacts. These systems are governed through public policy that unravels in a multi-scale institutional context, which ranges from norms and values to technical standards. Simulation, agent-based

  2. Career and Technical Education Teacher Shortage: A Successful Model for Recruitment and Retention

    Science.gov (United States)

    Wilkin, Thomas; Nwoke, Godfrey I.

    2011-01-01

    The role of Career and Technical Education (CTE) as a major source of skilled workers for the American economy and a vital component of American education is well established. Several recent studies show that when CTE programs combine rigorous academic standards and industry-based technical content, the result is higher academic achievement and…

  3. Improving Service Quality in Technical Education: Use of Interpretive Structural Modeling

    Science.gov (United States)

    Debnath, Roma Mitra; Shankar, Ravi

    2012-01-01

    Purpose: The purpose of this paper is to identify the relevant enablers and barriers related to technical education. It seeks to critically analyze the relationship amongst them so that policy makers can focus on relevant parameters to improve the service quality of technical education. Design/methodology/approach: The present study employs the…

  4. Economic modeling of directed technical change: the case of CO2 emission reduction

    NARCIS (Netherlands)

    Otto, V.M.

    2006-01-01

    The potential of technical change for cost-effective pollution abatement typically differs from technology to technology. It therefore is the aim of this thesis to study how policy instruments can direct technical change to those technologies with the greatest potential for cost-effective pollution

  5. Advanced methods for modeling water-levels and estimating drawdowns with SeriesSEE, an Excel add-in

    Science.gov (United States)

    Halford, Keith; Garcia, C. Amanda; Fenelon, Joe; Mirus, Benjamin B.

    2012-12-21

    Water-level modeling is used for multiple-well aquifer tests to reliably differentiate pumping responses from natural water-level changes in wells, or “environmental fluctuations.” Synthetic water levels are created during water-level modeling and represent the summation of multiple component fluctuations, including those caused by environmental forcing and pumping. Pumping signals are modeled by transforming step-wise pumping records into water-level changes by using superimposed Theis functions. Water-levels can be modeled robustly with this Theis-transform approach because environmental fluctuations and pumping signals are simulated simultaneously. Water-level modeling with Theis transforms has been implemented in the program SeriesSEE, which is a Microsoft® Excel add-in. Moving average, Theis, pneumatic-lag, and gamma functions transform time series of measured values into water-level model components in SeriesSEE. Earth tides and step transforms are additional computed water-level model components. Water-level models are calibrated by minimizing a sum-of-squares objective function where singular value decomposition and Tikhonov regularization stabilize results. Drawdown estimates from a water-level model are the summation of all Theis transforms minus residual differences between synthetic and measured water levels. The accuracy of drawdown estimates is limited primarily by noise in the data sets, not the Theis-transform approach. Drawdowns much smaller than environmental fluctuations have been detected across major fault structures, at distances of more than 1 mile from the pumping well, and with limited pre-pumping and recovery data at sites across the United States. In addition to water-level modeling, utilities exist in SeriesSEE for viewing, cleaning, manipulating, and analyzing time-series data.

  6. Approaches in highly parameterized inversion: TSPROC, a general time-series processor to assist in model calibration and result summarization

    Science.gov (United States)

    Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.

    2012-01-01

    The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.

  7. Adapting an Agent-Based Model of Socio-Technical Systems to Analyze System and Security Failures

    Science.gov (United States)

    2016-05-09

    we implemented the model using JADE, the Java Agent DEvelopment framework.[1] JADE is a Java based open source Agent-Oriented Middleware whose...of socio-technical systems, volume 9. Springer Science & Business Media, 2013. [20] C. V. Wright, C. Connelly, T. Braje, J. C. Rabek, L. M. Rossey

  8. Issues in Biological Shape Modelling

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen

    This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape or appear......This talk reflects parts of the current research at informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations, modifications, and applications of the elements of constructing models of shape...

  9. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  10. Technical Report Series on Global Modeling and Data Assimilation. Volume 12; Comparison of Satellite Global Rainfall Algorithms

    Science.gov (United States)

    Suarez, Max J. (Editor); Chang, Alfred T. C.; Chiu, Long S.

    1997-01-01

    Seventeen months of rainfall data (August 1987-December 1988) from nine satellite rainfall algorithms (Adler, Chang, Kummerow, Prabhakara, Huffman, Spencer, Susskind, and Wu) were analyzed to examine the uncertainty of satellite-derived rainfall estimates. The variability among algorithms, measured as the standard deviation computed from the ensemble of algorithms, shows regions of high algorithm variability tend to coincide with regions of high rain rates. Histograms of pattern correlation (PC) between algorithms suggest a bimodal distribution, with separation at a PC-value of about 0.85. Applying this threshold as a criteria for similarity, our analyses show that algorithms using the same sensor or satellite input tend to be similar, suggesting the dominance of sampling errors in these satellite estimates.

  11. Review of the Technical Status on the Debris Bed Cooling Model

    International Nuclear Information System (INIS)

    Kim, Eui Kwang; Cho, Chung Ho; Lee, Yong Bum

    2007-09-01

    Preliminary safety analyses of the KALIMER-600 design have shown that the design has inherent safety characteristics and is capable of accommodating double-fault initiators such as ATWS events without coolant boiling or fuel melting. However, for the future design of sodium cooled fast reactor, the evaluation of the safety performance and the determination of containment requirements may be worth due consideration of triple-fault accident sequences of extremely low probability of occurrence that leads to core melting. For any postulated accident sequence which leads to core melting, in-vessel retention of the core debris will be required as a design requirement for the future design of sodium cooled fast reactor. Also, proof of the capacity of the debris bed cooling is an essential condition to solve the problem of in-vessel retention of the core debris. In this study, review of the technical status on the debris bed cooling model was carried out for in-vessel retention of the core debris

  12. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  13. Final Technical Report - SciDAC Cooperative Agreement: Center for Extended Magnetohydrodynamic Modeling/ Transport and Dynamics in Torodial Fusion System

    International Nuclear Information System (INIS)

    Schanck, Dalton D.

    2010-01-01

    Final technical report for research performed by Professor Dalton D. Schnack on SciDAC Cooperative Agreement: Center for Extended MHD Modeling, DE-FC02-06ER54870, for the period 7/1/06 to 2/15/08. Principal results for this period are: 1. Development of a model for computational modeling for the primitive form of the extended MMD equations. This was reported as Phys. Plasmas 13, 058103 (2006). 2. Comparison between the NIMROD and M3D codes for simulation of the nonlinear sawtooth crash in the CDXU tokamak. This was reported in Phys. Plasmas 14, 056105 (2006). 3. Demonstration of 2-fluid and gyroviscous stabilization of interchange modes using computational extended MHD models. This was reported in Phys. Rev. Letters 101, 085005 (2008). Each of these publications is attached as an Appendix of this report. They should be consulted for technical details.

  14. A COMPARATIVE STUDY OF SIMULATION AND TIME SERIES MODEL IN QUANTIFYING BULLWHIP EFFECT IN SUPPLY CHAIN

    Directory of Open Access Journals (Sweden)

    T. V. O. Fabson

    2011-11-01

    Full Text Available Bullwhip (or whiplash effect is an observed phenomenon in forecast driven distribution channeland careful management of these effects is of great importance to managers of supply chain.Bullwhip effect refers to situations where orders to the suppliers tend to have larger variance thansales to the buyer (demand distortion and the distortion increases as we move up the supply chain.Due to the fact that demand of customer for product is unstable, business managers must forecast inorder to properly position inventory and other resources. Forecasts are statistically based and in mostcases, are not very accurate. The existence of forecast errors made it necessary for organizations tooften carry an inventory buffer called “safety stock”. Moving up the supply chain from the end userscustomers to raw materials supplier there is a lot of variation in demand that can be observed, whichcall for greater need for safety stock.This study compares the efficacy of simulation and Time Series model in quantifying the bullwhipeffects in supply chain management.

  15. Numerical modelling of series-parallel cooling systems in power plant

    Science.gov (United States)

    Regucki, Paweł; Lewkowicz, Marek; Kucięba, Małgorzata

    2017-11-01

    The paper presents a mathematical model allowing one to study series-parallel hydraulic systems like, e.g., the cooling system of a power boiler's auxiliary devices or a closed cooling system including condensers and cooling towers. The analytical approach is based on a set of non-linear algebraic equations solved using numerical techniques. As a result of the iterative process, a set of volumetric flow rates of water through all the branches of the investigated hydraulic system is obtained. The calculations indicate the influence of changes in the pipeline's geometrical parameters on the total cooling water flow rate in the analysed installation. Such an approach makes it possible to analyse different variants of the modernization of the studied systems, as well as allowing for the indication of its critical elements. Basing on these results, an investor can choose the optimal variant of the reconstruction of the installation from the economic point of view. As examples of such a calculation, two hydraulic installations are described. One is a boiler auxiliary cooling installation including two screw ash coolers. The other is a closed cooling system consisting of cooling towers and condensers.

  16. Testing Homeopathy in Mouse Emotional Response Models: Pooled Data Analysis of Two Series of Studies

    Directory of Open Access Journals (Sweden)

    Paolo Bellavite

    2012-01-01

    Full Text Available Two previous investigations were performed to assess the activity of Gelsemium sempervirens (Gelsemium s. in mice, using emotional response models. These two series are pooled and analysed here. Gelsemium s. in various homeopathic centesimal dilutions/dynamizations (4C, 5C, 7C, 9C, and 30C, a placebo (solvent vehicle, and the reference drugs diazepam (1 mg/kg body weight or buspirone (5 mg/kg body weight were delivered intraperitoneally to groups of albino CD1 mice, and their effects on animal behaviour were assessed by the light-dark (LD choice test and the open-field (OF exploration test. Up to 14 separate replications were carried out in fully blind and randomised conditions. Pooled analysis demonstrated highly significant effects of Gelsemium s. 5C, 7C, and 30C on the OF parameter “time spent in central area” and of Gelsemium s. 5C, 9C, and 30C on the LD parameters “time spent in lit area” and “number of light-dark transitions,” without any sedative action or adverse effects on locomotion. This pooled data analysis confirms and reinforces the evidence that Gelsemium s. regulates emotional responses and behaviour of laboratory mice in a nonlinear fashion with dilution/dynamization.

  17. Co-evolution of intelligent socio-technical systems modelling and applications in large scale emergency and transport domains

    CERN Document Server

    2013-01-01

    As the interconnectivity between humans through technical devices is becoming ubiquitous, the next step is already in the making: ambient intelligence, i.e. smart (technical) environments, which will eventually play the same active role in communication as the human players, leading to a co-evolution in all domains where real-time communication is essential. This topical volume, based on the findings of the Socionical European research project, gives equal attention to two highly relevant domains of applications: transport, specifically traffic, dynamics from the viewpoint of a socio-technical interaction and evacuation scenarios for large-scale emergency situations. Care was taken to investigate as much as possible the limits of scalability and to combine the modeling using complex systems science approaches with relevant data analysis.

  18. The effectiveness of agrobusiness technical training and education model for the field agricultural extension officers

    Directory of Open Access Journals (Sweden)

    Kristiyo Sumarwono

    2017-07-01

    Full Text Available The study was to: (1 find the most effective agrobusiness technical training and education model for the Field Agricultural Extension Officers to be implemented; and (2 to identify the knowledge level, the highest agrobusiness skills and the strongest self-confidence that might be achieved by the participants through the implemented training and education patterns. The study was conducted by means of experiment method with the regular pattern of training and education program as the control and the mentoring pattern of training and education program as the treatment. The three patterns of training and education programs served as the independent variables while the knowledge, the skills and the self-confidence served as the dependent variables. The study was conducted in three locations namely: the Institution of Agricultural Human Resources Development in the Province of Yogyakarta Special Region (Balai Pengembangan Sumber Daya Manusia Pertanian Daerah Istimewa Yogyakarta – BPSMP DIY; the Institution of Agricultural Human Resources Empowerment (Balai Pemberdayaan Sumber Daya Manusia Pertanian – BPSDMTAN Soropadan Temanggung Provinsi Jawa Tengah in Soropadan, Temanggung, the Province of Central Java; and the Institution of Training and Education in Semarang, the Province of Central Java (Badan Pendidikan dan Pelatihan Semarang Provinsi Jawa Tengah. The study was conducted to all of the participants who attended the agrobusiness technical training and education program and, therefore, all of the participants became the subjects of the study. The study was conducted from October 2013 until March 2014. The results of the study showed that: (1 there had not been any significant difference on the knowledge and the skills of the participants who attended the regular pattern in training and education programs and those who attended the mentoring pattern in training and education programs; (2 the regular pattern in training and education programs

  19. Developing and Improving Student Non-Technical Skills in IT Education: A Literature Review and Model

    Directory of Open Access Journals (Sweden)

    Marcia Hagen

    2016-06-01

    Full Text Available The purpose of this paper is to identify portions of the literature in the areas of Information Technology (IT management, skills development, and curriculum development that support the design of a holistic conceptual framework for instruction in non-technical skills within the IT higher education context. This article review provides a framework for understanding how the critical success factors related to IT and Information Systems (IS professional success is impacted by developing students’ non-technical skills. The article culminates in a holistic conceptual framework for developing non-technical skills within the IT higher education context. Implications for theory and research are provided.

  20. Random Regression Forest Model using Technical Analysis Variables: An application on Turkish Banking Sector in Borsa Istanbul (BIST)

    OpenAIRE

    Senol Emir; Hasan Dincer; Umit Hacioglu; Serhat Yuksel

    2016-01-01

    The purpose of this study is to explore the importance and ranking of technical analysis variables in Turkish banking sector. Random Forest method is used for determining importance scores of inputs for eight banks in Borsa Istanbul. Then two predictive models utilizing Random Forest (RF) and Artificial Neural Networks (ANN) are built for predicting BIST-100 index and bank closing prices. Results of the models are compared by three metrics namely Mean Absolute Error (MAE), Mean Square Error (...

  1. Models for the control of electric actuators (EGEM, electrical engineering series); Modeles pour la commande des actionneurs electriques (Traite EGEM, serie Genie electrique)

    Energy Technology Data Exchange (ETDEWEB)

    Louis, J.P.

    2004-07-01

    The modeling of a system to be automatized is a key step for the determination of the control laws because these laws are based on inverse models deduced from direct models. The ideal example is the DC actuator, the simpleness of which allows to directly shift from the modeling to the control law. For AC actuators, the modeling tools are based on the classical hypotheses: linearity, first harmonics, symmetry. They lead to very efficient models which allow to study the properties in dynamical and permanent regime of the most important actuators: synchronous motors, asynchronous motors, voltage inverters. Some extensions to other kind of machines which does not fulfill the classical hypotheses are also proposed: synchronous machines with non-sinusoidal field distribution and asynchronous machines in saturated regime. (J.S.)

  2. Nonlinear Time Series and Neural-Network Models of Exchange Rates between the US Dollar and Major Currencies

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2016-03-01

    Full Text Available This paper features an analysis of major currency exchange rate movements in relation to the US dollar, as constituted in US dollar terms. Euro, British pound, Chinese yuan, and Japanese yen are modelled using a variety of non-linear models, including smooth transition regression models, logistic smooth transition regressions models, threshold autoregressive models, nonlinear autoregressive models, and additive nonlinear autoregressive models, plus Neural Network models. The models are evaluated on the basis of error metrics for twenty day out-of-sample forecasts using the mean average percentage errors (MAPE. The results suggest that there is no dominating class of time series models, and the different currency pairs relationships with the US dollar are captured best by neural net regression models, over the ten year sample of daily exchange rate returns data, from August 2005 to August 2015.

  3. Fuzzy time series forecasting model with natural partitioning length approach for predicting the unemployment rate under different degree of confidence

    Science.gov (United States)

    Ramli, Nazirah; Mutalib, Siti Musleha Ab; Mohamad, Daud

    2017-08-01

    Fuzzy time series forecasting model has been proposed since 1993 to cater for data in linguistic values. Many improvement and modification have been made to the model such as enhancement on the length of interval and types of fuzzy logical relation. However, most of the improvement models represent the linguistic term in the form of discrete fuzzy sets. In this paper, fuzzy time series model with data in the form of trapezoidal fuzzy numbers and natural partitioning length approach is introduced for predicting the unemployment rate. Two types of fuzzy relations are used in this study which are first order and second order fuzzy relation. This proposed model can produce the forecasted values under different degree of confidence.

  4. Prototype design for a predictive model to improve evacuation operations : technical report.

    Science.gov (United States)

    2011-08-01

    Mass evacuations of the Texas Gulf Coast remain a difficult challenge. These events are massive in scale, : highly complex, and entail an intricate, ever-changing conglomeration of technical and jurisdictional issues. : This project focused primarily...

  5. Test performance of the QSE series of 5 cm aperture quadrupole model magnets

    International Nuclear Information System (INIS)

    Archer, B.; Bein, D.; Cunningham, G.; DiMarco, J.; Gathright, T.; Jayakumar, J.; LaBarge, A.; Li, W.; Lambert, D.; Scott, M.

    1994-01-01

    A 5 cm aperture quadrupole design, the QSE series of magnets were the first to be tested in the Short Magnet and Cable Test Laboratory (SMCTL) at the SSCL. Test performance of the first two magnets of the series are presented, including quench performance, quench localization, strain gage readings, and magnetic measurements. Both magnets behaved reasonably well with no quenches below the collider operating current, four training quenches to plateau, and good training memory between thermal cycles. Future magnets in the QSE series will be used to reduce the initial training and to tune out unwanted magnetic harmonics

  6. Test performance of the QSE series of 5 cm aperture quadrupole model magnets

    International Nuclear Information System (INIS)

    Archer, B.; Bein, D.; Cunningham, G.; DiMarco, J.; Gathright, T.; Jayakumar, J.; Labarge, A.; Li, W.; Lambert, D.; Scott, M.; Snitchler, G.; Zeigler, R.

    1993-04-01

    A 5 cm aperture quadrupole design, the QSE series of magnets were the first to be tested in the Short Magnet and Cable Test Laboratory (SMCTL) at the SSCL. Test performance of the first two magnets of the series are presented, including quench performance, quench localization, strain gage readings, and magnetic measurements.Both magnets behaved reasonably well with no quenches below the collider operating current, four training quenches to plateau, and good training memory between thermal cycles. Future magnets in the QSE series will be used to reduce the initial training and to tune out unwanted magnetic harmonics

  7. DESIGNING AND BUILDING EXERCISE MODEL OF TECHNICAL ENGLISH VOCABULARIES USING CALL (COMPUTER ASSISTED LANGUAGE LEARNING

    Directory of Open Access Journals (Sweden)

    Yogi Widiawati

    2017-11-01

    Full Text Available The research is aimed to assist and facilitate the students of Electrical and Electronics Department of Politeknik Negeri Jakarta (State Polytechnics of Jakarta, Indonesia, in learning technical English vocabulary. As technical students, they study ESP (English for Specific Purposes and they find some obstacles in memorizing technical vocabularies which are very important in order to read and understand manual books for laboratory and workshop. Some English technical vocabularies among others are “generate”, “pile”, “bench”, et cetera. The research outcome is software which will be beneficial for technical students, especially electrical and electronics students. This software can be used to practice their vocabulary skills, so they will be more skillful and knowledgeable. This software is designed by using the program of Rapid E-Learning Suite Version 5.2 and Flash CS3. The software practice contains some exercises on reading text and reading comprehension questions and presented with the multiple answers. This software is handy and flexible because students can bring it anywhere and be studied anytime. It is handy because this software is put and saved in CD (compact disc, so the students can take it with them anywhere and anytime they want to learn. In other words, they have flexibility to learn and practice English Technical Vocabularies. As a result, the students are found one of the ways to overcome their problems of memorizing vocabularies. The product is a kind of software which is easily used and portable so that the students can use the software anywhere and anytime. It consists of 3 (three sections of exercises. At the end of each exercise, the students are evaluated automatically by looking at the scoring system. These will encourage them to get good score by repeating it again and again. So the technical words are not problem for them. Furthermore, the students can practice technical English vocabulary both at home and

  8. Time-dependent deformation source model of Kilauea volcano obtained via InSAR time series and inversion modeling

    Science.gov (United States)

    Zhai, G.; Shirzaei, M.

    2014-12-01

    The Kilauea volcano, Hawaii Island, is one of the most active volcanoes worldwide. Its complex system including magma reservoirs and rift zones, provides a unique opportunity to investigate the dynamics of magma transport and supply. The relatively shallow magma reservoir beneath the caldera stores magma prior to eruption at the caldera or migration to the rift zones. Additionally, the temporally variable pressure in the magma reservoir causes changes in the stress field, driving dike propagation and occasional intrusions at the eastern rift zone. Thus constraining the time-dependent evolution of the magma reservoir plays an important role in understanding magma processes such as supply, storage, transport and eruption. The recent development of space-based monitoring technology, InSAR (Interferometric synthetic aperture radar), allows the detection of subtle deformation of the surface at high spatial resolution and accuracy. In order to understand the dynamics of the magma chamber at Kilauea summit area and the associated stress field, we explored SAR data sets acquired in two overlapping tracks of Envisat SAR data during period 2003-2010. The combined InSAR time series includes 100 samples measuring summit deformation at unprecedented spatiotemporal resolutions. To investigate the source of the summit deformation field, we propose a novel time-dependent inverse modelling approach to constrain the dynamics of the reservoir volume change within the summit magma reservoir in three dimensions. In conjunction with seismic and gas data sets, the obtained time-dependent model could resolve the temporally variable relation between shallow and deep reservoirs, as well as their connection to the rift zone via stress changes. The data and model improve the understanding of the Kilauea plumbing system, physics of eruptions, mechanics of rift intrusions, and enhance eruption forecast models.

  9. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  10. High temperature series expansions for the susceptibility of Ising model on the Kagome lattice with nearest neighber interactions

    Directory of Open Access Journals (Sweden)

    Z Jalali mola

    2011-12-01

    Full Text Available  The Ising model is one of the simplest models describing the interacting particles. In this work, we calculate the high temperature series expansions of zero field susceptibility of ising model with ferromagnetic, antiferromagnetic and one antiferromagnetic interactions on two dimensional kagome lattice. Using the Pade´ approximation, we calculate the susceptibility of critical exponent of ferromagnetic ising model γ ≈ 1.75, which is consistent with universality hypothesis. However, antiferromagnetic and one antiferromagnetic interaction ising model doesn’t show any transition at finite temperature because of the effect of magnetic frustration.

  11. Technical snow production in skiing areas: conditions, practice, monitoring and modelling. A case study in Mayrhofen/Austria

    Science.gov (United States)

    Strasser, Ulrich; Hanzer, Florian; Marke, Thomas; Rothleitner, Michael

    2017-04-01

    The production of technical snow today is a self-evident feature of modern alpine skiing resort management. Millions of Euros are invested every year for the technical infrastructure and its operation to produce a homogeneous and continuing snow cover on the skiing slopes for the winter season in almost every larger destination in the Alps. In Austria, skiing tourism is a significant factor of the national economic structure. We present the framing conditions of technical snow production in the mid-size skiing resort of Mayrhofen (Zillertal Alps/Austria, 136 km slopes, elevation range 630 - 2.500 m a.s.l.). Production conditions are defined by the availability of water, the planned date for the season opening, and the climatic conditions in the weeks before. By means of an adapted snow production strategy an attempt is made to ecologically and economically optimize the use of water and energy resources. Monitoring of the snow cover is supported by a network of low-cost sensors and mobile snow depth recordings. Finally, technical snow production is simulated with the spatially distributed, physically based hydroclimatological model AMUNDSEN. The model explicitly considers individual snow guns and distributes the produced snow along the slopes. The amount of simulated snow produced by each device is a function of its type, of actual wet-bulb temperature at the location, of ski area infrastructure (in terms of water supply and pumping capacity), and of snow demand.

  12. Time Series Analysis of a Principal-Agent Model to Assess Risk Shifting and Bargaining Power in Commodity Marketing Channels

    NARCIS (Netherlands)

    Kuiper, W.E.; Kuwornu, J.K.M.; Pennings, J.M.E.

    2003-01-01

    We apply the classic agency model to investigate risk shifting in an agricultural marketing channel, using time series analysis. We show that if the principal is risk-neutral and the agent is risk-averse instead of risk-neutral, then a linear contract can still be optimal if the fixed payment is

  13. THE EFFECT OF DECOMPOSITION METHOD AS DATA PREPROCESSING ON NEURAL NETWORKS MODEL FOR FORECASTING TREND AND SEASONAL TIME SERIES

    Directory of Open Access Journals (Sweden)

    Subanar Subanar

    2006-01-01

    Full Text Available Recently, one of the central topics for the neural networks (NN community is the issue of data preprocessing on the use of NN. In this paper, we will investigate this topic particularly on the effect of Decomposition method as data processing and the use of NN for modeling effectively time series with both trend and seasonal patterns. Limited empirical studies on seasonal time series forecasting with neural networks show that some find neural networks are able to model seasonality directly and prior deseasonalization is not necessary, and others conclude just the opposite. In this research, we study particularly on the effectiveness of data preprocessing, including detrending and deseasonalization by applying Decomposition method on NN modeling and forecasting performance. We use two kinds of data, simulation and real data. Simulation data are examined on multiplicative of trend and seasonality patterns. The results are compared to those obtained from the classical time series model. Our result shows that a combination of detrending and deseasonalization by applying Decomposition method is the effective data preprocessing on the use of NN for forecasting trend and seasonal time series.

  14. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation

    Science.gov (United States)

    The scale mismatch between remotely sensed observations and crop growth models simulated state variables decreases the reliability of crop yield estimates. To overcome this problem, we used a two-step data assimilation phases: first we generated a complete leaf area index (LAI) time series by combin...

  15. 75 FR 55455 - Airworthiness Directives; The Boeing Company Model 737-700 (IGW) Series Airplanes Equipped With...

    Science.gov (United States)

    2010-09-13

    ... should address the following actions. (1) Permanently drain auxiliary fuel tanks, and clear them of fuel... them at the pneumatic source, and secure them. (4) Disconnect all fuel feed and fuel vent plumbing... Airworthiness Directives; The Boeing Company Model 737-700 (IGW) Series Airplanes Equipped With Auxiliary Fuel...

  16. Two-parameter double-oscillator model of Mathews-Lakshmanan type: Series solutions and supersymmetric partners

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Halberg, Axel, E-mail: axgeschu@iun.edu, E-mail: xbataxel@gmail.com [Department of Mathematics and Actuarial Science and Department of Physics, Indiana University Northwest, 3400 Broadway, Gary, Indiana 46408 (United States); Wang, Jie, E-mail: wangjie@iun.edu [Department of Computer Information Systems, Indiana University Northwest, 3400 Broadway, Gary, Indiana 46408 (United States)

    2015-07-15

    We obtain series solutions, the discrete spectrum, and supersymmetric partners for a quantum double-oscillator system. Its potential features a superposition of the one-parameter Mathews-Lakshmanan interaction and a one-parameter harmonic or inverse harmonic oscillator contribution. Furthermore, our results are transferred to a generalized Pöschl-Teller model that is isospectral to the double-oscillator system.

  17. 76 FR 5061 - Special Conditions: TTF Aerospace, LLC, Modification to Boeing Model 767-300 Series Airplanes...

    Science.gov (United States)

    2011-01-28

    ..., will have a novel or unusual design features associated with the pilot lower lobe crew rest module (CRM... people to take part in this rulemaking by sending written comments, data, or views. The most helpful...) for installation of a lower lobe pilot crew rest module (CRM) in Boeing Model 767-300 series airplanes...

  18. Two-parameter double-oscillator model of Mathews-Lakshmanan type: Series solutions and supersymmetric partners

    International Nuclear Information System (INIS)

    Schulze-Halberg, Axel; Wang, Jie

    2015-01-01

    We obtain series solutions, the discrete spectrum, and supersymmetric partners for a quantum double-oscillator system. Its potential features a superposition of the one-parameter Mathews-Lakshmanan interaction and a one-parameter harmonic or inverse harmonic oscillator contribution. Furthermore, our results are transferred to a generalized Pöschl-Teller model that is isospectral to the double-oscillator system

  19. Detecting geothermal anomalies and evaluating LST geothermal component by combining thermal remote sensing time series and land surface model data

    NARCIS (Netherlands)

    Romaguera, M.; Vaughan, R. G.; Ettema, J.; Izquierdo-Verdiguier, E.; Hecker, C. A.; van der Meer, F. D.

    2017-01-01

    This paper explores for the first time the possibilities to use two land surface temperature (LST) time series of different origins (geostationary Meteosat Second Generation satellite data and Noah land surface modelling, LSM), to detect geothermal anomalies and extract the geothermal component of

  20. 78 FR 76252 - Special Conditions: Airbus, Model A350-900 Series Airplane; Isolation or Protection of the...

    Science.gov (United States)

    2013-12-17

    ....dot.gov/ . Docket: Background documents or comments received may be read at http://www.regulations.gov.... Background On August 25, 2008, Airbus applied for a type certificate for their new Model A350-900 series... corruption of data and systems critical to the safety and maintenance of the airplane. The existing...