WorldWideScience

Sample records for stochastic models calibrated

  1. Calibration of a stochastic health evolution model using NHIS data

    Science.gov (United States)

    Gupta, Aparna; Li, Zhisheng

    2011-10-01

    This paper presents and calibrates an individual's stochastic health evolution model. In this health evolution model, the uncertainty of health incidents is described by a stochastic process with a finite number of possible outcomes. We construct a comprehensive health status index (HSI) to describe an individual's health status, as well as a health risk factor system (RFS) to classify individuals into different risk groups. Based on the maximum likelihood estimation (MLE) method and the method of nonlinear least squares fitting, model calibration is formulated in terms of two mixed-integer nonlinear optimization problems. Using the National Health Interview Survey (NHIS) data, the model is calibrated for specific risk groups. Longitudinal data from the Health and Retirement Study (HRS) is used to validate the calibrated model, which displays good validation properties. The end goal of this paper is to provide a model and methodology, whose output can serve as a crucial component of decision support for strategic planning of health related financing and risk management.

  2. Stochastic isotropic hyperelastic materials: constitutive calibration and model selection

    Science.gov (United States)

    Mihai, L. Angela; Woolley, Thomas E.; Goriely, Alain

    2018-03-01

    Biological and synthetic materials often exhibit intrinsic variability in their elastic responses under large strains, owing to microstructural inhomogeneity or when elastic data are extracted from viscoelastic mechanical tests. For these materials, although hyperelastic models calibrated to mean data are useful, stochastic representations accounting also for data dispersion carry extra information about the variability of material properties found in practical applications. We combine finite elasticity and information theories to construct homogeneous isotropic hyperelastic models with random field parameters calibrated to discrete mean values and standard deviations of either the stress-strain function or the nonlinear shear modulus, which is a function of the deformation, estimated from experimental tests. These quantities can take on different values, corresponding to possible outcomes of the experiments. As multiple models can be derived that adequately represent the observed phenomena, we apply Occam's razor by providing an explicit criterion for model selection based on Bayesian statistics. We then employ this criterion to select a model among competing models calibrated to experimental data for rubber and brain tissue under single or multiaxial loads.

  3. Stochastic Modeling of Overtime Occupancy and Its Application in Building Energy Simulation and Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Kaiyu; Yan, Da; Hong, Tianzhen; Guo, Siyue

    2014-02-28

    Overtime is a common phenomenon around the world. Overtime drives both internal heat gains from occupants, lighting and plug-loads, and HVAC operation during overtime periods. Overtime leads to longer occupancy hours and extended operation of building services systems beyond normal working hours, thus overtime impacts total building energy use. Current literature lacks methods to model overtime occupancy because overtime is stochastic in nature and varies by individual occupants and by time. To address this gap in the literature, this study aims to develop a new stochastic model based on the statistical analysis of measured overtime occupancy data from an office building. A binomial distribution is used to represent the total number of occupants working overtime, while an exponential distribution is used to represent the duration of overtime periods. The overtime model is used to generate overtime occupancy schedules as an input to the energy model of a second office building. The measured and simulated cooling energy use during the overtime period is compared in order to validate the overtime model. A hybrid approach to energy model calibration is proposed and tested, which combines ASHRAE Guideline 14 for the calibration of the energy model during normal working hours, and a proposed KS test for the calibration of the energy model during overtime. The developed stochastic overtime model and the hybrid calibration approach can be used in building energy simulations to improve the accuracy of results, and better understand the characteristics of overtime in office buildings.

  4. Hydrological model calibration for derived flood frequency analysis using stochastic rainfall and probability distributions of peak flows

    Science.gov (United States)

    Haberlandt, U.; Radtke, I.

    2014-01-01

    Derived flood frequency analysis allows the estimation of design floods with hydrological modeling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices regarding precipitation input, discharge output and consequently the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets and to propose the most suitable approach. Event based and continuous, observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output, short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in northern Germany with the hydrological model HEC-HMS (Hydrologic Engineering Center's Hydrologic Modeling System). The results show that (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, and (III) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the

  5. Electricity price modeling with stochastic time change

    International Nuclear Information System (INIS)

    Borovkova, Svetlana; Schmeck, Maren Diane

    2017-01-01

    In this paper, we develop a novel approach to electricity price modeling, based on the powerful technique of stochastic time change. This technique allows us to incorporate the characteristic features of electricity prices (such as seasonal volatility, time varying mean reversion and seasonally occurring price spikes) into the model in an elegant and economically justifiable way. The stochastic time change introduces stochastic as well as deterministic (e.g., seasonal) features in the price process' volatility and in the jump component. We specify the base process as a mean reverting jump diffusion and the time change as an absolutely continuous stochastic process with seasonal component. The activity rate of the stochastic time change can be related to the factors that influence supply and demand. Here we use the temperature as a proxy for the demand and hence, as the driving factor of the stochastic time change, and show that this choice leads to realistic price paths. We derive properties of the resulting price process and develop the model calibration procedure. We calibrate the model to the historical EEX power prices and apply it to generating realistic price paths by Monte Carlo simulations. We show that the simulated price process matches the distributional characteristics of the observed electricity prices in periods of both high and low demand. - Highlights: • We develop a novel approach to electricity price modeling, based on the powerful technique of stochastic time change. • We incorporate the characteristic features of electricity prices, such as seasonal volatility and spikes into the model. • We use the temperature as a proxy for the demand and hence, as the driving factor of the stochastic time change • We derive properties of the resulting price process and develop the model calibration procedure. • We calibrate the model to the historical EEX power prices and apply it to generating realistic price paths.

  6. Sparse calibration of subsurface flow models using nonlinear orthogonal matching pursuit and an iterative stochastic ensemble method

    KAUST Repository

    Elsheikh, Ahmed H.

    2013-06-01

    We introduce a nonlinear orthogonal matching pursuit (NOMP) for sparse calibration of subsurface flow models. Sparse calibration is a challenging problem as the unknowns are both the non-zero components of the solution and their associated weights. NOMP is a greedy algorithm that discovers at each iteration the most correlated basis function with the residual from a large pool of basis functions. The discovered basis (aka support) is augmented across the nonlinear iterations. Once a set of basis functions are selected, the solution is obtained by applying Tikhonov regularization. The proposed algorithm relies on stochastically approximated gradient using an iterative stochastic ensemble method (ISEM). In the current study, the search space is parameterized using an overcomplete dictionary of basis functions built using the K-SVD algorithm. The proposed algorithm is the first ensemble based algorithm that tackels the sparse nonlinear parameter estimation problem. © 2013 Elsevier Ltd.

  7. Calibration and simulation of Heston model

    Directory of Open Access Journals (Sweden)

    Mrázek Milan

    2017-05-01

    Full Text Available We calibrate Heston stochastic volatility model to real market data using several optimization techniques. We compare both global and local optimizers for different weights showing remarkable differences even for data (DAX options from two consecutive days. We provide a novel calibration procedure that incorporates the usage of approximation formula and outperforms significantly other existing calibration methods.

  8. Stochastic forward and inverse groundwater flow and solute transport modeling

    NARCIS (Netherlands)

    Janssen, G.M.C.M.

    2008-01-01

    Keywords: calibration, inverse modeling, stochastic modeling, nonlinear biodegradation, stochastic-convective, advective-dispersive, travel time, network design, non-Gaussian distribution, multimodal distribution, representers

    This thesis offers three new approaches that contribute

  9. Stochastic calibration and learning in nonstationary hydroeconomic models

    Science.gov (United States)

    Maneta, M. P.; Howitt, R.

    2014-05-01

    Concern about water scarcity and adverse climate events over agricultural regions has motivated a number of efforts to develop operational integrated hydroeconomic models to guide adaptation and optimal use of water. Once calibrated, these models are used for water management and analysis assuming they remain valid under future conditions. In this paper, we present and demonstrate a methodology that permits the recursive calibration of economic models of agricultural production from noisy but frequently available data. We use a standard economic calibration approach, namely positive mathematical programming, integrated in a data assimilation algorithm based on the ensemble Kalman filter equations to identify the economic model parameters. A moving average kernel ensures that new and past information on agricultural activity are blended during the calibration process, avoiding loss of information and overcalibration for the conditions of a single year. A regularization constraint akin to the standard Tikhonov regularization is included in the filter to ensure its stability even in the presence of parameters with low sensitivity to observations. The results show that the implementation of the PMP methodology within a data assimilation framework based on the enKF equations is an effective method to calibrate models of agricultural production even with noisy information. The recursive nature of the method incorporates new information as an added value to the known previous observations of agricultural activity without the need to store historical information. The robustness of the method opens the door to the use of new remote sensing algorithms for operational water management.

  10. Model Calibration in Option Pricing

    Directory of Open Access Journals (Sweden)

    Andre Loerx

    2012-04-01

    Full Text Available We consider calibration problems for models of pricing derivatives which occur in mathematical finance. We discuss various approaches such as using stochastic differential equations or partial differential equations for the modeling process. We discuss the development in the past literature and give an outlook into modern approaches of modelling. Furthermore, we address important numerical issues in the valuation of options and likewise the calibration of these models. This leads to interesting problems in optimization, where, e.g., the use of adjoint equations or the choice of the parametrization for the model parameters play an important role.

  11. Clustered iterative stochastic ensemble method for multi-modal calibration of subsurface flow models

    KAUST Repository

    Elsheikh, Ahmed H.

    2013-05-01

    A novel multi-modal parameter estimation algorithm is introduced. Parameter estimation is an ill-posed inverse problem that might admit many different solutions. This is attributed to the limited amount of measured data used to constrain the inverse problem. The proposed multi-modal model calibration algorithm uses an iterative stochastic ensemble method (ISEM) for parameter estimation. ISEM employs an ensemble of directional derivatives within a Gauss-Newton iteration for nonlinear parameter estimation. ISEM is augmented with a clustering step based on k-means algorithm to form sub-ensembles. These sub-ensembles are used to explore different parts of the search space. Clusters are updated at regular intervals of the algorithm to allow merging of close clusters approaching the same local minima. Numerical testing demonstrates the potential of the proposed algorithm in dealing with multi-modal nonlinear parameter estimation for subsurface flow models. © 2013 Elsevier B.V.

  12. AUTOMATIC CALIBRATION OF A STOCHASTIC-LAGRANGIAN TRANSPORT MODEL (SLAM)

    Science.gov (United States)

    Numerical models are a useful tool in evaluating and designing NAPL remediation systems. Traditional constitutive finite difference and finite element models are complex and expensive to apply. For this reason, this paper presents the application of a simplified stochastic-Lagran...

  13. Dynamic-stochastic modeling of snow cover formation on the European territory of Russia

    OpenAIRE

    A. N. Gelfan; V. M. Moreido

    2014-01-01

    A dynamic-stochastic model, which combines a deterministic model of snow cover formation with a stochastic weather generator, has been developed. The deterministic snow model describes temporal change of the snow depth, content of ice and liquid water, snow density, snowmelt, sublimation, re-freezing of melt water, and snow metamorphism. The model has been calibrated and validated against the long-term data of snow measurements over the territory of the European Russia. The model showed good ...

  14. Simultaneous perturbation stochastic approximation for tidal models

    KAUST Repository

    Altaf, M.U.

    2011-05-12

    The Dutch continental shelf model (DCSM) is a shallow sea model of entire continental shelf which is used operationally in the Netherlands to forecast the storm surges in the North Sea. The forecasts are necessary to support the decision of the timely closure of the moveable storm surge barriers to protect the land. In this study, an automated model calibration method, simultaneous perturbation stochastic approximation (SPSA) is implemented for tidal calibration of the DCSM. The method uses objective function evaluations to obtain the gradient approximations. The gradient approximation for the central difference method uses only two objective function evaluation independent of the number of parameters being optimized. The calibration parameter in this study is the model bathymetry. A number of calibration experiments is performed. The effectiveness of the algorithm is evaluated in terms of the accuracy of the final results as well as the computational costs required to produce these results. In doing so, comparison is made with a traditional steepest descent method and also with a newly developed proper orthogonal decompositionbased calibration method. The main findings are: (1) The SPSA method gives comparable results to steepest descent method with little computational cost. (2) The SPSA method with little computational cost can be used to estimate large number of parameters.

  15. Simultaneous perturbation stochastic approximation for tidal models

    KAUST Repository

    Altaf, M.U.; Heemink, A.W.; Verlaan, M.; Hoteit, Ibrahim

    2011-01-01

    The Dutch continental shelf model (DCSM) is a shallow sea model of entire continental shelf which is used operationally in the Netherlands to forecast the storm surges in the North Sea. The forecasts are necessary to support the decision of the timely closure of the moveable storm surge barriers to protect the land. In this study, an automated model calibration method, simultaneous perturbation stochastic approximation (SPSA) is implemented for tidal calibration of the DCSM. The method uses objective function evaluations to obtain the gradient approximations. The gradient approximation for the central difference method uses only two objective function evaluation independent of the number of parameters being optimized. The calibration parameter in this study is the model bathymetry. A number of calibration experiments is performed. The effectiveness of the algorithm is evaluated in terms of the accuracy of the final results as well as the computational costs required to produce these results. In doing so, comparison is made with a traditional steepest descent method and also with a newly developed proper orthogonal decompositionbased calibration method. The main findings are: (1) The SPSA method gives comparable results to steepest descent method with little computational cost. (2) The SPSA method with little computational cost can be used to estimate large number of parameters.

  16. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    Science.gov (United States)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  17. Evaluation of Stochastic Rainfall Models in Capturing Climate Variability for Future Drought and Flood Risk Assessment

    Science.gov (United States)

    Chowdhury, A. F. M. K.; Lockart, N.; Willgoose, G. R.; Kuczera, G. A.; Kiem, A.; Nadeeka, P. M.

    2016-12-01

    One of the key objectives of stochastic rainfall modelling is to capture the full variability of climate system for future drought and flood risk assessment. However, it is not clear how well these models can capture the future climate variability when they are calibrated to Global/Regional Climate Model data (GCM/RCM) as these datasets are usually available for very short future period/s (e.g. 20 years). This study has assessed the ability of two stochastic daily rainfall models to capture climate variability by calibrating them to a dynamically downscaled RCM dataset in an east Australian catchment for 1990-2010, 2020-2040, and 2060-2080 epochs. The two stochastic models are: (1) a hierarchical Markov Chain (MC) model, which we developed in a previous study and (2) a semi-parametric MC model developed by Mehrotra and Sharma (2007). Our hierarchical model uses stochastic parameters of MC and Gamma distribution, while the semi-parametric model uses a modified MC process with memory of past periods and kernel density estimation. This study has generated multiple realizations of rainfall series by using parameters of each model calibrated to the RCM dataset for each epoch. The generated rainfall series are used to generate synthetic streamflow by using a SimHyd hydrology model. Assessing the synthetic rainfall and streamflow series, this study has found that both stochastic models can incorporate a range of variability in rainfall as well as streamflow generation for both current and future periods. However, the hierarchical model tends to overestimate the multiyear variability of wet spell lengths (therefore, is less likely to simulate long periods of drought and flood), while the semi-parametric model tends to overestimate the mean annual rainfall depths and streamflow volumes (hence, simulated droughts are likely to be less severe). Sensitivity of these limitations of both stochastic models in terms of future drought and flood risk assessment will be discussed.

  18. Study on individual stochastic model of GNSS observations for precise kinematic applications

    Science.gov (United States)

    Próchniewicz, Dominik; Szpunar, Ryszard

    2015-04-01

    The proper definition of mathematical positioning model, which is defined by functional and stochastic models, is a prerequisite to obtain the optimal estimation of unknown parameters. Especially important in this definition is realistic modelling of stochastic properties of observations, which are more receiver-dependent and time-varying than deterministic relationships. This is particularly true with respect to precise kinematic applications which are characterized by weakening model strength. In this case, incorrect or simplified definition of stochastic model causes that the performance of ambiguity resolution and accuracy of position estimation can be limited. In this study we investigate the methods of describing the measurement noise of GNSS observations and its impact to derive precise kinematic positioning model. In particular stochastic modelling of individual components of the variance-covariance matrix of observation noise performed using observations from a very short baseline and laboratory GNSS signal generator, is analyzed. Experimental test results indicate that the utilizing the individual stochastic model of observations including elevation dependency and cross-correlation instead of assumption that raw measurements are independent with the same variance improves the performance of ambiguity resolution as well as rover positioning accuracy. This shows that the proposed stochastic assessment method could be a important part in complex calibration procedure of GNSS equipment.

  19. Hydrological model calibration for flood prediction in current and future climates using probability distributions of observed peak flows and model based rainfall

    Science.gov (United States)

    Haberlandt, Uwe; Wallner, Markus; Radtke, Imke

    2013-04-01

    Derived flood frequency analysis based on continuous hydrological modelling is very demanding regarding the required length and temporal resolution of precipitation input data. Often such flood predictions are obtained using long precipitation time series from stochastic approaches or from regional climate models as input. However, the calibration of the hydrological model is usually done using short time series of observed data. This inconsistent employment of different data types for calibration and application of a hydrological model increases its uncertainty. Here, it is proposed to calibrate a hydrological model directly on probability distributions of observed peak flows using model based rainfall in line with its later application. Two examples are given to illustrate the idea. The first one deals with classical derived flood frequency analysis using input data from an hourly stochastic rainfall model. The second one concerns a climate impact analysis using hourly precipitation from a regional climate model. The results show that: (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated on extreme conditions works quite well for average conditions but not vice versa, (III) the calibration of the hydrological model using regional climate model data works as an implicit bias correction method and (IV) the best performance for flood estimation is usually obtained when model based precipitation and observed probability distribution of peak flows are used for model calibration.

  20. Predicting the Stochastic Properties of the Shallow Subsurface for Improved Geophysical Modeling

    Science.gov (United States)

    Stroujkova, A.; Vynne, J.; Bonner, J.; Lewkowicz, J.

    2005-12-01

    Strong ground motion data from numerous explosive field experiments and from moderate to large earthquakes show significant variations in amplitude and waveform shape with respect to both azimuth and range. Attempts to model these variations using deterministic models have often been unsuccessful. It has been hypothesized that a stochastic description of the geological medium is a more realistic approach. To estimate the stochastic properties of the shallow subsurface, we use Measurement While Drilling (MWD) data, which are routinely collected by mines in order to facilitate design of blast patterns. The parameters, such as rotation speed of the drill, torque, and penetration rate, are used to compute the rock's Specific Energy (SE), which is then related to a blastability index. We use values of SE measured at two different mines and calibrated to laboratory measurements of rock properties to determine correlation lengths of the subsurface rocks in 2D, needed to obtain 2D and 3D stochastic models. The stochastic models are then combined with the deterministic models and used to compute synthetic seismic waveforms.

  1. A two-factor, stochastic programming model of Danish mortgage-backed securities

    DEFF Research Database (Denmark)

    Nielsen, Søren S.; Poulsen, Rolf

    2004-01-01

    -trivial, both in terms of deciding on an initial mortgage, and in terms of managing (rebalancing) it optimally.We propose a two-factor, arbitrage-free interest-rate model, calibrated to observable security prices, and implement on top of it a multi-stage, stochastic optimization program with the purpose...

  2. Regionalization of the Modified Bartlett-Lewis Rectangular Pulse Stochastic Rainfall Model

    OpenAIRE

    Dongkyun Kim; Francisco Olivera; Huidae Cho; Scott A. Socolofsky

    2013-01-01

    Parameters of the Modified Bartlett-Lewis Rectangular Pulse (MBLRP) stochastic rainfall simulation model were regionalized across the contiguous United States. Three thousand four hundred forty-four National Climate Data Center (NCDC) rain gauges were used to obtain spatial and seasonal patterns of the model parameters. The MBLRP model was calibrated to minimize the discrepancy between the precipitation depth statistics between the observed and MBLRP-generated precipitation time series. These...

  3. Stochastic neuron models

    CERN Document Server

    Greenwood, Priscilla E

    2016-01-01

    This book describes a large number of open problems in the theory of stochastic neural systems, with the aim of enticing probabilists to work on them. This includes problems arising from stochastic models of individual neurons as well as those arising from stochastic models of the activities of small and large networks of interconnected neurons. The necessary neuroscience background to these problems is outlined within the text, so readers can grasp the context in which they arise. This book will be useful for graduate students and instructors providing material and references for applying probability to stochastic neuron modeling. Methods and results are presented, but the emphasis is on questions where additional stochastic analysis may contribute neuroscience insight. An extensive bibliography is included. Dr. Priscilla E. Greenwood is a Professor Emerita in the Department of Mathematics at the University of British Columbia. Dr. Lawrence M. Ward is a Professor in the Department of Psychology and the Brain...

  4. On combination of strict Bayesian principles with model reduction technique or how stochastic model calibration can become feasible for large-scale applications

    Science.gov (United States)

    Oladyshkin, S.; Schroeder, P.; Class, H.; Nowak, W.

    2013-12-01

    Predicting underground carbon dioxide (CO2) storage represents a challenging problem in a complex dynamic system. Due to lacking information about reservoir parameters, quantification of uncertainties may become the dominant question in risk assessment. Calibration on past observed data from pilot-scale test injection can improve the predictive power of the involved geological, flow, and transport models. The current work performs history matching to pressure time series from a pilot storage site operated in Europe, maintained during an injection period. Simulation of compressible two-phase flow and transport (CO2/brine) in the considered site is computationally very demanding, requiring about 12 days of CPU time for an individual model run. For that reason, brute-force approaches for calibration are not feasible. In the current work, we explore an advanced framework for history matching based on the arbitrary polynomial chaos expansion (aPC) and strict Bayesian principles. The aPC [1] offers a drastic but accurate stochastic model reduction. Unlike many previous chaos expansions, it can handle arbitrary probability distribution shapes of uncertain parameters, and can therefore handle directly the statistical information appearing during the matching procedure. We capture the dependence of model output on these multipliers with the expansion-based reduced model. In our study we keep the spatial heterogeneity suggested by geophysical methods, but consider uncertainty in the magnitude of permeability trough zone-wise permeability multipliers. Next combined the aPC with Bootstrap filtering (a brute-force but fully accurate Bayesian updating mechanism) in order to perform the matching. In comparison to (Ensemble) Kalman Filters, our method accounts for higher-order statistical moments and for the non-linearity of both the forward model and the inversion, and thus allows a rigorous quantification of calibrated model uncertainty. The usually high computational costs of

  5. Derived flood frequency analysis using different model calibration strategies based on various types of rainfall-runoff data - a comparison

    Science.gov (United States)

    Haberlandt, U.; Radtke, I.

    2013-08-01

    Derived flood frequency analysis allows to estimate design floods with hydrological modelling for poorly observed basins considering change and taking into account flood protection measures. There are several possible choices about precipitation input, discharge output and consequently regarding the calibration of the model. The objective of this study is to compare different calibration strategies for a hydrological model considering various types of rainfall input and runoff output data sets. Event based and continuous observed hourly rainfall data as well as disaggregated daily rainfall and stochastically generated hourly rainfall data are used as input for the model. As output short hourly and longer daily continuous flow time series as well as probability distributions of annual maximum peak flow series are employed. The performance of the strategies is evaluated using the obtained different model parameter sets for continuous simulation of discharge in an independent validation period and by comparing the model derived flood frequency distributions with the observed one. The investigations are carried out for three mesoscale catchments in Northern Germany with the hydrological model HEC-HMS. The results show that: (i) the same type of precipitation input data should be used for calibration and application of the hydrological model, (ii) a model calibrated using a small sample of extreme values works quite well for the simulation of continuous time series with moderate length but not vice versa, (iii) the best performance with small uncertainty is obtained when stochastic precipitation data and the observed probability distribution of peak flows are used for model calibration. This outcome suggests to calibrate a hydrological model directly on probability distributions of observed peak flows using stochastic rainfall as input if its purpose is the application for derived flood frequency analysis.

  6. Essay on Option Pricing, Hedging and Calibration

    DEFF Research Database (Denmark)

    da Silva Ribeiro, André Manuel

    Quantitative finance is concerned about applying mathematics to financial markets.This thesis is a collection of essays that study different problems in this field: How efficient are option price approximations to calibrate a stochastic volatilitymodel? (Chapter 2) How different is the discretely...... of dynamics? (Chapter 5) How can we formulate a simple free-arbitrage model to price correlationswaps? (Chapter 6) A summary of the work presented in this thesis: Approximation Behooves Calibration In this paper we show that calibration based on an expansion approximation for option prices in the Heston...... stochastic volatility model gives stable, accurate, and fast results for S&P500-index option data over the period 2005 to 2009. Discretely Sampled Variance Options: A Stochastic Approximation Approach In this paper, we expand Drimus and Farkas (2012) framework to price variance options on discretely sampled...

  7. Stochastic biomathematical models with applications to neuronal modeling

    CERN Document Server

    Batzel, Jerry; Ditlevsen, Susanne

    2013-01-01

    Stochastic biomathematical models are becoming increasingly important as new light is shed on the role of noise in living systems. In certain biological systems, stochastic effects may even enhance a signal, thus providing a biological motivation for the noise observed in living systems. Recent advances in stochastic analysis and increasing computing power facilitate the analysis of more biophysically realistic models, and this book provides researchers in computational neuroscience and stochastic systems with an overview of recent developments. Key concepts are developed in chapters written by experts in their respective fields. Topics include: one-dimensional homogeneous diffusions and their boundary behavior, large deviation theory and its application in stochastic neurobiological models, a review of mathematical methods for stochastic neuronal integrate-and-fire models, stochastic partial differential equation models in neurobiology, and stochastic modeling of spreading cortical depression.

  8. COMPARISON OF EIGENMODE-BASED AND RANDOM FIELD-BASED IMPERFECTION MODELING FOR THE STOCHASTIC BUCKLING ANALYSIS OF I-SECTION BEAM–COLUMNS

    KAUST Repository

    STAVREV, A.

    2013-03-01

    The uncertainty of geometric imperfections in a series of nominally equal I-beams leads to a variability of corresponding buckling loads. Its analysis requires a stochastic imperfection model, which can be derived either by the simple variation of the critical eigenmode with a scalar random variable, or with the help of the more advanced theory of random fields. The present paper first provides a concise review of the two different modeling approaches, covering theoretical background, assumptions and calibration, and illustrates their integration into commercial finite element software to conduct stochastic buckling analyses with the Monte-Carlo method. The stochastic buckling behavior of an example beam is then simulated with both stochastic models, calibrated from corresponding imperfection measurements. The simulation results show that for different load cases, the response statistics of the buckling load obtained with the eigenmode-based and the random field-based models agree very well. A comparison of our simulation results with corresponding Eurocode 3 limit loads indicates that the design standard is very conservative for compression dominated load cases. © 2013 World Scientific Publishing Company.

  9. Forecasting financial asset processes: stochastic dynamics via learning neural networks.

    Science.gov (United States)

    Giebel, S; Rainer, M

    2010-01-01

    Models for financial asset dynamics usually take into account their inherent unpredictable nature by including a suitable stochastic component into their process. Unknown (forward) values of financial assets (at a given time in the future) are usually estimated as expectations of the stochastic asset under a suitable risk-neutral measure. This estimation requires the stochastic model to be calibrated to some history of sufficient length in the past. Apart from inherent limitations, due to the stochastic nature of the process, the predictive power is also limited by the simplifying assumptions of the common calibration methods, such as maximum likelihood estimation and regression methods, performed often without weights on the historic time series, or with static weights only. Here we propose a novel method of "intelligent" calibration, using learning neural networks in order to dynamically adapt the parameters of the stochastic model. Hence we have a stochastic process with time dependent parameters, the dynamics of the parameters being themselves learned continuously by a neural network. The back propagation in training the previous weights is limited to a certain memory length (in the examples we consider 10 previous business days), which is similar to the maximal time lag of autoregressive processes. We demonstrate the learning efficiency of the new algorithm by tracking the next-day forecasts for the EURTRY and EUR-HUF exchange rates each.

  10. Stochastic Still Water Response Model

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2002-01-01

    In this study a stochastic field model for the still water loading is formulated where the statistics (mean value, standard deviation, and correlation) of the sectional forces are obtained by integration of the load field over the relevant part of the ship structure. The objective of the model is...... out that an important parameter of the stochastic cargo field model is the mean number of containers delivered by each customer.......In this study a stochastic field model for the still water loading is formulated where the statistics (mean value, standard deviation, and correlation) of the sectional forces are obtained by integration of the load field over the relevant part of the ship structure. The objective of the model...... is to establish the stochastic load field conditional on a given draft and trim of the vessel. The model contributes to a realistic modelling of the stochastic load processes to be used in a reliability evaluation of the ship hull. Emphasis is given to container vessels. The formulation of the model for obtaining...

  11. Rainfall Stochastic models

    Science.gov (United States)

    Campo, M. A.; Lopez, J. J.; Rebole, J. P.

    2012-04-01

    This work was carried out in north of Spain. San Sebastian A meteorological station, where there are available precipitation records every ten minutes was selected. Precipitation data covers from October of 1927 to September of 1997. Pulse models describe the temporal process of rainfall as a succession of rainy cells, main storm, whose origins are distributed in time according to a Poisson process and a secondary process that generates a random number of cells of rain within each storm. Among different pulse models, the Bartlett-Lewis was used. On the other hand, alternative renewal processes and Markov chains describe the way in which the process will evolve in the future depending only on the current state. Therefore they are nor dependant on past events. Two basic processes are considered when describing the occurrence of rain: the alternation of wet and dry periods and temporal distribution of rainfall in each rain event, which determines the rainwater collected in each of the intervals that make up the rain. This allows the introduction of alternative renewal processes and Markov chains of three states, where interstorm time is given by either of the two dry states, short or long. Thus, the stochastic model of Markov chains tries to reproduce the basis of pulse models: the succession of storms, each one composed for a series of rain, separated by a short interval of time without theoretical complexity of these. In a first step, we analyzed all variables involved in the sequential process of the rain: rain event duration, event duration of non-rain, average rainfall intensity in rain events, and finally, temporal distribution of rainfall within the rain event. Additionally, for pulse Bartlett-Lewis model calibration, main descriptive statistics were calculated for each month, considering the process of seasonal rainfall in each month. In a second step, both models were calibrated. Finally, synthetic series were simulated with calibration parameters; series

  12. Stochastic-field cavitation model

    International Nuclear Information System (INIS)

    Dumond, J.; Magagnato, F.; Class, A.

    2013-01-01

    Nonlinear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally, the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian “particles” or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and, in particular, to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. First, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations

  13. Stochastic-field cavitation model

    Science.gov (United States)

    Dumond, J.; Magagnato, F.; Class, A.

    2013-07-01

    Nonlinear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally, the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and, in particular, to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. First, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.

  14. On the deterministic and stochastic use of hydrologic models

    Science.gov (United States)

    Farmer, William H.; Vogel, Richard M.

    2016-01-01

    Environmental simulation models, such as precipitation-runoff watershed models, are increasingly used in a deterministic manner for environmental and water resources design, planning, and management. In operational hydrology, simulated responses are now routinely used to plan, design, and manage a very wide class of water resource systems. However, all such models are calibrated to existing data sets and retain some residual error. This residual, typically unknown in practice, is often ignored, implicitly trusting simulated responses as if they are deterministic quantities. In general, ignoring the residuals will result in simulated responses with distributional properties that do not mimic those of the observed responses. This discrepancy has major implications for the operational use of environmental simulation models as is shown here. Both a simple linear model and a distributed-parameter precipitation-runoff model are used to document the expected bias in the distributional properties of simulated responses when the residuals are ignored. The systematic reintroduction of residuals into simulated responses in a manner that produces stochastic output is shown to improve the distributional properties of the simulated responses. Every effort should be made to understand the distributional behavior of simulation residuals and to use environmental simulation models in a stochastic manner.

  15. Research on nonlinear stochastic dynamical price model

    International Nuclear Information System (INIS)

    Li Jiaorui; Xu Wei; Xie Wenxian; Ren Zhengzheng

    2008-01-01

    In consideration of many uncertain factors existing in economic system, nonlinear stochastic dynamical price model which is subjected to Gaussian white noise excitation is proposed based on deterministic model. One-dimensional averaged Ito stochastic differential equation for the model is derived by using the stochastic averaging method, and applied to investigate the stability of the trivial solution and the first-passage failure of the stochastic price model. The stochastic price model and the methods presented in this paper are verified by numerical studies

  16. A Simple, Realistic Stochastic Model of Gastric Emptying.

    Directory of Open Access Journals (Sweden)

    Jiraphat Yokrattanasak

    Full Text Available Several models of Gastric Emptying (GE have been employed in the past to represent the rate of delivery of stomach contents to the duodenum and jejunum. These models have all used a deterministic form (algebraic equations or ordinary differential equations, considering GE as a continuous, smooth process in time. However, GE is known to occur as a sequence of spurts, irregular both in size and in timing. Hence, we formulate a simple stochastic process model, able to represent the irregular decrements of gastric contents after a meal. The model is calibrated on existing literature data and provides consistent predictions of the observed variability in the emptying trajectories. This approach may be useful in metabolic modeling, since it describes well and explains the apparently heterogeneous GE experimental results in situations where common gastric mechanics across subjects would be expected.

  17. Alternative Asymmetric Stochastic Volatility Models

    NARCIS (Netherlands)

    M. Asai (Manabu); M.J. McAleer (Michael)

    2010-01-01

    textabstractThe stochastic volatility model usually incorporates asymmetric effects by introducing the negative correlation between the innovations in returns and volatility. In this paper, we propose a new asymmetric stochastic volatility model, based on the leverage and size effects. The model is

  18. Stochasticity Modeling in Memristors

    KAUST Repository

    Naous, Rawan

    2015-10-26

    Diverse models have been proposed over the past years to explain the exhibiting behavior of memristors, the fourth fundamental circuit element. The models varied in complexity ranging from a description of physical mechanisms to a more generalized mathematical modeling. Nonetheless, stochasticity, a widespread observed phenomenon, has been immensely overlooked from the modeling perspective. This inherent variability within the operation of the memristor is a vital feature for the integration of this nonlinear device into the stochastic electronics realm of study. In this paper, experimentally observed innate stochasticity is modeled in a circuit compatible format. The model proposed is generic and could be incorporated into variants of threshold-based memristor models in which apparent variations in the output hysteresis convey the switching threshold shift. Further application as a noise injection alternative paves the way for novel approaches in the fields of neuromorphic engineering circuits design. On the other hand, extra caution needs to be paid to variability intolerant digital designs based on non-deterministic memristor logic.

  19. Stochasticity Modeling in Memristors

    KAUST Repository

    Naous, Rawan; Al-Shedivat, Maruan; Salama, Khaled N.

    2015-01-01

    Diverse models have been proposed over the past years to explain the exhibiting behavior of memristors, the fourth fundamental circuit element. The models varied in complexity ranging from a description of physical mechanisms to a more generalized mathematical modeling. Nonetheless, stochasticity, a widespread observed phenomenon, has been immensely overlooked from the modeling perspective. This inherent variability within the operation of the memristor is a vital feature for the integration of this nonlinear device into the stochastic electronics realm of study. In this paper, experimentally observed innate stochasticity is modeled in a circuit compatible format. The model proposed is generic and could be incorporated into variants of threshold-based memristor models in which apparent variations in the output hysteresis convey the switching threshold shift. Further application as a noise injection alternative paves the way for novel approaches in the fields of neuromorphic engineering circuits design. On the other hand, extra caution needs to be paid to variability intolerant digital designs based on non-deterministic memristor logic.

  20. Stochastic models: theory and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2008-03-01

    Many problems in applied science and engineering involve physical phenomena that behave randomly in time and/or space. Examples are diverse and include turbulent flow over an aircraft wing, Earth climatology, material microstructure, and the financial markets. Mathematical models for these random phenomena are referred to as stochastic processes and/or random fields, and Monte Carlo simulation is the only general-purpose tool for solving problems of this type. The use of Monte Carlo simulation requires methods and algorithms to generate samples of the appropriate stochastic model; these samples then become inputs and/or boundary conditions to established deterministic simulation codes. While numerous algorithms and tools currently exist to generate samples of simple random variables and vectors, no cohesive simulation tool yet exists for generating samples of stochastic processes and/or random fields. There are two objectives of this report. First, we provide some theoretical background on stochastic processes and random fields that can be used to model phenomena that are random in space and/or time. Second, we provide simple algorithms that can be used to generate independent samples of general stochastic models. The theory and simulation of random variables and vectors is also reviewed for completeness.

  1. Generation of a stochastic precipitation model for the tropical climate

    Science.gov (United States)

    Ng, Jing Lin; Abd Aziz, Samsuzana; Huang, Yuk Feng; Wayayok, Aimrun; Rowshon, MK

    2017-06-01

    A tropical country like Malaysia is characterized by intense localized precipitation with temperatures remaining relatively constant throughout the year. A stochastic modeling of precipitation in the flood-prone Kelantan River Basin is particularly challenging due to the high intermittency of precipitation events of the northeast monsoons. There is an urgent need to have long series of precipitation in modeling the hydrological responses. A single-site stochastic precipitation model that includes precipitation occurrence and an intensity model was developed, calibrated, and validated for the Kelantan River Basin. The simulation process was carried out separately for each station without considering the spatial correlation of precipitation. The Markov chains up to the fifth-order and six distributions were considered. The daily precipitation data of 17 rainfall stations for the study period of 1954-2013 were selected. The results suggested that second- and third-order Markov chains were suitable for simulating monthly and yearly precipitation occurrences, respectively. The fifth-order Markov chain resulted in overestimation of precipitation occurrences. For the mean, distribution, and standard deviation of precipitation amounts, the exponential, gamma, log-normal, skew normal, mixed exponential, and generalized Pareto distributions performed superiorly. However, for the extremes of precipitation, the exponential and log-normal distributions were better while the skew normal and generalized Pareto distributions tend to show underestimations. The log-normal distribution was chosen as the best distribution to simulate precipitation amounts. Overall, the stochastic precipitation model developed is considered a convenient tool to simulate the characteristics of precipitation in the Kelantan River Basin.

  2. Stochastic Optimization of Wind Turbine Power Factor Using Stochastic Model of Wind Power

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Siano, Pierluigi; Bak-Jensen, Birgitte

    2010-01-01

    This paper proposes a stochastic optimization algorithm that aims to minimize the expectation of the system power losses by controlling wind turbine (WT) power factors. This objective of the optimization is subject to the probability constraints of bus voltage and line current requirements....... The optimization algorithm utilizes the stochastic models of wind power generation (WPG) and load demand to take into account their stochastic variation. The stochastic model of WPG is developed on the basis of a limited autoregressive integrated moving average (LARIMA) model by introducing a crosscorrelation...... structure to the LARIMA model. The proposed stochastic optimization is carried out on a 69-bus distribution system. Simulation results confirm that, under various combinations of WPG and load demand, the system power losses are considerably reduced with the optimal setting of WT power factor as compared...

  3. Stochastic diffusion models for substitutable technological innovations

    NARCIS (Netherlands)

    Wang, L.; Hu, B.; Yu, X.

    2004-01-01

    Based on the analysis of firms' stochastic adoption behaviour, this paper first points out the necessity to build more practical stochastic models. And then, stochastic evolutionary models are built for substitutable innovation diffusion system. Finally, through the computer simulation of the

  4. Dynamic-stochastic modeling of snow cover formation on the European territory of Russia

    Directory of Open Access Journals (Sweden)

    A. N. Gelfan

    2014-01-01

    Full Text Available A dynamic-stochastic model, which combines a deterministic model of snow cover formation with a stochastic weather generator, has been developed. The deterministic snow model describes temporal change of the snow depth, content of ice and liquid water, snow density, snowmelt, sublimation, re-freezing of melt water, and snow metamorphism. The model has been calibrated and validated against the long-term data of snow measurements over the territory of the European Russia. The model showed good performance in simulating time series of the snow water equivalent and snow depth. The developed weather generator (NEsted Weather Generator, NewGen includes nested generators of annual, monthly and daily time series of weather variables (namely, precipitation, air temperature, and air humidity. The parameters of the NewGen have been adjusted through calibration against the long-term meteorological data in the European Russia. A disaggregation procedure has been proposed for transforming parameters of the annual weather generator into the parameters of the monthly one and, subsequently, into the parameters of the daily generator. Multi-year time series of the simulated daily weather variables have been used as an input to the snow model. Probability properties of the snow cover, such as snow water equivalent and snow depth for return periods of 25 and 100 years, have been estimated against the observed data, showing good correlation coefficients. The described model has been applied to different landscapes of European Russia, from steppe to taiga regions, to show the robustness of the proposed technique.

  5. Sequential neural models with stochastic layers

    DEFF Research Database (Denmark)

    Fraccaro, Marco; Sønderby, Søren Kaae; Paquet, Ulrich

    2016-01-01

    How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural...... generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over...

  6. Calibrated Properties Model

    International Nuclear Information System (INIS)

    Ahlers, C.; Liu, H.

    2000-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the ''AMR Development Plan for U0035 Calibrated Properties Model REV00. These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions

  7. Numerical Simulation of the Heston Model under Stochastic Correlation

    Directory of Open Access Journals (Sweden)

    Long Teng

    2017-12-01

    Full Text Available Stochastic correlation models have become increasingly important in financial markets. In order to be able to price vanilla options in stochastic volatility and correlation models, in this work, we study the extension of the Heston model by imposing stochastic correlations driven by a stochastic differential equation. We discuss the efficient algorithms for the extended Heston model by incorporating stochastic correlations. Our numerical experiments show that the proposed algorithms can efficiently provide highly accurate results for the extended Heston by including stochastic correlations. By investigating the effect of stochastic correlations on the implied volatility, we find that the performance of the Heston model can be proved by including stochastic correlations.

  8. Calibrated Properties Model

    International Nuclear Information System (INIS)

    Ahlers, C.F.; Liu, H.H.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the AMR Development Plan for U0035 Calibrated Properties Model REV00 (CRWMS M and O 1999c). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions

  9. Transport properties of stochastic Lorentz models

    NARCIS (Netherlands)

    Beijeren, H. van

    Diffusion processes are considered for one-dimensional stochastic Lorentz models, consisting of randomly distributed fixed scatterers and one moving light particle. In waiting time Lorentz models the light particle makes instantaneous jumps between scatterers after a stochastically distributed

  10. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  11. Revisiting the cape cod bacteria injection experiment using a stochastic modeling approach

    Science.gov (United States)

    Maxwell, R.M.; Welty, C.; Harvey, R.W.

    2007-01-01

    Bromide and resting-cell bacteria tracer tests conducted in a sandy aquifer at the U.S. Geological Survey Cape Cod site in 1987 were reinterpreted using a three-dimensional stochastic approach. Bacteria transport was coupled to colloid filtration theory through functional dependence of local-scale colloid transport parameters upon hydraulic conductivity and seepage velocity in a stochastic advection - dispersion/attachment - detachment model. Geostatistical information on the hydraulic conductivity (K) field that was unavailable at the time of the original test was utilized as input. Using geostatistical parameters, a groundwater flow and particle-tracking model of conservative solute transport was calibrated to the bromide-tracer breakthrough data. An optimization routine was employed over 100 realizations to adjust the mean and variance ofthe natural-logarithm of hydraulic conductivity (InK) field to achieve best fit of a simulated, average bromide breakthrough curve. A stochastic particle-tracking model for the bacteria was run without adjustments to the local-scale colloid transport parameters. Good predictions of mean bacteria breakthrough were achieved using several approaches for modeling components of the system. Simulations incorporating the recent Tufenkji and Elimelech (Environ. Sci. Technol. 2004, 38, 529-536) correlation equation for estimating single collector efficiency were compared to those using the older Rajagopalan and Tien (AIChE J. 1976, 22, 523-533) model. Both appeared to work equally well at predicting mean bacteria breakthrough using a constant mean bacteria diameter for this set of field conditions. Simulations using a distribution of bacterial cell diameters available from original field notes yielded a slight improvement in the model and data agreement compared to simulations using an average bacterial diameter. The stochastic approach based on estimates of local-scale parameters for the bacteria-transport process reasonably captured

  12. Modelling Cow Behaviour Using Stochastic Automata

    DEFF Research Database (Denmark)

    Jónsson, Ragnar Ingi

    This report covers an initial study on the modelling of cow behaviour using stochastic automata with the aim of detecting lameness. Lameness in cows is a serious problem that needs to be dealt with because it results in less profitable production units and in reduced quality of life...... for the affected livestock. By featuring training data consisting of measurements of cow activity, three different models are obtained, namely an autonomous stochastic automaton, a stochastic automaton with coinciding state and output and an autonomous stochastic automaton with coinciding state and output, all...... of which describe the cows' activity in the two regarded behavioural scenarios, non-lame and lame. Using the experimental measurement data the different behavioural relations for the two regarded behavioural scenarios are assessed. The three models comprise activity within last hour, activity within last...

  13. Stochastic Modeling of Wind Derivatives in Energy Markets

    Directory of Open Access Journals (Sweden)

    Fred Espen Benth

    2018-05-01

    Full Text Available We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG process and the wind speed and wind power production with two Ornstein–Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power production, namely between a pure jump process and a continuous path process, respectively, we replace the small jumps of the NIG process by a Brownian term. We then apply our models to two different problems: first, to study from the stochastic point of view the income from a wind power plant, as the expected value of the product between the electricity spot price and the amount of energy produced; then, to construct and price a European put-type quanto option in the wind energy markets that allows the buyer to hedge against low prices and low wind power production in the plant. Calibration of the proposed models and related price formulas is also provided, according to specific datasets.

  14. Stochastic volatility and stochastic leverage

    DEFF Research Database (Denmark)

    Veraart, Almut; Veraart, Luitgard A. M.

    This paper proposes the new concept of stochastic leverage in stochastic volatility models. Stochastic leverage refers to a stochastic process which replaces the classical constant correlation parameter between the asset return and the stochastic volatility process. We provide a systematic...... treatment of stochastic leverage and propose to model the stochastic leverage effect explicitly, e.g. by means of a linear transformation of a Jacobi process. Such models are both analytically tractable and allow for a direct economic interpretation. In particular, we propose two new stochastic volatility...... models which allow for a stochastic leverage effect: the generalised Heston model and the generalised Barndorff-Nielsen & Shephard model. We investigate the impact of a stochastic leverage effect in the risk neutral world by focusing on implied volatilities generated by option prices derived from our new...

  15. Stochastic models of cell motility

    DEFF Research Database (Denmark)

    Gradinaru, Cristian

    2012-01-01

    Cell motility and migration are central to the development and maintenance of multicellular organisms, and errors during this process can lead to major diseases. Consequently, the mechanisms and phenomenology of cell motility are currently under intense study. In recent years, a new...... interdisciplinary field focusing on the study of biological processes at the nanoscale level, with a range of technological applications in medicine and biological research, has emerged. The work presented in this thesis is at the interface of cell biology, image processing, and stochastic modeling. The stochastic...... models introduced here are based on persistent random motion, which I apply to real-life studies of cell motility on flat and nanostructured surfaces. These models aim to predict the time-dependent position of cell centroids in a stochastic manner, and conversely determine directly from experimental...

  16. Stochastic Modelling of Hydrologic Systems

    DEFF Research Database (Denmark)

    Jonsdottir, Harpa

    2007-01-01

    In this PhD project several stochastic modelling methods are studied and applied on various subjects in hydrology. The research was prepared at Informatics and Mathematical Modelling at the Technical University of Denmark. The thesis is divided into two parts. The first part contains...... an introduction and an overview of the papers published. Then an introduction to basic concepts in hydrology along with a description of hydrological data is given. Finally an introduction to stochastic modelling is given. The second part contains the research papers. In the research papers the stochastic methods...... are described, as at the time of publication these methods represent new contribution to hydrology. The second part also contains additional description of software used and a brief introduction to stiff systems. The system in one of the papers is stiff....

  17. Field Trial Measurements to Validate a Stochastic Aircraft Boarding Model

    Directory of Open Access Journals (Sweden)

    Michael Schultz

    2018-03-01

    Full Text Available Efficient boarding procedures have to consider both operational constraints and the individual passenger behavior. In contrast to the aircraft handling processes of fueling, catering and cleaning, the boarding process is more driven by passengers than by airport or airline operators. This paper delivers a comprehensive set of operational data including classification of boarding times, passenger arrival times, times to store hand luggage, and passenger interactions in the aircraft cabin as a reliable basis for calibrating models for aircraft boarding. In this paper, a microscopic approach is used to model the passenger behavior, where the passenger movement is defined as a one-dimensional, stochastic, and time/space discrete transition process. This model is used to compare measurements from field trials of boarding procedures with simulation results and demonstrates a deviation smaller than 5%.

  18. Stochasticity and determinism in models of hematopoiesis.

    Science.gov (United States)

    Kimmel, Marek

    2014-01-01

    This chapter represents a novel view of modeling in hematopoiesis, synthesizing both deterministic and stochastic approaches. Whereas the stochastic models work in situations where chance dominates, for example when the number of cells is small, or under random mutations, the deterministic models are more important for large-scale, normal hematopoiesis. New types of models are on the horizon. These models attempt to account for distributed environments such as hematopoietic niches and their impact on dynamics. Mixed effects of such structures and chance events are largely unknown and constitute both a challenge and promise for modeling. Our discussion is presented under the separate headings of deterministic and stochastic modeling; however, the connections between both are frequently mentioned. Four case studies are included to elucidate important examples. We also include a primer of deterministic and stochastic dynamics for the reader's use.

  19. Stochastic Models of Polymer Systems

    Science.gov (United States)

    2016-01-01

    Distribution Unlimited Final Report: Stochastic Models of Polymer Systems The views, opinions and/or findings contained in this report are those of the...ADDRESS. Princeton University PO Box 0036 87 Prospect Avenue - 2nd floor Princeton, NJ 08544 -2020 14-Mar-2014 ABSTRACT Number of Papers published in...peer-reviewed journals: Number of Papers published in non peer-reviewed journals: Final Report: Stochastic Models of Polymer Systems Report Title

  20. CAM Stochastic Volatility Model for Option Pricing

    Directory of Open Access Journals (Sweden)

    Wanwan Huang

    2016-01-01

    Full Text Available The coupled additive and multiplicative (CAM noises model is a stochastic volatility model for derivative pricing. Unlike the other stochastic volatility models in the literature, the CAM model uses two Brownian motions, one multiplicative and one additive, to model the volatility process. We provide empirical evidence that suggests a nontrivial relationship between the kurtosis and skewness of asset prices and that the CAM model is able to capture this relationship, whereas the traditional stochastic volatility models cannot. We introduce a control variate method and Monte Carlo estimators for some of the sensitivities (Greeks of the model. We also derive an approximation for the characteristic function of the model.

  1. Modeling stochasticity and robustness in gene regulatory networks.

    Science.gov (United States)

    Garg, Abhishek; Mohanram, Kartik; Di Cara, Alessandro; De Micheli, Giovanni; Xenarios, Ioannis

    2009-06-15

    Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.

  2. Stochastic models in reliability and maintenance

    CERN Document Server

    2002-01-01

    Our daily lives can be maintained by the high-technology systems. Computer systems are typical examples of such systems. We can enjoy our modern lives by using many computer systems. Much more importantly, we have to maintain such systems without failure, but cannot predict when such systems will fail and how to fix such systems without delay. A stochastic process is a set of outcomes of a random experiment indexed by time, and is one of the key tools needed to analyze the future behavior quantitatively. Reliability and maintainability technologies are of great interest and importance to the maintenance of such systems. Many mathematical models have been and will be proposed to describe reliability and maintainability systems by using the stochastic processes. The theme of this book is "Stochastic Models in Reliability and Main­ tainability. " This book consists of 12 chapters on the theme above from the different viewpoints of stochastic modeling. Chapter 1 is devoted to "Renewal Processes," under which cla...

  3. The multivariate supOU stochastic volatility model

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Stelzer, Robert

    Using positive semidefinite supOU (superposition of Ornstein-Uhlenbeck type) processes to describe the volatility, we introduce a multivariate stochastic volatility model for financial data which is capable of modelling long range dependence effects. The finiteness of moments and the second order...... structure of the volatility, the log returns, as well as their "squares" are discussed in detail. Moreover, we give several examples in which long memory effects occur and study how the model as well as the simple Ornstein-Uhlenbeck type stochastic volatility model behave under linear transformations....... In particular, the models are shown to be preserved under invertible linear transformations. Finally, we discuss how (sup)OU stochastic volatility models can be combined with a factor modelling approach....

  4. A Fractionally Integrated Wishart Stochastic Volatility Model

    NARCIS (Netherlands)

    M. Asai (Manabu); M.J. McAleer (Michael)

    2013-01-01

    textabstractThere has recently been growing interest in modeling and estimating alternative continuous time multivariate stochastic volatility models. We propose a continuous time fractionally integrated Wishart stochastic volatility (FIWSV) process. We derive the conditional Laplace transform of

  5. A stochastic SIS epidemic model with vaccination

    Science.gov (United States)

    Cao, Boqiang; Shan, Meijing; Zhang, Qimin; Wang, Weiming

    2017-11-01

    In this paper, we investigate the basic features of an SIS type infectious disease model with varying population size and vaccinations in presence of environment noise. By applying the Markov semigroup theory, we propose a stochastic reproduction number R0s which can be seen as a threshold parameter to utilize in identifying the stochastic extinction and persistence: If R0s disease-free absorbing set for the stochastic epidemic model, which implies that disease dies out with probability one; while if R0s > 1, under some mild extra conditions, the SDE model has an endemic stationary distribution which results in the stochastic persistence of the infectious disease. The most interesting finding is that large environmental noise can suppress the outbreak of the disease.

  6. Population stochastic modelling (PSM)--an R package for mixed-effects models based on stochastic differential equations.

    Science.gov (United States)

    Klim, Søren; Mortensen, Stig Bousgaard; Kristensen, Niels Rode; Overgaard, Rune Viig; Madsen, Henrik

    2009-06-01

    The extension from ordinary to stochastic differential equations (SDEs) in pharmacokinetic and pharmacodynamic (PK/PD) modelling is an emerging field and has been motivated in a number of articles [N.R. Kristensen, H. Madsen, S.H. Ingwersen, Using stochastic differential equations for PK/PD model development, J. Pharmacokinet. Pharmacodyn. 32 (February(1)) (2005) 109-141; C.W. Tornøe, R.V. Overgaard, H. Agersø, H.A. Nielsen, H. Madsen, E.N. Jonsson, Stochastic differential equations in NONMEM: implementation, application, and comparison with ordinary differential equations, Pharm. Res. 22 (August(8)) (2005) 1247-1258; R.V. Overgaard, N. Jonsson, C.W. Tornøe, H. Madsen, Non-linear mixed-effects models with stochastic differential equations: implementation of an estimation algorithm, J. Pharmacokinet. Pharmacodyn. 32 (February(1)) (2005) 85-107; U. Picchini, S. Ditlevsen, A. De Gaetano, Maximum likelihood estimation of a time-inhomogeneous stochastic differential model of glucose dynamics, Math. Med. Biol. 25 (June(2)) (2008) 141-155]. PK/PD models are traditionally based ordinary differential equations (ODEs) with an observation link that incorporates noise. This state-space formulation only allows for observation noise and not for system noise. Extending to SDEs allows for a Wiener noise component in the system equations. This additional noise component enables handling of autocorrelated residuals originating from natural variation or systematic model error. Autocorrelated residuals are often partly ignored in PK/PD modelling although violating the hypothesis for many standard statistical tests. This article presents a package for the statistical program R that is able to handle SDEs in a mixed-effects setting. The estimation method implemented is the FOCE(1) approximation to the population likelihood which is generated from the individual likelihoods that are approximated using the Extended Kalman Filter's one-step predictions.

  7. Calibration process of highly parameterized semi-distributed hydrological model

    Science.gov (United States)

    Vidmar, Andrej; Brilly, Mitja

    2017-04-01

    Hydrological phenomena take place in the hydrological system, which is governed by nature, and are essentially stochastic. These phenomena are unique, non-recurring, and changeable across space and time. Since any river basin with its own natural characteristics and any hydrological event therein, are unique, this is a complex process that is not researched enough. Calibration is a procedure of determining the parameters of a model that are not known well enough. Input and output variables and mathematical model expressions are known, while only some parameters are unknown, which are determined by calibrating the model. The software used for hydrological modelling nowadays is equipped with sophisticated algorithms for calibration purposes without possibility to manage process by modeler. The results are not the best. We develop procedure for expert driven process of calibration. We use HBV-light-CLI hydrological model which has command line interface and coupling it with PEST. PEST is parameter estimation tool which is used widely in ground water modeling and can be used also on surface waters. Process of calibration managed by expert directly, and proportionally to the expert knowledge, affects the outcome of the inversion procedure and achieves better results than if the procedure had been left to the selected optimization algorithm. First step is to properly define spatial characteristic and structural design of semi-distributed model including all morphological and hydrological phenomena, like karstic area, alluvial area and forest area. This step includes and requires geological, meteorological, hydraulic and hydrological knowledge of modeler. Second step is to set initial parameter values at their preferred values based on expert knowledge. In this step we also define all parameter and observation groups. Peak data are essential in process of calibration if we are mainly interested in flood events. Each Sub Catchment in the model has own observations group

  8. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...

  9. Developing a stochastic conflict resolution model for urban runoff quality management: Application of info-gap and bargaining theories

    Science.gov (United States)

    Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra

    2016-02-01

    In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.

  10. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  11. Stochastic Wake Modelling Based on POD Analysis

    Directory of Open Access Journals (Sweden)

    David Bastine

    2018-03-01

    Full Text Available In this work, large eddy simulation data is analysed to investigate a new stochastic modeling approach for the wake of a wind turbine. The data is generated by the large eddy simulation (LES model PALM combined with an actuator disk with rotation representing the turbine. After applying a proper orthogonal decomposition (POD, three different stochastic models for the weighting coefficients of the POD modes are deduced resulting in three different wake models. Their performance is investigated mainly on the basis of aeroelastic simulations of a wind turbine in the wake. Three different load cases and their statistical characteristics are compared for the original LES, truncated PODs and the stochastic wake models including different numbers of POD modes. It is shown that approximately six POD modes are enough to capture the load dynamics on large temporal scales. Modeling the weighting coefficients as independent stochastic processes leads to similar load characteristics as in the case of the truncated POD. To complete this simplified wake description, we show evidence that the small-scale dynamics can be captured by adding to our model a homogeneous turbulent field. In this way, we present a procedure to derive stochastic wake models from costly computational fluid dynamics (CFD calculations or elaborated experimental investigations. These numerically efficient models provide the added value of possible long-term studies. Depending on the aspects of interest, different minimalized models may be obtained.

  12. Stochastic stability and bifurcation in a macroeconomic model

    International Nuclear Information System (INIS)

    Li Wei; Xu Wei; Zhao Junfeng; Jin Yanfei

    2007-01-01

    On the basis of the work of Goodwin and Puu, a new business cycle model subject to a stochastically parametric excitation is derived in this paper. At first, we reduce the model to a one-dimensional diffusion process by applying the stochastic averaging method of quasi-nonintegrable Hamiltonian system. Secondly, we utilize the methods of Lyapunov exponent and boundary classification associated with diffusion process respectively to analyze the stochastic stability of the trivial solution of system. The numerical results obtained illustrate that the trivial solution of system must be globally stable if it is locally stable in the state space. Thirdly, we explore the stochastic Hopf bifurcation of the business cycle model according to the qualitative changes in stationary probability density of system response. It is concluded that the stochastic Hopf bifurcation occurs at two critical parametric values. Finally, some explanations are given in a simply way on the potential applications of stochastic stability and bifurcation analysis

  13. Population stochastic modelling (PSM)-An R package for mixed-effects models based on stochastic differential equations

    DEFF Research Database (Denmark)

    Klim, Søren; Mortensen, Stig Bousgaard; Kristensen, Niels Rode

    2009-01-01

    are often partly ignored in PK/PD modelling although violating the hypothesis for many standard statistical tests. This article presents a package for the statistical program R that is able to handle SDEs in a mixed-effects setting. The estimation method implemented is the FOCE1 approximation......The extension from ordinary to stochastic differential equations (SDEs) in pharmacokinetic and pharmacodynamic (PK/PD) modelling is an emerging field and has been motivated in a number of articles [N.R. Kristensen, H. Madsen, S.H. Ingwersen, Using stochastic differential equations for PK/PD model...... development, J. Pharmacokinet. Pharmacodyn. 32 (February(l)) (2005) 109-141; C.W. Tornoe, R.V Overgaard, H. Agerso, H.A. Nielsen, H. Madsen, E.N. Jonsson, Stochastic differential equations in NONMEM: implementation, application, and comparison with ordinary differential equations, Pharm. Res. 22 (August(8...

  14. Stochastic volatility models and Kelvin waves

    Science.gov (United States)

    Lipton, Alex; Sepp, Artur

    2008-08-01

    We use stochastic volatility models to describe the evolution of an asset price, its instantaneous volatility and its realized volatility. In particular, we concentrate on the Stein and Stein model (SSM) (1991) for the stochastic asset volatility and the Heston model (HM) (1993) for the stochastic asset variance. By construction, the volatility is not sign definite in SSM and is non-negative in HM. It is well known that both models produce closed-form expressions for the prices of vanilla option via the Lewis-Lipton formula. However, the numerical pricing of exotic options by means of the finite difference and Monte Carlo methods is much more complex for HM than for SSM. Until now, this complexity was considered to be an acceptable price to pay for ensuring that the asset volatility is non-negative. We argue that having negative stochastic volatility is a psychological rather than financial or mathematical problem, and advocate using SSM rather than HM in most applications. We extend SSM by adding volatility jumps and obtain a closed-form expression for the density of the asset price and its realized volatility. We also show that the current method of choice for solving pricing problems with stochastic volatility (via the affine ansatz for the Fourier-transformed density function) can be traced back to the Kelvin method designed in the 19th century for studying wave motion problems arising in fluid dynamics.

  15. Stochastic volatility models and Kelvin waves

    International Nuclear Information System (INIS)

    Lipton, Alex; Sepp, Artur

    2008-01-01

    We use stochastic volatility models to describe the evolution of an asset price, its instantaneous volatility and its realized volatility. In particular, we concentrate on the Stein and Stein model (SSM) (1991) for the stochastic asset volatility and the Heston model (HM) (1993) for the stochastic asset variance. By construction, the volatility is not sign definite in SSM and is non-negative in HM. It is well known that both models produce closed-form expressions for the prices of vanilla option via the Lewis-Lipton formula. However, the numerical pricing of exotic options by means of the finite difference and Monte Carlo methods is much more complex for HM than for SSM. Until now, this complexity was considered to be an acceptable price to pay for ensuring that the asset volatility is non-negative. We argue that having negative stochastic volatility is a psychological rather than financial or mathematical problem, and advocate using SSM rather than HM in most applications. We extend SSM by adding volatility jumps and obtain a closed-form expression for the density of the asset price and its realized volatility. We also show that the current method of choice for solving pricing problems with stochastic volatility (via the affine ansatz for the Fourier-transformed density function) can be traced back to the Kelvin method designed in the 19th century for studying wave motion problems arising in fluid dynamics

  16. Stochastic volatility models and Kelvin waves

    Energy Technology Data Exchange (ETDEWEB)

    Lipton, Alex [Merrill Lynch, Mlfc Main, 2 King Edward Street, London EC1A 1HQ (United Kingdom); Sepp, Artur [Merrill Lynch, 4 World Financial Center, New York, NY 10080 (United States)], E-mail: Alex_Lipton@ml.com, E-mail: Artur_Sepp@ml.com

    2008-08-29

    We use stochastic volatility models to describe the evolution of an asset price, its instantaneous volatility and its realized volatility. In particular, we concentrate on the Stein and Stein model (SSM) (1991) for the stochastic asset volatility and the Heston model (HM) (1993) for the stochastic asset variance. By construction, the volatility is not sign definite in SSM and is non-negative in HM. It is well known that both models produce closed-form expressions for the prices of vanilla option via the Lewis-Lipton formula. However, the numerical pricing of exotic options by means of the finite difference and Monte Carlo methods is much more complex for HM than for SSM. Until now, this complexity was considered to be an acceptable price to pay for ensuring that the asset volatility is non-negative. We argue that having negative stochastic volatility is a psychological rather than financial or mathematical problem, and advocate using SSM rather than HM in most applications. We extend SSM by adding volatility jumps and obtain a closed-form expression for the density of the asset price and its realized volatility. We also show that the current method of choice for solving pricing problems with stochastic volatility (via the affine ansatz for the Fourier-transformed density function) can be traced back to the Kelvin method designed in the 19th century for studying wave motion problems arising in fluid dynamics.

  17. Stochastic Modelling Of The Repairable System

    Directory of Open Access Journals (Sweden)

    Andrzejczak Karol

    2015-11-01

    Full Text Available All reliability models consisting of random time factors form stochastic processes. In this paper we recall the definitions of the most common point processes which are used for modelling of repairable systems. Particularly this paper presents stochastic processes as examples of reliability systems for the support of the maintenance related decisions. We consider the simplest one-unit system with a negligible repair or replacement time, i.e., the unit is operating and is repaired or replaced at failure, where the time required for repair and replacement is negligible. When the repair or replacement is completed, the unit becomes as good as new and resumes operation. The stochastic modelling of recoverable systems constitutes an excellent method of supporting maintenance related decision-making processes and enables their more rational use.

  18. Hybrid approaches for multiple-species stochastic reaction–diffusion models

    International Nuclear Information System (INIS)

    Spill, Fabian; Guerrero, Pilar; Alarcon, Tomas; Maini, Philip K.; Byrne, Helen

    2015-01-01

    Reaction–diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and small in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction–diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model. - Highlights: • A novel hybrid stochastic/deterministic reaction–diffusion simulation method is given. • Can massively speed up stochastic simulations while preserving stochastic effects. • Can handle multiple reacting species. • Can handle moving boundaries

  19. Hybrid approaches for multiple-species stochastic reaction–diffusion models

    Energy Technology Data Exchange (ETDEWEB)

    Spill, Fabian, E-mail: fspill@bu.edu [Department of Biomedical Engineering, Boston University, 44 Cummington Street, Boston, MA 02215 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States); Guerrero, Pilar [Department of Mathematics, University College London, Gower Street, London WC1E 6BT (United Kingdom); Alarcon, Tomas [Centre de Recerca Matematica, Campus de Bellaterra, Edifici C, 08193 Bellaterra (Barcelona) (Spain); Departament de Matemàtiques, Universitat Atonòma de Barcelona, 08193 Bellaterra (Barcelona) (Spain); Maini, Philip K. [Wolfson Centre for Mathematical Biology, Mathematical Institute, University of Oxford, Oxford OX2 6GG (United Kingdom); Byrne, Helen [Wolfson Centre for Mathematical Biology, Mathematical Institute, University of Oxford, Oxford OX2 6GG (United Kingdom); Computational Biology Group, Department of Computer Science, University of Oxford, Oxford OX1 3QD (United Kingdom)

    2015-10-15

    Reaction–diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and small in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction–diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model. - Highlights: • A novel hybrid stochastic/deterministic reaction–diffusion simulation method is given. • Can massively speed up stochastic simulations while preserving stochastic effects. • Can handle multiple reacting species. • Can handle moving boundaries.

  20. Model selection for integrated pest management with stochasticity.

    Science.gov (United States)

    Akman, Olcay; Comar, Timothy D; Hrozencik, Daniel

    2018-04-07

    In Song and Xiang (2006), an integrated pest management model with periodically varying climatic conditions was introduced. In order to address a wider range of environmental effects, the authors here have embarked upon a series of studies resulting in a more flexible modeling approach. In Akman et al. (2013), the impact of randomly changing environmental conditions is examined by incorporating stochasticity into the birth pulse of the prey species. In Akman et al. (2014), the authors introduce a class of models via a mixture of two birth-pulse terms and determined conditions for the global and local asymptotic stability of the pest eradication solution. With this work, the authors unify the stochastic and mixture model components to create further flexibility in modeling the impacts of random environmental changes on an integrated pest management system. In particular, we first determine the conditions under which solutions of our deterministic mixture model are permanent. We then analyze the stochastic model to find the optimal value of the mixing parameter that minimizes the variance in the efficacy of the pesticide. Additionally, we perform a sensitivity analysis to show that the corresponding pesticide efficacy determined by this optimization technique is indeed robust. Through numerical simulations we show that permanence can be preserved in our stochastic model. Our study of the stochastic version of the model indicates that our results on the deterministic model provide informative conclusions about the behavior of the stochastic model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Infinite-degree-corrected stochastic block model

    DEFF Research Database (Denmark)

    Herlau, Tue; Schmidt, Mikkel Nørgaard; Mørup, Morten

    2014-01-01

    In stochastic block models, which are among the most prominent statistical models for cluster analysis of complex networks, clusters are defined as groups of nodes with statistically similar link probabilities within and between groups. A recent extension by Karrer and Newman [Karrer and Newman...... corrected stochastic block model as a nonparametric Bayesian model, incorporating a parameter to control the amount of degree correction that can then be inferred from data. Additionally, our formulation yields principled ways of inferring the number of groups as well as predicting missing links...

  2. Calibrated Properties Model

    International Nuclear Information System (INIS)

    Ghezzehej, T.

    2004-01-01

    The purpose of this model report is to document the calibrated properties model that provides calibrated property sets for unsaturated zone (UZ) flow and transport process models (UZ models). The calibration of the property sets is performed through inverse modeling. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Sections 1.2.6 and 2.1.1.6). Direct inputs to this model report were derived from the following upstream analysis and model reports: ''Analysis of Hydrologic Properties Data'' (BSC 2004 [DIRS 170038]); ''Development of Numerical Grids for UZ Flow and Transport Modeling'' (BSC 2004 [DIRS 169855]); ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]); ''Geologic Framework Model'' (GFM2000) (BSC 2004 [DIRS 170029]). Additionally, this model report incorporates errata of the previous version and closure of the Key Technical Issue agreement TSPAI 3.26 (Section 6.2.2 and Appendix B), and it is revised for improved transparency

  3. A Stochastic Model for Malaria Transmission Dynamics

    Directory of Open Access Journals (Sweden)

    Rachel Waema Mbogo

    2018-01-01

    Full Text Available Malaria is one of the three most dangerous infectious diseases worldwide (along with HIV/AIDS and tuberculosis. In this paper we compare the disease dynamics of the deterministic and stochastic models in order to determine the effect of randomness in malaria transmission dynamics. Relationships between the basic reproduction number for malaria transmission dynamics between humans and mosquitoes and the extinction thresholds of corresponding continuous-time Markov chain models are derived under certain assumptions. The stochastic model is formulated using the continuous-time discrete state Galton-Watson branching process (CTDSGWbp. The reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or die out. Thresholds for disease extinction from stochastic models contribute crucial knowledge on disease control and elimination and mitigation of infectious diseases. Analytical and numerical results show some significant differences in model predictions between the stochastic and deterministic models. In particular, we find that malaria outbreak is more likely if the disease is introduced by infected mosquitoes as opposed to infected humans. These insights demonstrate the importance of a policy or intervention focusing on controlling the infected mosquito population if the control of malaria is to be realized.

  4. Stochastic modeling and analysis of telecoms networks

    CERN Document Server

    Decreusefond, Laurent

    2012-01-01

    This book addresses the stochastic modeling of telecommunication networks, introducing the main mathematical tools for that purpose, such as Markov processes, real and spatial point processes and stochastic recursions, and presenting a wide list of results on stability, performances and comparison of systems.The authors propose a comprehensive mathematical construction of the foundations of stochastic network theory: Markov chains, continuous time Markov chains are extensively studied using an original martingale-based approach. A complete presentation of stochastic recursions from an

  5. Joint Pricing of VIX and SPX Options with Stochastic Volatility and Jump models

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Stisen, Martin

    2015-01-01

    to existing literature, we derive numerically simpler VIX option and futures pricing formulas in the case of the SVJ model. Moreover, the paper is the first to study the pricing performance of three widely used models to SPX options and VIX derivatives.......With the existence of active markets for volatility derivatives and options on the underlying instrument, the need for models that are able to price these markets consistently has increased. Although pricing formulas for VIX and vanilla options are now available for commonly employed models...... and variance (SVJJ) are jointly calibrated to market quotes on SPX and VIX options together with VIX futures. The full flexibility of having jumps in both returns and volatility added to a stochastic volatility model is essential. Moreover, we find that the SVJJ model with the Feller condition imposed...

  6. Stochastic line motion and stochastic flux conservation for nonideal hydromagnetic models

    International Nuclear Information System (INIS)

    Eyink, Gregory L.

    2009-01-01

    We prove that smooth solutions of nonideal (viscous and resistive) incompressible magnetohydrodynamic (MHD) equations satisfy a stochastic law of flux conservation. This property implies that the magnetic flux through a surface is equal to the average of the magnetic fluxes through an ensemble of surfaces advected backward in time by the plasma velocity perturbed with a random white noise. Our result is an analog of the well-known Alfven theorem of ideal MHD and is valid for any value of the magnetic Prandtl number. A second stochastic conservation law is shown to hold at unit Prandtl number, a random version of the generalized Kelvin theorem derived by Bekenstein and Oron for ideal MHD. These stochastic conservation laws are not only shown to be consequences of the nonideal MHD equations but are proved in fact to be equivalent to those equations. We derive similar results for two more refined hydromagnetic models, Hall MHD and the two-fluid plasma model, still assuming incompressible velocities and isotropic transport coefficients. Finally, we use these results to discuss briefly the infinite-Reynolds-number limit of hydromagnetic turbulence and to support the conjecture that flux conservation remains stochastic in that limit.

  7. A stochastic model of enzyme kinetics

    Science.gov (United States)

    Stefanini, Marianne; Newman, Timothy; McKane, Alan

    2003-10-01

    Enzyme kinetics is generally modeled by deterministic rate equations, and in the simplest case leads to the well-known Michaelis-Menten equation. It is plausible that stochastic effects will play an important role at low enzyme concentrations. We have addressed this by constructing a simple stochastic model which can be exactly solved in the steady-state. Throughout a wide range of parameter values Michaelis-Menten dynamics is replaced by a new and simple theoretical result.

  8. Boosting Bayesian parameter inference of nonlinear stochastic differential equation models by Hamiltonian scale separation.

    Science.gov (United States)

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.

  9. FluTE, a publicly available stochastic influenza epidemic simulation model.

    Directory of Open Access Journals (Sweden)

    Dennis L Chao

    2010-01-01

    Full Text Available Mathematical and computer models of epidemics have contributed to our understanding of the spread of infectious disease and the measures needed to contain or mitigate them. To help prepare for future influenza seasonal epidemics or pandemics, we developed a new stochastic model of the spread of influenza across a large population. Individuals in this model have realistic social contact networks, and transmission and infections are based on the current state of knowledge of the natural history of influenza. The model has been calibrated so that outcomes are consistent with the 1957/1958 Asian A(H2N2 and 2009 pandemic A(H1N1 influenza viruses. We present examples of how this model can be used to study the dynamics of influenza epidemics in the United States and simulate how to mitigate or delay them using pharmaceutical interventions and social distancing measures. Computer simulation models play an essential role in informing public policy and evaluating pandemic preparedness plans. We have made the source code of this model publicly available to encourage its use and further development.

  10. FluTE, a publicly available stochastic influenza epidemic simulation model.

    Science.gov (United States)

    Chao, Dennis L; Halloran, M Elizabeth; Obenchain, Valerie J; Longini, Ira M

    2010-01-29

    Mathematical and computer models of epidemics have contributed to our understanding of the spread of infectious disease and the measures needed to contain or mitigate them. To help prepare for future influenza seasonal epidemics or pandemics, we developed a new stochastic model of the spread of influenza across a large population. Individuals in this model have realistic social contact networks, and transmission and infections are based on the current state of knowledge of the natural history of influenza. The model has been calibrated so that outcomes are consistent with the 1957/1958 Asian A(H2N2) and 2009 pandemic A(H1N1) influenza viruses. We present examples of how this model can be used to study the dynamics of influenza epidemics in the United States and simulate how to mitigate or delay them using pharmaceutical interventions and social distancing measures. Computer simulation models play an essential role in informing public policy and evaluating pandemic preparedness plans. We have made the source code of this model publicly available to encourage its use and further development.

  11. Stochastic Control - External Models

    DEFF Research Database (Denmark)

    Poulsen, Niels Kjølstad

    2005-01-01

    This note is devoted to control of stochastic systems described in discrete time. We are concerned with external descriptions or transfer function model, where we have a dynamic model for the input output relation only (i.e.. no direct internal information). The methods are based on LTI systems...

  12. Stochastic dynamic modeling of regular and slow earthquakes

    Science.gov (United States)

    Aso, N.; Ando, R.; Ide, S.

    2017-12-01

    Both regular and slow earthquakes are slip phenomena on plate boundaries and are simulated by a (quasi-)dynamic modeling [Liu and Rice, 2005]. In these numerical simulations, spatial heterogeneity is usually considered not only for explaining real physical properties but also for evaluating the stability of the calculations or the sensitivity of the results on the condition. However, even though we discretize the model space with small grids, heterogeneity at smaller scales than the grid size is not considered in the models with deterministic governing equations. To evaluate the effect of heterogeneity at the smaller scales we need to consider stochastic interactions between slip and stress in a dynamic modeling. Tidal stress is known to trigger or affect both regular and slow earthquakes [Yabe et al., 2015; Ide et al., 2016], and such an external force with fluctuation can also be considered as a stochastic external force. A healing process of faults may also be stochastic, so we introduce stochastic friction law. In the present study, we propose a stochastic dynamic model to explain both regular and slow earthquakes. We solve mode III problem, which corresponds to the rupture propagation along the strike direction. We use BIEM (boundary integral equation method) scheme to simulate slip evolution, but we add stochastic perturbations in the governing equations, which is usually written in a deterministic manner. As the simplest type of perturbations, we adopt Gaussian deviations in the formulation of the slip-stress kernel, external force, and friction. By increasing the amplitude of perturbations of the slip-stress kernel, we reproduce complicated rupture process of regular earthquakes including unilateral and bilateral ruptures. By perturbing external force, we reproduce slow rupture propagation at a scale of km/day. The slow propagation generated by a combination of fast interaction at S-wave velocity is analogous to the kinetic theory of gasses: thermal

  13. Stochastic Watershed Models for Risk Based Decision Making

    Science.gov (United States)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  14. Observation models in radiocarbon calibration

    International Nuclear Information System (INIS)

    Jones, M.D.; Nicholls, G.K.

    2001-01-01

    The observation model underlying any calibration process dictates the precise mathematical details of the calibration calculations. Accordingly it is important that an appropriate observation model is used. Here this is illustrated with reference to the use of reservoir offsets where the standard calibration approach is based on a different model to that which the practitioners clearly believe is being applied. This sort of error can give rise to significantly erroneous calibration results. (author). 12 refs., 1 fig

  15. Dynamics of a Stochastic Intraguild Predation Model

    Directory of Open Access Journals (Sweden)

    Zejing Xing

    2016-04-01

    Full Text Available Intraguild predation (IGP is a widespread ecological phenomenon which occurs when one predator species attacks another predator species with which it competes for a shared prey species. The objective of this paper is to study the dynamical properties of a stochastic intraguild predation model. We analyze stochastic persistence and extinction of the stochastic IGP model containing five cases and establish the sufficient criteria for global asymptotic stability of the positive solutions. This study shows that it is possible for the coexistence of three species under the influence of environmental noise, and that the noise may have a positive effect for IGP species. A stationary distribution of the stochastic IGP model is established and it has the ergodic property, suggesting that the time average of population size with the development of time is equal to the stationary distribution in space. Finally, we show that our results may be extended to two well-known biological systems: food chains and exploitative competition.

  16. Calibrating a multi-model approach to defect production in high energy collision cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.; Singh, B.N.; Diaz de la Rubia, T.

    1994-01-01

    A multi-model approach to simulating defect production processes at the atomic scale is described that incorporates molecular dynamics (MD), binary collision approximation (BCA) calculations and stochastic annealing simulations. The central hypothesis is that the simple, fast computer codes capable of simulating large numbers of high energy cascades (e.g., BCA codes) can be made to yield the correct defect configurations when their parameters are calibrated using the results of the more physically realistic MD simulations. The calibration procedure is investigated using results of MD simulations of 25 keV cascades in copper. The configurations of point defects are extracted from the MD cascade simulations at the end of the collisional phase, thus providing information similar to that obtained with a binary collision model. The MD collisional phase defect configurations are used as input to the ALSOME annealing simulation code, and values of the ALSOME quenching parameters are determined that yield the best fit to the post-quenching defect configurations of the MD simulations. ((orig.))

  17. Stochastic Modeling of Traffic Air Pollution

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2014-01-01

    In this paper, modeling of traffic air pollution is discussed with special reference to infrastructures. A number of subjects related to health effects of air pollution and the different types of pollutants are briefly presented. A simple model for estimating the social cost of traffic related air...... and using simple Monte Carlo techniques to obtain a stochastic estimate of the costs of traffic air pollution for infrastructures....... pollution is derived. Several authors have published papers on this very complicated subject, but no stochastic modelling procedure have obtained general acceptance. The subject is discussed basis of a deterministic model. However, it is straightforward to modify this model to include uncertain parameters...

  18. Stochastic modelling of turbulence

    DEFF Research Database (Denmark)

    Sørensen, Emil Hedevang Lohse

    previously been shown to be closely connected to the energy dissipation. The incorporation of the small scale dynamics into the spatial model opens the door to a fully fledged stochastic model of turbulence. Concerning the interaction of wind and wind turbine, a new method is proposed to extract wind turbine...

  19. Approximate models for broken clouds in stochastic radiative transfer theory

    International Nuclear Information System (INIS)

    Doicu, Adrian; Efremenko, Dmitry S.; Loyola, Diego; Trautmann, Thomas

    2014-01-01

    This paper presents approximate models in stochastic radiative transfer theory. The independent column approximation and its modified version with a solar source computed in a full three-dimensional atmosphere are formulated in a stochastic framework and for arbitrary cloud statistics. The nth-order stochastic models describing the independent column approximations are equivalent to the nth-order stochastic models for the original radiance fields in which the gradient vectors are neglected. Fast approximate models are further derived on the basis of zeroth-order stochastic models and the independent column approximation. The so-called “internal mixing” models assume a combination of the optical properties of the cloud and the clear sky, while the “external mixing” models assume a combination of the radiances corresponding to completely overcast and clear skies. A consistent treatment of internal and external mixing models is provided, and a new parameterization of the closure coefficient in the effective thickness approximation is given. An efficient computation of the closure coefficient for internal mixing models, using a previously derived vector stochastic model as a reference, is also presented. Equipped with appropriate look-up tables for the closure coefficient, these models can easily be integrated into operational trace gas retrieval systems that exploit absorption features in the near-IR solar spectrum. - Highlights: • Independent column approximation in a stochastic setting. • Fast internal and external mixing models for total and diffuse radiances. • Efficient optimization of internal mixing models to match reference models

  20. Parameter estimation in stochastic rainfall-runoff models

    DEFF Research Database (Denmark)

    Jonsdottir, Harpa; Madsen, Henrik; Palsson, Olafur Petur

    2006-01-01

    A parameter estimation method for stochastic rainfall-runoff models is presented. The model considered in the paper is a conceptual stochastic model, formulated in continuous-discrete state space form. The model is small and a fully automatic optimization is, therefore, possible for estimating all...... the parameter values are optimal for simulation or prediction. The data originates from Iceland and the model is designed for Icelandic conditions, including a snow routine for mountainous areas. The model demands only two input data series, precipitation and temperature and one output data series...

  1. Test models for improving filtering with model errors through stochastic parameter estimation

    International Nuclear Information System (INIS)

    Gershgorin, B.; Harlim, J.; Majda, A.J.

    2010-01-01

    The filtering skill for turbulent signals from nature is often limited by model errors created by utilizing an imperfect model for filtering. Updating the parameters in the imperfect model through stochastic parameter estimation is one way to increase filtering skill and model performance. Here a suite of stringent test models for filtering with stochastic parameter estimation is developed based on the Stochastic Parameterization Extended Kalman Filter (SPEKF). These new SPEKF-algorithms systematically correct both multiplicative and additive biases and involve exact formulas for propagating the mean and covariance including the parameters in the test model. A comprehensive study is presented of robust parameter regimes for increasing filtering skill through stochastic parameter estimation for turbulent signals as the observation time and observation noise are varied and even when the forcing is incorrectly specified. The results here provide useful guidelines for filtering turbulent signals in more complex systems with significant model errors.

  2. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  3. 12th Workshop on Stochastic Models, Statistics and Their Applications

    CERN Document Server

    Rafajłowicz, Ewaryst; Szajowski, Krzysztof

    2015-01-01

    This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.

  4. Constraining Stochastic Parametrisation Schemes Using High-Resolution Model Simulations

    Science.gov (United States)

    Christensen, H. M.; Dawson, A.; Palmer, T.

    2017-12-01

    Stochastic parametrisations are used in weather and climate models as a physically motivated way to represent model error due to unresolved processes. Designing new stochastic schemes has been the target of much innovative research over the last decade. While a focus has been on developing physically motivated approaches, many successful stochastic parametrisation schemes are very simple, such as the European Centre for Medium-Range Weather Forecasts (ECMWF) multiplicative scheme `Stochastically Perturbed Parametrisation Tendencies' (SPPT). The SPPT scheme improves the skill of probabilistic weather and seasonal forecasts, and so is widely used. However, little work has focused on assessing the physical basis of the SPPT scheme. We address this matter by using high-resolution model simulations to explicitly measure the `error' in the parametrised tendency that SPPT seeks to represent. The high resolution simulations are first coarse-grained to the desired forecast model resolution before they are used to produce initial conditions and forcing data needed to drive the ECMWF Single Column Model (SCM). By comparing SCM forecast tendencies with the evolution of the high resolution model, we can measure the `error' in the forecast tendencies. In this way, we provide justification for the multiplicative nature of SPPT, and for the temporal and spatial scales of the stochastic perturbations. However, we also identify issues with the SPPT scheme. It is therefore hoped these measurements will improve both holistic and process based approaches to stochastic parametrisation. Figure caption: Instantaneous snapshot of the optimal SPPT stochastic perturbation, derived by comparing high-resolution simulations with a low resolution forecast model.

  5. Hybrid approaches for multiple-species stochastic reaction-diffusion models

    Science.gov (United States)

    Spill, Fabian; Guerrero, Pilar; Alarcon, Tomas; Maini, Philip K.; Byrne, Helen

    2015-10-01

    Reaction-diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and small in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction-diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model.

  6. Hybrid approaches for multiple-species stochastic reaction-diffusion models.

    KAUST Repository

    Spill, Fabian; Guerrero, Pilar; Alarcon, Tomas; Maini, Philip K; Byrne, Helen

    2015-01-01

    Reaction-diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and small in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction-diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model.

  7. Hybrid approaches for multiple-species stochastic reaction-diffusion models.

    KAUST Repository

    Spill, Fabian

    2015-10-01

    Reaction-diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and small in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction-diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model.

  8. Gradient-based model calibration with proxy-model assistance

    Science.gov (United States)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  9. A stochastic modeling of recurrent measles epidemic | Kassem ...

    African Journals Online (AJOL)

    A simple stochastic mathematical model is developed and investigated for the dynamics of measles epidemic. The model, which is a multi-dimensional diffusion process, includes susceptible individuals, latent (exposed), infected and removed individuals. Stochastic effects are assumed to arise in the process of infection of ...

  10. A cavitation model based on Eulerian stochastic fields

    Science.gov (United States)

    Magagnato, F.; Dumond, J.

    2013-12-01

    Non-linear phenomena can often be described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and in particular to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. Firstly, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.

  11. Stochastic quantization for the axial model

    International Nuclear Information System (INIS)

    Farina, C.; Montani, H.; Albuquerque, L.C.

    1991-01-01

    We use bosonization ideas to solve the axial model in the stochastic quantization framework. We obtain the fermion propagator of the theory decoupling directly the Langevin equation, instead of the Fokker-Planck equation. In the Appendix we calculate explicitly the anomalous divergence of the axial-vector current by using a regularization that does not break the Markovian character of the stochastic process

  12. Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.

    Science.gov (United States)

    Caglar, Mehmet Umut; Pal, Ranadip

    2013-01-01

    Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.

  13. Analysis of stochastic effects in Kaldor-type business cycle discrete model

    Science.gov (United States)

    Bashkirtseva, Irina; Ryashko, Lev; Sysolyatina, Anna

    2016-07-01

    We study nonlinear stochastic phenomena in the discrete Kaldor model of business cycles. A numerical parametric analysis of stochastically forced attractors (equilibria, closed invariant curves, discrete cycles) of this model is performed using the stochastic sensitivity functions technique. A spatial arrangement of random states in stochastic attractors is modeled by confidence domains. The phenomenon of noise-induced transitions ;chaos-order; is discussed.

  14. Stochastic Spectral Descent for Discrete Graphical Models

    International Nuclear Information System (INIS)

    Carlson, David; Hsieh, Ya-Ping; Collins, Edo; Carin, Lawrence; Cevher, Volkan

    2015-01-01

    Interest in deep probabilistic graphical models has in-creased in recent years, due to their state-of-the-art performance on many machine learning applications. Such models are typically trained with the stochastic gradient method, which can take a significant number of iterations to converge. Since the computational cost of gradient estimation is prohibitive even for modestly sized models, training becomes slow and practically usable models are kept small. In this paper we propose a new, largely tuning-free algorithm to address this problem. Our approach derives novel majorization bounds based on the Schatten- norm. Intriguingly, the minimizers of these bounds can be interpreted as gradient methods in a non-Euclidean space. We thus propose using a stochastic gradient method in non-Euclidean space. We both provide simple conditions under which our algorithm is guaranteed to converge, and demonstrate empirically that our algorithm leads to dramatically faster training and improved predictive ability compared to stochastic gradient descent for both directed and undirected graphical models.

  15. Stochastic Modeling and Generation of Partially Polarized or Partially Coherent Electromagnetic Waves

    Science.gov (United States)

    Davis, Brynmor; Kim, Edward; Piepmeier, Jeffrey; Hildebrand, Peter H. (Technical Monitor)

    2001-01-01

    Many new Earth remote-sensing instruments are embracing both the advantages and added complexity that result from interferometric or fully polarimetric operation. To increase instrument understanding and functionality a model of the signals these instruments measure is presented. A stochastic model is used as it recognizes the non-deterministic nature of any real-world measurements while also providing a tractable mathematical framework. A stationary, Gaussian-distributed model structure is proposed. Temporal and spectral correlation measures provide a statistical description of the physical properties of coherence and polarization-state. From this relationship the model is mathematically defined. The model is shown to be unique for any set of physical parameters. A method of realizing the model (necessary for applications such as synthetic calibration-signal generation) is given and computer simulation results are presented. The signals are constructed using the output of a multi-input multi-output linear filter system, driven with white noise.

  16. From complex to simple: interdisciplinary stochastic models

    International Nuclear Information System (INIS)

    Mazilu, D A; Zamora, G; Mazilu, I

    2012-01-01

    We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions for certain physical quantities, such as the time dependence of the length of the microtubules, and diffusion coefficients. The second one is a stochastic adsorption model with applications in surface deposition, epidemics and voter systems. We introduce the ‘empty interval method’ and show sample calculations for the time-dependent particle density. These models can serve as an introduction to the field of non-equilibrium statistical physics, and can also be used as a pedagogical tool to exemplify standard statistical physics concepts, such as random walks or the kinetic approach of the master equation. (paper)

  17. Stochastic fractional differential equations: Modeling, method and analysis

    International Nuclear Information System (INIS)

    Pedjeu, Jean-C.; Ladde, Gangaram S.

    2012-01-01

    By introducing a concept of dynamic process operating under multi-time scales in sciences and engineering, a mathematical model described by a system of multi-time scale stochastic differential equations is formulated. The classical Picard–Lindelöf successive approximations scheme is applied to the model validation problem, namely, existence and uniqueness of solution process. Naturally, this leads to the problem of finding closed form solutions of both linear and nonlinear multi-time scale stochastic differential equations of Itô–Doob type. Finally, to illustrate the scope of ideas and presented results, multi-time scale stochastic models for ecological and epidemiological processes in population dynamic are outlined.

  18. Tsunamis: stochastic models of occurrence and generation mechanisms

    Science.gov (United States)

    Geist, Eric L.; Oglesby, David D.

    2014-01-01

    The devastating consequences of the 2004 Indian Ocean and 2011 Japan tsunamis have led to increased research into many different aspects of the tsunami phenomenon. In this entry, we review research related to the observed complexity and uncertainty associated with tsunami generation, propagation, and occurrence described and analyzed using a variety of stochastic methods. In each case, seismogenic tsunamis are primarily considered. Stochastic models are developed from the physical theories that govern tsunami evolution combined with empirical models fitted to seismic and tsunami observations, as well as tsunami catalogs. These stochastic methods are key to providing probabilistic forecasts and hazard assessments for tsunamis. The stochastic methods described here are similar to those described for earthquakes (Vere-Jones 2013) and volcanoes (Bebbington 2013) in this encyclopedia.

  19. Addressing model error through atmospheric stochastic physical parametrizations: impact on the coupled ECMWF seasonal forecasting system

    Science.gov (United States)

    Weisheimer, Antje; Corti, Susanna; Palmer, Tim; Vitart, Frederic

    2014-01-01

    The finite resolution of general circulation models of the coupled atmosphere–ocean system and the effects of sub-grid-scale variability present a major source of uncertainty in model simulations on all time scales. The European Centre for Medium-Range Weather Forecasts has been at the forefront of developing new approaches to account for these uncertainties. In particular, the stochastically perturbed physical tendency scheme and the stochastically perturbed backscatter algorithm for the atmosphere are now used routinely for global numerical weather prediction. The European Centre also performs long-range predictions of the coupled atmosphere–ocean climate system in operational forecast mode, and the latest seasonal forecasting system—System 4—has the stochastically perturbed tendency and backscatter schemes implemented in a similar way to that for the medium-range weather forecasts. Here, we present results of the impact of these schemes in System 4 by contrasting the operational performance on seasonal time scales during the retrospective forecast period 1981–2010 with comparable simulations that do not account for the representation of model uncertainty. We find that the stochastic tendency perturbation schemes helped to reduce excessively strong convective activity especially over the Maritime Continent and the tropical Western Pacific, leading to reduced biases of the outgoing longwave radiation (OLR), cloud cover, precipitation and near-surface winds. Positive impact was also found for the statistics of the Madden–Julian oscillation (MJO), showing an increase in the frequencies and amplitudes of MJO events. Further, the errors of El Niño southern oscillation forecasts become smaller, whereas increases in ensemble spread lead to a better calibrated system if the stochastic tendency is activated. The backscatter scheme has overall neutral impact. Finally, evidence for noise-activated regime transitions has been found in a cluster analysis of mid

  20. ARIMA-Based Time Series Model of Stochastic Wind Power Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Pedersen, Troels; Bak-Jensen, Birgitte

    2010-01-01

    This paper proposes a stochastic wind power model based on an autoregressive integrated moving average (ARIMA) process. The model takes into account the nonstationarity and physical limits of stochastic wind power generation. The model is constructed based on wind power measurement of one year from...... the Nysted offshore wind farm in Denmark. The proposed limited-ARIMA (LARIMA) model introduces a limiter and characterizes the stochastic wind power generation by mean level, temporal correlation and driving noise. The model is validated against the measurement in terms of temporal correlation...... and probability distribution. The LARIMA model outperforms a first-order transition matrix based discrete Markov model in terms of temporal correlation, probability distribution and model parameter number. The proposed LARIMA model is further extended to include the monthly variation of the stochastic wind power...

  1. Modeling stochasticity in biochemical reaction networks

    International Nuclear Information System (INIS)

    Constantino, P H; Vlysidis, M; Smadbeck, P; Kaznessis, Y N

    2016-01-01

    Small biomolecular systems are inherently stochastic. Indeed, fluctuations of molecular species are substantial in living organisms and may result in significant variation in cellular phenotypes. The chemical master equation (CME) is the most detailed mathematical model that can describe stochastic behaviors. However, because of its complexity the CME has been solved for only few, very small reaction networks. As a result, the contribution of CME-based approaches to biology has been very limited. In this review we discuss the approach of solving CME by a set of differential equations of probability moments, called moment equations. We present different approaches to produce and to solve these equations, emphasizing the use of factorial moments and the zero information entropy closure scheme. We also provide information on the stability analysis of stochastic systems. Finally, we speculate on the utility of CME-based modeling formalisms, especially in the context of synthetic biology efforts. (topical review)

  2. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  3. Stochastic lattice model of synaptic membrane protein domains.

    Science.gov (United States)

    Li, Yiwei; Kahraman, Osman; Haselwandter, Christoph A

    2017-05-01

    Neurotransmitter receptor molecules, concentrated in synaptic membrane domains along with scaffolds and other kinds of proteins, are crucial for signal transmission across chemical synapses. In common with other membrane protein domains, synaptic domains are characterized by low protein copy numbers and protein crowding, with rapid stochastic turnover of individual molecules. We study here in detail a stochastic lattice model of the receptor-scaffold reaction-diffusion dynamics at synaptic domains that was found previously to capture, at the mean-field level, the self-assembly, stability, and characteristic size of synaptic domains observed in experiments. We show that our stochastic lattice model yields quantitative agreement with mean-field models of nonlinear diffusion in crowded membranes. Through a combination of analytic and numerical solutions of the master equation governing the reaction dynamics at synaptic domains, together with kinetic Monte Carlo simulations, we find substantial discrepancies between mean-field and stochastic models for the reaction dynamics at synaptic domains. Based on the reaction and diffusion properties of synaptic receptors and scaffolds suggested by previous experiments and mean-field calculations, we show that the stochastic reaction-diffusion dynamics of synaptic receptors and scaffolds provide a simple physical mechanism for collective fluctuations in synaptic domains, the molecular turnover observed at synaptic domains, key features of the observed single-molecule trajectories, and spatial heterogeneity in the effective rates at which receptors and scaffolds are recycled at the cell membrane. Our work sheds light on the physical mechanisms and principles linking the collective properties of membrane protein domains to the stochastic dynamics that rule their molecular components.

  4. STOCHASTIC CHARACTERISTICS AND MODELING OF RELATIVE ...

    African Journals Online (AJOL)

    Test

    Results are highly accurate and promising for all models based on Lewis' criteria. ... hydrological cycle. Future increases in ... STOCHASTIC CHARACTERISTICS AND MODELING OF RELATIVE HUMIDITY OF OGUN BASIN, NIGERIA. 71 ...

  5. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  6. Deterministic and stochastic CTMC models from Zika disease transmission

    Science.gov (United States)

    Zevika, Mona; Soewono, Edy

    2018-03-01

    Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.

  7. Weather Derivatives and Stochastic Modelling of Temperature

    Directory of Open Access Journals (Sweden)

    Fred Espen Benth

    2011-01-01

    Full Text Available We propose a continuous-time autoregressive model for the temperature dynamics with volatility being the product of a seasonal function and a stochastic process. We use the Barndorff-Nielsen and Shephard model for the stochastic volatility. The proposed temperature dynamics is flexible enough to model temperature data accurately, and at the same time being analytically tractable. Futures prices for commonly traded contracts at the Chicago Mercantile Exchange on indices like cooling- and heating-degree days and cumulative average temperatures are computed, as well as option prices on them.

  8. Stochastic Modeling Of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei; Sørensen, John Dalsgaard

    2014-01-01

    reliable components are needed for wind turbine. In this paper focus is on reliability of critical components in drivetrain such as bearings and shafts. High failure rates of these components imply a need for more reliable components. To estimate the reliability of these components, stochastic models...... are needed for initial defects and damage accumulation. In this paper, stochastic models are formulated considering some of the failure modes observed in these components. The models are based on theoretical considerations, manufacturing uncertainties, size effects of different scales. It is illustrated how...

  9. Applied stochastic modelling

    CERN Document Server

    Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P

    2008-01-01

    Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...

  10. On changes of measure in stochastic volatility models

    Directory of Open Access Journals (Sweden)

    Bernard Wong

    2006-01-01

    models. This had led many researchers to “assume the condition away,” even though the condition is not innocuous, and nonsensical results can occur if it is in fact not satisfied. We provide an applicable theorem to check the conditions for a general class of Markovian stochastic volatility models. As an example we will also provide a detailed analysis of the Stein and Stein and Heston stochastic volatility models.

  11. Another look at volume self-calibration: calibration and self-calibration within a pinhole model of Scheimpflug cameras

    International Nuclear Information System (INIS)

    Cornic, Philippe; Le Besnerais, Guy; Champagnat, Frédéric; Illoul, Cédric; Cheminet, Adam; Le Sant, Yves; Leclaire, Benjamin

    2016-01-01

    We address calibration and self-calibration of tomographic PIV experiments within a pinhole model of cameras. A complete and explicit pinhole model of a camera equipped with a 2-tilt angles Scheimpflug adapter is presented. It is then used in a calibration procedure based on a freely moving calibration plate. While the resulting calibrations are accurate enough for Tomo-PIV, we confirm, through a simple experiment, that they are not stable in time, and illustrate how the pinhole framework can be used to provide a quantitative evaluation of geometrical drifts in the setup. We propose an original self-calibration method based on global optimization of the extrinsic parameters of the pinhole model. These methods are successfully applied to the tomographic PIV of an air jet experiment. An unexpected by-product of our work is to show that volume self-calibration induces a change in the world frame coordinates. Provided the calibration drift is small, as generally observed in PIV, the bias on the estimated velocity field is negligible but the absolute location cannot be accurately recovered using standard calibration data. (paper)

  12. Gompertzian stochastic model with delay effect to cervical cancer growth

    International Nuclear Information System (INIS)

    Mazlan, Mazma Syahidatul Ayuni binti; Rosli, Norhayati binti; Bahar, Arifah

    2015-01-01

    In this paper, a Gompertzian stochastic model with time delay is introduced to describe the cervical cancer growth. The parameters values of the mathematical model are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic model numerically. The efficiency of mathematical model is measured by comparing the simulated result and the clinical data of cervical cancer growth. Low values of Mean-Square Error (MSE) of Gompertzian stochastic model with delay effect indicate good fits

  13. Gompertzian stochastic model with delay effect to cervical cancer growth

    Energy Technology Data Exchange (ETDEWEB)

    Mazlan, Mazma Syahidatul Ayuni binti; Rosli, Norhayati binti [Faculty of Industrial Sciences and Technology, Universiti Malaysia Pahang, Lebuhraya Tun Razak, 26300 Gambang, Pahang (Malaysia); Bahar, Arifah [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor and UTM Centre for Industrial and Applied Mathematics (UTM-CIAM), Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia)

    2015-02-03

    In this paper, a Gompertzian stochastic model with time delay is introduced to describe the cervical cancer growth. The parameters values of the mathematical model are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic model numerically. The efficiency of mathematical model is measured by comparing the simulated result and the clinical data of cervical cancer growth. Low values of Mean-Square Error (MSE) of Gompertzian stochastic model with delay effect indicate good fits.

  14. Stochastic Modelling, Analysis, and Simulations of the Solar Cycle Dynamic Process

    Science.gov (United States)

    Turner, Douglas C.; Ladde, Gangaram S.

    2018-03-01

    Analytical solutions, discretization schemes and simulation results are presented for the time delay deterministic differential equation model of the solar dynamo presented by Wilmot-Smith et al. In addition, this model is extended under stochastic Gaussian white noise parametric fluctuations. The introduction of stochastic fluctuations incorporates variables affecting the dynamo process in the solar interior, estimation error of parameters, and uncertainty of the α-effect mechanism. Simulation results are presented and analyzed to exhibit the effects of stochastic parametric volatility-dependent perturbations. The results generalize and extend the work of Hazra et al. In fact, some of these results exhibit the oscillatory dynamic behavior generated by the stochastic parametric additative perturbations in the absence of time delay. In addition, the simulation results of the modified stochastic models influence the change in behavior of the very recently developed stochastic model of Hazra et al.

  15. Stochastic models of the Social Security trust funds.

    Science.gov (United States)

    Burdick, Clark; Manchester, Joyce

    Each year in March, the Board of Trustees of the Social Security trust funds reports on the current and projected financial condition of the Social Security programs. Those programs, which pay monthly benefits to retired workers and their families, to the survivors of deceased workers, and to disabled workers and their families, are financed through the Old-Age, Survivors, and Disability Insurance (OASDI) Trust Funds. In their 2003 report, the Trustees present, for the first time, results from a stochastic model of the combined OASDI trust funds. Stochastic modeling is an important new tool for Social Security policy analysis and offers the promise of valuable new insights into the financial status of the OASDI trust funds and the effects of policy changes. The results presented in this article demonstrate that several stochastic models deliver broadly consistent results even though they use very different approaches and assumptions. However, they also show that the variation in trust fund outcomes differs as the approach and assumptions are varied. Which approach and assumptions are best suited for Social Security policy analysis remains an open question. Further research is needed before the promise of stochastic modeling is fully realized. For example, neither parameter uncertainty nor variability in ultimate assumption values is recognized explicitly in the analyses. Despite this caveat, stochastic modeling results are already shedding new light on the range and distribution of trust fund outcomes that might occur in the future.

  16. Biochemical Network Stochastic Simulator (BioNetS: software for stochastic modeling of biochemical networks

    Directory of Open Access Journals (Sweden)

    Elston Timothy C

    2004-03-01

    Full Text Available Abstract Background Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. Results We have developed the software package Biochemical Network Stochastic Simulator (BioNetS for efficientlyand accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solvesthe appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. Conclusions We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  17. Electricity Market Stochastic Dynamic Model and Its Mean Stability Analysis

    Directory of Open Access Journals (Sweden)

    Zhanhui Lu

    2014-01-01

    Full Text Available Based on the deterministic dynamic model of electricity market proposed by Alvarado, a stochastic electricity market model, considering the random nature of demand sides, is presented in this paper on the assumption that generator cost function and consumer utility function are quadratic functions. The stochastic electricity market model is a generalization of the deterministic dynamic model. Using the theory of stochastic differential equations, stochastic process theory, and eigenvalue techniques, the determining conditions of the mean stability for this electricity market model under small Gauss type random excitation are provided and testified theoretically. That is, if the demand elasticity of suppliers is nonnegative and the demand elasticity of consumers is negative, then the stochastic electricity market model is mean stable. It implies that the stability can be judged directly by initial data without any computation. Taking deterministic electricity market data combined with small Gauss type random excitation as numerical samples to interpret random phenomena from a statistical perspective, the results indicate the conclusions above are correct, valid, and practical.

  18. Aspects if stochastic models for short-term hydropower scheduling and bidding

    Energy Technology Data Exchange (ETDEWEB)

    Belsnes, Michael Martin [Sintef Energy, Trondheim (Norway); Follestad, Turid [Sintef Energy, Trondheim (Norway); Wolfgang, Ove [Sintef Energy, Trondheim (Norway); Fosso, Olav B. [Dep. of electric power engineering NTNU, Trondheim (Norway)

    2012-07-01

    This report discusses challenges met when turning from deterministic to stochastic decision support models for short-term hydropower scheduling and bidding. The report describes characteristics of the short-term scheduling and bidding problem, different market and bidding strategies, and how a stochastic optimization model can be formulated. A review of approaches for stochastic short-term modelling and stochastic modelling for the input variables inflow and market prices is given. The report discusses methods for approximating the predictive distribution of uncertain variables by scenario trees. Benefits of using a stochastic over a deterministic model are illustrated by a case study, where increased profit is obtained to a varying degree depending on the reservoir filling and price structure. Finally, an approach for assessing the effect of using a size restricted scenario tree to approximate the predictive distribution for stochastic input variables is described. The report is a summary of the findings of Work package 1 of the research project #Left Double Quotation Mark#Optimal short-term scheduling of wind and hydro resources#Right Double Quotation Mark#. The project aims at developing a prototype for an operational stochastic short-term scheduling model. Based on the investigations summarized in the report, it is concluded that using a deterministic equivalent formulation of the stochastic optimization problem is convenient and sufficient for obtaining a working prototype. (author)

  19. Characterizing economic trends by Bayesian stochastic model specifi cation search

    OpenAIRE

    Grassi, Stefano; Proietti, Tommaso

    2010-01-01

    We apply a recently proposed Bayesian model selection technique, known as stochastic model specification search, for characterising the nature of the trend in macroeconomic time series. We illustrate that the methodology can be quite successfully applied to discriminate between stochastic and deterministic trends. In particular, we formulate autoregressive models with stochastic trends components and decide on whether a specific feature of the series, i.e. the underlying level and/or the rate...

  20. Index Option Pricing Models with Stochastic Volatility and Stochastic Interest Rates

    NARCIS (Netherlands)

    Jiang, G.J.; van der Sluis, P.J.

    2000-01-01

    This paper specifies a multivariate stochastic volatility (SV) model for the S&P500 index and spot interest rate processes. We first estimate the multivariate SV model via the efficient method of moments (EMM) technique based on observations of underlying state variables, and then investigate the

  1. Recent topographic evolution and erosion of the deglaciated Washington Cascades inferred from a stochastic landscape evolution model

    Science.gov (United States)

    Moon, Seulgi; Shelef, Eitan; Hilley, George E.

    2015-05-01

    In this study, we model postglacial surface processes and examine the evolution of the topography and denudation rates within the deglaciated Washington Cascades to understand the controls on and time scales of landscape response to changes in the surface process regime after deglaciation. The postglacial adjustment of this landscape is modeled using a geomorphic-transport-law-based numerical model that includes processes of river incision, hillslope diffusion, and stochastic landslides. The surface lowering due to landslides is parameterized using a physically based slope stability model coupled to a stochastic model of the generation of landslides. The model parameters of river incision and stochastic landslides are calibrated based on the rates and distribution of thousand-year-time scale denudation rates measured from cosmogenic 10Be isotopes. The probability distributions of those model parameters calculated based on a Bayesian inversion scheme show comparable ranges from previous studies in similar rock types and climatic conditions. The magnitude of landslide denudation rates is determined by failure density (similar to landslide frequency), whereas precipitation and slopes affect the spatial variation in landslide denudation rates. Simulation results show that postglacial denudation rates decay over time and take longer than 100 kyr to reach time-invariant rates. Over time, the landslides in the model consume the steep slopes characteristic of deglaciated landscapes. This response time scale is on the order of or longer than glacial/interglacial cycles, suggesting that frequent climatic perturbations during the Quaternary may produce a significant and prolonged impact on denudation and topography.

  2. Dynamic optimization deterministic and stochastic models

    CERN Document Server

    Hinderer, Karl; Stieglitz, Michael

    2016-01-01

    This book explores discrete-time dynamic optimization and provides a detailed introduction to both deterministic and stochastic models. Covering problems with finite and infinite horizon, as well as Markov renewal programs, Bayesian control models and partially observable processes, the book focuses on the precise modelling of applications in a variety of areas, including operations research, computer science, mathematics, statistics, engineering, economics and finance. Dynamic Optimization is a carefully presented textbook which starts with discrete-time deterministic dynamic optimization problems, providing readers with the tools for sequential decision-making, before proceeding to the more complicated stochastic models. The authors present complete and simple proofs and illustrate the main results with numerous examples and exercises (without solutions). With relevant material covered in four appendices, this book is completely self-contained.

  3. Stochastic models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2003-01-01

    Simple stochastic differential equation models have been applied by several researchers to describe the dispersion of tracer particles in the planetary atmospheric boundary layer and to form the basis for computer simulations of particle paths. To obtain the drift coefficient, empirical vertical...... positions close to the boundaries. Different rules have been suggested in the literature with justifications based on simulation studies. Herein the relevant stochastic differential equation model is formulated in a particular way. The formulation is based on the marginal transformation of the position...... velocity distributions that depend on height above the ground both with respect to standard deviation and skewness are substituted into the stationary Fokker/Planck equation. The particle position distribution is taken to be uniform *the well/mixed condition( and also a given dispersion coefficient...

  4. Hopf bifurcation of the stochastic model on business cycle

    International Nuclear Information System (INIS)

    Xu, J; Wang, H; Ge, G

    2008-01-01

    A stochastic model on business cycle was presented in thas paper. Simplifying the model through the quasi Hamiltonian theory, the Ito diffusion process was obtained. According to Oseledec multiplicative ergodic theory and singular boundary theory, the conditions of local and global stability were acquired. Solving the stationary FPK equation and analyzing the stationary probability density, the stochastic Hopf bifurcation was explained. The result indicated that the change of parameter awas the key factor to the appearance of the stochastic Hopf bifurcation

  5. High-resolution stochastic generation of extreme rainfall intensity for urban drainage modelling applications

    Science.gov (United States)

    Peleg, Nadav; Blumensaat, Frank; Molnar, Peter; Fatichi, Simone; Burlando, Paolo

    2016-04-01

    Urban drainage response is highly dependent on the spatial and temporal structure of rainfall. Therefore, measuring and simulating rainfall at a high spatial and temporal resolution is a fundamental step to fully assess urban drainage system reliability and related uncertainties. This is even more relevant when considering extreme rainfall events. However, the current space-time rainfall models have limitations in capturing extreme rainfall intensity statistics for short durations. Here, we use the STREAP (Space-Time Realizations of Areal Precipitation) model, which is a novel stochastic rainfall generator for simulating high-resolution rainfall fields that preserve the spatio-temporal structure of rainfall and its statistical characteristics. The model enables a generation of rain fields at 102 m and minute scales in a fast and computer-efficient way matching the requirements for hydrological analysis of urban drainage systems. The STREAP model was applied successfully in the past to generate high-resolution extreme rainfall intensities over a small domain. A sub-catchment in the city of Luzern (Switzerland) was chosen as a case study to: (i) evaluate the ability of STREAP to disaggregate extreme rainfall intensities for urban drainage applications; (ii) assessing the role of stochastic climate variability of rainfall in flow response and (iii) evaluate the degree of non-linearity between extreme rainfall intensity and system response (i.e. flow) for a small urban catchment. The channel flow at the catchment outlet is simulated by means of a calibrated hydrodynamic sewer model.

  6. Stochastic models for predicting pitting corrosion damage of HLRW containers

    International Nuclear Information System (INIS)

    Henshall, G.A.

    1991-10-01

    Stochastic models for predicting aqueous pitting corrosion damage of high-level radioactive-waste containers are described. These models could be used to predict the time required for the first pit to penetrate a container and the increase in the number of breaches at later times, both of which would be useful in the repository system performance analysis. Monte Carlo implementations of the stochastic models are described, and predictions of induction time, survival probability and pit depth distributions are presented. These results suggest that the pit nucleation probability decreases with exposure time and that pit growth may be a stochastic process. The advantages and disadvantages of the stochastic approach, methods for modeling the effects of environment, and plans for future work are discussed

  7. Brain-inspired Stochastic Models and Implementations

    KAUST Repository

    Al-Shedivat, Maruan

    2015-05-12

    One of the approaches to building artificial intelligence (AI) is to decipher the princi- ples of the brain function and to employ similar mechanisms for solving cognitive tasks, such as visual perception or natural language understanding, using machines. The recent breakthrough, named deep learning, demonstrated that large multi-layer networks of arti- ficial neural-like computing units attain remarkable performance on some of these tasks. Nevertheless, such artificial networks remain to be very loosely inspired by the brain, which rich structures and mechanisms may further suggest new algorithms or even new paradigms of computation. In this thesis, we explore brain-inspired probabilistic mechanisms, such as neural and synaptic stochasticity, in the context of generative models. The two questions we ask here are: (i) what kind of models can describe a neural learning system built of stochastic components? and (ii) how can we implement such systems e ̆ciently? To give specific answers, we consider two well known models and the corresponding neural architectures: the Naive Bayes model implemented with a winner-take-all spiking neural network and the Boltzmann machine implemented in a spiking or non-spiking fashion. We propose and analyze an e ̆cient neuromorphic implementation of the stochastic neu- ral firing mechanism and study the e ̄ects of synaptic unreliability on learning generative energy-based models implemented with neural networks.

  8. Stochastic models to simulate paratuberculosis in dairy herds

    DEFF Research Database (Denmark)

    Nielsen, Søren Saxmose; Weber, M.F.; Kudahl, Anne Margrethe Braad

    2011-01-01

    Stochastic simulation models are widely accepted as a means of assessing the impact of changes in daily management and the control of different diseases, such as paratuberculosis, in dairy herds. This paper summarises and discusses the assumptions of four stochastic simulation models and their use...... the models are somewhat different in their underlying principles and do put slightly different values on the different strategies, their overall findings are similar. Therefore, simulation models may be useful in planning paratuberculosis strategies in dairy herds, although as with all models caution...

  9. Stochastic population oscillations in spatial predator-prey models

    International Nuclear Information System (INIS)

    Taeuber, Uwe C

    2011-01-01

    It is well-established that including spatial structure and stochastic noise in models for predator-prey interactions invalidates the classical deterministic Lotka-Volterra picture of neutral population cycles. In contrast, stochastic models yield long-lived, but ultimately decaying erratic population oscillations, which can be understood through a resonant amplification mechanism for density fluctuations. In Monte Carlo simulations of spatial stochastic predator-prey systems, one observes striking complex spatio-temporal structures. These spreading activity fronts induce persistent correlations between predators and prey. In the presence of local particle density restrictions (finite prey carrying capacity), there exists an extinction threshold for the predator population. The accompanying continuous non-equilibrium phase transition is governed by the directed-percolation universality class. We employ field-theoretic methods based on the Doi-Peliti representation of the master equation for stochastic particle interaction models to (i) map the ensuing action in the vicinity of the absorbing state phase transition to Reggeon field theory, and (ii) to quantitatively address fluctuation-induced renormalizations of the population oscillation frequency, damping, and diffusion coefficients in the species coexistence phase.

  10. Linking agent-based models and stochastic models of financial markets.

    Science.gov (United States)

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  11. Stochastic differential equation model to Prendiville processes

    International Nuclear Information System (INIS)

    Granita; Bahar, Arifah

    2015-01-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution

  12. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  13. Error-in-variables models in calibration

    Science.gov (United States)

    Lira, I.; Grientschnig, D.

    2017-12-01

    In many calibration operations, the stimuli applied to the measuring system or instrument under test are derived from measurement standards whose values may be considered to be perfectly known. In that case, it is assumed that calibration uncertainty arises solely from inexact measurement of the responses, from imperfect control of the calibration process and from the possible inaccuracy of the calibration model. However, the premise that the stimuli are completely known is never strictly fulfilled and in some instances it may be grossly inadequate. Then, error-in-variables (EIV) regression models have to be employed. In metrology, these models have been approached mostly from the frequentist perspective. In contrast, not much guidance is available on their Bayesian analysis. In this paper, we first present a brief summary of the conventional statistical techniques that have been developed to deal with EIV models in calibration. We then proceed to discuss the alternative Bayesian framework under some simplifying assumptions. Through a detailed example about the calibration of an instrument for measuring flow rates, we provide advice on how the user of the calibration function should employ the latter framework for inferring the stimulus acting on the calibrated device when, in use, a certain response is measured.

  14. Stochastic growth logistic model with aftereffect for batch fermentation process

    Energy Technology Data Exchange (ETDEWEB)

    Rosli, Norhayati; Ayoubi, Tawfiqullah [Faculty of Industrial Sciences and Technology, Universiti Malaysia Pahang, Lebuhraya Tun Razak, 26300 Gambang, Pahang (Malaysia); Bahar, Arifah; Rahman, Haliza Abdul [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia); Salleh, Madihah Md [Department of Biotechnology Industry, Faculty of Biosciences and Bioengineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia)

    2014-06-19

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.

  15. Stochastic growth logistic model with aftereffect for batch fermentation process

    Science.gov (United States)

    Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md

    2014-06-01

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.

  16. Stochastic growth logistic model with aftereffect for batch fermentation process

    International Nuclear Information System (INIS)

    Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md

    2014-01-01

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits

  17. Towards Model Checking Stochastic Process Algebra

    NARCIS (Netherlands)

    Hermanns, H.; Grieskamp, W.; Santen, T.; Katoen, Joost P.; Stoddart, B.; Meyer-Kayser, J.; Siegle, M.

    2000-01-01

    Stochastic process algebras have been proven useful because they allow behaviour-oriented performance and reliability modelling. As opposed to traditional performance modelling techniques, the behaviour- oriented style supports composition and abstraction in a natural way. However, analysis of

  18. Stochastic models for predicting environmental impact in aquatic ecosystems

    International Nuclear Information System (INIS)

    Stewart-Oaten, A.

    1986-01-01

    The purpose of stochastic predictions are discussed in relation to the environmental impacts of nuclear power plants on aquatic ecosystems. One purpose is to aid in making rational decisions about whether a power plant should be built, where, and how it should be designed. The other purpose is to check on the models themselves in the light of what eventually happens. The author discusses the role or statistical decision theory in the decision-making problem. Various types of stochastic models and their problems are presented. In addition some suggestions are made for generating usable stochastic models, and checking and improving on them. 12 references

  19. Stochastic model for the 3D microstructure of pristine and cyclically aged cathodes in Li-ion batteries

    Science.gov (United States)

    Kuchler, Klaus; Westhoff, Daniel; Feinauer, Julian; Mitsch, Tim; Manke, Ingo; Schmidt, Volker

    2018-04-01

    It is well-known that the microstructure of electrodes in lithium-ion batteries strongly affects their performance. Vice versa, the microstructure can exhibit strong changes during the usage of the battery due to aging effects. For a better understanding of these effects, mathematical analysis and modeling has turned out to be of great help. In particular, stochastic 3D microstructure models have proven to be a powerful and very flexible tool to generate various kinds of particle-based structures. Recently, such models have been proposed for the microstructure of anodes in lithium-ion energy and power cells. In the present paper, we describe a stochastic modeling approach for the 3D microstructure of cathodes in a lithium-ion energy cell, which differs significantly from the one observed in anodes. The model for the cathode data enhances the ideas of the anode models, which have been developed so far. It is calibrated using 3D tomographic image data from pristine as well as two aged cathodes. A validation based on morphological image characteristics shows that the model is able to realistically describe both, the microstructure of pristine and aged cathodes. Thus, we conclude that the model is suitable to generate virtual, but realistic microstructures of lithium-ion cathodes.

  20. Heston Model Calibration in the Brazilian Foreign Exchange (FX Options Market

    Directory of Open Access Journals (Sweden)

    Joe Akira Yoshino

    2004-06-01

    Full Text Available Despite the relatively recent advance in the derivative industry, the European FX option market uses simple models such as Black (1976 or Garman and Kohlhagen (1983. This widespread practice hides very important quantitative effects that could be better explored by using alternative pricing models such as the one that incorporates the stochastic volatility features. Understanding and calibrating this type of pricing model represents a challenge in the current state of art in financial engineering, specially in emerging markets that are characterized by strong volatilities, periodic changing regimes and in most case suffering of liquidity, specially during the crisis. In this sense, this paper shows how to implement the Hestons Model for the Brazilian FX option market. This approach uses the volatility matrix provided by a pool of domestic market players. Although the Hestons Model presents a formal analytical solution it does not require simulation-, the closed form solutions show a mathematical complexity. Thus, the main objective of this work is to implement this model in the Brazilian FX market.

  1. Introduction to modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, V G

    2011-01-01

    This is an introductory-level text on stochastic modeling. It is suited for undergraduate students in engineering, operations research, statistics, mathematics, actuarial science, business management, computer science, and public policy. It employs a large number of examples to teach the students to use stochastic models of real-life systems to predict their performance, and use this analysis to design better systems. The book is devoted to the study of important classes of stochastic processes: discrete and continuous time Markov processes, Poisson processes, renewal and regenerative processes, semi-Markov processes, queueing models, and diffusion processes. The book systematically studies the short-term and the long-term behavior, cost/reward models, and first passage times. All the material is illustrated with many examples, and case studies. The book provides a concise review of probability in the appendix. The book emphasizes numerical answers to the problems. A collection of MATLAB programs to accompany...

  2. Stochastic Averaging and Stochastic Extremum Seeking

    CERN Document Server

    Liu, Shu-Jun

    2012-01-01

    Stochastic Averaging and Stochastic Extremum Seeking develops methods of mathematical analysis inspired by the interest in reverse engineering  and analysis of bacterial  convergence by chemotaxis and to apply similar stochastic optimization techniques in other environments. The first half of the text presents significant advances in stochastic averaging theory, necessitated by the fact that existing theorems are restricted to systems with linear growth, globally exponentially stable average models, vanishing stochastic perturbations, and prevent analysis over infinite time horizon. The second half of the text introduces stochastic extremum seeking algorithms for model-free optimization of systems in real time using stochastic perturbations for estimation of their gradients. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton). The design of algorithms...

  3. Explicit calibration and simulation of stochastic fields by low-order ARMA processes

    DEFF Research Database (Denmark)

    Krenk, Steen

    2011-01-01

    A simple framework for autoregressive simulation of stochastic fields is presented. The autoregressive format leads to a simple exponential correlation structure in the time-dimension. In the case of scalar processes a more detailed correlation structure can be obtained by adding memory...... to the process via an extension to autoregressive moving average (ARMA) processes. The ARMA format incorporates a more detailed correlation structure by including previous values of the simulated process. Alternatively, a more detailed correlation structure can be obtained by including additional 'state......-space' variables in the simulation. For a scalar process this would imply an increase of the dimension of the process to be simulated. In the case of a stochastic field the correlation in the time-dimension is represented, although indirectly, in the simultaneous spatial correlation. The model with the shortest...

  4. Predicting Footbridge Response using Stochastic Load Models

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2013-01-01

    Walking parameters such as step frequency, pedestrian mass, dynamic load factor, etc. are basically stochastic, although it is quite common to adapt deterministic models for these parameters. The present paper considers a stochastic approach to modeling the action of pedestrians, but when doing so...... decisions need to be made in terms of statistical distributions of walking parameters and in terms of the parameters describing the statistical distributions. The paper explores how sensitive computations of bridge response are to some of the decisions to be made in this respect. This is useful...

  5. Threshold Dynamics of a Stochastic Chemostat Model with Two Nutrients and One Microorganism

    Directory of Open Access Journals (Sweden)

    Jian Zhang

    2017-01-01

    Full Text Available A new stochastic chemostat model with two substitutable nutrients and one microorganism is proposed and investigated. Firstly, for the corresponding deterministic model, the threshold for extinction and permanence of the microorganism is obtained by analyzing the stability of the equilibria. Then, for the stochastic model, the threshold of the stochastic chemostat for extinction and permanence of the microorganism is explored. Difference of the threshold of the deterministic model and the stochastic model shows that a large stochastic disturbance can affect the persistence of the microorganism and is harmful to the cultivation of the microorganism. To illustrate this phenomenon, we give some computer simulations with different intensity of stochastic noise disturbance.

  6. The Asymptotic Behaviour of a Stochastic 3D LANS-α Model

    International Nuclear Information System (INIS)

    Caraballo, Tomas; Marquez-Duran, Antonio M.; Real, Jose

    2006-01-01

    The long-time behaviour of a stochastic 3D LANS-α model on a bounded domain is analysed. First, we reformulate the model as an abstract problem. Next, we establish sufficient conditions ensuring the existence of stationary (steady state) solutions of this abstract nonlinear stochastic evolution equation, and study the stability properties of the model. Finally, we analyse the effects produced by stochastic perturbations in the deterministic version of the system (persistence of exponential stability as well as possible stabilisation effects produced by the noise). The general results are applied to our stochastic LANS-α system throughout the paper

  7. Fitting PAC spectra with stochastic models: PolyPacFit

    Energy Technology Data Exchange (ETDEWEB)

    Zacate, M. O., E-mail: zacatem1@nku.edu [Northern Kentucky University, Department of Physics and Geology (United States); Evenson, W. E. [Utah Valley University, College of Science and Health (United States); Newhouse, R.; Collins, G. S. [Washington State University, Department of Physics and Astronomy (United States)

    2010-04-15

    PolyPacFit is an advanced fitting program for time-differential perturbed angular correlation (PAC) spectroscopy. It incorporates stochastic models and provides robust options for customization of fits. Notable features of the program include platform independence and support for (1) fits to stochastic models of hyperfine interactions, (2) user-defined constraints among model parameters, (3) fits to multiple spectra simultaneously, and (4) any spin nuclear probe.

  8. Stochastic modelling of two-phase flows including phase change

    International Nuclear Information System (INIS)

    Hurisse, O.; Minier, J.P.

    2011-01-01

    Stochastic modelling has already been developed and applied for single-phase flows and incompressible two-phase flows. In this article, we propose an extension of this modelling approach to two-phase flows including phase change (e.g. for steam-water flows). Two aspects are emphasised: a stochastic model accounting for phase transition and a modelling constraint which arises from volume conservation. To illustrate the whole approach, some remarks are eventually proposed for two-fluid models. (authors)

  9. A stochastic model for quantum measurement

    International Nuclear Information System (INIS)

    Budiyono, Agung

    2013-01-01

    We develop a statistical model of microscopic stochastic deviation from classical mechanics based on a stochastic process with a transition probability that is assumed to be given by an exponential distribution of infinitesimal stationary action. We apply the statistical model to stochastically modify a classical mechanical model for the measurement of physical quantities reproducing the prediction of quantum mechanics. The system+apparatus always has a definite configuration at all times, as in classical mechanics, fluctuating randomly following a continuous trajectory. On the other hand, the wavefunction and quantum mechanical Hermitian operator corresponding to the physical quantity arise formally as artificial mathematical constructs. During a single measurement, the wavefunction of the whole system+apparatus evolves according to a Schrödinger equation and the configuration of the apparatus acts as the pointer of the measurement so that there is no wavefunction collapse. We will also show that while the outcome of each single measurement event does not reveal the actual value of the physical quantity prior to measurement, its average in an ensemble of identical measurements is equal to the average of the actual value of the physical quantity prior to measurement over the distribution of the configuration of the system. (paper)

  10. The critical domain size of stochastic population models.

    Science.gov (United States)

    Reimer, Jody R; Bonsall, Michael B; Maini, Philip K

    2017-02-01

    Identifying the critical domain size necessary for a population to persist is an important question in ecology. Both demographic and environmental stochasticity impact a population's ability to persist. Here we explore ways of including this variability. We study populations with distinct dispersal and sedentary stages, which have traditionally been modelled using a deterministic integrodifference equation (IDE) framework. Individual-based models (IBMs) are the most intuitive stochastic analogues to IDEs but yield few analytic insights. We explore two alternate approaches; one is a scaling up to the population level using the Central Limit Theorem, and the other a variation on both Galton-Watson branching processes and branching processes in random environments. These branching process models closely approximate the IBM and yield insight into the factors determining the critical domain size for a given population subject to stochasticity.

  11. Alternative Approaches to Technical Efficiency Estimation in the Stochastic Frontier Model

    OpenAIRE

    Acquah, H. de-Graft; Onumah, E. E.

    2014-01-01

    Estimating the stochastic frontier model and calculating technical efficiency of decision making units are of great importance in applied production economic works. This paper estimates technical efficiency from the stochastic frontier model using Jondrow, and Battese and Coelli approaches. In order to compare alternative methods, simulated data with sample sizes of 60 and 200 are generated from stochastic frontier model commonly applied to agricultural firms. Simulated data is employed to co...

  12. Stochastic Modelling of Shiroro River Stream flow Process

    OpenAIRE

    Musa, J. J

    2013-01-01

    Economists, social scientists and engineers provide insights into the drivers of anthropogenic climate change and the options for adaptation and mitigation, and yet other scientists, including geographers and biologists, study the impacts of climate change. This project concentrates mainly on the discharge from the Shiroro River. A stochastic approach is presented for modeling a time series by an Autoregressive Moving Average model (ARMA). The development and use of a stochastic stream flow m...

  13. TIME-DEPENDENT STOCHASTIC ACCELERATION MODEL FOR FERMI BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Kento; Asano, Katsuaki; Terasawa, Toshio, E-mail: kentos@icrr.u-tokyo.ac.jp, E-mail: asanok@icrr.u-tokyo.ac.jp, E-mail: terasawa@icrr.u-tokyo.ac.jp [Institute for Cosmic Ray Research, The University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba 277-8582 (Japan)

    2015-12-01

    We study stochastic acceleration models for the Fermi bubbles. Turbulence is excited just behind the shock front via Kelvin–Helmholtz, Rayleigh–Taylor, or Richtmyer–Meshkov instabilities, and plasma particles are continuously accelerated by the interaction with the turbulence. The turbulence gradually decays as it goes away from the shock fronts. Adopting a phenomenological model for the stochastic acceleration, we explicitly solve the temporal evolution of the particle energy distribution in the turbulence. Our results show that the spatial distribution of high-energy particles is different from those for a steady solution. We also show that the contribution of electrons that escaped from the acceleration regions significantly softens the photon spectrum. The photon spectrum and surface brightness profile are reproduced by our models. If the escape efficiency is very high, the radio flux from the escaped low-energy electrons can be comparable to that of the WMAP haze. We also demonstrate hadronic models with the stochastic acceleration, but they are unlikely in the viewpoint of the energy budget.

  14. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  15. Review of "Stochastic Modelling for Systems Biology" by Darren Wilkinson

    Directory of Open Access Journals (Sweden)

    Bullinger Eric

    2006-12-01

    Full Text Available Abstract "Stochastic Modelling for Systems Biology" by Darren Wilkinson introduces the peculiarities of stochastic modelling in biology. This book is particularly suited to as a textbook or for self-study, and for readers with a theoretical background.

  16. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.

    Science.gov (United States)

    Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M

    2016-12-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.

  17. Stochastic Modelling of River Geometry

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Schaarup-Jensen, K.

    1996-01-01

    Numerical hydrodynamic river models are used in a large number of applications to estimate critical events for rivers. These estimates are subject to a number of uncertainties. In this paper, the problem to evaluate these estimates using probabilistic methods is considered. Stochastic models for ...... for river geometries are formulated and a coupling between hydraulic computational methods and numerical reliability methods is presented....

  18. Stochastic Volatility and DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    This paper argues that a specification of stochastic volatility commonly used to analyze the Great Moderation in DSGE models may not be appropriate, because the level of a process with this specification does not have conditional or unconditional moments. This is unfortunate because agents may...

  19. Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate

    International Nuclear Information System (INIS)

    Wang Zhi-Gang; Gao Rui-Mei; Fan Xiao-Ming; Han Qi-Xing

    2014-01-01

    We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ 0 , a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ 0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ 0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ 0 , when the stochastic system obeys some conditions and ℛ 0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations. (general)

  20. Stochastic differential equations used to model conjugation

    DEFF Research Database (Denmark)

    Philipsen, Kirsten Riber; Christiansen, Lasse Engbo

    Stochastic differential equations (SDEs) are used to model horizontal transfer of antibiotic resis- tance by conjugation. The model describes the concentration of donor, recipient, transconjugants and substrate. The strength of the SDE model over the traditional ODE models is that the noise can...

  1. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  2. SR 97. Alternative models project. Stochastic continuum modelling of Aberg

    International Nuclear Information System (INIS)

    Widen, H.; Walker, D.

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modelling approaches to bedrock performance assessment for a single hypothetical repository, arbitrarily named Aberg. The Aberg repository will adopt input parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The models are restricted to an explicit domain, boundary conditions and canister location to facilitate the comparison. The boundary conditions are based on the regional groundwater model provided in digital format. This study is the application of HYDRASTAR, a stochastic continuum groundwater flow and transport-modelling program. The study uses 34 realisations of 945 canister locations in the hypothetical repository to evaluate the uncertainty of the advective travel time, canister flux (Darcy velocity at a canister) and F-ratio. Several comparisons of variability are constructed between individual canister locations and individual realisations. For the ensemble of all realisations with all canister locations, the study found a median travel time of 27 years, a median canister flux of 7.1 x 10 -4 m/yr and a median F-ratio of 3.3 x 10 5 yr/m. The overall pattern of regional flow is preserved in the site-scale model, as is reflected in flow paths and exit locations. The site-scale model slightly over-predicts the boundary fluxes from the single realisation of the regional model. The explicitly prescribed domain was seen to be slightly restrictive, with 6% of the stream tubes failing to exit the upper surface of the model. Sensitivity analysis and calibration are suggested as possible extensions of the modelling study

  3. A primer on stochastic epidemic models: Formulation, numerical simulation, and analysis

    Directory of Open Access Journals (Sweden)

    Linda J.S. Allen

    2017-05-01

    Full Text Available Some mathematical methods for formulation and numerical simulation of stochastic epidemic models are presented. Specifically, models are formulated for continuous-time Markov chains and stochastic differential equations. Some well-known examples are used for illustration such as an SIR epidemic model and a host-vector malaria model. Analytical methods for approximating the probability of a disease outbreak are also discussed. Keywords: Branching process, Continuous-time Markov chain, Minor outbreak, Stochastic differential equation, 2000 MSC: 60H10, 60J28, 92D30

  4. Some Remarks on Stochastic Versions of the Ramsey Growth Model

    Czech Academy of Sciences Publication Activity Database

    Sladký, Karel

    2012-01-01

    Roč. 19, č. 29 (2012), s. 139-152 ISSN 1212-074X R&D Projects: GA ČR GAP402/10/1610; GA ČR GAP402/10/0956; GA ČR GAP402/11/0150 Institutional support: RVO:67985556 Keywords : Economic dynamics * Ramsey growth model with disturbance * stochastic dynamic programming * multistage stochastic programs Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/sladky-some remarks on stochastic versions of the ramsey growth model.pdf

  5. Stochastic modeling of consumer preferences for health care institutions.

    Science.gov (United States)

    Malhotra, N K

    1983-01-01

    This paper proposes a stochastic procedure for modeling consumer preferences via LOGIT analysis. First, a simple, non-technical exposition of the use of a stochastic approach in health care marketing is presented. Second, a study illustrating the application of the LOGIT model in assessing consumer preferences for hospitals is given. The paper concludes with several implications of the proposed approach.

  6. Stochastic ontogenetic growth model

    Science.gov (United States)

    West, B. J.; West, D.

    2012-02-01

    An ontogenetic growth model (OGM) for a thermodynamically closed system is generalized to satisfy both the first and second law of thermodynamics. The hypothesized stochastic ontogenetic growth model (SOGM) is shown to entail the interspecies allometry relation by explicitly averaging the basal metabolic rate and the total body mass over the steady-state probability density for the total body mass (TBM). This is the first derivation of the interspecies metabolic allometric relation from a dynamical model and the asymptotic steady-state distribution of the TBM is fit to data and shown to be inverse power law.

  7. Multivariate moment closure techniques for stochastic kinetic models

    International Nuclear Information System (INIS)

    Lakatos, Eszter; Ale, Angelique; Kirk, Paul D. W.; Stumpf, Michael P. H.

    2015-01-01

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs

  8. Multivariate moment closure techniques for stochastic kinetic models

    Energy Technology Data Exchange (ETDEWEB)

    Lakatos, Eszter, E-mail: e.lakatos13@imperial.ac.uk; Ale, Angelique; Kirk, Paul D. W.; Stumpf, Michael P. H., E-mail: m.stumpf@imperial.ac.uk [Department of Life Sciences, Centre for Integrative Systems Biology and Bioinformatics, Imperial College London, London SW7 2AZ (United Kingdom)

    2015-09-07

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.

  9. Comparison of stochastic models in Monte Carlo simulation of coated particle fuels

    International Nuclear Information System (INIS)

    Yu Hui; Nam Zin Cho

    2013-01-01

    There is growing interest worldwide in very high temperature gas cooled reactors as candidates for next generation reactor systems. For design and analysis of such reactors with double heterogeneity introduced by the coated particle fuels that are randomly distributed in graphite pebbles, stochastic transport models are becoming essential. Several models were reported in the literature, such as coarse lattice models, fine lattice stochastic (FLS) models, random sequential addition (RSA) models, metropolis models. The principles and performance of these stochastic models are described and compared in this paper. Compared with the usual fixed lattice methods, sub-FLS modeling allows more realistic stochastic distribution of fuel particles and thus results in more accurate criticality calculation. Compared with the basic RSA method, sub-FLS modeling requires simpler and more efficient overlapping checking procedure. (authors)

  10. Compositional Modelling of Stochastic Hybrid Systems

    NARCIS (Netherlands)

    Strubbe, S.N.

    2005-01-01

    In this thesis we present a modelling framework for compositional modelling of stochastic hybrid systems. Hybrid systems consist of a combination of continuous and discrete dynamics. The state space of a hybrid system is hybrid in the sense that it consists of a continuous component and a discrete

  11. Stochastic models for turbulent reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Kerstein, A. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    The goal of this program is to develop and apply stochastic models of various processes occurring within turbulent reacting flows in order to identify the fundamental mechanisms governing these flows, to support experimental studies of these flows, and to further the development of comprehensive turbulent reacting flow models.

  12. Multi-scenario modelling of uncertainty in stochastic chemical systems

    International Nuclear Information System (INIS)

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-01-01

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo

  13. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  14. Skew redundant MEMS IMU calibration using a Kalman filter

    International Nuclear Information System (INIS)

    Jafari, M; Sahebjameyan, M; Moshiri, B; Najafabadi, T A

    2015-01-01

    In this paper, a novel calibration procedure for skew redundant inertial measurement units (SRIMUs) based on micro-electro mechanical systems (MEMS) is proposed. A general model of the SRIMU measurements is derived which contains the effects of bias, scale factor error and misalignments. For more accuracy, the effect of lever arms of the accelerometers to the center of the table are modeled and compensated in the calibration procedure. Two separate Kalman filters (KFs) are proposed to perform the estimation of error parameters for gyroscopes and accelerometers. The predictive error minimization (PEM) stochastic modeling method is used to simultaneously model the effect of bias instability and random walk noise on the calibration Kalman filters to diminish the biased estimations. The proposed procedure is simulated numerically and has expected experimental results. The calibration maneuvers are applied using a two-axis angle turntable in a way that the persistency of excitation (PE) condition for parameter estimation is met. For this purpose, a trapezoidal calibration profile is utilized to excite different deterministic error parameters of the accelerometers and a pulse profile is used for the gyroscopes. Furthermore, to evaluate the performance of the proposed KF calibration method, a conventional least squares (LS) calibration procedure is derived for the SRIMUs and the simulation and experimental results compare the functionality of the two proposed methods with each other. (paper)

  15. Transient Inverse Calibration of the Site-Wide Groundwater Flow Model (ACM-2): FY03 Progress Report

    International Nuclear Information System (INIS)

    Vermeul, Vince R.; Bergeron, Marcel P.; Cole, C R.; Murray, Christopher J.; Nichols, William E.; Scheibe, Timothy D.; Thorne, Paul D.; Waichler, Scott R.; Xie, YuLong

    2003-01-01

    this task is to assess uncertainty based on the most current model (ACM-2), this preliminary work provided an effective basis for developing the approach and implementation methodology. A strategy was developed to facilitate inverse calibration analysis of the large number of stochastic ACMs generated. These stochastic ACMs are random selections from a range of possible model structures, all of which are consistent with available observations. However, a single inverse run requires many forward flow model runs, and full inverse analysis of the large number of combinations of stochastic alternative models is not now computationally feasible. Thus, a two-part approach was developed: (1) full inverse modeling of selected realizations combined with limited forward modeling and (2) implementation of the UCODE/CFEST inverse modeling framework to enhance computational capabilities

  16. Calibration of semi-stochastic procedure for simulating high-frequency ground motions

    Science.gov (United States)

    Seyhan, Emel; Stewart, Jonathan P.; Graves, Robert

    2013-01-01

    Broadband ground motion simulation procedures typically utilize physics-based modeling at low frequencies, coupled with semi-stochastic procedures at high frequencies. The high-frequency procedure considered here combines deterministic Fourier amplitude spectra (dependent on source, path, and site models) with random phase. Previous work showed that high-frequency intensity measures from this simulation methodology attenuate faster with distance and have lower intra-event dispersion than in empirical equations. We address these issues by increasing crustal damping (Q) to reduce distance attenuation bias and by introducing random site-to-site variations to Fourier amplitudes using a lognormal standard deviation ranging from 0.45 for Mw  100 km).

  17. Estimation of Stochastic Volatility Models by Nonparametric Filtering

    DEFF Research Database (Denmark)

    Kanaya, Shin; Kristensen, Dennis

    2016-01-01

    /estimated volatility process replacing the latent process. Our estimation strategy is applicable to both parametric and nonparametric stochastic volatility models, and can handle both jumps and market microstructure noise. The resulting estimators of the stochastic volatility model will carry additional biases...... and variances due to the first-step estimation, but under regularity conditions we show that these vanish asymptotically and our estimators inherit the asymptotic properties of the infeasible estimators based on observations of the volatility process. A simulation study examines the finite-sample properties...

  18. Pan-European stochastic flood event set

    Science.gov (United States)

    Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav

    2017-04-01

    predictands. Finally, the transfer functions are applied to a large ensemble of GCM simulations with forcing corresponding to present day climate conditions to generate highly resolved stochastic time series of precipitation and temperature for several thousand years. These time series form the input for the rainfall-runoff model developed by the UEA team. It is a spatially distributed model adapted from the HBV model and will be calibrated for individual basins using historical discharge data. The calibrated model will be driven by the precipitation time series generated by the KIT team to simulate discharges at a daily time step. The uncertainties in the simulated discharges will be analysed using multiple model parameter sets. A number of statistical methods will be used to assess return periods, changes in the magnitudes, changes in the characteristics of floods such as time base and time to peak, and spatial correlations of large flood events. The Pan-European flood stochastic event set will permit a better view of flood risk for market applications.

  19. Advances in stochastic and deterministic global optimization

    CERN Document Server

    Zhigljavsky, Anatoly; Žilinskas, Julius

    2016-01-01

    Current research results in stochastic and deterministic global optimization including single and multiple objectives are explored and presented in this book by leading specialists from various fields. Contributions include applications to multidimensional data visualization, regression, survey calibration, inventory management, timetabling, chemical engineering, energy systems, and competitive facility location. Graduate students, researchers, and scientists in computer science, numerical analysis, optimization, and applied mathematics will be fascinated by the theoretical, computational, and application-oriented aspects of stochastic and deterministic global optimization explored in this book. This volume is dedicated to the 70th birthday of Antanas Žilinskas who is a leading world expert in global optimization. Professor Žilinskas's research has concentrated on studying models for the objective function, the development and implementation of efficient algorithms for global optimization with single and mu...

  20. A Simulation-Based Dynamic Stochastic Route Choice Model for Evacuation

    Directory of Open Access Journals (Sweden)

    Xing Zhao

    2012-01-01

    Full Text Available This paper establishes a dynamic stochastic route choice model for evacuation to simulate the propagation process of traffic flow and estimate the stochastic route choice under evacuation situations. The model contains a lane-group-based cell transmission model (CTM which sets different traffic capacities for links with different turning movements to flow out in an evacuation situation, an actual impedance model which is to obtain the impedance of each route in time units at each time interval and a stochastic route choice model according to the probit-based stochastic user equilibrium. In this model, vehicles loading at each origin at each time interval are assumed to choose an evacuation route under determinate road network, signal design, and OD demand. As a case study, the proposed model is validated on the network nearby Nanjing Olympic Center after the opening ceremony of the 10th National Games of the People's Republic of China. The traffic volumes and clearing time at five exit points of the evacuation zone are calculated by the model to compare with survey data. The results show that this model can appropriately simulate the dynamic route choice and evolution process of the traffic flow on the network in an evacuation situation.

  1. On the small-time behavior of stochastic logistic models

    Directory of Open Access Journals (Sweden)

    Dung Tien Nguyen

    2017-09-01

    Full Text Available In this paper we investigate the small-time behaviors of the solution to  a stochastic logistic model. The obtained results allow us to estimate the number of individuals in the population and can be used to study stochastic prey-predator systems.

  2. Spatial stochastic regression modelling of urban land use

    International Nuclear Information System (INIS)

    Arshad, S H M; Jaafar, J; Abiden, M Z Z; Latif, Z A; Rasam, A R A

    2014-01-01

    Urbanization is very closely linked to industrialization, commercialization or overall economic growth and development. This results in innumerable benefits of the quantity and quality of the urban environment and lifestyle but on the other hand contributes to unbounded development, urban sprawl, overcrowding and decreasing standard of living. Regulation and observation of urban development activities is crucial. The understanding of urban systems that promotes urban growth are also essential for the purpose of policy making, formulating development strategies as well as development plan preparation. This study aims to compare two different stochastic regression modeling techniques for spatial structure models of urban growth in the same specific study area. Both techniques will utilize the same datasets and their results will be analyzed. The work starts by producing an urban growth model by using stochastic regression modeling techniques namely the Ordinary Least Square (OLS) and Geographically Weighted Regression (GWR). The two techniques are compared to and it is found that, GWR seems to be a more significant stochastic regression model compared to OLS, it gives a smaller AICc (Akaike's Information Corrected Criterion) value and its output is more spatially explainable

  3. Derivation of flood frequency curves in poorly gauged Mediterranean catchments using a simple stochastic hydrological rainfall-runoff model

    Science.gov (United States)

    Aronica, G. T.; Candela, A.

    2007-12-01

    SummaryIn this paper a Monte Carlo procedure for deriving frequency distributions of peak flows using a semi-distributed stochastic rainfall-runoff model is presented. The rainfall-runoff model here used is very simple one, with a limited number of parameters and practically does not require any calibration, resulting in a robust tool for those catchments which are partially or poorly gauged. The procedure is based on three modules: a stochastic rainfall generator module, a hydrologic loss module and a flood routing module. In the rainfall generator module the rainfall storm, i.e. the maximum rainfall depth for a fixed duration, is assumed to follow the two components extreme value (TCEV) distribution whose parameters have been estimated at regional scale for Sicily. The catchment response has been modelled by using the Soil Conservation Service-Curve Number (SCS-CN) method, in a semi-distributed form, for the transformation of total rainfall to effective rainfall and simple form of IUH for the flood routing. Here, SCS-CN method is implemented in probabilistic form with respect to prior-to-storm conditions, allowing to relax the classical iso-frequency assumption between rainfall and peak flow. The procedure is tested on six practical case studies where synthetic FFC (flood frequency curve) were obtained starting from model variables distributions by simulating 5000 flood events combining 5000 values of total rainfall depth for the storm duration and AMC (antecedent moisture conditions) conditions. The application of this procedure showed how Monte Carlo simulation technique can reproduce the observed flood frequency curves with reasonable accuracy over a wide range of return periods using a simple and parsimonious approach, limited data input and without any calibration of the rainfall-runoff model.

  4. Stochastic Geometric Models with Non-stationary Spatial Correlations in Lagrangian Fluid Flows

    Science.gov (United States)

    Gay-Balmaz, François; Holm, Darryl D.

    2018-01-01

    Inspired by spatiotemporal observations from satellites of the trajectories of objects drifting near the surface of the ocean in the National Oceanic and Atmospheric Administration's "Global Drifter Program", this paper develops data-driven stochastic models of geophysical fluid dynamics (GFD) with non-stationary spatial correlations representing the dynamical behaviour of oceanic currents. Three models are considered. Model 1 from Holm (Proc R Soc A 471:20140963, 2015) is reviewed, in which the spatial correlations are time independent. Two new models, called Model 2 and Model 3, introduce two different symmetry breaking mechanisms by which the spatial correlations may be advected by the flow. These models are derived using reduction by symmetry of stochastic variational principles, leading to stochastic Hamiltonian systems, whose momentum maps, conservation laws and Lie-Poisson bracket structures are used in developing the new stochastic Hamiltonian models of GFD.

  5. Stochastic Geometric Models with Non-stationary Spatial Correlations in Lagrangian Fluid Flows

    Science.gov (United States)

    Gay-Balmaz, François; Holm, Darryl D.

    2018-06-01

    Inspired by spatiotemporal observations from satellites of the trajectories of objects drifting near the surface of the ocean in the National Oceanic and Atmospheric Administration's "Global Drifter Program", this paper develops data-driven stochastic models of geophysical fluid dynamics (GFD) with non-stationary spatial correlations representing the dynamical behaviour of oceanic currents. Three models are considered. Model 1 from Holm (Proc R Soc A 471:20140963, 2015) is reviewed, in which the spatial correlations are time independent. Two new models, called Model 2 and Model 3, introduce two different symmetry breaking mechanisms by which the spatial correlations may be advected by the flow. These models are derived using reduction by symmetry of stochastic variational principles, leading to stochastic Hamiltonian systems, whose momentum maps, conservation laws and Lie-Poisson bracket structures are used in developing the new stochastic Hamiltonian models of GFD.

  6. Markov Chain Models for the Stochastic Modeling of Pitting Corrosion

    Directory of Open Access Journals (Sweden)

    A. Valor

    2013-01-01

    Full Text Available The stochastic nature of pitting corrosion of metallic structures has been widely recognized. It is assumed that this kind of deterioration retains no memory of the past, so only the current state of the damage influences its future development. This characteristic allows pitting corrosion to be categorized as a Markov process. In this paper, two different models of pitting corrosion, developed using Markov chains, are presented. Firstly, a continuous-time, nonhomogeneous linear growth (pure birth Markov process is used to model external pitting corrosion in underground pipelines. A closed-form solution of the system of Kolmogorov's forward equations is used to describe the transition probability function in a discrete pit depth space. The transition probability function is identified by correlating the stochastic pit depth mean with the empirical deterministic mean. In the second model, the distribution of maximum pit depths in a pitting experiment is successfully modeled after the combination of two stochastic processes: pit initiation and pit growth. Pit generation is modeled as a nonhomogeneous Poisson process, in which induction time is simulated as the realization of a Weibull process. Pit growth is simulated using a nonhomogeneous Markov process. An analytical solution of Kolmogorov's system of equations is also found for the transition probabilities from the first Markov state. Extreme value statistics is employed to find the distribution of maximum pit depths.

  7. A stochastic model for the financial market with discontinuous prices

    Directory of Open Access Journals (Sweden)

    Leda D. Minkova

    1996-01-01

    Full Text Available This paper models some situations occurring in the financial market. The asset prices evolve according to a stochastic integral equation driven by a Gaussian martingale. A portfolio process is constrained in such a way that the wealth process covers some obligation. A solution to a linear stochastic integral equation is obtained in a class of cadlag stochastic processes.

  8. Stochastic higher spin six vertex model and Macdonald measures

    Science.gov (United States)

    Borodin, Alexei

    2018-02-01

    We prove an identity that relates the q-Laplace transform of the height function of a (higher spin inhomogeneous) stochastic six vertex model in a quadrant on one side and a multiplicative functional of a Macdonald measure on the other. The identity is used to prove the GUE Tracy-Widom asymptotics for two instances of the stochastic six vertex model via asymptotic analysis of the corresponding Schur measures.

  9. Improved ensemble-mean forecast skills of ENSO events by a zero-mean stochastic model-error model of an intermediate coupled model

    Science.gov (United States)

    Zheng, F.; Zhu, J.

    2015-12-01

    To perform an ensemble-based ENSO probabilistic forecast, the crucial issue is to design a reliable ensemble prediction strategy that should include the major uncertainties of a forecast system. In this study, we developed a new general ensemble perturbation technique to improve the ensemble-mean predictive skill of forecasting ENSO using an intermediate coupled model (ICM). The model uncertainties are first estimated and analyzed from EnKF analysis results through assimilating observed SST. Then, based on the pre-analyzed properties of the model errors, a zero-mean stochastic model-error model is developed to mainly represent the model uncertainties induced by some important physical processes missed in the coupled model (i.e., stochastic atmospheric forcing/MJO, extra-tropical cooling and warming, Indian Ocean Dipole mode, etc.). Each member of an ensemble forecast is perturbed by the stochastic model-error model at each step during the 12-month forecast process, and the stochastical perturbations are added into the modeled physical fields to mimic the presence of these high-frequency stochastic noises and model biases and their effect on the predictability of the coupled system. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr retrospective forecast experiments. The two forecast schemes are differentiated by whether they considered the model stochastic perturbations, with both initialized by the ensemble-mean analysis states from EnKF. The comparison results suggest that the stochastic model-error perturbations have significant and positive impacts on improving the ensemble-mean prediction skills during the entire 12-month forecast process. Because the nonlinear feature of the coupled model can induce the nonlinear growth of the added stochastic model errors with model integration, especially through the nonlinear heating mechanism with the vertical advection term of the model, the

  10. Stochastic hyperfine interactions modeling library

    Science.gov (United States)

    Zacate, Matthew O.; Evenson, William E.

    2011-04-01

    The stochastic hyperfine interactions modeling library (SHIML) provides a set of routines to assist in the development and application of stochastic models of hyperfine interactions. The library provides routines written in the C programming language that (1) read a text description of a model for fluctuating hyperfine fields, (2) set up the Blume matrix, upon which the evolution operator of the system depends, and (3) find the eigenvalues and eigenvectors of the Blume matrix so that theoretical spectra of experimental techniques that measure hyperfine interactions can be calculated. The optimized vector and matrix operations of the BLAS and LAPACK libraries are utilized; however, there was a need to develop supplementary code to find an orthonormal set of (left and right) eigenvectors of complex, non-Hermitian matrices. In addition, example code is provided to illustrate the use of SHIML to generate perturbed angular correlation spectra for the special case of polycrystalline samples when anisotropy terms of higher order than A can be neglected. Program summaryProgram title: SHIML Catalogue identifier: AEIF_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIF_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL 3 No. of lines in distributed program, including test data, etc.: 8224 No. of bytes in distributed program, including test data, etc.: 312 348 Distribution format: tar.gz Programming language: C Computer: Any Operating system: LINUX, OS X RAM: Varies Classification: 7.4 External routines: TAPP [1], BLAS [2], a C-interface to BLAS [3], and LAPACK [4] Nature of problem: In condensed matter systems, hyperfine methods such as nuclear magnetic resonance (NMR), Mössbauer effect (ME), muon spin rotation (μSR), and perturbed angular correlation spectroscopy (PAC) measure electronic and magnetic structure within Angstroms of nuclear probes through the hyperfine interaction. When

  11. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin; Zhang, Zhihua; Wong, Ka-Chun; Zhang, Xiangliang; Keyes, David E.

    2017-01-01

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference

  12. Calibration with respect to hydraulic head measurements in stochastic simulation of groundwater flow - a numerical experiment using MATLAB

    International Nuclear Information System (INIS)

    Eriksson, L.O.; Oppelstrup, J.

    1994-12-01

    A simulator for 2D stochastic continuum simulation and inverse modelling of groundwater flow has been developed. The simulator is well suited for method evaluation and what-if simulation and written in MATLAB. Conductivity fields are generated by unconditional simulation, conditional simulation on measured conductivities and calibration on both steady-state head measurements and transient head histories. The fields can also include fracture zones and zones with different mean conductivities. Statistics of conductivity fields and particle travel times are recorded in Monte-Carlo simulations. The calibration uses the pilot point technique, an inverse technique proposed by RamaRao and LaVenue. Several Kriging procedures are implemented, among others Kriging neighborhoods. In cases where the expectation of the log-conductivity in the truth field is known the nonbias conditions can be omitted, which will make the variance in the conditionally simulated conductivity fields smaller. A simulation experiment, resembling the initial stages of a site investigation and devised in collaboration with SKB, is performed and interpreted. The results obtained in the present study show less uncertainty than in our preceding study. This is mainly due to the modification of the Kriging procedure but also to the use of more data. Still the large uncertainty in cases of sparse data is apparent. The variogram represents essential characteristics of the conductivity field. Thus, even unconditional simulations take account of important information. Significant improvements in variance by further conditioning will be obtained only as the number of data becomes much larger. 16 refs, 26 figs

  13. Calibration with respect to hydraulic head measurements in stochastic simulation of groundwater flow - a numerical experiment using MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, L O; Oppelstrup, J [Starprog AB (Sweden)

    1994-12-01

    A simulator for 2D stochastic continuum simulation and inverse modelling of groundwater flow has been developed. The simulator is well suited for method evaluation and what-if simulation and written in MATLAB. Conductivity fields are generated by unconditional simulation, conditional simulation on measured conductivities and calibration on both steady-state head measurements and transient head histories. The fields can also include fracture zones and zones with different mean conductivities. Statistics of conductivity fields and particle travel times are recorded in Monte-Carlo simulations. The calibration uses the pilot point technique, an inverse technique proposed by RamaRao and LaVenue. Several Kriging procedures are implemented, among others Kriging neighborhoods. In cases where the expectation of the log-conductivity in the truth field is known the nonbias conditions can be omitted, which will make the variance in the conditionally simulated conductivity fields smaller. A simulation experiment, resembling the initial stages of a site investigation and devised in collaboration with SKB, is performed and interpreted. The results obtained in the present study show less uncertainty than in our preceding study. This is mainly due to the modification of the Kriging procedure but also to the use of more data. Still the large uncertainty in cases of sparse data is apparent. The variogram represents essential characteristics of the conductivity field. Thus, even unconditional simulations take account of important information. Significant improvements in variance by further conditioning will be obtained only as the number of data becomes much larger. 16 refs, 26 figs.

  14. Stochastic Parametrisations and Regime Behaviour of Atmospheric Models

    Science.gov (United States)

    Arnold, Hannah; Moroz, Irene; Palmer, Tim

    2013-04-01

    The presence of regimes is a characteristic of non-linear, chaotic systems (Lorenz, 2006). In the atmosphere, regimes emerge as familiar circulation patterns such as the El-Nino Southern Oscillation (ENSO), the North Atlantic Oscillation (NAO) and Scandinavian Blocking events. In recent years there has been much interest in the problem of identifying and studying atmospheric regimes (Solomon et al, 2007). In particular, how do these regimes respond to an external forcing such as anthropogenic greenhouse gas emissions? The importance of regimes in observed trends over the past 50-100 years indicates that in order to predict anthropogenic climate change, our climate models must be able to represent accurately natural circulation regimes, their statistics and variability. It is well established that representing model uncertainty as well as initial condition uncertainty is important for reliable weather forecasts (Palmer, 2001). In particular, stochastic parametrisation schemes have been shown to improve the skill of weather forecast models (e.g. Berner et al., 2009; Frenkel et al., 2012; Palmer et al., 2009). It is possible that including stochastic physics as a representation of model uncertainty could also be beneficial in climate modelling, enabling the simulator to explore larger regions of the climate attractor including other flow regimes. An alternative representation of model uncertainty is a perturbed parameter scheme, whereby physical parameters in subgrid parametrisation schemes are perturbed about their optimal value. Perturbing parameters gives a greater control over the ensemble than multi-model or multiparametrisation ensembles, and has been used as a representation of model uncertainty in climate prediction (Stainforth et al., 2005; Rougier et al., 2009). We investigate the effect of including representations of model uncertainty on the regime behaviour of a simulator. A simple chaotic model of the atmosphere, the Lorenz '96 system, is used to study

  15. Model Calibration in Watershed Hydrology

    Science.gov (United States)

    Yilmaz, Koray K.; Vrugt, Jasper A.; Gupta, Hoshin V.; Sorooshian, Soroosh

    2009-01-01

    Hydrologic models use relatively simple mathematical equations to conceptualize and aggregate the complex, spatially distributed, and highly interrelated water, energy, and vegetation processes in a watershed. A consequence of process aggregation is that the model parameters often do not represent directly measurable entities and must, therefore, be estimated using measurements of the system inputs and outputs. During this process, known as model calibration, the parameters are adjusted so that the behavior of the model approximates, as closely and consistently as possible, the observed response of the hydrologic system over some historical period of time. This Chapter reviews the current state-of-the-art of model calibration in watershed hydrology with special emphasis on our own contributions in the last few decades. We discuss the historical background that has led to current perspectives, and review different approaches for manual and automatic single- and multi-objective parameter estimation. In particular, we highlight the recent developments in the calibration of distributed hydrologic models using parameter dimensionality reduction sampling, parameter regularization and parallel computing.

  16. Model tracking dual stochastic controller design under irregular internal noises

    International Nuclear Information System (INIS)

    Lee, Jong Bok; Heo, Hoon; Cho, Yun Hyun; Ji, Tae Young

    2006-01-01

    Although many methods about the control of irregular external noise have been introduced and implemented, it is still necessary to design a controller that will be more effective and efficient methods to exclude for various noises. Accumulation of errors due to model tracking, internal noises (thermal noise, shot noise and l/f noise) that come from elements such as resistor, diode and transistor etc. in the circuit system and numerical errors due to digital process often destabilize the system and reduce the system performance. New stochastic controller is adopted to remove those noises using conventional controller simultaneously. Design method of a model tracking dual controller is proposed to improve the stability of system while removing external and internal noises. In the study, design process of the model tracking dual stochastic controller is introduced that improves system performance and guarantees robustness under irregular internal noises which can be created internally. The model tracking dual stochastic controller utilizing F-P-K stochastic control technique developed earlier is implemented to reveal its performance via simulation

  17. Simulation of nuclear plant operation into a stochastic energy production model

    International Nuclear Information System (INIS)

    Pacheco, R.L.

    1983-04-01

    A simulation model of nuclear plant operation is developed to fit into a stochastic energy production model. In order to improve the stochastic model used, and also reduce its computational time burdened by the aggregation of the model of nuclear plant operation, a study of tail truncation of the unsupplied demand distribution function has been performed. (E.G.) [pt

  18. A complementarity model for solving stochastic natural gas market equilibria

    International Nuclear Information System (INIS)

    Jifang Zhuang; Gabriel, S.A.

    2008-01-01

    This paper presents a stochastic equilibrium model for deregulated natural gas markets. Each market participant (pipeline operators, producers, etc.) solves a stochastic optimization problem whose optimality conditions, when combined with market-clearing conditions give rise to a certain mixed complementarity problem (MiCP). The stochastic aspects are depicted by a recourse problem for each player in which the first-stage decisions relate to long-term contracts and the second-stage decisions relate to spot market activities for three seasons. Besides showing that such a market model is an instance of a MiCP, we provide theoretical results concerning long-term and spot market prices and solve the resulting MiCP for a small yet representative market. We also note an interesting observation for the value of the stochastic solution for non-optimization problems. (author)

  19. A complementarity model for solving stochastic natural gas market equilibria

    International Nuclear Information System (INIS)

    Zhuang Jifang; Gabriel, Steven A.

    2008-01-01

    This paper presents a stochastic equilibrium model for deregulated natural gas markets. Each market participant (pipeline operators, producers, etc.) solves a stochastic optimization problem whose optimality conditions, when combined with market-clearing conditions give rise to a certain mixed complementarity problem (MiCP). The stochastic aspects are depicted by a recourse problem for each player in which the first-stage decisions relate to long-term contracts and the second-stage decisions relate to spot market activities for three seasons. Besides showing that such a market model is an instance of a MiCP, we provide theoretical results concerning long-term and spot market prices and solve the resulting MiCP for a small yet representative market. We also note an interesting observation for the value of the stochastic solution for non-optimization problems

  20. Deterministic and stochastic models for middle east respiratory syndrome (MERS)

    Science.gov (United States)

    Suryani, Dessy Rizki; Zevika, Mona; Nuraini, Nuning

    2018-03-01

    World Health Organization (WHO) data stated that since September 2012, there were 1,733 cases of Middle East Respiratory Syndrome (MERS) with 628 death cases that occurred in 27 countries. MERS was first identified in Saudi Arabia in 2012 and the largest cases of MERS outside Saudi Arabia occurred in South Korea in 2015. MERS is a disease that attacks the respiratory system caused by infection of MERS-CoV. MERS-CoV transmission occurs directly through direct contact between infected individual with non-infected individual or indirectly through contaminated object by the free virus. Suspected, MERS can spread quickly because of the free virus in environment. Mathematical modeling is used to illustrate the transmission of MERS disease using deterministic model and stochastic model. Deterministic model is used to investigate the temporal dynamic from the system to analyze the steady state condition. Stochastic model approach using Continuous Time Markov Chain (CTMC) is used to predict the future states by using random variables. From the models that were built, the threshold value for deterministic models and stochastic models obtained in the same form and the probability of disease extinction can be computed by stochastic model. Simulations for both models using several of different parameters are shown, and the probability of disease extinction will be compared with several initial conditions.

  1. Considerations when ranking stochastically modeled oil sands resource models for mining applications

    Energy Technology Data Exchange (ETDEWEB)

    Etris, E.L. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Petro-Canada, Calgary, AB (Canada); Idris, Y.; Hunter, A.C. [Petro-Canada, Calgary, AB (Canada)

    2008-10-15

    Alberta's Athabasca oil sands deposit has been targeted as a major resource for development. Bitumen recovery operations fall into 2 categories, namely mining and in situ operations. Mining recovery is done above ground level and consists of open pit digging, disaggregation of the bitumen-saturated sediment through crushing followed by pipeline transport in a water-based slurry and then separation of oil, water and sediment. In situ recovery consists of drilling wells and stimulating the oil sands in the subsurface with a thermal treatment to reduce the viscosity of the bitumen and allow it to come to the surface. Steam assisted gravity drainage (SAGD) is the most popular thermal treatment currently in use. Resource models that simulate the recovery process are needed for both mining and in situ recovery operations. Both types can benefit from the advantages of a stochastic modeling process for resource model building and uncertainty evaluation. Stochastic modeling provides a realistic geology and allows for multiple realizations, which mining operations can use to evaluate the variability of recoverable bitumen volumes and develop mine plans accordingly. This paper described the processes of stochastic modelling and of determining the appropriate single realization for mine planning as applied to the Fort Hills oil sands mine which is currently in the early planning stage. The modeling exercise was used to estimate the in-place resource and quantify the uncertainty in resource volumes. The stochastic models were checked against those generated from conventional methods to identify any differences and to make the appropriate adaptations. 13 refs., 3 tabs., 16 figs.

  2. A computer model of the biosphere, to estimate stochastic and non-stochastic effects of radionuclides on humans

    International Nuclear Information System (INIS)

    Laurens, J.M.

    1985-01-01

    A computer code was written to model food chains in order to estimate the internal and external doses, for stochastic and non-stochastic effects, on humans (adults and infants). Results are given for 67 radionuclides, for unit concentration in water (1 Bq/L) and in atmosphere (1 Bq/m 3 )

  3. A coupled stochastic rainfall-evapotranspiration model for hydrological impact analysis

    Science.gov (United States)

    Pham, Minh Tu; Vernieuwe, Hilde; De Baets, Bernard; Verhoest, Niko E. C.

    2018-02-01

    A hydrological impact analysis concerns the study of the consequences of certain scenarios on one or more variables or fluxes in the hydrological cycle. In such an exercise, discharge is often considered, as floods originating from extremely high discharges often cause damage. Investigating the impact of extreme discharges generally requires long time series of precipitation and evapotranspiration to be used to force a rainfall-runoff model. However, such kinds of data may not be available and one should resort to stochastically generated time series, even though the impact of using such data on the overall discharge, and especially on the extreme discharge events, is not well studied. In this paper, stochastically generated rainfall and corresponding evapotranspiration time series, generated by means of vine copulas, are used to force a simple conceptual hydrological model. The results obtained are comparable to the modelled discharge using observed forcing data. Yet, uncertainties in the modelled discharge increase with an increasing number of stochastically generated time series used. Notwithstanding this finding, it can be concluded that using a coupled stochastic rainfall-evapotranspiration model has great potential for hydrological impact analysis.

  4. Stochastic Modeling and Analysis of Power System with Renewable Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan

    Unlike traditional fossil-fuel based power generation, renewable generation such as wind power relies on uncontrollable prime sources such as wind speed. Wind speed varies stochastically, which to a large extent determines the stochastic behavior of power generation from wind farms...... that such a stochastic model can be used to simulate the effect of load management on the load duration curve. As CHP units are turned on and off by regulating power, CHP generation has discrete output and thus can be modeled by a transition matrix based discrete Markov chain. As the CHP generation has a strong diurnal...

  5. Extinction in neutrally stable stochastic Lotka-Volterra models

    Science.gov (United States)

    Dobrinevski, Alexander; Frey, Erwin

    2012-05-01

    Populations of competing biological species exhibit a fascinating interplay between the nonlinear dynamics of evolutionary selection forces and random fluctuations arising from the stochastic nature of the interactions. The processes leading to extinction of species, whose understanding is a key component in the study of evolution and biodiversity, are influenced by both of these factors. Here, we investigate a class of stochastic population dynamics models based on generalized Lotka-Volterra systems. In the case of neutral stability of the underlying deterministic model, the impact of intrinsic noise on the survival of species is dramatic: It destroys coexistence of interacting species on a time scale proportional to the population size. We introduce a new method based on stochastic averaging which allows one to understand this extinction process quantitatively by reduction to a lower-dimensional effective dynamics. This is performed analytically for two highly symmetrical models and can be generalized numerically to more complex situations. The extinction probability distributions and other quantities of interest we obtain show excellent agreement with simulations.

  6. Model predictive control classical, robust and stochastic

    CERN Document Server

    Kouvaritakis, Basil

    2016-01-01

    For the first time, a textbook that brings together classical predictive control with treatment of up-to-date robust and stochastic techniques. Model Predictive Control describes the development of tractable algorithms for uncertain, stochastic, constrained systems. The starting point is classical predictive control and the appropriate formulation of performance objectives and constraints to provide guarantees of closed-loop stability and performance. Moving on to robust predictive control, the text explains how similar guarantees may be obtained for cases in which the model describing the system dynamics is subject to additive disturbances and parametric uncertainties. Open- and closed-loop optimization are considered and the state of the art in computationally tractable methods based on uncertainty tubes presented for systems with additive model uncertainty. Finally, the tube framework is also applied to model predictive control problems involving hard or probabilistic constraints for the cases of multiplic...

  7. A low-bias simulation scheme for the SABR stochastic volatility model

    NARCIS (Netherlands)

    B. Chen (Bin); C.W. Oosterlee (Cornelis); J.A.M. van der Weide

    2012-01-01

    htmlabstractThe Stochastic Alpha Beta Rho Stochastic Volatility (SABR-SV) model is widely used in the financial industry for the pricing of fixed income instruments. In this paper we develop an lowbias simulation scheme for the SABR-SV model, which deals efficiently with (undesired)

  8. Stochastic mixed-mode oscillations in a three-species predator-prey model

    Science.gov (United States)

    Sadhu, Susmita; Kuehn, Christian

    2018-03-01

    The effect of demographic stochasticity, in the form of Gaussian white noise, in a predator-prey model with one fast and two slow variables is studied. We derive the stochastic differential equations (SDEs) from a discrete model. For suitable parameter values, the deterministic drift part of the model admits a folded node singularity and exhibits a singular Hopf bifurcation. We focus on the parameter regime near the Hopf bifurcation, where small amplitude oscillations exist as stable dynamics in the absence of noise. In this regime, the stochastic model admits noise-driven mixed-mode oscillations (MMOs), which capture the intermediate dynamics between two cycles of population outbreaks. We perform numerical simulations to calculate the distribution of the random number of small oscillations between successive spikes for varying noise intensities and distance to the Hopf bifurcation. We also study the effect of noise on a suitable Poincaré map. Finally, we prove that the stochastic model can be transformed into a normal form near the folded node, which can be linked to recent results on the interplay between deterministic and stochastic small amplitude oscillations. The normal form can also be used to study the parameter influence on the noise level near folded singularities.

  9. Stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural streamflow

    Science.gov (United States)

    Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.

    2016-02-24

    -balance model was developed for simulating monthly natural (unregulated) mean streamflow based on precipitation, temperature, and potential evapotranspiration at select streamflow-gaging stations. The model was calibrated using streamflow data from the U.S. Geological Survey and Environment Canada, along with natural (unregulated) streamflow data from the U.S. Army Corps of Engineers. Correlation coefficients between simulated and natural (unregulated) flows generally were high (greater than 0.8), and the seasonal means and standard deviations of the simulated flows closely matched the means and standard deviations of the natural (unregulated) flows. After calibrating the model for a monthly time step, monthly streamflow for each subbasin was disaggregated into three values per month, or an approximately 10-day time step, and a separate routing model was developed for simulating 10-day streamflow for downstream gages.The stochastic climate simulation model for precipitation, temperature, and potential evapotranspiration was combined with the water-balance model to simulate potential future sequences of 10-day mean streamflow for each of the streamflow-gaging station locations. Flood risk, as determined by equilibrium flow-frequency distributions for the dry (1912–69) and wet (1970–2011) climate states, was considerably higher for the wet state compared to the dry state. Future flood risk will remain high until the wet climate state ends, and for several years after that, because there may be a long lag-time between the return of drier conditions and the onset of a lower soil-moisture storage equilibrium.

  10. Stochastic resonance in models of neuronal ensembles

    International Nuclear Information System (INIS)

    Chialvo, D.R.; Longtin, A.; Mueller-Gerkin, J.

    1997-01-01

    Two recently suggested mechanisms for the neuronal encoding of sensory information involving the effect of stochastic resonance with aperiodic time-varying inputs are considered. It is shown, using theoretical arguments and numerical simulations, that the nonmonotonic behavior with increasing noise of the correlation measures used for the so-called aperiodic stochastic resonance (ASR) scenario does not rely on the cooperative effect typical of stochastic resonance in bistable and excitable systems. Rather, ASR with slowly varying signals is more properly interpreted as linearization by noise. Consequently, the broadening of the open-quotes resonance curveclose quotes in the multineuron stochastic resonance without tuning scenario can also be explained by this linearization. Computation of the input-output correlation as a function of both signal frequency and noise for the model system further reveals conditions where noise-induced firing with aperiodic inputs will benefit from stochastic resonance rather than linearization by noise. Thus, our study clarifies the tuning requirements for the optimal transduction of subthreshold aperiodic signals. It also shows that a single deterministic neuron can perform as well as a network when biased into a suprathreshold regime. Finally, we show that the inclusion of a refractory period in the spike-detection scheme produces a better correlation between instantaneous firing rate and input signal. copyright 1997 The American Physical Society

  11. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors

    Directory of Open Access Journals (Sweden)

    Spiros Pagiatakis

    2009-10-01

    Full Text Available In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times. It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF. It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at −40 °C, −20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  12. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    Science.gov (United States)

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  13. Stochastic volatility and multi-dimensional modeling in the European energy market

    Energy Technology Data Exchange (ETDEWEB)

    Vos, Linda

    2012-07-01

    In energy prices there is evidence for stochastic volatility. Stochastic volatility has effect on the price of path-dependent options and therefore has to be modeled properly. We introduced a multi-dimensional non-Gaussian stochastic volatility model with leverage which can be used in energy pricing. It captures special features of energy prices like price spikes, mean-reversion, stochastic volatility and inverse leverage. Moreover it allows modeling dependencies between different commodities.The derived forward price dynamics based on this multi-variate spot price model, provides a very flexible structure. It includes cotango, backwardation and hump shape forward curves.Alternatively energy prices could be modeled by a 2-factor model consisting of a non-Gaussian stable CARMA process and a non-stationary trend models by a Levy process. Also this model is able to capture special features like price spikes, mean reversion and the low frequency dynamics in the market. An robust L1-filter is introduced to filter out the states of the CARMA process. When applying to German electricity EEX exchange data an overall negative risk-premium is found. However close to delivery a positive risk-premium is observed.(Author)

  14. Stochastic Load Models and Footbridge Response

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2015-01-01

    Pedestrians may cause vibrations in footbridges and these vibrations may potentially be annoying. This calls for predictions of footbridge vibration levels and the paper considers a stochastic approach to modeling the action of pedestrians assuming walking parameters such as step frequency, pedes...

  15. On cross-currency models with stochastic volatility and correlated interest rates

    NARCIS (Netherlands)

    Grzelak, L.A.; Oosterlee, C.W.

    2010-01-01

    We construct multi-currency models with stochastic volatility and correlated stochastic interest rates with a full matrix of correlations. We first deal with a foreign exchange (FX) model of Heston-type, in which the domestic and foreign interest rates are generated by the short-rate process of

  16. Reliability-Based Calibration of Load Duration Factors for Timber Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Svensson, Staffan; Stang, Birgitte Friis Dela

    2005-01-01

    John Dalsgaard Sørensen, Staffan Svensson, Birgitte Dela Stang : Reliability-Based Calibration of Load Duration Factors for Timber Structures     Abstract :   The load bearing capacity of timber structures decrease with time depending on the type of load and timber. Based on representative limit...... states and stochastic models for timber structures, load duration factors are calibrated using probabilistic methods. Load duration e.ects are estimated on basis of simulation of realizations of wind, snow and imposed loads in accordance with the load models in the Danish structural codes. Three damage...... accumulation models are considered, namely Gerhards model, Barrett and Foschi _ s model and Foschi and Yao _ s model. The parameters in these models are .tted by the Maximum Likelihood Method using data relevant for Danish structural timber and the statistical uncertainty is quanti .ed. The reliability...

  17. On the Realistic Stochastic Model of GPS Observables: Implementation and Performance

    Science.gov (United States)

    Zangeneh-Nejad, F.; Amiri-Simkooei, A. R.; Sharifi, M. A.; Asgari, J.

    2015-12-01

    High-precision GPS positioning requires a realistic stochastic model of observables. A realistic GPS stochastic model of observables should take into account different variances for different observation types, correlations among different observables, the satellite elevation dependence of observables precision, and the temporal correlation of observables. Least-squares variance component estimation (LS-VCE) is applied to GPS observables using the geometry-based observation model (GBOM). To model the satellite elevation dependent of GPS observables precision, an exponential model depending on the elevation angles of the satellites are also employed. Temporal correlation of the GPS observables is modelled by using a first-order autoregressive noise model. An important step in the high-precision GPS positioning is double difference integer ambiguity resolution (IAR). The fraction or percentage of success among a number of integer ambiguity fixing is called the success rate. A realistic estimation of the GNSS observables covariance matrix plays an important role in the IAR. We consider the ambiguity resolution success rate for two cases, namely a nominal and a realistic stochastic model of the GPS observables using two GPS data sets collected by the Trimble R8 receiver. The results confirm that applying a more realistic stochastic model can significantly improve the IAR success rate on individual frequencies, either on L1 or on L2. An improvement of 20% was achieved to the empirical success rate results. The results also indicate that introducing the realistic stochastic model leads to a larger standard deviation for the baseline components by a factor of about 2.6 on the data sets considered.

  18. Inexact Multistage Stochastic Chance Constrained Programming Model for Water Resources Management under Uncertainties

    Directory of Open Access Journals (Sweden)

    Hong Zhang

    2017-01-01

    Full Text Available In order to formulate water allocation schemes under uncertainties in the water resources management systems, an inexact multistage stochastic chance constrained programming (IMSCCP model is proposed. The model integrates stochastic chance constrained programming, multistage stochastic programming, and inexact stochastic programming within a general optimization framework to handle the uncertainties occurring in both constraints and objective. These uncertainties are expressed as probability distributions, interval with multiply distributed stochastic boundaries, dynamic features of the long-term water allocation plans, and so on. Compared with the existing inexact multistage stochastic programming, the IMSCCP can be used to assess more system risks and handle more complicated uncertainties in water resources management systems. The IMSCCP model is applied to a hypothetical case study of water resources management. In order to construct an approximate solution for the model, a hybrid algorithm, which incorporates stochastic simulation, back propagation neural network, and genetic algorithm, is proposed. The results show that the optimal value represents the maximal net system benefit achieved with a given confidence level under chance constraints, and the solutions provide optimal water allocation schemes to multiple users over a multiperiod planning horizon.

  19. The Role of Stochastic Models in Interpreting the Origins of Biological Chirality

    Directory of Open Access Journals (Sweden)

    Gábor Lente

    2010-04-01

    Full Text Available This review summarizes recent stochastic modeling efforts in the theoretical research aimed at interpreting the origins of biological chirality. Stochastic kinetic models, especially those based on the continuous time discrete state approach, have great potential in modeling absolute asymmetric reactions, experimental examples of which have been reported in the past decade. An overview of the relevant mathematical background is given and several examples are presented to show how the significant numerical problems characteristic of the use of stochastic models can be overcome by non-trivial, but elementary algebra. In these stochastic models, a particulate view of matter is used rather than the concentration-based view of traditional chemical kinetics using continuous functions to describe the properties system. This has the advantage of giving adequate description of single-molecule events, which were probably important in the origin of biological chirality. The presented models can interpret and predict the random distribution of enantiomeric excess among repetitive experiments, which is the most striking feature of absolute asymmetric reactions. It is argued that the use of the stochastic kinetic approach should be much more widespread in the relevant literature.

  20. Moment Closure for the Stochastic Logistic Model

    National Research Council Canada - National Science Library

    Singh, Abhyudai; Hespanha, Joao P

    2006-01-01

    ..., which we refer to as the moment closure function. In this paper, a systematic procedure for constructing moment closure functions of arbitrary order is presented for the stochastic logistic model...

  1. Stochastic persistence and stationary distribution in an SIS epidemic model with media coverage

    Science.gov (United States)

    Guo, Wenjuan; Cai, Yongli; Zhang, Qimin; Wang, Weiming

    2018-02-01

    This paper aims to study an SIS epidemic model with media coverage from a general deterministic model to a stochastic differential equation with environment fluctuation. Mathematically, we use the Markov semigroup theory to prove that the basic reproduction number R0s can be used to control the dynamics of stochastic system. Epidemiologically, we show that environment fluctuation can inhibit the occurrence of the disease, namely, in the case of disease persistence for the deterministic model, the disease still dies out with probability one for the stochastic model. So to a great extent the stochastic perturbation under media coverage affects the outbreak of the disease.

  2. Persistence and extinction for a stochastic logistic model with infinite delay

    Directory of Open Access Journals (Sweden)

    Chun Lu

    2013-11-01

    Full Text Available This article, studies a stochastic logistic model with infinite delay. Using a phase space, we establish sufficient conditions for the extinction, nonpersistence in the mean, weak persistence, and stochastic permanence. A threshold between weak persistence and extinction is obtained. Our results state that different types of environmental noises have different effects on the persistence and extinction, and that the delay has no impact on the persistence and extinction for the stochastic model in the autonomous case. Numerical simulations illustrate the theoretical results.

  3. Stochastic Channel Modeling for Railway Tunnel Scenarios at 25 GHz

    Directory of Open Access Journals (Sweden)

    Danping He

    2018-02-01

    Full Text Available More people prefer using rail traffic for travel or for commuting owing to its convenience and flexibility. The railway scenario has become an important communication scenario in the fifth generation era. The communication system should be designed to support high‐data‐rate demands with seamless connectivity at a high mobility. In this paper, the channel characteristics are studied and modeled for the railway tunnel scenario with straight and curved route shapes. On the basis of measurements using the “Mobile Hotspot Network” system, a three‐dimensional ray tracer (RT is calibrated and validated for the target scenarios. More channel characteristics are explored via RT simulations at 25.25 GHz with a 500‐MHz bandwidth. The key channel parameters are extracted, provided, and incorporated into a 3rd‐Generation‐Partnership‐Project‐like stochastic channel generator. The necessary channel information can be practically realized, which can support the link‐level and system‐level design of the communication system in similar scenarios.

  4. Stochastic modeling for reliability shocks, burn-in and heterogeneous populations

    CERN Document Server

    Finkelstein, Maxim

    2013-01-01

    Focusing on shocks modeling, burn-in and heterogeneous populations, Stochastic Modeling for Reliability naturally combines these three topics in the unified stochastic framework and presents numerous practical examples that illustrate recent theoretical findings of the authors.  The populations of manufactured items in industry are usually heterogeneous. However, the conventional reliability analysis is performed under the implicit assumption of homogeneity, which can result in distortion of the corresponding reliability indices and various misconceptions. Stochastic Modeling for Reliability fills this gap and presents the basics and further developments of reliability theory for heterogeneous populations. Specifically, the authors consider burn-in as a method of elimination of ‘weak’ items from heterogeneous populations. The real life objects are operating in a changing environment. One of the ways to model an impact of this environment is via the external shocks occurring in accordance with some stocha...

  5. Some recent developments in stochastic volatility modelling

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Nicolato, Elisa; Shephard, N.

    2002-01-01

    This paper reviews and puts in context some of our recent work on stochastic volatility (SV) modelling for financial economics. Here our main focus is on: (i) the relationship between subordination and SV, (ii) OU based volatility models, (iii) exact option pricing, (iv) realized power variation...

  6. Scalable inference for stochastic block models

    KAUST Repository

    Peng, Chengbin

    2017-12-08

    Community detection in graphs is widely used in social and biological networks, and the stochastic block model is a powerful probabilistic tool for describing graphs with community structures. However, in the era of "big data," traditional inference algorithms for such a model are increasingly limited due to their high time complexity and poor scalability. In this paper, we propose a multi-stage maximum likelihood approach to recover the latent parameters of the stochastic block model, in time linear with respect to the number of edges. We also propose a parallel algorithm based on message passing. Our algorithm can overlap communication and computation, providing speedup without compromising accuracy as the number of processors grows. For example, to process a real-world graph with about 1.3 million nodes and 10 million edges, our algorithm requires about 6 seconds on 64 cores of a contemporary commodity Linux cluster. Experiments demonstrate that the algorithm can produce high quality results on both benchmark and real-world graphs. An example of finding more meaningful communities is illustrated consequently in comparison with a popular modularity maximization algorithm.

  7. A stochastic analysis for a phytoplankton-zooplankton model

    International Nuclear Information System (INIS)

    Ge, G; Wang, H-L; Xu, J

    2008-01-01

    A simple phytoplankton-zooplankton nonlinear dynamical model was proposed to study the coexistence of all the species and a Hopf bifurcation was observed. In order to study the effect of environmental robustness on this system, we have stochastically perturbed the system with respect to white noise around its positive interior equilibrium. We have observed that the system remains stochastically stable around the positive equilibrium for same parametric values in the deterministic situation

  8. ARMA modeling of stochastic processes in nuclear reactor with significant detection noise

    International Nuclear Information System (INIS)

    Zavaljevski, N.

    1992-01-01

    The theoretical basis of ARMA modelling of stochastic processes in nuclear reactor was presented in a previous paper, neglecting observational noise. The identification of real reactor data indicated that in some experiments the detection noise is significant. Thus a more rigorous theoretical modelling of stochastic processes in nuclear reactor is performed. Starting from the fundamental stochastic differential equations of the Langevin type for the interaction of the detector with neutron field, a new theoretical ARMA model is developed. preliminary identification results confirm the theoretical expectations. (author)

  9. INFERENCE AND SENSITIVITY IN STOCHASTIC WIND POWER FORECAST MODELS.

    KAUST Repository

    Elkantassi, Soumaya

    2017-10-03

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  10. INFERENCE AND SENSITIVITY IN STOCHASTIC WIND POWER FORECAST MODELS.

    KAUST Repository

    Elkantassi, Soumaya; Kalligiannaki, Evangelia; Tempone, Raul

    2017-01-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  11. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  12. Stochastic modelling in design of mechanical properties of nanometals

    International Nuclear Information System (INIS)

    Tengen, T.B.; Wejrzanowski, T.; Iwankiewicz, R.; Kurzydlowski, K.J.

    2010-01-01

    Polycrystalline nanometals are being fabricated through different processing routes and conditions. The consequence is that nanometals having the same mean grain size may have different grain size dispersion and, hence, may have different material properties. This has often led to conflicting reports from both theoretical and experimental findings about the evolutions of the mechanical properties of nanomaterials. The present paper employs stochastic model to study the impact of microstructure evolution during grain growth on the mechanical properties of polycrystalline nanometals. The stochastic model for grain growth and the stochastic model for changes in mechanical properties of nanomaterials are proposed. The model for the mechanical properties developed is tested on aluminium samples.Many salient features of the mechanical properties of the aluminium samples are revealed. The results show that the different mechanisms of grain growth impart different nature of response to the material mechanical properties. The conventional, homologous and anomalous temperature dependences of the yield stress have also been revealed to be due to different nature of interactions of the microstructures during evolution.

  13. Bayesian inference for hybrid discrete-continuous stochastic kinetic models

    International Nuclear Information System (INIS)

    Sherlock, Chris; Golightly, Andrew; Gillespie, Colin S

    2014-01-01

    We consider the problem of efficiently performing simulation and inference for stochastic kinetic models. Whilst it is possible to work directly with the resulting Markov jump process (MJP), computational cost can be prohibitive for networks of realistic size and complexity. In this paper, we consider an inference scheme based on a novel hybrid simulator that classifies reactions as either ‘fast’ or ‘slow’ with fast reactions evolving as a continuous Markov process whilst the remaining slow reaction occurrences are modelled through a MJP with time-dependent hazards. A linear noise approximation (LNA) of fast reaction dynamics is employed and slow reaction events are captured by exploiting the ability to solve the stochastic differential equation driving the LNA. This simulation procedure is used as a proposal mechanism inside a particle MCMC scheme, thus allowing Bayesian inference for the model parameters. We apply the scheme to a simple application and compare the output with an existing hybrid approach and also a scheme for performing inference for the underlying discrete stochastic model. (paper)

  14. Fitting Social Network Models Using Varying Truncation Stochastic Approximation MCMC Algorithm

    KAUST Repository

    Jin, Ick Hoon

    2013-10-01

    The exponential random graph model (ERGM) plays a major role in social network analysis. However, parameter estimation for the ERGM is a hard problem due to the intractability of its normalizing constant and the model degeneracy. The existing algorithms, such as Monte Carlo maximum likelihood estimation (MCMLE) and stochastic approximation, often fail for this problem in the presence of model degeneracy. In this article, we introduce the varying truncation stochastic approximation Markov chain Monte Carlo (SAMCMC) algorithm to tackle this problem. The varying truncation mechanism enables the algorithm to choose an appropriate starting point and an appropriate gain factor sequence, and thus to produce a reasonable parameter estimate for the ERGM even in the presence of model degeneracy. The numerical results indicate that the varying truncation SAMCMC algorithm can significantly outperform the MCMLE and stochastic approximation algorithms: for degenerate ERGMs, MCMLE and stochastic approximation often fail to produce any reasonable parameter estimates, while SAMCMC can do; for nondegenerate ERGMs, SAMCMC can work as well as or better than MCMLE and stochastic approximation. The data and source codes used for this article are available online as supplementary materials. © 2013 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

  15. A Method to Test Model Calibration Techniques: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-09-01

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  16. Double diffusivity model under stochastic forcing

    Science.gov (United States)

    Chattopadhyay, Amit K.; Aifantis, Elias C.

    2017-05-01

    The "double diffusivity" model was proposed in the late 1970s, and reworked in the early 1980s, as a continuum counterpart to existing discrete models of diffusion corresponding to high diffusivity paths, such as grain boundaries and dislocation lines. It was later rejuvenated in the 1990s to interpret experimental results on diffusion in polycrystalline and nanocrystalline specimens where grain boundaries and triple grain boundary junctions act as high diffusivity paths. Technically, the model pans out as a system of coupled Fick-type diffusion equations to represent "regular" and "high" diffusivity paths with "source terms" accounting for the mass exchange between the two paths. The model remit was extended by analogy to describe flow in porous media with double porosity, as well as to model heat conduction in media with two nonequilibrium local temperature baths, e.g., ion and electron baths. Uncoupling of the two partial differential equations leads to a higher-ordered diffusion equation, solutions of which could be obtained in terms of classical diffusion equation solutions. Similar equations could also be derived within an "internal length" gradient (ILG) mechanics formulation applied to diffusion problems, i.e., by introducing nonlocal effects, together with inertia and viscosity, in a mechanics based formulation of diffusion theory. While being remarkably successful in studies related to various aspects of transport in inhomogeneous media with deterministic microstructures and nanostructures, its implications in the presence of stochasticity have not yet been considered. This issue becomes particularly important in the case of diffusion in nanopolycrystals whose deterministic ILG-based theoretical calculations predict a relaxation time that is only about one-tenth of the actual experimentally verified time scale. This article provides the "missing link" in this estimation by adding a vital element in the ILG structure, that of stochasticity, that takes into

  17. Stochastic modeling concepts in groundwater and risk assessment: potential application to marine problems

    International Nuclear Information System (INIS)

    Hamed, Maged M.

    2000-01-01

    Parameter uncertainty is ubiquitous in marine environmental processes. Failure to account for this uncertainty may lead to erroneous results, and may have significant environmental and economic ramifications. Stochastic modeling of oil spill transport and fate is, therefore, central in the development of an oil spill contingency plan for new oil and gas projects. Over the past twenty years, several stochastic modeling tools have been developed for modeling parameter uncertainty, including the spectral, perturbation, and simulation methods. In this work we explore the application of a new stochastic methodology, the first-order reliability method (FORM), in oil spill modeling. FORM was originally developed in the structural reliability field and has been recently applied to various environmental problems. The method has many appealing features that makes it a powerful tool for modeling complex environmental systems. The theory of FORM is presented, identifying the features that distinguish the method from other stochastic tools. Different formulations to the reliability-based stochastic oil spill modeling are presented in a decision-analytic context. (Author)

  18. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  19. Digital hardware implementation of a stochastic two-dimensional neuron model.

    Science.gov (United States)

    Grassia, F; Kohno, T; Levi, T

    2016-11-01

    This study explores the feasibility of stochastic neuron simulation in digital systems (FPGA), which realizes an implementation of a two-dimensional neuron model. The stochasticity is added by a source of current noise in the silicon neuron using an Ornstein-Uhlenbeck process. This approach uses digital computation to emulate individual neuron behavior using fixed point arithmetic operation. The neuron model's computations are performed in arithmetic pipelines. It was designed in VHDL language and simulated prior to mapping in the FPGA. The experimental results confirmed the validity of the developed stochastic FPGA implementation, which makes the implementation of the silicon neuron more biologically plausible for future hybrid experiments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Stochastic modelling of conjugate heat transfer in near-wall turbulence

    International Nuclear Information System (INIS)

    Pozorski, Jacek; Minier, Jean-Pierre

    2006-01-01

    The paper addresses the conjugate heat transfer in turbulent flows with temperature assumed to be a passive scalar. The Lagrangian approach is applied and the heat transfer is modelled with the use of stochastic particles. The intensity of thermal fluctuations in near-wall turbulence is determined from the scalar probability density function (PDF) with externally provided dynamical statistics. A stochastic model for the temperature field in the wall material is proposed and boundary conditions for stochastic particles at the solid-fluid interface are formulated. The heated channel flow with finite-thickness walls is considered as a validation case. Computation results for the mean temperature profiles and the variance of thermal fluctuations are presented and compared with available DNS data

  1. Stochastic modelling of conjugate heat transfer in near-wall turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Pozorski, Jacek [Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80952 Gdansk (Poland)]. E-mail: jp@imp.gda.pl; Minier, Jean-Pierre [Research and Development Division, Electricite de France, 6 quai Watier, 78400 Chatou (France)

    2006-10-15

    The paper addresses the conjugate heat transfer in turbulent flows with temperature assumed to be a passive scalar. The Lagrangian approach is applied and the heat transfer is modelled with the use of stochastic particles. The intensity of thermal fluctuations in near-wall turbulence is determined from the scalar probability density function (PDF) with externally provided dynamical statistics. A stochastic model for the temperature field in the wall material is proposed and boundary conditions for stochastic particles at the solid-fluid interface are formulated. The heated channel flow with finite-thickness walls is considered as a validation case. Computation results for the mean temperature profiles and the variance of thermal fluctuations are presented and compared with available DNS data.

  2. Analysis and reconstruction of stochastic coupled map lattice models

    International Nuclear Information System (INIS)

    Coca, Daniel; Billings, Stephen A.

    2003-01-01

    The Letter introduces a general stochastic coupled lattice map model together with an algorithm to estimate the nodal equations involved based only on a small set of observable variables and in the presence of stochastic perturbations. More general forms of the Frobenius-Perron and the transfer operators, which describe the evolution of densities under the action of the CML transformation, are derived

  3. Stochastic bifurcation in a model of love with colored noise

    Science.gov (United States)

    Yue, Xiaokui; Dai, Honghua; Yuan, Jianping

    2015-07-01

    In this paper, we wish to examine the stochastic bifurcation induced by multiplicative Gaussian colored noise in a dynamical model of love where the random factor is used to describe the complexity and unpredictability of psychological systems. First, the dynamics in deterministic love-triangle model are considered briefly including equilibrium points and their stability, chaotic behaviors and chaotic attractors. Then, the influences of Gaussian colored noise with different parameters are explored such as the phase plots, top Lyapunov exponents, stationary probability density function (PDF) and stochastic bifurcation. The stochastic P-bifurcation through a qualitative change of the stationary PDF will be observed and bifurcation diagram on parameter plane of correlation time and noise intensity is presented to find the bifurcation behaviors in detail. Finally, the top Lyapunov exponent is computed to determine the D-bifurcation when the noise intensity achieves to a critical value. By comparison, we find there is no connection between two kinds of stochastic bifurcation.

  4. Analysis of novel stochastic switched SILI epidemic models with continuous and impulsive control

    Science.gov (United States)

    Gao, Shujing; Zhong, Deming; Zhang, Yan

    2018-04-01

    In this paper, we establish two new stochastic switched epidemic models with continuous and impulsive control. The stochastic perturbations are considered for the natural death rate in each equation of the models. Firstly, a stochastic switched SILI model with continuous control schemes is investigated. By using Lyapunov-Razumikhin method, the sufficient conditions for extinction in mean are established. Our result shows that the disease could be die out theoretically if threshold value R is less than one, regardless of whether the disease-free solutions of the corresponding subsystems are stable or unstable. Then, a stochastic switched SILI model with continuous control schemes and pulse vaccination is studied. The threshold value R is derived. The global attractivity of the model is also obtained. At last, numerical simulations are carried out to support our results.

  5. Stochastic Subspace Modelling of Turbulence

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Pedersen, B. J.; Nielsen, Søren R.K.

    2009-01-01

    positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order......, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.......Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper...

  6. Reflected stochastic differential equation models for constrained animal movement

    Science.gov (United States)

    Hanks, Ephraim M.; Johnson, Devin S.; Hooten, Mevin B.

    2017-01-01

    Movement for many animal species is constrained in space by barriers such as rivers, shorelines, or impassable cliffs. We develop an approach for modeling animal movement constrained in space by considering a class of constrained stochastic processes, reflected stochastic differential equations. Our approach generalizes existing methods for modeling unconstrained animal movement. We present methods for simulation and inference based on augmenting the constrained movement path with a latent unconstrained path and illustrate this augmentation with a simulation example and an analysis of telemetry data from a Steller sea lion (Eumatopias jubatus) in southeast Alaska.

  7. Financial model calibration using consistency hints.

    Science.gov (United States)

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  8. Stochastic interest rates model in compounding | Galadima ...

    African Journals Online (AJOL)

    Stochastic interest rates model in compounding. ... in finance, real estate, insurance, accounting and other areas of business administration. The assumption that future rates are fixed and known with certainty at the beginning of an investment, ...

  9. A stochastic cloud model for cloud and ozone retrievals from UV measurements

    International Nuclear Information System (INIS)

    Efremenko, Dmitry S.; Schüssler, Olena; Doicu, Adrian; Loyola, Diego

    2016-01-01

    The new generation of satellite instruments provides measurements in and around the Oxygen A-band on a global basis and with a relatively high spatial resolution. These data are commonly used for the determination of cloud properties. A stochastic model and radiative transfer model, previously developed by the authors, is used as the forward model component in retrievals of cloud parameters and ozone total and partial columns. The cloud retrieval algorithm combines local and global optimization routines, and yields a retrieval accuracy of about 1% and a fast computational time. Retrieved parameters are the cloud optical thickness and the cloud-top height. It was found that the use of the independent pixel approximation instead of the stochastic cloud model leads to large errors in the retrieved cloud parameters, as well as, in the retrieved ozone height resolved partial columns. The latter can be reduced by using the stochastic cloud model to compute the optimal value of the regularization parameter in the framework of Tikhonov regularization. - Highlights: • A stochastic radiative transfer model for retrieving clouds/ozone is designed. • Errors of independent pixel approximation (IPA) for O3 total column are small. • The error of IPA for ozone profile retrieval may become large. • The use of stochastic model reduces the error of ozone profile retrieval.

  10. Threshold Dynamics in Stochastic SIRS Epidemic Models with Nonlinear Incidence and Vaccination

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available In this paper, the dynamical behaviors for a stochastic SIRS epidemic model with nonlinear incidence and vaccination are investigated. In the models, the disease transmission coefficient and the removal rates are all affected by noise. Some new basic properties of the models are found. Applying these properties, we establish a series of new threshold conditions on the stochastically exponential extinction, stochastic persistence, and permanence in the mean of the disease with probability one for the models. Furthermore, we obtain a sufficient condition on the existence of unique stationary distribution for the model. Finally, a series of numerical examples are introduced to illustrate our main theoretical results and some conjectures are further proposed.

  11. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  12. CALIBRATED HYDRODYNAMIC MODEL

    Directory of Open Access Journals (Sweden)

    Sezar Gülbaz

    2015-01-01

    Full Text Available The land development and increase in urbanization in a watershed affect water quantityand water quality. On one hand, urbanization provokes the adjustment of geomorphicstructure of the streams, ultimately raises peak flow rate which causes flood; on theother hand, it diminishes water quality which results in an increase in Total SuspendedSolid (TSS. Consequently, sediment accumulation in downstream of urban areas isobserved which is not preferred for longer life of dams. In order to overcome thesediment accumulation problem in dams, the amount of TSS in streams and inwatersheds should be taken under control. Low Impact Development (LID is a BestManagement Practice (BMP which may be used for this purpose. It is a land planningand engineering design method which is applied in managing storm water runoff inorder to reduce flooding as well as simultaneously improve water quality. LID includestechniques to predict suspended solid loads in surface runoff generated over imperviousurban surfaces. In this study, the impact of LID-BMPs on surface runoff and TSS isinvestigated by employing a calibrated hydrodynamic model for Sazlidere Watershedwhich is located in Istanbul, Turkey. For this purpose, a calibrated hydrodynamicmodel was developed by using Environmental Protection Agency Storm WaterManagement Model (EPA SWMM. For model calibration and validation, we set up arain gauge and a flow meter into the field and obtain rainfall and flow rate data. Andthen, we select several LID types such as retention basins, vegetative swales andpermeable pavement and we obtain their influence on peak flow rate and pollutantbuildup and washoff for TSS. Consequently, we observe the possible effects ofLID on surface runoff and TSS in Sazlidere Watershed.

  13. Models of the stochastic activity of neurones

    CERN Document Server

    Holden, Arun Vivian

    1976-01-01

    These notes have grown from a series of seminars given at Leeds between 1972 and 1975. They represent an attempt to gather together the different kinds of model which have been proposed to account for the stochastic activity of neurones, and to provide an introduction to this area of mathematical biology. A striking feature of the electrical activity of the nervous system is that it appears stochastic: this is apparent at all levels of recording, ranging from intracellular recordings to the electroencephalogram. The chapters start with fluctuations in membrane potential, proceed through single unit and synaptic activity and end with the behaviour of large aggregates of neurones: L have chgaen this seque~~e\\/~~';uggest that the interesting behaviourr~f :the nervous system - its individuality, variability and dynamic forms - may in part result from the stochastic behaviour of its components. I would like to thank Dr. Julio Rubio for reading and commenting on the drafts, Mrs. Doris Beighton for producing the fin...

  14. Stochastic modelling of avascular tumour growth and therapy

    International Nuclear Information System (INIS)

    Sahoo, S; Sahoo, A; Shearer, S F C

    2011-01-01

    In this paper, a generalized stochastic model for the growth of avascular tumours is presented. This model captures the dynamical evolution of avascular tumour cell subpopulations by incorporating Gaussian white noise into the growth rate of the mitotic function. This work generalizes the deterministic model proposed by Sherratt and Chaplain (2001 J. Math. Biol. 43 291) where they formulated a tumour model in an in vivo setting, in terms of continuum densities of proliferating, quiescent and necrotic cells. Detailed simulations of our model show that the inclusion of Gaussian noise in the original model of Sherratt and Chaplain substantially distorts the overall structure of the density profiles in addition to reducing the speed of tumour growth. Within this stochastic carcinogenesis framework the action of therapy is also investigated by replacing Gaussian white noise with a therapy term. We compare a constant therapy protocol with a logarithmic time-dependent protocol. Our results predict that a logarithmic therapy is more effective than the constant therapy protocol.

  15. Stochastic models of solute transport in highly heterogeneous geologic media

    Energy Technology Data Exchange (ETDEWEB)

    Semenov, V.N.; Korotkin, I.A.; Pruess, K.; Goloviznin, V.M.; Sorokovikova, O.S.

    2009-09-15

    A stochastic model of anomalous diffusion was developed in which transport occurs by random motion of Brownian particles, described by distribution functions of random displacements with heavy (power-law) tails. One variant of an effective algorithm for random function generation with a power-law asymptotic and arbitrary factor of asymmetry is proposed that is based on the Gnedenko-Levy limit theorem and makes it possible to reproduce all known Levy {alpha}-stable fractal processes. A two-dimensional stochastic random walk algorithm has been developed that approximates anomalous diffusion with streamline-dependent and space-dependent parameters. The motivation for introducing such a type of dispersion model is the observed fact that tracers in natural aquifers spread at different super-Fickian rates in different directions. For this and other important cases, stochastic random walk models are the only known way to solve the so-called multiscaling fractional order diffusion equation with space-dependent parameters. Some comparisons of model results and field experiments are presented.

  16. Stochastic dynamics modeling solute transport in porous media modeling solute transport in porous media

    CERN Document Server

    Kulasiri, Don

    2002-01-01

    Most of the natural and biological phenomena such as solute transport in porous media exhibit variability which can not be modeled by using deterministic approaches. There is evidence in natural phenomena to suggest that some of the observations can not be explained by using the models which give deterministic solutions. Stochastic processes have a rich repository of objects which can be used to express the randomness inherent in the system and the evolution of the system over time. The attractiveness of the stochastic differential equations (SDE) and stochastic partial differential equations (SPDE) come from the fact that we can integrate the variability of the system along with the scientific knowledge pertaining to the system. One of the aims of this book is to explaim some useufl concepts in stochastic dynamics so that the scientists and engineers with a background in undergraduate differential calculus could appreciate the applicability and appropriateness of these developments in mathematics. The ideas ...

  17. Methods and models in mathematical biology deterministic and stochastic approaches

    CERN Document Server

    Müller, Johannes

    2015-01-01

    This book developed from classes in mathematical biology taught by the authors over several years at the Technische Universität München. The main themes are modeling principles, mathematical principles for the analysis of these models, and model-based analysis of data. The key topics of modern biomathematics are covered: ecology, epidemiology, biochemistry, regulatory networks, neuronal networks, and population genetics. A variety of mathematical methods are introduced, ranging from ordinary and partial differential equations to stochastic graph theory and  branching processes. A special emphasis is placed on the interplay between stochastic and deterministic models.

  18. Stochastic description of heterogeneities of permeability within groundwater flow models

    International Nuclear Information System (INIS)

    Cacas, M.C.; Lachassagne, P.; Ledoux, E.; Marsily, G. de

    1991-01-01

    In order to model radionuclide migration in the geosphere realistically at the field scale, the hydrogeologist needs to be able to simulate groundwater flow in heterogeneous media. Heterogeneity of the medium can be described using a stochastic approach, that affects the way in which a flow model is formulated. In this paper, we discuss the problems that we have encountered in modelling both continuous and fractured media. The stochastic approach leads to a methodology that enables local measurements of permeability to be integrated into a model which gives a good prediction of groundwater flow on a regional scale. 5 Figs.; 8 Refs

  19. Doubly stochastic Poisson process models for precipitation at fine time-scales

    Science.gov (United States)

    Ramesh, Nadarajah I.; Onof, Christian; Xie, Dichao

    2012-09-01

    This paper considers a class of stochastic point process models, based on doubly stochastic Poisson processes, in the modelling of rainfall. We examine the application of this class of models, a neglected alternative to the widely-known Poisson cluster models, in the analysis of fine time-scale rainfall intensity. These models are mainly used to analyse tipping-bucket raingauge data from a single site but an extension to multiple sites is illustrated which reveals the potential of this class of models to study the temporal and spatial variability of precipitation at fine time-scales.

  20. A stochastic multiscale framework for modeling flow through random heterogeneous porous media

    International Nuclear Information System (INIS)

    Ganapathysubramanian, B.; Zabaras, N.

    2009-01-01

    Flow through porous media is ubiquitous, occurring from large geological scales down to the microscopic scales. Several critical engineering phenomena like contaminant spread, nuclear waste disposal and oil recovery rely on accurate analysis and prediction of these multiscale phenomena. Such analysis is complicated by inherent uncertainties as well as the limited information available to characterize the system. Any realistic modeling of these transport phenomena has to resolve two key issues: (i) the multi-length scale variations in permeability that these systems exhibit, and (ii) the inherently limited information available to quantify these property variations that necessitates posing these phenomena as stochastic processes. A stochastic variational multiscale formulation is developed to incorporate uncertain multiscale features. A stochastic analogue to a mixed multiscale finite element framework is used to formulate the physical stochastic multiscale process. Recent developments in linear and non-linear model reduction techniques are used to convert the limited information available about the permeability variation into a viable stochastic input model. An adaptive sparse grid collocation strategy is used to efficiently solve the resulting stochastic partial differential equations (SPDEs). The framework is applied to analyze flow through random heterogeneous media when only limited statistics about the permeability variation are given

  1. Stochastic models, estimation, and control

    CERN Document Server

    Maybeck, Peter S

    1982-01-01

    This volume builds upon the foundations set in Volumes 1 and 2. Chapter 13 introduces the basic concepts of stochastic control and dynamic programming as the fundamental means of synthesizing optimal stochastic control laws.

  2. GillesPy: A Python Package for Stochastic Model Building and Simulation

    OpenAIRE

    Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.

    2016-01-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we descr...

  3. Stochastic Analysis 2010

    CERN Document Server

    Crisan, Dan

    2011-01-01

    "Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa

  4. Predicting population extinction or disease outbreaks with stochastic models

    Directory of Open Access Journals (Sweden)

    Linda J. S. Allen

    2017-01-01

    Full Text Available Models of exponential growth, logistic growth and epidemics are common applications in undergraduate differential equation courses. The corresponding stochastic models are not part of these courses, although when population sizes are small their behaviour is often more realistic and distinctly different from deterministic models. For example, the randomness associated with births and deaths may lead to population extinction even in an exponentially growing population. Some background in continuous-time Markov chains and applications to populations, epidemics and cancer are presented with a goal to introduce this topic into the undergraduate mathematics curriculum that will encourage further investigation into problems on conservation, infectious diseases and cancer therapy. MATLAB programs for graphing sample paths of stochastic models are provided in the Appendix.

  5. Cloud-Based Model Calibration Using OpenStudio: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  6. Stochastic Growth Models with No Discounting

    Czech Academy of Sciences Publication Activity Database

    Sladký, Karel

    2007-01-01

    Roč. 15, č. 4 (2007), s. 88-98 ISSN 0572-3043 R&D Projects: GA ČR(CZ) GA402/06/0990; GA ČR GA402/05/0115 Institutional research plan: CEZ:AV0Z10750506 Keywords : economic dynamics * stochastic version of the Ramsey growth model * Markov decision processes Subject RIV: AH - Economics

  7. Role of calibration, validation, and relevance in multi-level uncertainty integration

    International Nuclear Information System (INIS)

    Li, Chenzhao; Mahadevan, Sankaran

    2016-01-01

    Calibration of model parameters is an essential step in predicting the response of a complicated system, but the lack of data at the system level makes it impossible to conduct this quantification directly. In such a situation, system model parameters are estimated using tests at lower levels of complexity which share the same model parameters with the system. For such a multi-level problem, this paper proposes a methodology to quantify the uncertainty in the system level prediction by integrating calibration, validation and sensitivity analysis at different levels. The proposed approach considers the validity of the models used for parameter estimation at lower levels, as well as the relevance at the lower level to the prediction at the system level. The model validity is evaluated using a model reliability metric, and models with multivariate output are considered. The relevance is quantified by comparing Sobol indices at the lower level and system level, thus measuring the extent to which a lower level test represents the characteristics of the system so that the calibration results can be reliably used in the system level. Finally the results of calibration, validation and relevance analysis are integrated in a roll-up method to predict the system output. - Highlights: • Relevance analysis to quantify the closeness of two models. • Stochastic model reliability metric to integrate multiple validation experiments. • Extend the model reliability metric to deal with multivariate output. • Roll-up formula to integrate calibration, validation, and relevance.

  8. Application of Stochastic Partial Differential Equations to Reservoir Property Modelling

    KAUST Repository

    Potsepaev, R.

    2010-09-06

    Existing algorithms of geostatistics for stochastic modelling of reservoir parameters require a mapping (the \\'uvt-transform\\') into the parametric space and reconstruction of a stratigraphic co-ordinate system. The parametric space can be considered to represent a pre-deformed and pre-faulted depositional environment. Existing approximations of this mapping in many cases cause significant distortions to the correlation distances. In this work we propose a coordinate free approach for modelling stochastic textures through the application of stochastic partial differential equations. By avoiding the construction of a uvt-transform and stratigraphic coordinates, one can generate realizations directly in the physical space in the presence of deformations and faults. In particular the solution of the modified Helmholtz equation driven by Gaussian white noise is a zero mean Gaussian stationary random field with exponential correlation function (in 3-D). This equation can be used to generate realizations in parametric space. In order to sample in physical space we introduce a stochastic elliptic PDE with tensor coefficients, where the tensor is related to correlation anisotropy and its variation is physical space.

  9. Fermentation process tracking through enhanced spectral calibration modeling.

    Science.gov (United States)

    Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah

    2007-06-15

    The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.

  10. Cumulative error models for the tank calibration problem

    International Nuclear Information System (INIS)

    Goldman, A.; Anderson, L.G.; Weber, J.

    1983-01-01

    The purpose of a tank calibration equation is to obtain an estimate of the liquid volume that corresponds to a liquid level measurement. Calibration experimental errors occur in both liquid level and liquid volume measurements. If one of the errors is relatively small, the calibration equation can be determined from wellknown regression and calibration methods. If both variables are assumed to be in error, then for linear cases a prototype model should be considered. Many investigators are not familiar with this model or do not have computing facilities capable of obtaining numerical solutions. This paper discusses and compares three linear models that approximate the prototype model and have the advantage of much simpler computations. Comparisons among the four models and recommendations of suitability are made from simulations and from analyses of six sets of experimental data

  11. Root zone water quality model (RZWQM2): Model use, calibration and validation

    Science.gov (United States)

    Ma, Liwang; Ahuja, Lajpat; Nolan, B.T.; Malone, Robert; Trout, Thomas; Qi, Z.

    2012-01-01

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model, it has many desirable features for the modeling community. This article outlines the principles of calibrating the model component by component with one or more datasets and validating the model with independent datasets. Users should consult the RZWQM2 user manual distributed along with the model and a more detailed protocol on how to calibrate RZWQM2 provided in a book chapter. Two case studies (or examples) are included in this article. One is from an irrigated maize study in Colorado to illustrate the use of field and laboratory measured soil hydraulic properties on simulated soil water and crop production. It also demonstrates the interaction between soil and plant parameters in simulated plant responses to water stresses. The other is from a maize-soybean rotation study in Iowa to show a manual calibration of the model for crop yield, soil water, and N leaching in tile-drained soils. Although the commonly used trial-and-error calibration method works well for experienced users, as shown in the second example, an automated calibration procedure is more objective, as shown in the first example. Furthermore, the incorporation of the Parameter Estimation Software (PEST) into RZWQM2 made the calibration of the model more efficient than a grid (ordered) search of model parameters. In addition, PEST provides sensitivity and uncertainty analyses that should help users in selecting the right parameters to calibrate.

  12. Survival Analysis of a Nonautonomous Logistic Model with Stochastic Perturbation

    Directory of Open Access Journals (Sweden)

    Chun Lu

    2012-01-01

    Full Text Available Taking white noise into account, a stochastic nonautonomous logistic model is proposed and investigated. Sufficient conditions for extinction, nonpersistence in the mean, weak persistence, stochastic permanence, and global asymptotic stability are established. Moreover, the threshold between weak persistence and extinction is obtained. Finally, we introduce some numerical simulink graphics to illustrate our main results.

  13. Exercise effects in a virtual type 1 diabetes patient: Using stochastic differential equations for model extension

    DEFF Research Database (Denmark)

    Duun-Henriksen, Anne Katrine; Schmidt, S.; Nørgaard, K.

    2013-01-01

    extension incorporating exercise effects on insulin and glucose dynamics. Our model is constructed as a stochastic state space model consisting of a set of stochastic differential equations (SDEs). In a stochastic state space model, the residual error is split into random measurement error...

  14. Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.

    Science.gov (United States)

    Kang, Yun; Lanchier, Nicolas

    2011-06-01

    We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the

  15. Dynamic stochastic accumulation model with application to pension savings management

    Directory of Open Access Journals (Sweden)

    Melicherčik Igor

    2010-01-01

    Full Text Available We propose a dynamic stochastic accumulation model for determining optimal decision between stock and bond investments during accumulation of pension savings. Stock prices are assumed to be driven by the geometric Brownian motion. Interest rates are modeled by means of the Cox-Ingersoll-Ross model. The optimal decision as a solution to the corresponding dynamic stochastic program is a function of the duration of saving, the level of savings and the short rate. Qualitative and quantitative properties of the optimal solution are analyzed. The model is tested on the funded pillar of the Slovak pension system. The results are calculated for various risk preferences of a saver.

  16. Stochastic modeling of financial electricity contracts

    International Nuclear Information System (INIS)

    Benth, Fred Espen; Koekebakker, Steen

    2008-01-01

    We discuss the modeling of electricity contracts traded in many deregulated power markets. These forward/futures type contracts deliver (either physically or financially) electricity over a specified time period, and is frequently referred to as swaps since they in effect represent an exchange of fixed for floating electricity price. We propose to use the Heath-Jarrow-Morton approach to model swap prices since the notion of a spot price is not easily defined in these markets. For general stochastic dynamical models, we connect the spot price, the instantaneous-delivery forward price and the swap price, and analyze two different ways to apply the Heath-Jarrow-Morton approach to swap pricing: Either one specifies a dynamics for the non-existing instantaneous-delivery forwards and derives the implied swap dynamics, or one models directly on the swaps. The former is shown to lead to quite complicated stochastic models for the swap price, even when the forward dynamics is simple. The latter has some theoretical problems due to a no-arbitrage condition that has to be satisfied for swaps with overlapping delivery periods. To overcome this problem, a practical modeling approach is analyzed. The market is supposed only to consist of non-overlapping swaps, and these are modelled directly. A thorough empirical study is performed using data collected from Nord Pool. Our investigations demonstrate that it is possible to state reasonable models for the swap price dynamics which is analytically tractable for risk management and option pricing purposes, however, this is an area of further research. (author)

  17. Backward-stochastic-differential-equation approach to modeling of gene expression.

    Science.gov (United States)

    Shamarova, Evelina; Chertovskih, Roman; Ramos, Alexandre F; Aguiar, Paulo

    2017-03-01

    In this article, we introduce a backward method to model stochastic gene expression and protein-level dynamics. The protein amount is regarded as a diffusion process and is described by a backward stochastic differential equation (BSDE). Unlike many other SDE techniques proposed in the literature, the BSDE method is backward in time; that is, instead of initial conditions it requires the specification of end-point ("final") conditions, in addition to the model parametrization. To validate our approach we employ Gillespie's stochastic simulation algorithm (SSA) to generate (forward) benchmark data, according to predefined gene network models. Numerical simulations show that the BSDE method is able to correctly infer the protein-level distributions that preceded a known final condition, obtained originally from the forward SSA. This makes the BSDE method a powerful systems biology tool for time-reversed simulations, allowing, for example, the assessment of the biological conditions (e.g., protein concentrations) that preceded an experimentally measured event of interest (e.g., mitosis, apoptosis, etc.).

  18. Stochastic inequalities and applications to dynamics analysis of a novel SIVS epidemic model with jumps

    Directory of Open Access Journals (Sweden)

    Xiaona Leng

    2017-06-01

    Full Text Available Abstract This paper proposes a new nonlinear stochastic SIVS epidemic model with double epidemic hypothesis and Lévy jumps. The main purpose of this paper is to investigate the threshold dynamics of the stochastic SIVS epidemic model. By using the technique of a series of stochastic inequalities, we obtain sufficient conditions for the persistence in mean and extinction of the stochastic system and the threshold which governs the extinction and the spread of the epidemic diseases. Finally, this paper describes the results of numerical simulations investigating the dynamical effects of stochastic disturbance. Our results significantly improve and generalize the corresponding results in recent literatures. The developed theoretical methods and stochastic inequalities technique can be used to investigate the high-dimensional nonlinear stochastic differential systems.

  19. Pricing long-dated insurance contracts with stochastic interest rates and stochastic volatility

    NARCIS (Netherlands)

    van Haastrecht, A.; Lord, R.; Pelsser, A.; Schrager, D.

    2009-01-01

    We consider the pricing of long-dated insurance contracts under stochastic interest rates and stochastic volatility. In particular, we focus on the valuation of insurance options with long-term equity or foreign exchange exposures. Our modeling framework extends the stochastic volatility model of

  20. Stochastic Modelling of the Diffusion Coefficient for Concrete

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    In the paper, a new stochastic modelling of the diffusion coefficient D is presented. The modelling is based on physical understanding of the diffusion process and on some recent experimental results. The diffusion coefficients D is strongly dependent on the w/c ratio and the temperature....

  1. Stochastic models for transport in a fluidized bed

    NARCIS (Netherlands)

    Dehling, H.G; Hoffmann, A.C; Stuut, H.W.

    1999-01-01

    In this paper we study stochastic models for the transport of particles in a fluidized bed reactor and compute the associated residence time distribution (RTD). Our main model is basically a diffusion process in [0;A] with reflecting/absorbing boundary conditions, modified by allowing jumps to the

  2. Calibration of the Site-Scale Saturated Zone Flow Model

    International Nuclear Information System (INIS)

    Zyvoloski, G. A.

    2001-01-01

    The purpose of the flow calibration analysis work is to provide Performance Assessment (PA) with the calibrated site-scale saturated zone (SZ) flow model that will be used to make radionuclide transport calculations. As such, it is one of the most important models developed in the Yucca Mountain project. This model will be a culmination of much of our knowledge of the SZ flow system. The objective of this study is to provide a defensible site-scale SZ flow and transport model that can be used for assessing total system performance. A defensible model would include geologic and hydrologic data that are used to form the hydrogeologic framework model; also, it would include hydrochemical information to infer transport pathways, in-situ permeability measurements, and water level and head measurements. In addition, the model should include information on major model sensitivities. Especially important are those that affect calibration, the direction of transport pathways, and travel times. Finally, if warranted, alternative calibrations representing different conceptual models should be included. To obtain a defensible model, all available data should be used (or at least considered) to obtain a calibrated model. The site-scale SZ model was calibrated using measured and model-generated water levels and hydraulic head data, specific discharge calculations, and flux comparisons along several of the boundaries. Model validity was established by comparing model-generated permeabilities with the permeability data from field and laboratory tests; by comparing fluid pathlines obtained from the SZ flow model with those inferred from hydrochemical data; and by comparing the upward gradient generated with the model with that observed in the field. This analysis is governed by the Office of Civilian Radioactive Waste Management (OCRWM) Analysis and Modeling Report (AMR) Development Plan ''Calibration of the Site-Scale Saturated Zone Flow Model'' (CRWMS M and O 1999a)

  3. Individualism in plant populations: using stochastic differential equations to model individual neighbourhood-dependent plant growth.

    Science.gov (United States)

    Lv, Qiming; Schneider, Manuel K; Pitchford, Jonathan W

    2008-08-01

    We study individual plant growth and size hierarchy formation in an experimental population of Arabidopsis thaliana, within an integrated analysis that explicitly accounts for size-dependent growth, size- and space-dependent competition, and environmental stochasticity. It is shown that a Gompertz-type stochastic differential equation (SDE) model, involving asymmetric competition kernels and a stochastic term which decreases with the logarithm of plant weight, efficiently describes individual plant growth, competition, and variability in the studied population. The model is evaluated within a Bayesian framework and compared to its deterministic counterpart, and to several simplified stochastic models, using distributional validation. We show that stochasticity is an important determinant of size hierarchy and that SDE models outperform the deterministic model if and only if structural components of competition (asymmetry; size- and space-dependence) are accounted for. Implications of these results are discussed in the context of plant ecology and in more general modelling situations.

  4. Stochastic dynamical models for ecological regime shifts

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Carstensen, Jacob; Madsen, Henrik

    the physical and biological knowledge of the system, and nonlinearities introduced here can generate regime shifts or enhance the probability of regime shifts in the case of stochastic models, typically characterized by a threshold value for the known driver. A simple model for light competition between...... definition and stability of regimes become less subtle. Ecological regime shifts and their modeling must be viewed in a probabilistic manner, particularly if such model results are to be used in ecosystem management....

  5. MT3DMS: Model use, calibration, and validation

    Science.gov (United States)

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  6. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  7. Stochastic Modeling of Past Volcanic Crises

    Science.gov (United States)

    Woo, Gordon

    2018-01-01

    The statistical foundation of disaster risk analysis is past experience. From a scientific perspective, history is just one realization of what might have happened, given the randomness and chaotic dynamics of Nature. Stochastic analysis of the past is an exploratory exercise in counterfactual history, considering alternative possible scenarios. In particular, the dynamic perturbations that might have transitioned a volcano from an unrest to an eruptive state need to be considered. The stochastic modeling of past volcanic crises leads to estimates of eruption probability that can illuminate historical volcanic crisis decisions. It can also inform future economic risk management decisions in regions where there has been some volcanic unrest, but no actual eruption for at least hundreds of years. Furthermore, the availability of a library of past eruption probabilities would provide benchmark support for estimates of eruption probability in future volcanic crises.

  8. A stochastic SIRS epidemic model with infectious force under intervention strategies

    Science.gov (United States)

    Cai, Yongli; Kang, Yun; Banerjee, Malay; Wang, Weiming

    2015-12-01

    In this paper, we extend a classical SIRS epidemic model with the infectious forces under intervention strategies from a deterministic framework to a stochastic differential equation (SDE) one through introducing random fluctuations. The value of our study lies in two aspects. Mathematically, by using the Markov semigroups theory, we prove that the reproduction number R0S can be used to govern the stochastic dynamics of SDE model. If R0S 1, under mild extra conditions, it has an endemic stationary distribution which leads to the stochastical persistence of the disease. Epidemiologically, we find that random fluctuations can suppress disease outbreak, which can provide us some useful control strategies to regulate disease dynamics.

  9. A Stochastic Operational Planning Model for Smart Power Systems

    Directory of Open Access Journals (Sweden)

    Sh. Jadid

    2014-12-01

    Full Text Available Smart Grids are result of utilizing novel technologies such as distributed energy resources, and communication technologies in power system to compensate some of its defects. Various power resources provide some benefits for operation domain however, power system operator should use a powerful methodology to manage them. Renewable resources and load add uncertainty to the problem. So, independent system operator should use a stochastic method to manage them. A Stochastic unit commitment is presented in this paper to schedule various power resources such as distributed generation units, conventional thermal generation units, wind and PV farms, and demand response resources. Demand response resources, interruptible loads, distributed generation units, and conventional thermal generation units are used to provide required reserve for compensating stochastic nature of various resources and loads. In the presented model, resources connected to distribution network can participate in wholesale market through aggregators. Moreover, a novel three-program model which can be used by aggregators is presented in this article. Loads and distributed generation can contract with aggregators by these programs. A three-bus test system and the IEEE RTS are used to illustrate usefulness of the presented model. The results show that ISO can manage the system effectively by using this model

  10. Mapping of the stochastic Lotka-Volterra model to models of population genetics and game theory

    Science.gov (United States)

    Constable, George W. A.; McKane, Alan J.

    2017-08-01

    The relationship between the M -species stochastic Lotka-Volterra competition (SLVC) model and the M -allele Moran model of population genetics is explored via timescale separation arguments. When selection for species is weak and the population size is large but finite, precise conditions are determined for the stochastic dynamics of the SLVC model to be mappable to the neutral Moran model, the Moran model with frequency-independent selection, and the Moran model with frequency-dependent selection (equivalently a game-theoretic formulation of the Moran model). We demonstrate how these mappings can be used to calculate extinction probabilities and the times until a species' extinction in the SLVC model.

  11. Characterizing economic trends by Bayesian stochastic model specification search

    DEFF Research Database (Denmark)

    Grassi, Stefano; Proietti, Tommaso

    We extend a recently proposed Bayesian model selection technique, known as stochastic model specification search, for characterising the nature of the trend in macroeconomic time series. In particular, we focus on autoregressive models with possibly time-varying intercept and slope and decide on ...

  12. Intimate Partner Violence: A Stochastic Model.

    Science.gov (United States)

    Guidi, Elisa; Meringolo, Patrizia; Guazzini, Andrea; Bagnoli, Franco

    2017-01-01

    Intimate partner violence (IPV) has been a well-studied problem in the past psychological literature, especially through its classical methodology such as qualitative, quantitative and mixed methods. This article introduces two basic stochastic models as an alternative approach to simulate the short and long-term dynamics of a couple at risk of IPV. In both models, the members of the couple may assume a finite number of states, updating them in a probabilistic way at discrete time steps. After defining the transition probabilities, we first analyze the evolution of the couple in isolation and then we consider the case in which the individuals modify their behavior depending on the perceived violence from other couples in their environment or based on the perceived informal social support. While high perceived violence in other couples may converge toward the own presence of IPV by means a gender-specific transmission, the gender differences fade-out in the case of received informal social support. Despite the simplicity of the two stochastic models, they generate results which compare well with past experimental studies about IPV and they give important practical implications for prevention intervention in this field. Copyright: © 2016 by Fabrizio Serra editore, Pisa · Roma.

  13. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Francisco, Alexandre S.; Duran, Jorge Alberto R., E-mail: afrancisco@metal.eeimvr.uff.br, E-mail: duran@metal.eeimvr.uff.br [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Dept. de Engenharia Mecanica

    2013-07-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  14. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    International Nuclear Information System (INIS)

    Francisco, Alexandre S.; Duran, Jorge Alberto R.

    2013-01-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  15. Estimation of Dynamic Panel Data Models with Stochastic Volatility Using Particle Filters

    Directory of Open Access Journals (Sweden)

    Wen Xu

    2016-10-01

    Full Text Available Time-varying volatility is common in macroeconomic data and has been incorporated into macroeconomic models in recent work. Dynamic panel data models have become increasingly popular in macroeconomics to study common relationships across countries or regions. This paper estimates dynamic panel data models with stochastic volatility by maximizing an approximate likelihood obtained via Rao-Blackwellized particle filters. Monte Carlo studies reveal the good and stable performance of our particle filter-based estimator. When the volatility of volatility is high, or when regressors are absent but stochastic volatility exists, our approach can be better than the maximum likelihood estimator which neglects stochastic volatility and generalized method of moments (GMM estimators.

  16. Stochastic processes

    CERN Document Server

    Parzen, Emanuel

    1962-01-01

    Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

  17. Development of stochastic indicator models of lithology, Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Rautman, C.A.; Robey, T.H.

    1994-01-01

    Indicator geostatistical techniques have been used to produce a number of fully three-dimensional stochastic simulations of large-scale lithologic categories at the Yucca Mountain site. Each realization reproduces the available drill hole data used to condition the simulation. Information is propagated away from each point of observation in accordance with a mathematical model of spatial continuity inferred through soft data taken from published geologic cross sections. Variations among the simulated models collectively represent uncertainty in the lithology at unsampled locations. These stochastic models succeed in capturing many major features of welded-nonwelded lithologic framework of Yucca Mountain. However, contacts between welded and nonwelded rock types for individual simulations appear more complex than suggested by field observation, and a number of probable numerical artifacts exist in these models. Many of the apparent discrepancies between the simulated models and the general geology of Yucca Mountain represent characterization uncertainty, and can be traced to the sparse site data used to condition the simulations. Several vertical stratigraphic columns have been extracted from the three-dimensional stochastic models for use in simplified total-system performance assessment exercises. Simple, manual adjustments are required to eliminate the more obvious simulation artifacts and to impose a secondary set of deterministic geologic features on the overall stratigraphic framework provided by the indictor models

  18. Modelling the stochastic behaviour of primary nucleation.

    Science.gov (United States)

    Maggioni, Giovanni Maria; Mazzotti, Marco

    2015-01-01

    We study the stochastic nature of primary nucleation and how it manifests itself in a crystallisation process at different scales and under different operating conditions. Such characteristics of nucleation are evident in many experiments where detection times of crystals are not identical, despite identical experimental conditions, but instead are distributed around an average value. While abundant experimental evidence has been reported in the literature, a clear theoretical understanding and an appropriate modelling of this feature is still missing. In this contribution, we present two models describing a batch cooling crystallisation, where the interplay between stochastic nucleation and deterministic crystal growth is described differently in each. The nucleation and growth rates of the two models are estimated by a comprehensive set of measurements of paracetamol crystallisation from aqueous solution in a 1 mL vessel [Kadam et al., Chemical Engineering Science, 2012, 72, 10-19]. Both models are applied to the cooling crystallisation process above under different operating conditions, i.e. different volumes, initial concentrations, cooling rates. The advantages and disadvantages of the two approaches are illustrated and discussed, with particular reference to their use across scales of nucleation rate measured in very small crystallisers.

  19. Persistence and extinction for stochastic logistic model with Levy noise and impulsive perturbation

    Directory of Open Access Journals (Sweden)

    Chun Lu

    2015-09-01

    Full Text Available This article investigates a stochastic logistic model with Levy noise and impulsive perturbation. In the model, the impulsive perturbation and Levy noise are taken into account simultaneously. This model is new and more feasible and more accordance with the actual. The definition of solution to a stochastic differential equation with Levy noise and impulsive perturbation is established. Based on this definition, we show that our model has a unique global positive solution and obtains its explicit expression. Sufficient conditions for extinction are established as well as nonpersistence in the mean, weak persistence and stochastic permanence. The threshold between weak persistence and extinction is obtained.

  20. A single model procedure for estimating tank calibration equations

    International Nuclear Information System (INIS)

    Liebetrau, A.M.

    1997-10-01

    A fundamental component of any accountability system for nuclear materials is a tank calibration equation that relates the height of liquid in a tank to its volume. Tank volume calibration equations are typically determined from pairs of height and volume measurements taken in a series of calibration runs. After raw calibration data are standardized to a fixed set of reference conditions, the calibration equation is typically fit by dividing the data into several segments--corresponding to regions in the tank--and independently fitting the data for each segment. The estimates obtained for individual segments must then be combined to obtain an estimate of the entire calibration function. This process is tedious and time-consuming. Moreover, uncertainty estimates may be misleading because it is difficult to properly model run-to-run variability and between-segment correlation. In this paper, the authors describe a model whose parameters can be estimated simultaneously for all segments of the calibration data, thereby eliminating the need for segment-by-segment estimation. The essence of the proposed model is to define a suitable polynomial to fit to each segment and then extend its definition to the domain of the entire calibration function, so that it (the entire calibration function) can be expressed as the sum of these extended polynomials. The model provides defensible estimates of between-run variability and yields a proper treatment of between-segment correlations. A portable software package, called TANCS, has been developed to facilitate the acquisition, standardization, and analysis of tank calibration data. The TANCS package was used for the calculations in an example presented to illustrate the unified modeling approach described in this paper. With TANCS, a trial calibration function can be estimated and evaluated in a matter of minutes

  1. Stochastic modeling of central apnea events in preterm infants

    International Nuclear Information System (INIS)

    Clark, Matthew T; Lake, Douglas E; Randall Moorman, J; Delos, John B; Lee, Hoshik; Fairchild, Karen D; Kattwinkel, John

    2016-01-01

    A near-ubiquitous pathology in very low birth weight infants is neonatal apnea, breathing pauses with slowing of the heart and falling blood oxygen. Events of substantial duration occasionally occur after an infant is discharged from the neonatal intensive care unit (NICU). It is not known whether apneas result from a predictable process or from a stochastic process, but the observation that they occur in seemingly random clusters justifies the use of stochastic models. We use a hidden-Markov model to analyze the distribution of durations of apneas and the distribution of times between apneas. The model suggests the presence of four breathing states, ranging from very stable (with an average lifetime of 12 h) to very unstable (with an average lifetime of 10 s). Although the states themselves are not visible, the mathematical analysis gives estimates of the transition rates among these states. We have obtained these transition rates, and shown how they change with post-menstrual age; as expected, the residence time in the more stable breathing states increases with age. We also extrapolated the model to predict the frequency of very prolonged apnea during the first year of life. This paradigm—stochastic modeling of cardiorespiratory control in neonatal infants to estimate risk for severe clinical events—may be a first step toward personalized risk assessment for life threatening apnea events after NICU discharge. (paper)

  2. Stochastic modeling of central apnea events in preterm infants.

    Science.gov (United States)

    Clark, Matthew T; Delos, John B; Lake, Douglas E; Lee, Hoshik; Fairchild, Karen D; Kattwinkel, John; Moorman, J Randall

    2016-04-01

    A near-ubiquitous pathology in very low birth weight infants is neonatal apnea, breathing pauses with slowing of the heart and falling blood oxygen. Events of substantial duration occasionally occur after an infant is discharged from the neonatal intensive care unit (NICU). It is not known whether apneas result from a predictable process or from a stochastic process, but the observation that they occur in seemingly random clusters justifies the use of stochastic models. We use a hidden-Markov model to analyze the distribution of durations of apneas and the distribution of times between apneas. The model suggests the presence of four breathing states, ranging from very stable (with an average lifetime of 12 h) to very unstable (with an average lifetime of 10 s). Although the states themselves are not visible, the mathematical analysis gives estimates of the transition rates among these states. We have obtained these transition rates, and shown how they change with post-menstrual age; as expected, the residence time in the more stable breathing states increases with age. We also extrapolated the model to predict the frequency of very prolonged apnea during the first year of life. This paradigm-stochastic modeling of cardiorespiratory control in neonatal infants to estimate risk for severe clinical events-may be a first step toward personalized risk assessment for life threatening apnea events after NICU discharge.

  3. Analysis of dynamic regimes in stochastically forced Kaldor model

    International Nuclear Information System (INIS)

    Bashkirtseva, Irina; Ryazanova, Tatyana; Ryashko, Lev

    2015-01-01

    We consider the business cycle Kaldor model forced by random noise. Detailed parametric analysis of deterministic system is carried out and zones of coexisting stable equilibrium and stable limit cycle are found. Noise-induced transitions between these attractors are studied using stochastic sensitivity function technique and confidence domains method. Critical values of noise intensity corresponding to noise-induced transitions “equilibrium → cycle” and “cycle → equilibrium” are estimated. Dominants in combined stochastic regimes are discussed.

  4. Calibration of the ALLEGRO resonant detector

    Energy Technology Data Exchange (ETDEWEB)

    McHugh, Martin P [Department of Physics, Loyola University, New Orleans, Louisiana 70118 (United States); Johnson, Warren W [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hamilton, William O [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hanson, Jonathan [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Heng, Ik Siong [University of Glasgow, Glasgow G12 8QQ (United Kingdom); McNeese, Daniel [Department of Physics, Loyola University, New Orleans, Louisiana 70118 (United States); Miller, Phillip [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Nettles, Damon [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Weaver, Jordan [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Zhang Ping [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2005-09-21

    We describe a method for calibrating the ALLEGRO resonant detector. The resulting response function can be used to transform the observed data backwards to gravitational strain data. These data are the input to a cross-correlation analysis to search for stochastic gravitational waves.

  5. Calibration of the ALLEGRO resonant detector

    International Nuclear Information System (INIS)

    McHugh, Martin P; Johnson, Warren W; Hamilton, William O; Hanson, Jonathan; Heng, Ik Siong; McNeese, Daniel; Miller, Phillip; Nettles, Damon; Weaver, Jordan; Zhang Ping

    2005-01-01

    We describe a method for calibrating the ALLEGRO resonant detector. The resulting response function can be used to transform the observed data backwards to gravitational strain data. These data are the input to a cross-correlation analysis to search for stochastic gravitational waves

  6. Modeling animal movements using stochastic differential equations

    Science.gov (United States)

    Haiganoush K. Preisler; Alan A. Ager; Bruce K. Johnson; John G. Kie

    2004-01-01

    We describe the use of bivariate stochastic differential equations (SDE) for modeling movements of 216 radiocollared female Rocky Mountain elk at the Starkey Experimental Forest and Range in northeastern Oregon. Spatially and temporally explicit vector fields were estimated using approximating difference equations and nonparametric regression techniques. Estimated...

  7. Modeling reliability of power systems substations by using stochastic automata networks

    International Nuclear Information System (INIS)

    Šnipas, Mindaugas; Radziukynas, Virginijus; Valakevičius, Eimutis

    2017-01-01

    In this paper, stochastic automata networks (SANs) formalism to model reliability of power systems substations is applied. The proposed strategy allows reducing the size of state space of Markov chain model and simplifying system specification. Two case studies of standard configurations of substations are considered in detail. SAN models with different assumptions were created. SAN approach is compared with exact reliability calculation by using a minimal path set method. Modeling results showed that total independence of automata can be assumed for relatively small power systems substations with reliable equipment. In this case, the implementation of Markov chain model by a using SAN method is a relatively easy task. - Highlights: • We present the methodology to apply stochastic automata network formalism to create Markov chain models of power systems. • The stochastic automata network approach is combined with minimal path sets and structural functions. • Two models of substation configurations with different model assumptions are presented to illustrate the proposed methodology. • Modeling results of system with independent automata and functional transition rates are similar. • The conditions when total independence of automata can be assumed are addressed.

  8. Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs

    Science.gov (United States)

    Harvey, David Benjamin Paul

    A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.

  9. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    Science.gov (United States)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  10. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    Directory of Open Access Journals (Sweden)

    Jae Sang Moon

    2017-12-01

    Full Text Available Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES. Stochastic characteristics of these LES waked wind velocity field, including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study’s overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.

  11. Influence of rainfall observation network on model calibration and application

    Directory of Open Access Journals (Sweden)

    A. Bárdossy

    2008-01-01

    Full Text Available The objective in this study is to investigate the influence of the spatial resolution of the rainfall input on the model calibration and application. The analysis is carried out by varying the distribution of the raingauge network. A meso-scale catchment located in southwest Germany has been selected for this study. First, the semi-distributed HBV model is calibrated with the precipitation interpolated from the available observed rainfall of the different raingauge networks. An automatic calibration method based on the combinatorial optimization algorithm simulated annealing is applied. The performance of the hydrological model is analyzed as a function of the raingauge density. Secondly, the calibrated model is validated using interpolated precipitation from the same raingauge density used for the calibration as well as interpolated precipitation based on networks of reduced and increased raingauge density. Lastly, the effect of missing rainfall data is investigated by using a multiple linear regression approach for filling in the missing measurements. The model, calibrated with the complete set of observed data, is then run in the validation period using the above described precipitation field. The simulated hydrographs obtained in the above described three sets of experiments are analyzed through the comparisons of the computed Nash-Sutcliffe coefficient and several goodness-of-fit indexes. The results show that the model using different raingauge networks might need re-calibration of the model parameters, specifically model calibrated on relatively sparse precipitation information might perform well on dense precipitation information while model calibrated on dense precipitation information fails on sparse precipitation information. Also, the model calibrated with the complete set of observed precipitation and run with incomplete observed data associated with the data estimated using multiple linear regressions, at the locations treated as

  12. Stochastic modeling of thermal fatigue crack growth

    CERN Document Server

    Radu, Vasile

    2015-01-01

    The book describes a systematic stochastic modeling approach for assessing thermal-fatigue crack-growth in mixing tees, based on the power spectral density of temperature fluctuation at the inner pipe surface. It shows the development of a frequency-temperature response function in the framework of single-input, single-output (SISO) methodology from random noise/signal theory under sinusoidal input. The frequency response of stress intensity factor (SIF) is obtained by a polynomial fitting procedure of thermal stress profiles at various instants of time. The method, which takes into account the variability of material properties, and has been implemented in a real-world application, estimates the probabilities of failure by considering a limit state function and Monte Carlo analysis, which are based on the proposed stochastic model. Written in a comprehensive and accessible style, this book presents a new and effective method for assessing thermal fatigue crack, and it is intended as a concise and practice-or...

  13. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming

    2012-01-01

    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  14. Powering stochastic reliability models by discrete event simulation

    DEFF Research Database (Denmark)

    Kozine, Igor; Wang, Xiaoyun

    2012-01-01

    it difficult to find a solution to the problem. The power of modern computers and recent developments in discrete-event simulation (DES) software enable to diminish some of the drawbacks of stochastic models. In this paper we describe the insights we have gained based on using both Markov and DES models...

  15. Portfolio Optimization with Stochastic Dividends and Stochastic Volatility

    Science.gov (United States)

    Varga, Katherine Yvonne

    2015-01-01

    We consider an optimal investment-consumption portfolio optimization model in which an investor receives stochastic dividends. As a first problem, we allow the drift of stock price to be a bounded function. Next, we consider a stochastic volatility model. In each problem, we use the dynamic programming method to derive the Hamilton-Jacobi-Bellman…

  16. Response spectrum analysis of a stochastic seismic model

    International Nuclear Information System (INIS)

    Kimura, Koji; Sakata, Masaru; Takemoto, Shinichiro.

    1990-01-01

    The stochastic response spectrum approach is presented for predicting the dynamic behavior of structures to earthquake excitation expressed by a random process, one of whose sample functions can be regarded as a recorded strong-motion earthquake accelerogram. The approach consists of modeling recorded ground motion by a random process and the root-mean-square response (rms) analysis of a single-degree-of-freedom system by using the moment equations method. The stochastic response spectrum is obtained as a plot of the maximum rms response versus the natural period of the system and is compared with the conventional response spectrum. (author)

  17. Discrete stochastic analogs of Erlang epidemic models.

    Science.gov (United States)

    Getz, Wayne M; Dougherty, Eric R

    2018-12-01

    Erlang differential equation models of epidemic processes provide more realistic disease-class transition dynamics from susceptible (S) to exposed (E) to infectious (I) and removed (R) categories than the ubiquitous SEIR model. The latter is itself is at one end of the spectrum of Erlang SE[Formula: see text]I[Formula: see text]R models with [Formula: see text] concatenated E compartments and [Formula: see text] concatenated I compartments. Discrete-time models, however, are computationally much simpler to simulate and fit to epidemic outbreak data than continuous-time differential equations, and are also much more readily extended to include demographic and other types of stochasticity. Here we formulate discrete-time deterministic analogs of the Erlang models, and their stochastic extension, based on a time-to-go distributional principle. Depending on which distributions are used (e.g. discretized Erlang, Gamma, Beta, or Uniform distributions), we demonstrate that our formulation represents both a discretization of Erlang epidemic models and generalizations thereof. We consider the challenges of fitting SE[Formula: see text]I[Formula: see text]R models and our discrete-time analog to data (the recent outbreak of Ebola in Liberia). We demonstrate that the latter performs much better than the former; although confining fits to strict SEIR formulations reduces the numerical challenges, but sacrifices best-fit likelihood scores by at least 7%.

  18. Stochastic modeling of oligodendrocyte generation in cell culture: model validation with time-lapse data

    Directory of Open Access Journals (Sweden)

    Noble Mark

    2006-05-01

    Full Text Available Abstract Background The purpose of this paper is two-fold. The first objective is to validate the assumptions behind a stochastic model developed earlier by these authors to describe oligodendrocyte generation in cell culture. The second is to generate time-lapse data that may help biomathematicians to build stochastic models of cell proliferation and differentiation under other experimental scenarios. Results Using time-lapse video recording it is possible to follow the individual evolutions of different cells within each clone. This experimental technique is very laborious and cannot replace model-based quantitative inference from clonal data. However, it is unrivalled in validating the structure of a stochastic model intended to describe cell proliferation and differentiation at the clonal level. In this paper, such data are reported and analyzed for oligodendrocyte precursor cells cultured in vitro. Conclusion The results strongly support the validity of the most basic assumptions underpinning the previously proposed model of oligodendrocyte development in cell culture. However, there are some discrepancies; the most important is that the contribution of progenitor cell death to cell kinetics in this experimental system has been underestimated.

  19. Bond and CDS Pricing via the Stochastic Recovery Black-Cox Model

    Directory of Open Access Journals (Sweden)

    Albert Cohen

    2017-04-01

    Full Text Available Building on recent work incorporating recovery risk into structural models by Cohen & Costanzino (2015, we consider the Black-Cox model with an added recovery risk driver. The recovery risk driver arises naturally in the context of imperfect information implicit in the structural framework. This leads to a two-factor structural model we call the Stochastic Recovery Black-Cox model, whereby the asset risk driver At defines the default trigger and the recovery risk driver Rt defines the amount recovered in the event of default. We then price zero-coupon bonds and credit default swaps under the Stochastic Recovery Black-Cox model. Finally, we compare our results with the classic Black-Cox model, give explicit expressions for the recovery risk premium in the Stochastic Recovery Black-Cox model, and detail how the introduction of separate but correlated risk drivers leads to a decoupling of the default and recovery risk premiums in the credit spread. We conclude this work by computing the effect of adding coupons that are paid continuously until default, and price perpetual (consol bonds in our two-factor firm value model, extending calculations in the seminal paper by Leland (1994.

  20. Stochastic modeling of pitting corrosion: A new model for initiation and growth of multiple corrosion pits

    International Nuclear Information System (INIS)

    Valor, A.; Caleyo, F.; Alfonso, L.; Rivas, D.; Hallen, J.M.

    2007-01-01

    In this work, a new stochastic model capable of simulating pitting corrosion is developed and validated. Pitting corrosion is modeled as the combination of two stochastic processes: pit initiation and pit growth. Pit generation is modeled as a nonhomogeneous Poisson process, in which induction time for pit initiation is simulated as the realization of a Weibull process. In this way, the exponential and Weibull distributions can be considered as the possible distributions for pit initiation time. Pit growth is simulated using a nonhomogeneous Markov process. Extreme value statistics is used to find the distribution of maximum pit depths resulting from the combination of the initiation and growth processes for multiple pits. The proposed model is validated using several published experiments on pitting corrosion. It is capable of reproducing the experimental observations with higher quality than the stochastic models available in the literature for pitting corrosion

  1. Stochastic modeling of pitting corrosion: A new model for initiation and growth of multiple corrosion pits

    Energy Technology Data Exchange (ETDEWEB)

    Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400 Havana (Cuba); Caleyo, F. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico)]. E-mail: fcaleyo@gmail.com; Alfonso, L. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico); Rivas, D. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico); Hallen, J.M. [Departamento de Ingenieria, Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico DF 07738 (Mexico)

    2007-02-15

    In this work, a new stochastic model capable of simulating pitting corrosion is developed and validated. Pitting corrosion is modeled as the combination of two stochastic processes: pit initiation and pit growth. Pit generation is modeled as a nonhomogeneous Poisson process, in which induction time for pit initiation is simulated as the realization of a Weibull process. In this way, the exponential and Weibull distributions can be considered as the possible distributions for pit initiation time. Pit growth is simulated using a nonhomogeneous Markov process. Extreme value statistics is used to find the distribution of maximum pit depths resulting from the combination of the initiation and growth processes for multiple pits. The proposed model is validated using several published experiments on pitting corrosion. It is capable of reproducing the experimental observations with higher quality than the stochastic models available in the literature for pitting corrosion.

  2. GillesPy: A Python Package for Stochastic Model Building and Simulation.

    Science.gov (United States)

    Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R

    2016-09-01

    GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.

  3. Modeling and Prediction Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Juhl, Rune; Møller, Jan Kloppenborg; Jørgensen, John Bagterp

    2016-01-01

    Pharmacokinetic/pharmakodynamic (PK/PD) modeling for a single subject is most often performed using nonlinear models based on deterministic ordinary differential equations (ODEs), and the variation between subjects in a population of subjects is described using a population (mixed effects) setup...... deterministic and can predict the future perfectly. A more realistic approach would be to allow for randomness in the model due to e.g., the model be too simple or errors in input. We describe a modeling and prediction setup which better reflects reality and suggests stochastic differential equations (SDEs...

  4. Levy-Student processes for a stochastic model of beam halos

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, N. Cufaro [Department of Mathematics, University of Bari, and INFN Sezione di Bari, via E. Orabona 4, 70125 Bari (Italy)]. E-mail: cufaro@ba.infn.it; De Martino, S. [Department of Physics, University of Salerno, and INFN Sezione di Napoli (gruppo di Salerno), Via S. Allende, I-84081 Baronissi (SA) (Italy); De Siena, S. [Department of Physics, University of Salerno, and INFN Sezione di Napoli (gruppo di Salerno), Via S. Allende, I-84081 Baronissi (SA) (Italy); Illuminati, F. [Department of Physics, University of Salerno, and INFN Sezione di Napoli (gruppo di Salerno), Via S. Allende, I-84081 Baronissi (SA) (Italy)

    2006-06-01

    We describe the transverse beam distribution in particle accelerators within the controlled, stochastic dynamical scheme of the stochastic mechanics which produces time reversal invariant diffusion processes. In this paper we analyze the consequences of introducing the generalized Student laws, namely non-Gaussian, Levy infinitely divisible (but not stable) distributions. We will analyze this idea from two different standpoints: (a) first by supposing that the stationary distribution of our (Wiener powered) stochastic model is a Student distribution; (b) by supposing that our model is based on a (non-Gaussian) Levy process whose increments are Student distributed. In the case (a) the longer tails of the power decay of the Student laws, and in the case (b) the discontinuities of the Levy-Student process can well account for the rare escape of particles from the beam core, and hence for the formation of a halo in intense beams.

  5. Levy-Student processes for a stochastic model of beam halos

    International Nuclear Information System (INIS)

    Petroni, N. Cufaro; De Martino, S.; De Siena, S.; Illuminati, F.

    2006-01-01

    We describe the transverse beam distribution in particle accelerators within the controlled, stochastic dynamical scheme of the stochastic mechanics which produces time reversal invariant diffusion processes. In this paper we analyze the consequences of introducing the generalized Student laws, namely non-Gaussian, Levy infinitely divisible (but not stable) distributions. We will analyze this idea from two different standpoints: (a) first by supposing that the stationary distribution of our (Wiener powered) stochastic model is a Student distribution; (b) by supposing that our model is based on a (non-Gaussian) Levy process whose increments are Student distributed. In the case (a) the longer tails of the power decay of the Student laws, and in the case (b) the discontinuities of the Levy-Student process can well account for the rare escape of particles from the beam core, and hence for the formation of a halo in intense beams

  6. Assessing Exhaustiveness of Stochastic Sampling for Integrative Modeling of Macromolecular Structures.

    Science.gov (United States)

    Viswanath, Shruthi; Chemmama, Ilan E; Cimermancic, Peter; Sali, Andrej

    2017-12-05

    Modeling of macromolecular structures involves structural sampling guided by a scoring function, resulting in an ensemble of good-scoring models. By necessity, the sampling is often stochastic, and must be exhaustive at a precision sufficient for accurate modeling and assessment of model uncertainty. Therefore, the very first step in analyzing the ensemble is an estimation of the highest precision at which the sampling is exhaustive. Here, we present an objective and automated method for this task. As a proxy for sampling exhaustiveness, we evaluate whether two independently and stochastically generated sets of models are sufficiently similar. The protocol includes testing 1) convergence of the model score, 2) whether model scores for the two samples were drawn from the same parent distribution, 3) whether each structural cluster includes models from each sample proportionally to its size, and 4) whether there is sufficient structural similarity between the two model samples in each cluster. The evaluation also provides the sampling precision, defined as the smallest clustering threshold that satisfies the third, most stringent test. We validate the protocol with the aid of enumerated good-scoring models for five illustrative cases of binary protein complexes. Passing the proposed four tests is necessary, but not sufficient for thorough sampling. The protocol is general in nature and can be applied to the stochastic sampling of any set of models, not just structural models. In addition, the tests can be used to stop stochastic sampling as soon as exhaustiveness at desired precision is reached, thereby improving sampling efficiency; they may also help in selecting a model representation that is sufficiently detailed to be informative, yet also sufficiently coarse for sampling to be exhaustive. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. Stochastic programming framework for Lithuanian pension payout modelling

    Directory of Open Access Journals (Sweden)

    Audrius Kabašinskas

    2014-12-01

    Full Text Available The paper provides a scientific approach to the problem of selecting a pension fund by taking into account some specific characteristics of the Lithuanian Republic (LR pension accumulation system. The decision making model, which can be used to plan a long-term pension accrual of the Lithuanian Republic (LR citizens, in an optimal way is presented. This model focuses on factors that influence the sustainability of the pension system selection under macroeconomic, social and demographic uncertainty. The model is formalized as a single stage stochastic optimization problem where the long-term optimal strategy can be obtained based on the possible scenarios generated for a particular participant. Stochastic programming methods allow including the pension fund rebalancing moment and direction of investment, and taking into account possible changes of personal income, changes of society and the global financial market. The collection of methods used to generate scenario trees was found useful to solve strategic planning problems.

  8. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    International Nuclear Information System (INIS)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García; Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D.; Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz

    2015-01-01

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs

  9. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García [Instituto de Astronomía Teórica y Experimental, CONICET-UNC, Laprida 854, X5000BGR, Córdoba (Argentina); Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D. [Consejo Nacional de Investigaciones Científicas y Técnicas, Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz, E-mail: andresnicolas@oac.uncor.edu [Instituto de Astrofísica, Pontificia Universidad Católica de Chile, Av. Vicuña Mackenna 4860, Santiago (Chile)

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.

  10. The threshold of a stochastic avian-human influenza epidemic model with psychological effect

    Science.gov (United States)

    Zhang, Fengrong; Zhang, Xinhong

    2018-02-01

    In this paper, a stochastic avian-human influenza epidemic model with psychological effect in human population and saturation effect within avian population is investigated. This model describes the transmission of avian influenza among avian population and human population in random environments. For stochastic avian-only system, persistence in the mean and extinction of the infected avian population are studied. For the avian-human influenza epidemic system, sufficient conditions for the existence of an ergodic stationary distribution are obtained. Furthermore, a threshold of this stochastic model which determines the outcome of the disease is obtained. Finally, numerical simulations are given to support the theoretical results.

  11. Deterministic and stochastic trends in the Lee-Carter mortality model

    DEFF Research Database (Denmark)

    Callot, Laurent; Haldrup, Niels; Kallestrup-Lamb, Malene

    The Lee and Carter (1992) model assumes that the deterministic and stochastic time series dynamics loads with identical weights when describing the development of age specific mortality rates. Effectively this means that the main characteristics of the model simplifies to a random walk model...... that characterizes mortality data. We find empirical evidence that this feature of the Lee-Carter model overly restricts the system dynamics and we suggest to separate the deterministic and stochastic time series components at the benefit of improved fit and forecasting performance. In fact, we find...... that the classical Lee-Carter model will otherwise over estimate the reduction of mortality for the younger age groups and will under estimate the reduction of mortality for the older age groups. In practice, our recommendation means that the Lee-Carter model instead of a one-factor model should be formulated...

  12. Modeling stochastic frontier based on vine copulas

    Science.gov (United States)

    Constantino, Michel; Candido, Osvaldo; Tabak, Benjamin M.; da Costa, Reginaldo Brito

    2017-11-01

    This article models a production function and analyzes the technical efficiency of listed companies in the United States, Germany and England between 2005 and 2012 based on the vine copula approach. Traditional estimates of the stochastic frontier assume that data is multivariate normally distributed and there is no source of asymmetry. The proposed method based on vine copulas allow us to explore different types of asymmetry and multivariate distribution. Using data on product, capital and labor, we measure the relative efficiency of the vine production function and estimate the coefficient used in the stochastic frontier literature for comparison purposes. This production vine copula predicts the value added by firms with given capital and labor in a probabilistic way. It thereby stands in sharp contrast to the production function, where the output of firms is completely deterministic. The results show that, on average, S&P500 companies are more efficient than companies listed in England and Germany, which presented similar average efficiency coefficients. For comparative purposes, the traditional stochastic frontier was estimated and the results showed discrepancies between the coefficients obtained by the application of the two methods, traditional and frontier-vine, opening new paths of non-linear research.

  13. Calibration of a Land Subsidence Model Using InSAR Data via the Ensemble Kalman Filter.

    Science.gov (United States)

    Li, Liangping; Zhang, Meijing; Katzenstein, Kurt

    2017-11-01

    The application of interferometric synthetic aperture radar (InSAR) has been increasingly used to improve capabilities to model land subsidence in hydrogeologic studies. A number of investigations over the last decade show how spatially detailed time-lapse images of ground displacements could be utilized to advance our understanding for better predictions. In this work, we use simulated land subsidences as observed measurements, mimicking InSAR data to inversely infer inelastic specific storage in a stochastic framework. The inelastic specific storage is assumed as a random variable and modeled using a geostatistical method such that the detailed variations in space could be represented and also that the uncertainties of both characterization of specific storage and prediction of land subsidence can be assessed. The ensemble Kalman filter (EnKF), a real-time data assimilation algorithm, is used to inversely calibrate a land subsidence model by matching simulated subsidences with InSAR data. The performance of the EnKF is demonstrated in a synthetic example in which simulated surface deformations using a reference field are assumed as InSAR data for inverse modeling. The results indicate: (1) the EnKF can be used successfully to calibrate a land subsidence model with InSAR data; the estimation of inelastic specific storage is improved, and uncertainty of prediction is reduced, when all the data are accounted for; and (2) if the same ensemble is used to estimate Kalman gain, the analysis errors could cause filter divergence; thus, it is essential to include localization in the EnKF for InSAR data assimilation. © 2017, National Ground Water Association.

  14. Evapotranspiration Estimates for a Stochastic Soil-Moisture Model

    Science.gov (United States)

    Chaleeraktrakoon, Chavalit; Somsakun, Somrit

    2009-03-01

    Potential evapotranspiration is information that is necessary for applying a widely used stochastic model of soil moisture (I. Rodriguez Iturbe, A. Porporato, L. Ridolfi, V. Isham and D. R. Cox, Probabilistic modelling of water balance at a point: The role of climate, soil and vegetation, Proc. Roy. Soc. London A455 (1999) 3789-3805). An objective of the present paper is thus to find a proper estimate of the evapotranspiration for the stochastic model. This estimate is obtained by comparing the calculated soil-moisture distribution resulting from various techniques, such as Thornthwaite, Makkink, Jensen-Haise, FAO Modified Penman, and Blaney-Criddle, with an observed one. The comparison results using five sequences of daily soil-moisture for a dry season from November 2003 to April 2004 (Udornthani Province, Thailand) have indicated that all methods can be used if the weather information required is available. This is because their soil-moisture distributions are alike. In addition, the model is shown to have its ability in approximately describing the phenomenon at a weekly or biweekly time scale which is desirable for agricultural engineering applications.

  15. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  16. News Impact Curve for Stochastic Volatility Models

    OpenAIRE

    Makoto Takahashi; Yasuhiro Omori; Toshiaki Watanabe

    2012-01-01

    This paper proposes a new method to compute the news impact curve for stochastic volatility (SV) models. The new method incorporates the joint movement of return and volatility, which has been ignored by the extant literature, by simply adding a couple of steps to the Bayesian MCMC estimation procedures for SV models. This simple procedure is versatile and applicable to various SV type models. Contrary to the monotonic news impact functions in the extant literature, the new method gives a U-s...

  17. Stochastic model of energetic nuclear reactor

    International Nuclear Information System (INIS)

    Bojko, R.V.; Ryazanov, V.V.

    2002-01-01

    Behaviour of nuclear reactor was treated using the theory of branching processes. As mathematical model descriptive the neutron number in time the Markov occasional process is proposed. Application of branching occasional processes with variable regime to the description of neutron behaviour in the reactor makes possible conducting strong description of critical operation regime and demonstrates the severity of the process. Three regimes of the critical behaviour depending on the sign of manipulated variables and feedbacks were discovered. Probability regularities peculiar to the behaviour of the reactor are embodied to the suggested stochastic model [ru

  18. The cost of uniqueness in groundwater model calibration

    Science.gov (United States)

    Moore, Catherine; Doherty, John

    2006-04-01

    Calibration of a groundwater model requires that hydraulic properties be estimated throughout a model domain. This generally constitutes an underdetermined inverse problem, for which a solution can only be found when some kind of regularization device is included in the inversion process. Inclusion of regularization in the calibration process can be implicit, for example through the use of zones of constant parameter value, or explicit, for example through solution of a constrained minimization problem in which parameters are made to respect preferred values, or preferred relationships, to the degree necessary for a unique solution to be obtained. The "cost of uniqueness" is this: no matter which regularization methodology is employed, the inevitable consequence of its use is a loss of detail in the calibrated field. This, in turn, can lead to erroneous predictions made by a model that is ostensibly "well calibrated". Information made available as a by-product of the regularized inversion process allows the reasons for this loss of detail to be better understood. In particular, it is easily demonstrated that the estimated value for an hydraulic property at any point within a model domain is, in fact, a weighted average of the true hydraulic property over a much larger area. This averaging process causes loss of resolution in the estimated field. Where hydraulic conductivity is the hydraulic property being estimated, high averaging weights exist in areas that are strategically disposed with respect to measurement wells, while other areas may contribute very little to the estimated hydraulic conductivity at any point within the model domain, this possibly making the detection of hydraulic conductivity anomalies in these latter areas almost impossible. A study of the post-calibration parameter field covariance matrix allows further insights into the loss of system detail incurred through the calibration process to be gained. A comparison of pre- and post-calibration

  19. Stochastic time-dependent vehicle routing problem: Mathematical models and ant colony algorithm

    Directory of Open Access Journals (Sweden)

    Zhengyu Duan

    2015-11-01

    Full Text Available This article addresses the stochastic time-dependent vehicle routing problem. Two mathematical models named robust optimal schedule time model and minimum expected schedule time model are proposed for stochastic time-dependent vehicle routing problem, which can guarantee delivery within the time windows of customers. The robust optimal schedule time model only requires the variation range of link travel time, which can be conveniently derived from historical traffic data. In addition, the robust optimal schedule time model based on robust optimization method can be converted into a time-dependent vehicle routing problem. Moreover, an ant colony optimization algorithm is designed to solve stochastic time-dependent vehicle routing problem. As the improvements in initial solution and transition probability, ant colony optimization algorithm has a good performance in convergence. Through computational instances and Monte Carlo simulation tests, robust optimal schedule time model is proved to be better than minimum expected schedule time model in computational efficiency and coping with the travel time fluctuations. Therefore, robust optimal schedule time model is applicable in real road network.

  20. The global dynamics for a stochastic SIS epidemic model with isolation

    Science.gov (United States)

    Chen, Yiliang; Wen, Buyu; Teng, Zhidong

    2018-02-01

    In this paper, we investigate the dynamical behavior for a stochastic SIS epidemic model with isolation which is as an important strategy for the elimination of infectious diseases. It is assumed that the stochastic effects manifest themselves mainly as fluctuation in the transmission coefficient, the death rate and the proportional coefficient of the isolation of infective. It is shown that the extinction and persistence in the mean of the model are determined by a threshold value R0S . That is, if R0S 1, then the disease is stochastic persistent in the means with probability one. Furthermore, the existence of a unique stationary distribution is discussed, and the sufficient conditions are established by using the Lyapunov function method. Finally, some numerical examples are carried out to confirm the analytical results.

  1. Reconstructing the hidden states in time course data of stochastic models.

    Science.gov (United States)

    Zimmer, Christoph

    2015-11-01

    Parameter estimation is central for analyzing models in Systems Biology. The relevance of stochastic modeling in the field is increasing. Therefore, the need for tailored parameter estimation techniques is increasing as well. Challenges for parameter estimation are partial observability, measurement noise, and the computational complexity arising from the dimension of the parameter space. This article extends the multiple shooting for stochastic systems' method, developed for inference in intrinsic stochastic systems. The treatment of extrinsic noise and the estimation of the unobserved states is improved, by taking into account the correlation between unobserved and observed species. This article demonstrates the power of the method on different scenarios of a Lotka-Volterra model, including cases in which the prey population dies out or explodes, and a Calcium oscillation system. Besides showing how the new extension improves the accuracy of the parameter estimates, this article analyzes the accuracy of the state estimates. In contrast to previous approaches, the new approach is well able to estimate states and parameters for all the scenarios. As it does not need stochastic simulations, it is of the same order of speed as conventional least squares parameter estimation methods with respect to computational time. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Hand-eye calibration using a target registration error model.

    Science.gov (United States)

    Chen, Elvis C S; Morgan, Isabella; Jayarathne, Uditha; Ma, Burton; Peters, Terry M

    2017-10-01

    Surgical cameras are prevalent in modern operating theatres and are often used as a surrogate for direct vision. Visualisation techniques (e.g. image fusion) made possible by tracking the camera require accurate hand-eye calibration between the camera and the tracking system. The authors introduce the concept of 'guided hand-eye calibration', where calibration measurements are facilitated by a target registration error (TRE) model. They formulate hand-eye calibration as a registration problem between homologous point-line pairs. For each measurement, the position of a monochromatic ball-tip stylus (a point) and its projection onto the image (a line) is recorded, and the TRE of the resulting calibration is predicted using a TRE model. The TRE model is then used to guide the placement of the calibration tool, so that the subsequent measurement minimises the predicted TRE. Assessing TRE after each measurement produces accurate calibration using a minimal number of measurements. As a proof of principle, they evaluated guided calibration using a webcam and an endoscopic camera. Their endoscopic camera results suggest that millimetre TRE is achievable when at least 15 measurements are acquired with the tracker sensor ∼80 cm away on the laparoscope handle for a target ∼20 cm away from the camera.

  3. Probabilistic Forecast of Wind Power Generation by Stochastic Differential Equation Models

    KAUST Repository

    Elkantassi, Soumaya

    2017-04-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  4. Calibration of CORSIM models under saturated traffic flow conditions.

    Science.gov (United States)

    2013-09-01

    This study proposes a methodology to calibrate microscopic traffic flow simulation models. : The proposed methodology has the capability to calibrate simultaneously all the calibration : parameters as well as demand patterns for any network topology....

  5. A stochastic large deformation model for computational anatomy

    DEFF Research Database (Denmark)

    Arnaudon, Alexis; Holm, Darryl D.; Pai, Akshay Sadananda Uppinakudru

    2017-01-01

    In the study of shapes of human organs using computational anatomy, variations are found to arise from inter-subject anatomical differences, disease-specific effects, and measurement noise. This paper introduces a stochastic model for incorporating random variations into the Large Deformation...

  6. Calibrating cellular automaton models for pedestrians walking through corners

    Science.gov (United States)

    Dias, Charitha; Lovreglio, Ruggiero

    2018-05-01

    Cellular Automata (CA) based pedestrian simulation models have gained remarkable popularity as they are simpler and easier to implement compared to other microscopic modeling approaches. However, incorporating traditional floor field representations in CA models to simulate pedestrian corner navigation behavior could result in unrealistic behaviors. Even though several previous studies have attempted to enhance CA models to realistically simulate pedestrian maneuvers around bends, such modifications have not been calibrated or validated against empirical data. In this study, two static floor field (SFF) representations, namely 'discrete representation' and 'continuous representation', are calibrated for CA-models to represent pedestrians' walking behavior around 90° bends. Trajectory data collected through a controlled experiment are used to calibrate these model representations. Calibration results indicate that although both floor field representations can represent pedestrians' corner navigation behavior, the 'continuous' representation fits the data better. Output of this study could be beneficial for enhancing the reliability of existing CA-based models by representing pedestrians' corner navigation behaviors more realistically.

  7. Investigation of the transferability of hydrological models and a method to improve model calibration

    Directory of Open Access Journals (Sweden)

    G. Hartmann

    2005-01-01

    Full Text Available In order to find a model parameterization such that the hydrological model performs well even under different conditions, appropriate model performance measures have to be determined. A common performance measure is the Nash Sutcliffe efficiency. Usually it is calculated comparing observed and modelled daily values. In this paper a modified version is suggested in order to calibrate a model on different time scales simultaneously (days up to years. A spatially distributed hydrological model based on HBV concept was used. The modelling was applied on the Upper Neckar catchment, a mesoscale river in south western Germany with a basin size of about 4000 km2. The observation period 1961-1990 was divided into four different climatic periods, referred to as "warm", "cold", "wet" and "dry". These sub periods were used to assess the transferability of the model calibration and of the measure of performance. In a first step, the hydrological model was calibrated on a certain period and afterwards applied on the same period. Then, a validation was performed on the climatologically opposite period than the calibration, e.g. the model calibrated on the cold period was applied on the warm period. Optimal parameter sets were identified by an automatic calibration procedure based on Simulated Annealing. The results show, that calibrating a hydrological model that is supposed to handle short as well as long term signals becomes an important task. Especially the objective function has to be chosen very carefully.

  8. Two new algorithms to combine kriging with stochastic modelling

    Science.gov (United States)

    Venema, Victor; Lindau, Ralf; Varnai, Tamas; Simmer, Clemens

    2010-05-01

    Two main groups of statistical methods used in the Earth sciences are geostatistics and stochastic modelling. Geostatistical methods, such as various kriging algorithms, aim at estimating the mean value for every point as well as possible. In case of sparse measurements, such fields have less variability at small scales and a narrower distribution as the true field. This can lead to biases if a nonlinear process is simulated driven by such a kriged field. Stochastic modelling aims at reproducing the statistical structure of the data in space and time. One of the stochastic modelling methods, the so-called surrogate data approach, replicates the value distribution and power spectrum of a certain data set. While stochastic methods reproduce the statistical properties of the data, the location of the measurement is not considered. This requires the use of so-called constrained stochastic models. Because radiative transfer through clouds is a highly nonlinear process, it is essential to model the distribution (e.g. of optical depth, extinction, liquid water content or liquid water path) accurately. In addition, the correlations within the cloud field are important, especially because of horizontal photon transport. This explains the success of surrogate cloud fields for use in 3D radiative transfer studies. Up to now, however, we could only achieve good results for the radiative properties averaged over the field, but not for a radiation measurement located at a certain position. Therefore we have developed a new algorithm that combines the accuracy of stochastic (surrogate) modelling with the positioning capabilities of kriging. In this way, we can automatically profit from the large geostatistical literature and software. This algorithm is similar to the standard iterative amplitude adjusted Fourier transform (IAAFT) algorithm, but has an additional iterative step in which the surrogate field is nudged towards the kriged field. The nudging strength is gradually

  9. Robust nonlinear autoregressive moving average model parameter estimation using stochastic recurrent artificial neural networks

    DEFF Research Database (Denmark)

    Chon, K H; Hoyer, D; Armoundas, A A

    1999-01-01

    In this study, we introduce a new approach for estimating linear and nonlinear stochastic autoregressive moving average (ARMA) model parameters, given a corrupt signal, using artificial recurrent neural networks. This new approach is a two-step approach in which the parameters of the deterministic...... part of the stochastic ARMA model are first estimated via a three-layer artificial neural network (deterministic estimation step) and then reestimated using the prediction error as one of the inputs to the artificial neural networks in an iterative algorithm (stochastic estimation step). The prediction...... error is obtained by subtracting the corrupt signal of the estimated ARMA model obtained via the deterministic estimation step from the system output response. We present computer simulation examples to show the efficacy of the proposed stochastic recurrent neural network approach in obtaining accurate...

  10. An introduction to continuous-time stochastic processes theory, models, and applications to finance, biology, and medicine

    CERN Document Server

    Capasso, Vincenzo

    2015-01-01

    This textbook, now in its third edition, offers a rigorous and self-contained introduction to the theory of continuous-time stochastic processes, stochastic integrals, and stochastic differential equations. Expertly balancing theory and applications, the work features concrete examples of modeling real-world problems from biology, medicine, industrial applications, finance, and insurance using stochastic methods. No previous knowledge of stochastic processes is required. Key topics include: * Markov processes * Stochastic differential equations * Arbitrage-free markets and financial derivatives * Insurance risk * Population dynamics, and epidemics * Agent-based models New to the Third Edition: * Infinitely divisible distributions * Random measures * Levy processes * Fractional Brownian motion * Ergodic theory * Karhunen-Loeve expansion * Additional applications * Additional  exercises * Smoluchowski  approximation of  Langevin systems An Introduction to Continuous-Time Stochastic Processes, Third Editio...

  11. Stochastic Fractional Programming Approach to a Mean and Variance Model of a Transportation Problem

    Directory of Open Access Journals (Sweden)

    V. Charles

    2011-01-01

    Full Text Available In this paper, we propose a stochastic programming model, which considers a ratio of two nonlinear functions and probabilistic constraints. In the former, only expected model has been proposed without caring variability in the model. On the other hand, in the variance model, the variability played a vital role without concerning its counterpart, namely, the expected model. Further, the expected model optimizes the ratio of two linear cost functions where as variance model optimize the ratio of two non-linear functions, that is, the stochastic nature in the denominator and numerator and considering expectation and variability as well leads to a non-linear fractional program. In this paper, a transportation model with stochastic fractional programming (SFP problem approach is proposed, which strikes the balance between previous models available in the literature.

  12. Persistence and extinction for stochastic logistic model with Levy noise and impulsive perturbation

    OpenAIRE

    Chun Lu; Qiang Ma; Xiaohua Ding

    2015-01-01

    This article investigates a stochastic logistic model with Levy noise and impulsive perturbation. In the model, the impulsive perturbation and Levy noise are taken into account simultaneously. This model is new and more feasible and more accordance with the actual. The definition of solution to a stochastic differential equation with Levy noise and impulsive perturbation is established. Based on this definition, we show that our model has a unique global positive solut...

  13. Stochastic hyperelastic constitutive laws and identification procedure for soft biological tissues with intrinsic variability.

    Science.gov (United States)

    Staber, B; Guilleminot, J

    2017-01-01

    In this work, we address the constitutive modeling, in a probabilistic framework, of the hyperelastic response of soft biological tissues. The aim is on the one hand to mimic the mean behavior and variability that are typically encountered in the experimental characterization of such materials, and on the other hand to derive mathematical models that are almost surely consistent with the theory of nonlinear elasticity. Towards this goal, we invoke information theory and discuss a stochastic model relying on a low-dimensional parametrization. We subsequently propose a two-step methodology allowing for the calibration of the model using standard data, such as mean and standard deviation values along a given loading path. The framework is finally applied and benchmarked on three experimental databases proposed elsewhere in the literature. It is shown that the stochastic model allows experiments to be accurately reproduced, regardless of the tissue under consideration. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Testing for Volatility Co-movement in Bivariate Stochastic Volatility Models

    OpenAIRE

    Chen, Jinghui; Kobayashi, Masahito; McAleer, Michael

    2017-01-01

    markdownabstractThe paper considers the problem of volatility co-movement, namely as to whether two financial returns have perfectly correlated common volatility process, in the framework of multivariate stochastic volatility models and proposes a test which checks the volatility co-movement. The proposed test is a stochastic volatility version of the co-movement test proposed by Engle and Susmel (1993), who investigated whether international equity markets have volatility co-movement using t...

  15. Estimation of stochastic volatility by using Ornstein-Uhlenbeck type models

    Science.gov (United States)

    Mariani, Maria C.; Bhuiyan, Md Al Masum; Tweneboah, Osei K.

    2018-02-01

    In this study, we develop a technique for estimating the stochastic volatility (SV) of a financial time series by using Ornstein-Uhlenbeck type models. Using the daily closing prices from developed and emergent stock markets, we conclude that the incorporation of stochastic volatility into the time varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. Furthermore, our estimation algorithm is feasible with large data sets and have good convergence properties.

  16. Deterministic and stochastic trends in the Lee-Carter mortality model

    DEFF Research Database (Denmark)

    Callot, Laurent; Haldrup, Niels; Kallestrup-Lamb, Malene

    2015-01-01

    The Lee and Carter (1992) model assumes that the deterministic and stochastic time series dynamics load with identical weights when describing the development of age-specific mortality rates. Effectively this means that the main characteristics of the model simplify to a random walk model with age...... mortality data. We find empirical evidence that this feature of the Lee–Carter model overly restricts the system dynamics and we suggest to separate the deterministic and stochastic time series components at the benefit of improved fit and forecasting performance. In fact, we find that the classical Lee......–Carter model will otherwise overestimate the reduction of mortality for the younger age groups and will underestimate the reduction of mortality for the older age groups. In practice, our recommendation means that the Lee–Carter model instead of a one-factor model should be formulated as a two- (or several...

  17. A stochastic model of hormesis

    International Nuclear Information System (INIS)

    Yakovlev, A.Yu.; Tsodikov, A.D.; Bass, L.

    1993-01-01

    In order to describe the life-prolonging effect of some agents that are harmful at higher doses, ionizing radiations in particular, a stochastic model is developed in terms of accumulation and progression of intracellular lesions caused by the environment and by the agent itself. The processes of lesion repair, operating at the molecular and cellular level, are assumed to be responsible for this hormesis effect within the framework of the proposed model. Properties of lifetime distributions, derived for analysis of animal experiments with prolonged and acute irradiation, are given special attention. The model provides efficient means of interpreting experimental findings, as evidenced by its application to analysis of some published data on the hormetic effects of prolonged irradiation and of procaine on animal longevity. 51 refs., 2 figs., 1 tabs

  18. Solvable stochastic dealer models for financial markets

    Science.gov (United States)

    Yamada, Kenta; Takayasu, Hideki; Ito, Takatoshi; Takayasu, Misako

    2009-05-01

    We introduce solvable stochastic dealer models, which can reproduce basic empirical laws of financial markets such as the power law of price change. Starting from the simplest model that is almost equivalent to a Poisson random noise generator, the model becomes fairly realistic by adding only two effects: the self-modulation of transaction intervals and a forecasting tendency, which uses a moving average of the latest market price changes. Based on the present microscopic model of markets, we find a quantitative relation with market potential forces, which have recently been discovered in the study of market price modeling based on random walks.

  19. Empirical Analysis of Stochastic Volatility Model by Hybrid Monte Carlo Algorithm

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2013-01-01

    The stochastic volatility model is one of volatility models which infer latent volatility of asset returns. The Bayesian inference of the stochastic volatility (SV) model is performed by the hybrid Monte Carlo (HMC) algorithm which is superior to other Markov Chain Monte Carlo methods in sampling volatility variables. We perform the HMC simulations of the SV model for two liquid stock returns traded on the Tokyo Stock Exchange and measure the volatilities of those stock returns. Then we calculate the accuracy of the volatility measurement using the realized volatility as a proxy of the true volatility and compare the SV model with the GARCH model which is one of other volatility models. Using the accuracy calculated with the realized volatility we find that empirically the SV model performs better than the GARCH model.

  20. Modelling conjugation with stochastic differential equations

    DEFF Research Database (Denmark)

    Philipsen, Kirsten Riber; Christiansen, Lasse Engbo; Hasman, Henrik

    2010-01-01

    Enterococcus faecium strains in a rich exhaustible media. The model contains a new expression for a substrate dependent conjugation rate. A maximum likelihood based method is used to estimate the model parameters. Different models including different noise structure for the system and observations are compared......Conjugation is an important mechanism involved in the transfer of resistance between bacteria. In this article a stochastic differential equation based model consisting of a continuous time state equation and a discrete time measurement equation is introduced to model growth and conjugation of two...... using a likelihood-ratio test and Akaike's information criterion. Experiments indicating conjugation on the agar plates selecting for transconjugants motivates the introduction of an extended model, for which conjugation on the agar plate is described in the measurement equation. This model is compared...

  1. Modelling of diesel spray flames under engine-like conditions using an accelerated Eulerian Stochastic Field method

    DEFF Research Database (Denmark)

    Pang, Kar Mun; Jangi, Mehdi; Bai, Xue-Song

    2018-01-01

    This paper aims to simulate diesel spray flames across a wide range of engine-like conditions using the Eulerian Stochastic Field probability density function (ESF-PDF) model. The ESF model is coupled with the Chemistry Coordinate Mapping approach to expedite the calculation. A convergence study...... is carried out for a number of stochastic fields at five different conditions, covering both conventional diesel combustion and low-temperature combustion regimes. Ignition delay time, flame lift-off length as well as distributions of temperature and various combustion products are used to evaluate...... the performance of the model. The peak values of these properties generated using thirty-two stochastic fields are found to converge, with a maximum relative difference of 27% as compared to those from a greater number of stochastic fields. The ESF-PDF model with thirty-two stochastic fields performs reasonably...

  2. The threshold of a stochastic SIQS epidemic model

    Science.gov (United States)

    Zhang, Xiao-Bing; Huo, Hai-Feng; Xiang, Hong; Shi, Qihong; Li, Dungang

    2017-09-01

    In this paper, we present the threshold of a stochastic SIQS epidemic model which determines the extinction and persistence of the disease. Furthermore, we find that noise can suppress the disease outbreak. Numerical simulations are also carried out to confirm the analytical results.

  3. Stochastic Robust Mathematical Programming Model for Power System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong; Changhyeok, Lee; Haoyong, Chen; Mehrotra, Sanjay

    2016-01-01

    This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.

  4. Exact protein distributions for stochastic models of gene expression using partitioning of Poisson processes.

    Science.gov (United States)

    Pendar, Hodjat; Platini, Thierry; Kulkarni, Rahul V

    2013-04-01

    Stochasticity in gene expression gives rise to fluctuations in protein levels across a population of genetically identical cells. Such fluctuations can lead to phenotypic variation in clonal populations; hence, there is considerable interest in quantifying noise in gene expression using stochastic models. However, obtaining exact analytical results for protein distributions has been an intractable task for all but the simplest models. Here, we invoke the partitioning property of Poisson processes to develop a mapping that significantly simplifies the analysis of stochastic models of gene expression. The mapping leads to exact protein distributions using results for mRNA distributions in models with promoter-based regulation. Using this approach, we derive exact analytical results for steady-state and time-dependent distributions for the basic two-stage model of gene expression. Furthermore, we show how the mapping leads to exact protein distributions for extensions of the basic model that include the effects of posttranscriptional and posttranslational regulation. The approach developed in this work is widely applicable and can contribute to a quantitative understanding of stochasticity in gene expression and its regulation.

  5. Exact protein distributions for stochastic models of gene expression using partitioning of Poisson processes

    Science.gov (United States)

    Pendar, Hodjat; Platini, Thierry; Kulkarni, Rahul V.

    2013-04-01

    Stochasticity in gene expression gives rise to fluctuations in protein levels across a population of genetically identical cells. Such fluctuations can lead to phenotypic variation in clonal populations; hence, there is considerable interest in quantifying noise in gene expression using stochastic models. However, obtaining exact analytical results for protein distributions has been an intractable task for all but the simplest models. Here, we invoke the partitioning property of Poisson processes to develop a mapping that significantly simplifies the analysis of stochastic models of gene expression. The mapping leads to exact protein distributions using results for mRNA distributions in models with promoter-based regulation. Using this approach, we derive exact analytical results for steady-state and time-dependent distributions for the basic two-stage model of gene expression. Furthermore, we show how the mapping leads to exact protein distributions for extensions of the basic model that include the effects of posttranscriptional and posttranslational regulation. The approach developed in this work is widely applicable and can contribute to a quantitative understanding of stochasticity in gene expression and its regulation.

  6. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    Science.gov (United States)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low

  7. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    International Nuclear Information System (INIS)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-01-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R n . An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R d (d<< n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology

  8. Stochastic population and epidemic models persistence and extinction

    CERN Document Server

    Allen, Linda J S

    2015-01-01

    This monograph provides a summary of the basic theory of branching processes for single-type and multi-type processes. Classic examples of population and epidemic models illustrate the probability of population or epidemic extinction obtained from the theory of branching processes. The first chapter develops the branching process theory, while in the second chapter two applications to population and epidemic processes of single-type branching process theory are explored. The last two chapters present multi-type branching process applications to epidemic models, and then continuous-time and continuous-state branching processes with applications. In addition, several MATLAB programs for simulating stochastic sample paths  are provided in an Appendix. These notes originated as part of a lecture series on Stochastics in Biological Systems at the Mathematical Biosciences Institute in Ohio, USA. Professor Linda Allen is a Paul Whitfield Horn Professor of Mathematics in the Department of Mathematics and Statistics ...

  9. Stochastic quantization of a topological quantum mechanical model

    International Nuclear Information System (INIS)

    Antunes, Sergio; Krein, Gastao; Menezes, Gabriel; Svaiter, Nami Fux

    2011-01-01

    Full text: Stochastic quantization of complex actions has been extensively studied in the literature. In these models, a Markovian Langevin equation is used in order to study the quantization of such systems. In such papers, the advantages of the Markovian stochastic quantization method were explored and exposed. However, many drawbacks of the method were also pointed out, such as instability of the simulations with absence of convergence and sometimes convergence to the wrong limit. Indeed, although several alternative methods have been proposed to deal with interesting physical systems where the action is complex, these approaches do not suggest any general way of solving the particular difficulties that arise in each situation. Here, we wish to make contributions to the program of stochastic quantization of theories with imaginary action by investigating the consequences of a non-Markovian stochastic quantization in a particular situation, namely a quantum mechanical topological action. We analyze the Markovian stochastic quantization for a topological quantum mechanical action which is analog to a Maxwell-Chern-Simons action in the Weyl gauge. Afterwards we consider a Langevin equation with memory kernel and Einstein's relations with colored noise. We show that convergence towards equilibrium is achieved in both regimes. We also sketch a simple numerical analysis to investigate the possible advantages of non-Markovian procedure over the usual Markovian quantization. Both retarded Green's function for the diffusion problem are considered in such analysis. We show that, although the results indicated that the effect of memory kernel, as usually expected, is to delay the convergence to equilibrium, non-Markovian systems imply a faster decay compared to Markovian ones as well as smoother convergence to equilibrium. (author)

  10. hydroPSO: A Versatile Particle Swarm Optimisation R Package for Calibration of Environmental Models

    Science.gov (United States)

    Zambrano-Bigiarini, M.; Rojas, R.

    2012-04-01

    Particle Swarm Optimisation (PSO) is a recent and powerful population-based stochastic optimisation technique inspired by social behaviour of bird flocking, which shares similarities with other evolutionary techniques such as Genetic Algorithms (GA). In PSO, however, each individual of the population, known as particle in PSO terminology, adjusts its flying trajectory on the multi-dimensional search-space according to its own experience (best-known personal position) and the one of its neighbours in the swarm (best-known local position). PSO has recently received a surge of attention given its flexibility, ease of programming, low memory and CPU requirements, and efficiency. Despite these advantages, PSO may still get trapped into sub-optimal solutions, suffer from swarm explosion or premature convergence. Thus, the development of enhancements to the "canonical" PSO is an active area of research. To date, several modifications to the canonical PSO have been proposed in the literature, resulting into a large and dispersed collection of codes and algorithms which might well be used for similar if not identical purposes. In this work we present hydroPSO, a platform-independent R package implementing several enhancements to the canonical PSO that we consider of utmost importance to bring this technique to the attention of a broader community of scientists and practitioners. hydroPSO is model-independent, allowing the user to interface any model code with the calibration engine without having to invest considerable effort in customizing PSO to a new calibration problem. Some of the controlling options to fine-tune hydroPSO are: four alternative topologies, several types of inertia weight, time-variant acceleration coefficients, time-variant maximum velocity, regrouping of particles when premature convergence is detected, different types of boundary conditions and many others. Additionally, hydroPSO implements recent PSO variants such as: Improved Particle Swarm

  11. Streamflow characteristics from modelled runoff time series: Importance of calibration criteria selection

    Science.gov (United States)

    Poole, Sandra; Vis, Marc; Knight, Rodney; Seibert, Jan

    2017-01-01

    Ecologically relevant streamflow characteristics (SFCs) of ungauged catchments are often estimated from simulated runoff of hydrologic models that were originally calibrated on gauged catchments. However, SFC estimates of the gauged donor catchments and subsequently the ungauged catchments can be substantially uncertain when models are calibrated using traditional approaches based on optimization of statistical performance metrics (e.g., Nash–Sutcliffe model efficiency). An improved calibration strategy for gauged catchments is therefore crucial to help reduce the uncertainties of estimated SFCs for ungauged catchments. The aim of this study was to improve SFC estimates from modeled runoff time series in gauged catchments by explicitly including one or several SFCs in the calibration process. Different types of objective functions were defined consisting of the Nash–Sutcliffe model efficiency, single SFCs, or combinations thereof. We calibrated a bucket-type runoff model (HBV – Hydrologiska Byråns Vattenavdelning – model) for 25 catchments in the Tennessee River basin and evaluated the proposed calibration approach on 13 ecologically relevant SFCs representing major flow regime components and different flow conditions. While the model generally tended to underestimate the tested SFCs related to mean and high-flow conditions, SFCs related to low flow were generally overestimated. The highest estimation accuracies were achieved by a SFC-specific model calibration. Estimates of SFCs not included in the calibration process were of similar quality when comparing a multi-SFC calibration approach to a traditional model efficiency calibration. For practical applications, this implies that SFCs should preferably be estimated from targeted runoff model calibration, and modeled estimates need to be carefully interpreted.

  12. Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models

    KAUST Repository

    Elsheikh, Ahmed H.

    2013-06-01

    A novel parameter estimation algorithm is proposed. The inverse problem is formulated as a sequential data integration problem in which Gaussian process regression (GPR) is used to integrate the prior knowledge (static data). The search space is further parameterized using Karhunen-Loève expansion to build a set of basis functions that spans the search space. Optimal weights of the reduced basis functions are estimated by an iterative stochastic ensemble method (ISEM). ISEM employs directional derivatives within a Gauss-Newton iteration for efficient gradient estimation. The resulting update equation relies on the inverse of the output covariance matrix which is rank deficient.In the proposed algorithm we use an iterative regularization based on the ℓ2 Boosting algorithm. ℓ2 Boosting iteratively fits the residual and the amount of regularization is controlled by the number of iterations. A termination criteria based on Akaike information criterion (AIC) is utilized. This regularization method is very attractive in terms of performance and simplicity of implementation. The proposed algorithm combining ISEM and ℓ2 Boosting is evaluated on several nonlinear subsurface flow parameter estimation problems. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates. © 2013 Elsevier B.V.

  13. A stochastic model of nanoparticle self-assembly on Cayley trees

    International Nuclear Information System (INIS)

    Mazilu, I; Schwen, E M; Banks, W E; Pope, B K; Mazilu, D A

    2015-01-01

    Nanomedicine is an emerging area of medical research that uses innovative nanotechnologies to improve the delivery of therapeutic and diagnostic agents with maximum clinical benefit. We present a versatile stochastic model that can be used to capture the basic features of drug encapsulation of nanoparticles on tree-like synthetic polymers called dendrimers. The geometry of a dendrimer is described mathematically as a Cayley tree. We use our stochastic model to study the dynamics of deposition and release of monomers (simulating the drug molecules) on Cayley trees (simulating dendrimers). We present analytical and Monte Carlo simulation results for the particle density on Cayley trees of coordination number three and four

  14. Low-frequency scaling applied to stochastic finite-fault modeling

    Science.gov (United States)

    Crane, Stephen; Motazedian, Dariush

    2014-01-01

    Stochastic finite-fault modeling is an important tool for simulating moderate to large earthquakes. It has proven to be useful in applications that require a reliable estimation of ground motions, mostly in the spectral frequency range of 1 to 10 Hz, which is the range of most interest to engineers. However, since there can be little resemblance between the low-frequency spectra of large and small earthquakes, this portion can be difficult to simulate using stochastic finite-fault techniques. This paper introduces two different methods to scale low-frequency spectra for stochastic finite-fault modeling. One method multiplies the subfault source spectrum by an empirical function. This function has three parameters to scale the low-frequency spectra: the level of scaling and the start and end frequencies of the taper. This empirical function adjusts the earthquake spectra only between the desired frequencies, conserving seismic moment in the simulated spectra. The other method is an empirical low-frequency coefficient that is added to the subfault corner frequency. This new parameter changes the ratio between high and low frequencies. For each simulation, the entire earthquake spectra is adjusted, which may result in the seismic moment not being conserved for a simulated earthquake. These low-frequency scaling methods were used to reproduce recorded earthquake spectra from several earthquakes recorded in the Pacific Earthquake Engineering Research Center (PEER) Next Generation Attenuation Models (NGA) database. There were two methods of determining the stochastic parameters of best fit for each earthquake: a general residual analysis and an earthquake-specific residual analysis. Both methods resulted in comparable values for stress drop and the low-frequency scaling parameters; however, the earthquake-specific residual analysis obtained a more accurate distribution of the averaged residuals.

  15. A data driven nonlinear stochastic model for blood glucose dynamics.

    Science.gov (United States)

    Zhang, Yan; Holt, Tim A; Khovanova, Natalia

    2016-03-01

    The development of adequate mathematical models for blood glucose dynamics may improve early diagnosis and control of diabetes mellitus (DM). We have developed a stochastic nonlinear second order differential equation to describe the response of blood glucose concentration to food intake using continuous glucose monitoring (CGM) data. A variational Bayesian learning scheme was applied to define the number and values of the system's parameters by iterative optimisation of free energy. The model has the minimal order and number of parameters to successfully describe blood glucose dynamics in people with and without DM. The model accounts for the nonlinearity and stochasticity of the underlying glucose-insulin dynamic process. Being data-driven, it takes full advantage of available CGM data and, at the same time, reflects the intrinsic characteristics of the glucose-insulin system without detailed knowledge of the physiological mechanisms. We have shown that the dynamics of some postprandial blood glucose excursions can be described by a reduced (linear) model, previously seen in the literature. A comprehensive analysis demonstrates that deterministic system parameters belong to different ranges for diabetes and controls. Implications for clinical practice are discussed. This is the first study introducing a continuous data-driven nonlinear stochastic model capable of describing both DM and non-DM profiles. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. Stochastic modeling for river pollution of Sungai Perlis

    International Nuclear Information System (INIS)

    Yunus, Nurul Izzaty Mohd.; Rahman, Haliza Abd.; Bahar, Arifah

    2015-01-01

    River pollution has been recognized as a contributor to a wide range of health problems and disorders in human. It can pose health dangers to humans who come into contact with it, either directly or indirectly. Therefore, it is most important to measure the concentration of Biochemical Oxygen Demand (BOD) as a water quality parameter since the parameter has long been the basic means for determining the degree of water pollution in rivers. In this study, BOD is used as a parameter to estimate the water quality at Sungai Perlis. It has been observed that Sungai Perlis is polluted due to lack of management and improper use of resources. Therefore, it is of importance to model the Sungai Perlis water quality in order to describe and predict the water quality systems. The BOD concentration secondary data set is used which was extracted from the Drainage and Irrigation Department Perlis State website. The first order differential equation from Streeter – Phelps model was utilized as a deterministic model. Then, the model was developed into a stochastic model. Results from this study shows that the stochastic model is more adequate to describe and predict the BOD concentration and the water quality systems in Sungai Perlis by having smaller value of mean squared error (MSE)

  17. Stochastic modeling for river pollution of Sungai Perlis

    Energy Technology Data Exchange (ETDEWEB)

    Yunus, Nurul Izzaty Mohd.; Rahman, Haliza Abd. [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia,81310 Johor Bahru, Johor (Malaysia); Bahar, Arifah [UTM-Centre of Industrial and Applied Mathematics (UTM-CIAM) Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia)

    2015-02-03

    River pollution has been recognized as a contributor to a wide range of health problems and disorders in human. It can pose health dangers to humans who come into contact with it, either directly or indirectly. Therefore, it is most important to measure the concentration of Biochemical Oxygen Demand (BOD) as a water quality parameter since the parameter has long been the basic means for determining the degree of water pollution in rivers. In this study, BOD is used as a parameter to estimate the water quality at Sungai Perlis. It has been observed that Sungai Perlis is polluted due to lack of management and improper use of resources. Therefore, it is of importance to model the Sungai Perlis water quality in order to describe and predict the water quality systems. The BOD concentration secondary data set is used which was extracted from the Drainage and Irrigation Department Perlis State website. The first order differential equation from Streeter – Phelps model was utilized as a deterministic model. Then, the model was developed into a stochastic model. Results from this study shows that the stochastic model is more adequate to describe and predict the BOD concentration and the water quality systems in Sungai Perlis by having smaller value of mean squared error (MSE)

  18. The Long Time Behavior of a Stochastic Logistic Model with Infinite Delay and Impulsive Perturbation

    OpenAIRE

    Lu, Chun; Wu, Kaining

    2016-01-01

    This paper considers a stochastic logistic model with infinite delay and impulsive perturbation. Firstly, with the space $C_{g}$ as phase space, the definition of solution to a stochastic functional differential equation with infinite delay and impulsive perturbation is established. According to this definition, we show that our model has an unique global positive solution. Then we establish the sufficient and necessary conditions for extinction and stochastic permanence of the...

  19. Stochastic inverse problems: Models and metrics

    International Nuclear Information System (INIS)

    Sabbagh, Elias H.; Sabbagh, Harold A.; Murphy, R. Kim; Aldrin, John C.; Annis, Charles; Knopp, Jeremy S.

    2015-01-01

    In past work, we introduced model-based inverse methods, and applied them to problems in which the anomaly could be reasonably modeled by simple canonical shapes, such as rectangular solids. In these cases the parameters to be inverted would be length, width and height, as well as the occasional probe lift-off or rotation. We are now developing a formulation that allows more flexibility in modeling complex flaws. The idea consists of expanding the flaw in a sequence of basis functions, and then solving for the expansion coefficients of this sequence, which are modeled as independent random variables, uniformly distributed over their range of values. There are a number of applications of such modeling: 1. Connected cracks and multiple half-moons, which we have noted in a POD set. Ideally we would like to distinguish connected cracks from one long shallow crack. 2. Cracks of irregular profile and shape which have appeared in cold work holes during bolt-hole eddy-current inspection. One side of such cracks is much deeper than other. 3. L or C shaped crack profiles at the surface, examples of which have been seen in bolt-hole cracks. By formulating problems in a stochastic sense, we are able to leverage the stochastic global optimization algorithms in NLSE, which is resident in VIC-3D®, to answer questions of global minimization and to compute confidence bounds using the sensitivity coefficient that we get from NLSE. We will also address the issue of surrogate functions which are used during the inversion process, and how they contribute to the quality of the estimation of the bounds

  20. Stochastic inverse problems: Models and metrics

    Science.gov (United States)

    Sabbagh, Elias H.; Sabbagh, Harold A.; Murphy, R. Kim; Aldrin, John C.; Annis, Charles; Knopp, Jeremy S.

    2015-03-01

    In past work, we introduced model-based inverse methods, and applied them to problems in which the anomaly could be reasonably modeled by simple canonical shapes, such as rectangular solids. In these cases the parameters to be inverted would be length, width and height, as well as the occasional probe lift-off or rotation. We are now developing a formulation that allows more flexibility in modeling complex flaws. The idea consists of expanding the flaw in a sequence of basis functions, and then solving for the expansion coefficients of this sequence, which are modeled as independent random variables, uniformly distributed over their range of values. There are a number of applications of such modeling: 1. Connected cracks and multiple half-moons, which we have noted in a POD set. Ideally we would like to distinguish connected cracks from one long shallow crack. 2. Cracks of irregular profile and shape which have appeared in cold work holes during bolt-hole eddy-current inspection. One side of such cracks is much deeper than other. 3. L or C shaped crack profiles at the surface, examples of which have been seen in bolt-hole cracks. By formulating problems in a stochastic sense, we are able to leverage the stochastic global optimization algorithms in NLSE, which is resident in VIC-3D®, to answer questions of global minimization and to compute confidence bounds using the sensitivity coefficient that we get from NLSE. We will also address the issue of surrogate functions which are used during the inversion process, and how they contribute to the quality of the estimation of the bounds.

  1. Stochastic Modeling and Deterministic Limit of Catalytic Surface Processes

    DEFF Research Database (Denmark)

    Starke, Jens; Reichert, Christian; Eiswirth, Markus

    2007-01-01

    of stochastic origin can be observed in experiments. The models include a new approach to the platinum phase transition, which allows for a unification of existing models for Pt(100) and Pt(110). The rich nonlinear dynamical behavior of the macroscopic reaction kinetics is investigated and shows good agreement...

  2. Developing stochastic model of thrust and flight dynamics for small UAVs

    Science.gov (United States)

    Tjhai, Chandra

    This thesis presents a stochastic thrust model and aerodynamic model for small propeller driven UAVs whose power plant is a small electric motor. First a model which relates thrust generated by a small propeller driven electric motor as a function of throttle setting and commanded engine RPM is developed. A perturbation of this model is then used to relate the uncertainty in throttle and engine RPM commanded to the error in the predicted thrust. Such a stochastic model is indispensable in the design of state estimation and control systems for UAVs where the performance requirements of the systems are specied in stochastic terms. It is shown that thrust prediction models for small UAVs are not a simple, explicit functions relating throttle input and RPM command to thrust generated. Rather they are non-linear, iterative procedures which depend on a geometric description of the propeller and mathematical model of the motor. A detailed derivation of the iterative procedure is presented and the impact of errors which arise from inaccurate propeller and motor descriptions are discussed. Validation results from a series of wind tunnel tests are presented. The results show a favorable statistical agreement between the thrust uncertainty predicted by the model and the errors measured in the wind tunnel. The uncertainty model of aircraft aerodynamic coefficients developed based on wind tunnel experiment will be discussed at the end of this thesis.

  3. The threshold of a stochastic delayed SIR epidemic model with vaccination

    Science.gov (United States)

    Liu, Qun; Jiang, Daqing

    2016-11-01

    In this paper, we study the threshold dynamics of a stochastic delayed SIR epidemic model with vaccination. We obtain sufficient conditions for extinction and persistence in the mean of the epidemic. The threshold between persistence in the mean and extinction of the stochastic system is also obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number Rbar0 of the deterministic system. Results show that time delay has important effects on the persistence and extinction of the epidemic.

  4. Calibration of hydrological models using flow-duration curves

    Directory of Open Access Journals (Sweden)

    I. K. Westerberg

    2011-07-01

    Full Text Available The degree of belief we have in predictions from hydrologic models will normally depend on how well they can reproduce observations. Calibrations with traditional performance measures, such as the Nash-Sutcliffe model efficiency, are challenged by problems including: (1 uncertain discharge data, (2 variable sensitivity of different performance measures to different flow magnitudes, (3 influence of unknown input/output errors and (4 inability to evaluate model performance when observation time periods for discharge and model input data do not overlap. This paper explores a calibration method using flow-duration curves (FDCs to address these problems. The method focuses on reproducing the observed discharge frequency distribution rather than the exact hydrograph. It consists of applying limits of acceptability for selected evaluation points (EPs on the observed uncertain FDC in the extended GLUE approach. Two ways of selecting the EPs were tested – based on equal intervals of discharge and of volume of water. The method was tested and compared to a calibration using the traditional model efficiency for the daily four-parameter WASMOD model in the Paso La Ceiba catchment in Honduras and for Dynamic TOPMODEL evaluated at an hourly time scale for the Brue catchment in Great Britain. The volume method of selecting EPs gave the best results in both catchments with better calibrated slow flow, recession and evaporation than the other criteria. Observed and simulated time series of uncertain discharges agreed better for this method both in calibration and prediction in both catchments. An advantage with the method is that the rejection criterion is based on an estimation of the uncertainty in discharge data and that the EPs of the FDC can be chosen to reflect the aims of the modelling application, e.g. using more/less EPs at high/low flows. While the method appears less sensitive to epistemic input/output errors than previous use of limits of

  5. Stochastic modelling of turbulent combustion for design optimization of gas turbine combustors

    Science.gov (United States)

    Mehanna Ismail, Mohammed Ali

    The present work covers the development and the implementation of an efficient algorithm for the design optimization of gas turbine combustors. The purpose is to explore the possibilities and indicate constructive suggestions for optimization techniques as alternative methods for designing gas turbine combustors. The algorithm is general to the extent that no constraints are imposed on the combustion phenomena or on the combustor configuration. The optimization problem is broken down into two elementary problems: the first is the optimum search algorithm, and the second is the turbulent combustion model used to determine the combustor performance parameters. These performance parameters constitute the objective and physical constraints in the optimization problem formulation. The examination of both turbulent combustion phenomena and the gas turbine design process suggests that the turbulent combustion model represents a crucial part of the optimization algorithm. The basic requirements needed for a turbulent combustion model to be successfully used in a practical optimization algorithm are discussed. In principle, the combustion model should comply with the conflicting requirements of high fidelity, robustness and computational efficiency. To that end, the problem of turbulent combustion is discussed and the current state of the art of turbulent combustion modelling is reviewed. According to this review, turbulent combustion models based on the composition PDF transport equation are found to be good candidates for application in the present context. However, these models are computationally expensive. To overcome this difficulty, two different models based on the composition PDF transport equation were developed: an improved Lagrangian Monte Carlo composition PDF algorithm and the generalized stochastic reactor model. Improvements in the Lagrangian Monte Carlo composition PDF model performance and its computational efficiency were achieved through the

  6. Mixed Effects Modeling Using Stochastic Differential Equations: Illustrated by Pharmacokinetic Data of Nicotinic Acid in Obese Zucker Rats.

    Science.gov (United States)

    Leander, Jacob; Almquist, Joachim; Ahlström, Christine; Gabrielsson, Johan; Jirstrand, Mats

    2015-05-01

    Inclusion of stochastic differential equations in mixed effects models provides means to quantify and distinguish three sources of variability in data. In addition to the two commonly encountered sources, measurement error and interindividual variability, we also consider uncertainty in the dynamical model itself. To this end, we extend the ordinary differential equation setting used in nonlinear mixed effects models to include stochastic differential equations. The approximate population likelihood is derived using the first-order conditional estimation with interaction method and extended Kalman filtering. To illustrate the application of the stochastic differential mixed effects model, two pharmacokinetic models are considered. First, we use a stochastic one-compartmental model with first-order input and nonlinear elimination to generate synthetic data in a simulated study. We show that by using the proposed method, the three sources of variability can be successfully separated. If the stochastic part is neglected, the parameter estimates become biased, and the measurement error variance is significantly overestimated. Second, we consider an extension to a stochastic pharmacokinetic model in a preclinical study of nicotinic acid kinetics in obese Zucker rats. The parameter estimates are compared between a deterministic and a stochastic NiAc disposition model, respectively. Discrepancies between model predictions and observations, previously described as measurement noise only, are now separated into a comparatively lower level of measurement noise and a significant uncertainty in model dynamics. These examples demonstrate that stochastic differential mixed effects models are useful tools for identifying incomplete or inaccurate model dynamics and for reducing potential bias in parameter estimates due to such model deficiencies.

  7. Persistence and extinction of a stochastic single-species model under regime switching in a polluted environment II.

    Science.gov (United States)

    Liu, Meng; Wang, Ke

    2010-12-07

    This is a continuation of our paper [Liu, M., Wang, K., 2010. Persistence and extinction of a stochastic single-species model under regime switching in a polluted environment, J. Theor. Biol. 264, 934-944]. Taking both white noise and colored noise into account, a stochastic single-species model under regime switching in a polluted environment is studied. Sufficient conditions for extinction, stochastic nonpersistence in the mean, stochastic weak persistence and stochastic permanence are established. The threshold between stochastic weak persistence and extinction is obtained. The results show that a different type of noise has a different effect on the survival results. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Forecasting total natural-gas consumption in Spain by using the stochastic Gompertz innovation diffusion model

    International Nuclear Information System (INIS)

    Gutierrez, R.; Nafidi, A.; Gutierrez Sanchez, R.

    2005-01-01

    The principal objective of the present study is to examine the possibilities of using a Gompertz-type innovation diffusion process as a stochastic growth model of natural-gas consumption in Spain, and to compare our results with those obtained, on the one hand, by stochastic logistic innovation modelling and, on the other, by using a stochastic lognormal growth model based on a non-innovation diffusion process. Such a comparison is carried out taking into account the macroeconomic characteristics and natural-gas consumption patterns in Spain, both of which reflect the current expansive situation characterizing the Spanish economy. From the technical standpoint a contribution is also made to the theory of the stochastic Gompertz Innovation diffusion process (SGIDP), as applied to the case in question. (author)

  9. On the Radio-emitting Particles of the Crab Nebula: Stochastic Acceleration Model

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Shuta J. [Department of Physics, Faculty of Science and Engineering, Konan University, 8-9-1 Okamoto, Kobe, Hyogo 658-8501 (Japan); Asano, Katsuaki, E-mail: sjtanaka@center.konan-u.ac.jp [Institute for Cosmic Ray Research, The University of Tokyo, 5-1-5 Kashiwa-no-ha, Kashiwa City, Chiba, 277-8582 (Japan)

    2017-06-01

    The broadband emission of pulsar wind nebulae (PWNe) is well described by non-thermal emissions from accelerated electrons and positrons. However, the standard shock acceleration model of PWNe does not account for the hard spectrum in radio wavelengths. The origin of the radio-emitting particles is also important to determine the pair production efficiency in the pulsar magnetosphere. Here, we propose a possible resolution for the particle energy distribution in PWNe; the radio-emitting particles are not accelerated at the pulsar wind termination shock but are stochastically accelerated by turbulence inside PWNe. We upgrade our past one-zone spectral evolution model to include the energy diffusion, i.e., the stochastic acceleration, and apply the model to the Crab Nebula. A fairly simple form of the energy diffusion coefficient is assumed for this demonstrative study. For a particle injection to the stochastic acceleration process, we consider the continuous injection from the supernova ejecta or the impulsive injection associated with supernova explosion. The observed broadband spectrum and the decay of the radio flux are reproduced by tuning the amount of the particle injected to the stochastic acceleration process. The acceleration timescale and the duration of the acceleration are required to be a few decades and a few hundred years, respectively. Our results imply that some unveiled mechanisms, such as back reaction to the turbulence, are required to make the energies of stochastically and shock-accelerated particles comparable.

  10. An Empirical Application of a Two-Factor Model of Stochastic Volatility

    Czech Academy of Sciences Publication Activity Database

    Kuchyňka, Alexandr

    2008-01-01

    Roč. 17, č. 3 (2008), s. 243-253 ISSN 1210-0455 R&D Projects: GA ČR GA402/07/1113; GA MŠk(CZ) LC06075 Institutional research plan: CEZ:AV0Z10750506 Keywords : stochastic volatility * Kalman filter Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2008/E/kuchynka-an empirical application of a two-factor model of stochastic volatility.pdf

  11. Optically levitated nanoparticle as a model system for stochastic bistable dynamics.

    Science.gov (United States)

    Ricci, F; Rica, R A; Spasenović, M; Gieseler, J; Rondin, L; Novotny, L; Quidant, R

    2017-05-09

    Nano-mechanical resonators have gained an increasing importance in nanotechnology owing to their contributions to both fundamental and applied science. Yet, their small dimensions and mass raises some challenges as their dynamics gets dominated by nonlinearities that degrade their performance, for instance in sensing applications. Here, we report on the precise control of the nonlinear and stochastic bistable dynamics of a levitated nanoparticle in high vacuum. We demonstrate how it can lead to efficient signal amplification schemes, including stochastic resonance. This work contributes to showing the use of levitated nanoparticles as a model system for stochastic bistable dynamics, with applications to a wide variety of fields.

  12. Stochastic Model for Population Exposed to Low Level Risk

    International Nuclear Information System (INIS)

    Merkle, J.M.

    1996-01-01

    In this paper the stochastic model for population size, i.e. calculation of the number of deaths due to lethal stochastic health effects caused by the exposure to low level ionising radiation is presented. The model is defined for subpopulation with parameter (a, b) being fixed. Using the corresponding density function, it is possible to find all the quantities of interest by averaging over whole possible values for (a, l). All processes ar at first defined for one radionuclide, exposure pathway and the health effect under consideration. The results obtained in this paper are the basic quantities in the risk assessment, loss of life expectancy etc. The results presented in this paper are also applicable to the other sources of low level risk, not only the radiation risk

  13. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  14. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  15. Stochastic climate theory

    NARCIS (Netherlands)

    Gottwald, G.A.; Crommelin, D.T.; Franzke, C.L.E.; Franzke, C.L.E.; O'Kane, T.J.

    2017-01-01

    In this chapter we review stochastic modelling methods in climate science. First we provide a conceptual framework for stochastic modelling of deterministic dynamical systems based on the Mori-Zwanzig formalism. The Mori-Zwanzig equations contain a Markov term, a memory term and a term suggestive of

  16. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  17. On stochastic modeling of the modernized global positioning system (GPS) L2C signal

    International Nuclear Information System (INIS)

    Elsobeiey, Mohamed; El-Rabbany, Ahmed

    2010-01-01

    In order to take full advantage of the modernized GPS L2C signal, it is essential that its stochastic characteristics and code bias be rigorously determined. In this paper, long sessions of GPS measurements are used to study the stochastic characteristics of the modernized GPS L2C signal. As a byproduct, the stochastic characteristics of the legacy GPS signals, namely C/A and P2 codes, are also determined, which are used to verify the developed stochastic model of the modernized signal. The differential code biases between P2 and C2, DCB P2-C2 , are also estimated using the Bernese GPS software. It is shown that the developed models improved the precise point positioning (PPP) solution and convergence time

  18. Stochastic modeling of the hypothalamic pulse generator activity.

    Science.gov (United States)

    Camproux, A C; Thalabard, J C; Thomas, G

    1994-11-01

    Luteinizing hormone (LH) is released by the pituitary in discrete pulses. In the monkey, the appearance of LH pulses in the plasma is invariably associated with sharp increases (i.e, volleys) in the frequency of the hypothalamic pulse generator electrical activity, so that continuous monitoring of this activity by telemetry provides a unique means to study the temporal structure of the mechanism generating the pulses. To assess whether the times of occurrence and durations of previous volleys exert significant influence on the timing of the next volley, we used a class of periodic counting process models that specify the stochastic intensity of the process as the product of two factors: 1) a periodic baseline intensity and 2) a stochastic regression function with covariates representing the influence of the past. This approach allows the characterization of circadian modulation and memory range of the process underlying hypothalamic pulse generator activity, as illustrated by fitting the model to experimental data from two ovariectomized rhesus monkeys.

  19. Maximum likelihood approach for several stochastic volatility models

    International Nuclear Information System (INIS)

    Camprodon, Jordi; Perelló, Josep

    2012-01-01

    Volatility measures the amplitude of price fluctuations. Despite it being one of the most important quantities in finance, volatility is not directly observable. Here we apply a maximum likelihood method which assumes that price and volatility follow a two-dimensional diffusion process where volatility is the stochastic diffusion coefficient of the log-price dynamics. We apply this method to the simplest versions of the expOU, the OU and the Heston stochastic volatility models and we study their performance in terms of the log-price probability, the volatility probability, and its Mean First-Passage Time. The approach has some predictive power on the future returns amplitude by only knowing the current volatility. The assumed models do not consider long-range volatility autocorrelation and the asymmetric return-volatility cross-correlation but the method still yields very naturally these two important stylized facts. We apply the method to different market indices and with a good performance in all cases. (paper)

  20. A termination criterion for parameter estimation in stochastic models in systems biology.

    Science.gov (United States)

    Zimmer, Christoph; Sahle, Sven

    2015-11-01

    Parameter estimation procedures are a central aspect of modeling approaches in systems biology. They are often computationally expensive, especially when the models take stochasticity into account. Typically parameter estimation involves the iterative optimization of an objective function that describes how well the model fits some measured data with a certain set of parameter values. In order to limit the computational expenses it is therefore important to apply an adequate stopping criterion for the optimization process, so that the optimization continues at least until a reasonable fit is obtained, but not much longer. In the case of stochastic modeling, at least some parameter estimation schemes involve an objective function that is itself a random variable. This means that plain convergence tests are not a priori suitable as stopping criteria. This article suggests a termination criterion suited to optimization problems in parameter estimation arising from stochastic models in systems biology. The termination criterion is developed for optimization algorithms that involve populations of parameter sets, such as particle swarm or evolutionary algorithms. It is based on comparing the variance of the objective function over the whole population of parameter sets with the variance of repeated evaluations of the objective function at the best parameter set. The performance is demonstrated for several different algorithms. To test the termination criterion we choose polynomial test functions as well as systems biology models such as an Immigration-Death model and a bistable genetic toggle switch. The genetic toggle switch is an especially challenging test case as it shows a stochastic switching between two steady states which is qualitatively different from the model behavior in a deterministic model. Copyright © 2015. Published by Elsevier Ireland Ltd.

  1. Drift Scale Modeling: Study of Unsaturated Flow into a Drift Using a Stochastic Continuum Model

    International Nuclear Information System (INIS)

    Birkholzer, J.T.; Tsang, C.F.; Tsang, Y.W.; Wang, J.S

    1996-01-01

    Unsaturated flow in heterogeneous fractured porous rock was simulated using a stochastic continuum model (SCM). In this model, both the more conductive fractures and the less permeable matrix are generated within the framework of a single continuum stochastic approach, based on non-parametric indicator statistics. High-permeable fracture zones are distinguished from low-permeable matrix zones in that they have assigned a long range correlation structure in prescribed directions. The SCM was applied to study small-scale flow in the vicinity of an access tunnel, which is currently being drilled in the unsaturated fractured tuff formations at Yucca Mountain, Nevada. Extensive underground testing is underway in this tunnel to investigate the suitability of Yucca Mountain as an underground nuclear waste repository. Different flow scenarios were studied in the present paper, considering the flow conditions before and after the tunnel emplacement, and assuming steady-state net infiltration as well as episodic pulse infiltration. Although the capability of the stochastic continuum model has not yet been fully explored, it has been demonstrated that the SCM is a good alternative model feasible of describing heterogeneous flow processes in unsaturated fractured tuff at Yucca Mountain

  2. Handbook of EOQ inventory problems stochastic and deterministic models and applications

    CERN Document Server

    Choi, Tsan-Ming

    2013-01-01

    This book explores deterministic and stochastic EOQ-model based problems and applications, presenting technical analyses of single-echelon EOQ model based inventory problems, and applications of the EOQ model for multi-echelon supply chain inventory analysis.

  3. Logarithmic transformed statistical models in calibration

    International Nuclear Information System (INIS)

    Zeis, C.D.

    1975-01-01

    A general type of statistical model used for calibration of instruments having the property that the standard deviations of the observed values increase as a function of the mean value is described. The application to the Helix Counter at the Rocky Flats Plant is primarily from a theoretical point of view. The Helix Counter measures the amount of plutonium in certain types of chemicals. The method described can be used also for other calibrations. (U.S.)

  4. Model reduction for slow–fast stochastic systems with metastable behaviour

    International Nuclear Information System (INIS)

    Bruna, Maria; Chapman, S. Jonathan; Smith, Matthew J.

    2014-01-01

    The quasi-steady-state approximation (or stochastic averaging principle) is a useful tool in the study of multiscale stochastic systems, giving a practical method by which to reduce the number of degrees of freedom in a model. The method is extended here to slow–fast systems in which the fast variables exhibit metastable behaviour. The key parameter that determines the form of the reduced model is the ratio of the timescale for the switching of the fast variables between metastable states to the timescale for the evolution of the slow variables. The method is illustrated with two examples: one from biochemistry (a fast-species-mediated chemical switch coupled to a slower varying species), and one from ecology (a predator–prey system). Numerical simulations of each model reduction are compared with those of the full system

  5. Using genetic algorithms to calibrate a water quality model.

    Science.gov (United States)

    Liu, Shuming; Butler, David; Brazier, Richard; Heathwaite, Louise; Khu, Soon-Thiam

    2007-03-15

    With the increasing concern over the impact of diffuse pollution on water bodies, many diffuse pollution models have been developed in the last two decades. A common obstacle in using such models is how to determine the values of the model parameters. This is especially true when a model has a large number of parameters, which makes a full range of calibration expensive in terms of computing time. Compared with conventional optimisation approaches, soft computing techniques often have a faster convergence speed and are more efficient for global optimum searches. This paper presents an attempt to calibrate a diffuse pollution model using a genetic algorithm (GA). Designed to simulate the export of phosphorus from diffuse sources (agricultural land) and point sources (human), the Phosphorus Indicators Tool (PIT) version 1.1, on which this paper is based, consisted of 78 parameters. Previous studies have indicated the difficulty of full range model calibration due to the number of parameters involved. In this paper, a GA was employed to carry out the model calibration in which all parameters were involved. A sensitivity analysis was also performed to investigate the impact of operators in the GA on its effectiveness in optimum searching. The calibration yielded satisfactory results and required reasonable computing time. The application of the PIT model to the Windrush catchment with optimum parameter values was demonstrated. The annual P loss was predicted as 4.4 kg P/ha/yr, which showed a good fitness to the observed value.

  6. Cosmic CARNage I: on the calibration of galaxy formation models

    Science.gov (United States)

    Knebe, Alexander; Pearce, Frazer R.; Gonzalez-Perez, Violeta; Thomas, Peter A.; Benson, Andrew; Asquith, Rachel; Blaizot, Jeremy; Bower, Richard; Carretero, Jorge; Castander, Francisco J.; Cattaneo, Andrea; Cora, Sofía A.; Croton, Darren J.; Cui, Weiguang; Cunnama, Daniel; Devriendt, Julien E.; Elahi, Pascal J.; Font, Andreea; Fontanot, Fabio; Gargiulo, Ignacio D.; Helly, John; Henriques, Bruno; Lee, Jaehyun; Mamon, Gary A.; Onions, Julian; Padilla, Nelson D.; Power, Chris; Pujol, Arnau; Ruiz, Andrés N.; Srisawat, Chaichalit; Stevens, Adam R. H.; Tollet, Edouard; Vega-Martínez, Cristian A.; Yi, Sukyoung K.

    2018-04-01

    We present a comparison of nine galaxy formation models, eight semi-analytical, and one halo occupation distribution model, run on the same underlying cold dark matter simulation (cosmological box of comoving width 125h-1 Mpc, with a dark-matter particle mass of 1.24 × 109h-1M⊙) and the same merger trees. While their free parameters have been calibrated to the same observational data sets using two approaches, they nevertheless retain some `memory' of any previous calibration that served as the starting point (especially for the manually tuned models). For the first calibration, models reproduce the observed z = 0 galaxy stellar mass function (SMF) within 3σ. The second calibration extended the observational data to include the z = 2 SMF alongside the z ˜ 0 star formation rate function, cold gas mass, and the black hole-bulge mass relation. Encapsulating the observed evolution of the SMF from z = 2 to 0 is found to be very hard within the context of the physics currently included in the models. We finally use our calibrated models to study the evolution of the stellar-to-halo mass (SHM) ratio. For all models, we find that the peak value of the SHM relation decreases with redshift. However, the trends seen for the evolution of the peak position as well as the mean scatter in the SHM relation are rather weak and strongly model dependent. Both the calibration data sets and model results are publicly available.

  7. Stochastic modeling of reinforced concrete structures exposed to chloride attack

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Frier, Christian

    2004-01-01

    For many reinforced concrete structures corrosion of reinforcement is an important problem since it can result in expensive maintenance and repair actions. Further, a significant reduction of the load-bearing capacity can occur. One mode of corrosion initiation is that the chloride content around...... concentration and reinforcement cover depth are modeled by stochastic fields. The paper contains a description of the parameters to be included in a stochastic model and a proposal for the information needed to obtain values for the parameters in order to be able to perform reliability investigations....... The distribution of the time to initiation of corrosion is estimated by simulation. As an example a bridge pier in a marine environment is considered....

  8. Stochastic Modeling of Reinforced Concrete Structures Exposed to Chloride Attack

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Frier, Christian

    2003-01-01

    For many reinforced concrete structures corrosion of reinforcement is an important problem since it can result in expensive maintenance and repair actions. Further, a significant reduction of the load-bearing capacity can occur. One mode of corrosion initiation is that the chloride content around...... concentration and reinforcement cover depth are modeled by stochastic fields. The paper contains a description of the parameters to be included in a stochastic model and a proposal for the information needed to obtain values for the parameters in order to be ab le to perform reliability investigations....... The distribution of the time to initiation of corrosion is estimated by simulation. As an example a bridge pier in a marine environment is considered....

  9. Using Cutting-Edge Tree-Based Stochastic Models to Predict Credit Risk

    Directory of Open Access Journals (Sweden)

    Khaled Halteh

    2018-05-01

    Full Text Available Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of loans provided to borrowers with poor financial health, thereby reducing the number of failed businesses, and, in effect, preventing economies from collapsing. This paper uses state-of-the-art stochastic models, namely: Decision trees, random forests, and stochastic gradient boosting to add to the current literature on credit-risk modelling. The Australian mining industry has been selected to test our methodology. Mining in Australia generates around $138 billion annually, making up more than half of the total goods and services. This paper uses publicly-available financial data from 750 risky and not risky Australian mining companies as variables in our models. Our results indicate that stochastic gradient boosting was the superior model at correctly classifying the good and bad credit-rated companies within the mining sector. Our model showed that ‘Property, Plant, & Equipment (PPE turnover’, ‘Invested Capital Turnover’, and ‘Price over Earnings Ratio (PER’ were the variables with the best explanatory power pertaining to predicting credit risk in the Australian mining sector.

  10. Price-Dynamics of Shares and Bohmian Mechanics: Deterministic or Stochastic Model?

    Science.gov (United States)

    Choustova, Olga

    2007-02-01

    We apply the mathematical formalism of Bohmian mechanics to describe dynamics of shares. The main distinguishing feature of the financial Bohmian model is the possibility to take into account market psychology by describing expectations of traders by the pilot wave. We also discuss some objections (coming from conventional financial mathematics of stochastic processes) against the deterministic Bohmian model. In particular, the objection that such a model contradicts to the efficient market hypothesis which is the cornerstone of the modern market ideology. Another objection is of pure mathematical nature: it is related to the quadratic variation of price trajectories. One possibility to reply to this critique is to consider the stochastic Bohm-Vigier model, instead of the deterministic one. We do this in the present note.

  11. Persistence and extinction for a stochastic logistic model with infinite delay

    OpenAIRE

    Chun Lu; Xiaohua Ding

    2013-01-01

    This article, studies a stochastic logistic model with infinite delay. Using a phase space, we establish sufficient conditions for the extinction, nonpersistence in the mean, weak persistence, and stochastic permanence. A threshold between weak persistence and extinction is obtained. Our results state that different types of environmental noises have different effects on the persistence and extinction, and that the delay has no impact on the persistence and ext...

  12. High Accuracy Transistor Compact Model Calibrations

    Energy Technology Data Exchange (ETDEWEB)

    Hembree, Charles E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Mar, Alan [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robertson, Perry J. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirements require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.

  13. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  14. A calibration hierarchy for risk models was defined: from utopia to empirical data.

    Science.gov (United States)

    Van Calster, Ben; Nieboer, Daan; Vergouwe, Yvonne; De Cock, Bavo; Pencina, Michael J; Steyerberg, Ewout W

    2016-06-01

    Calibrated risk models are vital for valid decision support. We define four levels of calibration and describe implications for model development and external validation of predictions. We present results based on simulated data sets. A common definition of calibration is "having an event rate of R% among patients with a predicted risk of R%," which we refer to as "moderate calibration." Weaker forms of calibration only require the average predicted risk (mean calibration) or the average prediction effects (weak calibration) to be correct. "Strong calibration" requires that the event rate equals the predicted risk for every covariate pattern. This implies that the model is fully correct for the validation setting. We argue that this is unrealistic: the model type may be incorrect, the linear predictor is only asymptotically unbiased, and all nonlinear and interaction effects should be correctly modeled. In addition, we prove that moderate calibration guarantees nonharmful decision making. Finally, results indicate that a flexible assessment of calibration in small validation data sets is problematic. Strong calibration is desirable for individualized decision support but unrealistic and counter productive by stimulating the development of overly complex models. Model development and external validation should focus on moderate calibration. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Stochastic Boolean networks: An efficient approach to modeling gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Liang Jinghang

    2012-08-01

    Full Text Available Abstract Background Various computational models have been of interest due to their use in the modelling of gene regulatory networks (GRNs. As a logical model, probabilistic Boolean networks (PBNs consider molecular and genetic noise, so the study of PBNs provides significant insights into the understanding of the dynamics of GRNs. This will ultimately lead to advances in developing therapeutic methods that intervene in the process of disease development and progression. The applications of PBNs, however, are hindered by the complexities involved in the computation of the state transition matrix and the steady-state distribution of a PBN. For a PBN with n genes and N Boolean networks, the complexity to compute the state transition matrix is O(nN22n or O(nN2n for a sparse matrix. Results This paper presents a novel implementation of PBNs based on the notions of stochastic logic and stochastic computation. This stochastic implementation of a PBN is referred to as a stochastic Boolean network (SBN. An SBN provides an accurate and efficient simulation of a PBN without and with random gene perturbation. The state transition matrix is computed in an SBN with a complexity of O(nL2n, where L is a factor related to the stochastic sequence length. Since the minimum sequence length required for obtaining an evaluation accuracy approximately increases in a polynomial order with the number of genes, n, and the number of Boolean networks, N, usually increases exponentially with n, L is typically smaller than N, especially in a network with a large number of genes. Hence, the computational efficiency of an SBN is primarily limited by the number of genes, but not directly by the total possible number of Boolean networks. Furthermore, a time-frame expanded SBN enables an efficient analysis of the steady-state distribution of a PBN. These findings are supported by the simulation results of a simplified p53 network, several randomly generated networks and a

  16. Modeling Stochastic Energy and Water Consumption to Manage Residential Water Uses

    Science.gov (United States)

    Abdallah, A. M.; Rosenberg, D. E.; Water; Energy Conservation

    2011-12-01

    Water energy linkages have received growing attention from the water and energy utilities as utilities recognize that collaborative efforts can implement more effective conservation and efficiency improvement programs at lower cost with less effort. To date, limited energy-water household data has allowed only deterministic analysis for average, representative households and required coarse assumptions - like the water heater (the primary energy use in a home apart from heating and cooling) be a single end use. Here, we use recent available disaggregated hot and cold water household end-use data to estimate water and energy consumption for toilet, shower, faucet, dishwasher, laundry machine, leaks, and other household uses and savings from appliance retrofits. The disaggregated hot water and bulk water end-use data was previously collected by the USEPA for 96 single family households in Seattle WA and Oakland CA, and Tampa FL between the period from 2000 and 2003 for two weeks before and four weeks after each household was retrofitted with water efficient appliances. Using the disaggregated data, we developed a stochastic model that represents factors that influence water use for each appliance: behavioral (use frequency and duration), demographical (household size), and technological (use volume or flowrate). We also include stochastic factors that govern energy to heat hot water: hot water fraction (percentage of hot water volume to total water volume used in a certain end-use event), heater water intake and dispense temperatures, and energy source for the heater (gas, electric, etc). From the empirical household end-use data, we derive stochastic probability distributions for each water and energy factor where each distribution represents the range and likelihood of values that the factor may take. The uncertainty of the stochastic water and energy factors is propagated using Monte Carlo simulations to calculate the composite probability distribution for water

  17. Phenomenology of stochastic exponential growth

    Science.gov (United States)

    Pirjol, Dan; Jafarpour, Farshid; Iyer-Biswas, Srividya

    2017-06-01

    Stochastic exponential growth is observed in a variety of contexts, including molecular autocatalysis, nuclear fission, population growth, inflation of the universe, viral social media posts, and financial markets. Yet literature on modeling the phenomenology of these stochastic dynamics has predominantly focused on one model, geometric Brownian motion (GBM), which can be described as the solution of a Langevin equation with linear drift and linear multiplicative noise. Using recent experimental results on stochastic exponential growth of individual bacterial cell sizes, we motivate the need for a more general class of phenomenological models of stochastic exponential growth, which are consistent with the observation that the mean-rescaled distributions are approximately stationary at long times. We show that this behavior is not consistent with GBM, instead it is consistent with power-law multiplicative noise with positive fractional powers. Therefore, we consider this general class of phenomenological models for stochastic exponential growth, provide analytical solutions, and identify the important dimensionless combination of model parameters, which determines the shape of the mean-rescaled distribution. We also provide a prescription for robustly inferring model parameters from experimentally observed stochastic growth trajectories.

  18. Stochastic models for tumoral growth

    Science.gov (United States)

    Escudero, Carlos

    2006-02-01

    Strong experimental evidence has indicated that tumor growth belongs to the molecular beam epitaxy universality class. This type of growth is characterized by the constraint of cell proliferation to the tumor border and the surface diffusion of cells at the growing edge. Tumor growth is thus conceived as a competition for space between the tumor and the host, and cell diffusion at the tumor border is an optimal strategy adopted for minimizing the pressure and helping tumor development. Two stochastic partial differential equations are reported in this paper in order to correctly model the physical properties of tumoral growth in (1+1) and (2+1) dimensions. The advantage of these models is that they reproduce the correct geometry of the tumor and are defined in terms of polar variables. An analysis of these models allows us to quantitatively estimate the response of the tumor to an unfavorable perturbation during growth.

  19. Modelling Evolutionary Algorithms with Stochastic Differential Equations.

    Science.gov (United States)

    Heredia, Jorge Pérez

    2017-11-20

    There has been renewed interest in modelling the behaviour of evolutionary algorithms (EAs) by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogues of the additive and multiplicative drift theorems from runtime analysis. In addition, we derive a new more general multiplicative drift theorem that also covers non-elitist EAs. This theorem simultaneously allows for positive and negative results, providing information on the algorithm's progress even when the problem cannot be optimised efficiently. Finally, we provide results for some well-known heuristics namely Random Walk (RW), Random Local Search (RLS), the (1+1) EA, the Metropolis Algorithm (MA), and the Strong Selection Weak Mutation (SSWM) algorithm.

  20. Bayesian calibration of the Community Land Model using surrogates

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Jaideep; Hou, Zhangshuan; Huang, Maoyi; Swiler, Laura Painton

    2014-02-01

    We present results from the Bayesian calibration of hydrological parameters of the Community Land Model (CLM), which is often used in climate simulations and Earth system models. A statistical inverse problem is formulated for three hydrological parameters, conditional on observations of latent heat surface fluxes over 48 months. Our calibration method uses polynomial and Gaussian process surrogates of the CLM, and solves the parameter estimation problem using a Markov chain Monte Carlo sampler. Posterior probability densities for the parameters are developed for two sites with different soil and vegetation covers. Our method also allows us to examine the structural error in CLM under two error models. We find that surrogate models can be created for CLM in most cases. The posterior distributions are more predictive than the default parameter values in CLM. Climatologically averaging the observations does not modify the parameters' distributions significantly. The structural error model reveals a correlation time-scale which can be used to identify the physical process that could be contributing to it. While the calibrated CLM has a higher predictive skill, the calibration is under-dispersive.

  1. Optimal Tax Reduction by Depreciation : A Stochastic Model

    NARCIS (Netherlands)

    Berg, M.; De Waegenaere, A.M.B.; Wielhouwer, J.L.

    1996-01-01

    This paper focuses on the choice of a depreciation method, when trying to minimize the expected value of the present value of future tax payments.In a quite general model that allows for stochastic future cash- ows and a tax structure with tax brackets, we determine the optimal choice between the

  2. A Stochastic Hybrid Systems framework for analysis of Markov reward models

    International Nuclear Information System (INIS)

    Dhople, S.V.; DeVille, L.; Domínguez-García, A.D.

    2014-01-01

    In this paper, we propose a framework to analyze Markov reward models, which are commonly used in system performability analysis. The framework builds on a set of analytical tools developed for a class of stochastic processes referred to as Stochastic Hybrid Systems (SHS). The state space of an SHS is comprised of: (i) a discrete state that describes the possible configurations/modes that a system can adopt, which includes the nominal (non-faulty) operational mode, but also those operational modes that arise due to component faults, and (ii) a continuous state that describes the reward. Discrete state transitions are stochastic, and governed by transition rates that are (in general) a function of time and the value of the continuous state. The evolution of the continuous state is described by a stochastic differential equation and reward measures are defined as functions of the continuous state. Additionally, each transition is associated with a reset map that defines the mapping between the pre- and post-transition values of the discrete and continuous states; these mappings enable the definition of impulses and losses in the reward. The proposed SHS-based framework unifies the analysis of a variety of previously studied reward models. We illustrate the application of the framework to performability analysis via analytical and numerical examples

  3. Stochastic Geometric Network Models for Groups of Functional and Structural Connectomes

    Science.gov (United States)

    Friedman, Eric J.; Landsberg, Adam S.; Owen, Julia P.; Li, Yi-Ou; Mukherjee, Pratik

    2014-01-01

    Structural and functional connectomes are emerging as important instruments in the study of normal brain function and in the development of new biomarkers for a variety of brain disorders. In contrast to single-network studies that presently dominate the (non-connectome) network literature, connectome analyses typically examine groups of empirical networks and then compare these against standard (stochastic) network models. Current practice in connectome studies is to employ stochastic network models derived from social science and engineering contexts as the basis for the comparison. However, these are not necessarily best suited for the analysis of connectomes, which often contain groups of very closely related networks, such as occurs with a set of controls or a set of patients with a specific disorder. This paper studies important extensions of standard stochastic models that make them better adapted for analysis of connectomes, and develops new statistical fitting methodologies that account for inter-subject variations. The extensions explicitly incorporate geometric information about a network based on distances and inter/intra hemispherical asymmetries (to supplement ordinary degree-distribution information), and utilize a stochastic choice of networks' density levels (for fixed threshold networks) to better capture the variance in average connectivity among subjects. The new statistical tools introduced here allow one to compare groups of networks by matching both their average characteristics and the variations among them. A notable finding is that connectomes have high “smallworldness” beyond that arising from geometric and degree considerations alone. PMID:25067815

  4. Application of heuristic and machine-learning approach to engine model calibration

    Science.gov (United States)

    Cheng, Jie; Ryu, Kwang R.; Newman, C. E.; Davis, George C.

    1993-03-01

    Automation of engine model calibration procedures is a very challenging task because (1) the calibration process searches for a goal state in a huge, continuous state space, (2) calibration is often a lengthy and frustrating task because of complicated mutual interference among the target parameters, and (3) the calibration problem is heuristic by nature, and often heuristic knowledge for constraining a search cannot be easily acquired from domain experts. A combined heuristic and machine learning approach has, therefore, been adopted to improve the efficiency of model calibration. We developed an intelligent calibration program called ICALIB. It has been used on a daily basis for engine model applications, and has reduced the time required for model calibrations from many hours to a few minutes on average. In this paper, we describe the heuristic control strategies employed in ICALIB such as a hill-climbing search based on a state distance estimation function, incremental problem solution refinement by using a dynamic tolerance window, and calibration target parameter ordering for guiding the search. In addition, we present the application of a machine learning program called GID3* for automatic acquisition of heuristic rules for ordering target parameters.

  5. Stochastic quantization of field theories on the lattice and supersymmetrical models

    International Nuclear Information System (INIS)

    Aldazabal, Gerardo.

    1984-01-01

    Several aspects of the stochastic quantization method are considered. Specifically, field theories on the lattice and supersymmetrical models are studied. A non-linear sigma model is studied firstly, and it is shown that it is possible to obtain evolution equations written directly for invariant quantities. These ideas are generalized to obtain Langevin equations for the Wilson loops of non-abelian lattice gauge theories U (N) and SU (N). In order to write these equations, some different ways of introducing the constraints which the fields must satisfy are discussed. It is natural to have a strong coupling expansion in these equations. The correspondence with quantum field theory is established, and it is noticed that at all orders in the perturbation theory, Langevin equations reduce to Schwinger-Dyson equations. From another point of view, stochastic quantization is applied to large N matrix models on the lattice. As a result, a simple and systematic way of building reduced models is found. Referring to stochastic quantization in supersymmetric theories, a simple supersymmetric model is studied. It is shown that it is possible to write an evolution equation for the superfield wich leads to quantum field theory results in equilibrium. As the Langevin equation preserves supersymmetry, the property of dimensional reduction known for the quantum model is shown to be valid at all times. (M.E.L.) [es

  6. Model Calibration of Exciter and PSS Using Extended Kalman Filter

    Energy Technology Data Exchange (ETDEWEB)

    Kalsi, Karanjit; Du, Pengwei; Huang, Zhenyu

    2012-07-26

    Power system modeling and controls continue to become more complex with the advent of smart grid technologies and large-scale deployment of renewable energy resources. As demonstrated in recent studies, inaccurate system models could lead to large-scale blackouts, thereby motivating the need for model calibration. Current methods of model calibration rely on manual tuning based on engineering experience, are time consuming and could yield inaccurate parameter estimates. In this paper, the Extended Kalman Filter (EKF) is used as a tool to calibrate exciter and Power System Stabilizer (PSS) models of a particular type of machine in the Western Electricity Coordinating Council (WECC). The EKF-based parameter estimation is a recursive prediction-correction process which uses the mismatch between simulation and measurement to adjust the model parameters at every time step. Numerical simulations using actual field test data demonstrate the effectiveness of the proposed approach in calibrating the parameters.

  7. Model identification using stochastic differential equation grey-box models in diabetes.

    Science.gov (United States)

    Duun-Henriksen, Anne Katrine; Schmidt, Signe; Røge, Rikke Meldgaard; Møller, Jonas Bech; Nørgaard, Kirsten; Jørgensen, John Bagterp; Madsen, Henrik

    2013-03-01

    The acceptance of virtual preclinical testing of control algorithms is growing and thus also the need for robust and reliable models. Models based on ordinary differential equations (ODEs) can rarely be validated with standard statistical tools. Stochastic differential equations (SDEs) offer the possibility of building models that can be validated statistically and that are capable of predicting not only a realistic trajectory, but also the uncertainty of the prediction. In an SDE, the prediction error is split into two noise terms. This separation ensures that the errors are uncorrelated and provides the possibility to pinpoint model deficiencies. An identifiable model of the glucoregulatory system in a type 1 diabetes mellitus (T1DM) patient is used as the basis for development of a stochastic-differential-equation-based grey-box model (SDE-GB). The parameters are estimated on clinical data from four T1DM patients. The optimal SDE-GB is determined from likelihood-ratio tests. Finally, parameter tracking is used to track the variation in the "time to peak of meal response" parameter. We found that the transformation of the ODE model into an SDE-GB resulted in a significant improvement in the prediction and uncorrelated errors. Tracking of the "peak time of meal absorption" parameter showed that the absorption rate varied according to meal type. This study shows the potential of using SDE-GBs in diabetes modeling. Improved model predictions were obtained due to the separation of the prediction error. SDE-GBs offer a solid framework for using statistical tools for model validation and model development. © 2013 Diabetes Technology Society.

  8. Multi-Site Calibration of Linear Reservoir Based Geomorphologic Rainfall-Runoff Models

    Directory of Open Access Journals (Sweden)

    Bahram Saeidifarzad

    2014-09-01

    Full Text Available Multi-site optimization of two adapted event-based geomorphologic rainfall-runoff models was presented using Non-dominated Sorting Genetic Algorithm (NSGA-II method for the South Fork Eel River watershed, California. The first model was developed based on Unequal Cascade of Reservoirs (UECR and the second model was presented as a modified version of Geomorphological Unit Hydrograph based on Nash’s model (GUHN. Two calibration strategies were considered as semi-lumped and semi-distributed for imposing (or unimposing the geomorphology relations in the models. The results of models were compared with Nash’s model. Obtained results using the observed data of two stations in the multi-site optimization framework showed reasonable efficiency values in both the calibration and the verification steps. The outcomes also showed that semi-distributed calibration of the modified GUHN model slightly outperformed other models in both upstream and downstream stations during calibration. Both calibration strategies for the developed UECR model during the verification phase showed slightly better performance in the downstream station, but in the upstream station, the modified GUHN model in the semi-lumped strategy slightly outperformed the other models. The semi-lumped calibration strategy could lead to logical lag time parameters related to the basin geomorphology and may be more suitable for data-based statistical analyses of the rainfall-runoff process.

  9. Modelling the heat dynamics of a building using stochastic differential equations

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae; Madsen, Henrik; Hansen, Lars Henrik

    2000-01-01

    estimation and model validation, while physical knowledge is used in forming the model structure. The suggested lumped parameter model is thus based on thermodynamics and formulated as a system of stochastic differential equations. Due to the continuous time formulation the parameters of the model...

  10. Stochastic processes analysis in nuclear reactor using ARMA models

    International Nuclear Information System (INIS)

    Zavaljevski, N.

    1990-01-01

    The analysis of ARMA model derived from general stochastic state equations of nuclear reactor is given. The dependence of ARMA model parameters on the main physical characteristics of RB nuclear reactor in Vinca is presented. Preliminary identification results are presented, observed discrepancies between theory and experiment are explained and the possibilities of identification improvement are anticipated. (author)

  11. SURFplus Model Calibration for PBX 9502

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-12-06

    The SURFplus reactive burn model is calibrated for the TATB based explosive PBX 9502 at three initial temperatures; hot (75 C), ambient (23 C) and cold (-55 C). The CJ state depends on the initial temperature due to the variation in the initial density and initial specific energy of the PBX reactants. For the reactants, a porosity model for full density TATB is used. This allows the initial PBX density to be set to its measured value even though the coeffcient of thermal expansion for the TATB and the PBX differ. The PBX products EOS is taken as independent of the initial PBX state. The initial temperature also affects the sensitivity to shock initiation. The model rate parameters are calibrated to Pop plot data, the failure diameter, the limiting detonation speed just above the failure diameters, and curvature effect data for small curvature.

  12. Proximity, social capital and the Simon-model of stochastic growth

    NARCIS (Netherlands)

    Frenken, K.; Reggiani, A.; Nijkamp, P.

    2009-01-01

    It is customary to define economic geography as a discipline that deals with the uneven distribution of economic activity across space. From a historical perspective, stochastic growth models are of particular use (Simon 1955). Such models explain the current distribution of activities from the

  13. Development of the Stochastic Lung Model for Asthma

    International Nuclear Information System (INIS)

    Dobos, E.; Borbely-Kiss, I.; Kertesz, Zs.; Balashazy, I.

    2005-01-01

    Complete text of publication follows. The Stochastic Lung Model is a state-of-the-art tool for the investigation of the health impact of atmospheric aerosols. This model has already been tested and applied to calculate the deposition fractions of aerosols in different regions of the human respiratory tract. The health effects of inhaled aerosols may strongly depend on the distribution of deposition within the respiratory tract. In the current study three Asthma Models have been incorporated into the Stochastic Lung Deposition Code. A common new feature of these models is that the breathing cycle may be asymmetric. It means that the inspiration time, the expiration time and the two breath hold times are independent. And the code can simulate the mucus blockage, too. The main characteristics of the models are the followings: a) ASTHMA MODEL I: One input bronchial asthma factor is applied for the whole tracheobronchial region. The code multiplies all tracheobroncial diameters with this single value. b) ASTHMA MODEL II: Bronchial asthma factors have to be given for each bronchial generation as input data (21 values). The program multiplies the diameter of bronchi with these factors. c) ASTHMA MODEL III: Here, only the range of bronchial asthma factors are presented as input data and the code selects randomly the exact factors in pre-described airway generations. In this case the stochastic character appears in the Asthma Model, as well. As an example, Figure 1 shows the deposition fractions in the tracheobronchial and acinar regions of the human lung in the case of healthy and asthmatic adults at sitting breathing conditions as a function of particle size computed by Asthma Model I where the bronchial asthma factor was 30%. These models have been tested and compared for different types of asthma at various breathing conditions and in a wide range of particle sizes. The distribution of deposition in the characteristic regions of the respiratory tract have been computed

  14. A stochastic model for magnetic dynamics in single-molecule magnets

    Energy Technology Data Exchange (ETDEWEB)

    López-Ruiz, R., E-mail: rlruiz@ifi.unicamp.br [Instituto de Física Gleb Wataghin - Universidade Estadual de Campinas, 13083-859 Campinas (SP) (Brazil); Almeida, P.T. [Instituto de Física Gleb Wataghin - Universidade Estadual de Campinas, 13083-859 Campinas (SP) (Brazil); Vaz, M.G.F. [Instituto de Química, Universidade Federal Fluminense, 24020-150 Niterói (RJ) (Brazil); Novak, M.A. [Instituto de Física - Universidade Federal do Rio de Janeiro, 21941-972 Rio de Janeiro (RJ) (Brazil); Béron, F.; Pirota, K.R. [Instituto de Física Gleb Wataghin - Universidade Estadual de Campinas, 13083-859 Campinas (SP) (Brazil)

    2016-04-01

    Hysteresis and magnetic relaxation curves were performed on double well potential systems with quantum tunneling possibility via stochastic simulations. Simulation results are compared with experimental ones using the Mn{sub 12} single-molecule magnet, allowing us to introduce time dependence in the model. Despite being a simple simulation model, it adequately reproduces the phenomenology of a thermally activated quantum tunneling and can be extended to other systems with different parameters. Assuming competition between the reversal modes, thermal (over) and tunneling (across) the anisotropy barrier, a separation of classical and quantum contributions to relaxation time can be obtained. - Highlights: • Single-molecule magnets are modeled using a simple stochastic approach. • Simulation reproduces thermally-activated tunnelling magnetization reversal features. • The time is introduced in hysteresis and relaxation simulations. • We can separate the quantum and classical contributions to decay time.

  15. Bayesian model calibration of computational models in velocimetry diagnosed dynamic compression experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hund, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Dynamic compression experiments are being performed on complicated materials using increasingly complex drivers. The data produced in these experiments are beginning to reach a regime where traditional analysis techniques break down; requiring the solution of an inverse problem. A common measurement in dynamic experiments is an interface velocity as a function of time, and often this functional output can be simulated using a hydrodynamics code. Bayesian model calibration is a statistical framework to estimate inputs into a computational model in the presence of multiple uncertainties, making it well suited to measurements of this type. In this article, we apply Bayesian model calibration to high pressure (250 GPa) ramp compression measurements in tantalum. We address several issues speci c to this calibration including the functional nature of the output as well as parameter and model discrepancy identi ability. Speci cally, we propose scaling the likelihood function by an e ective sample size rather than modeling the autocorrelation function to accommodate the functional output and propose sensitivity analyses using the notion of `modularization' to assess the impact of experiment-speci c nuisance input parameters on estimates of material properties. We conclude that the proposed Bayesian model calibration procedure results in simple, fast, and valid inferences on the equation of state parameters for tantalum.

  16. ParPor: Particles in Pores. Stochastic Modeling of Polydisperse Transport

    DEFF Research Database (Denmark)

    Yuan, Hao

    2010-01-01

    Liquid flow containing particles in the different types of porous media appear in a large variety of practically important industrial and natural processes. The project aims at developing a stochastic model for the deep bed filtration process in which the polydisperse suspension flow...... in the polydisperse porous media. Instead of the traditional parabolic Advection-Dispersion Equation (ADE) the novel elliptic PDE based on the Continuous Time Random Walk is adopted for the particle size kinetics. The pore kinetics is either described by the stochastic size exclusion mechanism or the incomplete pore...

  17. Using Active Learning for Speeding up Calibration in Simulation Models.

    Science.gov (United States)

    Cevik, Mucahit; Ergun, Mehmet Ali; Stout, Natasha K; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2016-07-01

    Most cancer simulation models include unobservable parameters that determine disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality, and their values are typically estimated via a lengthy calibration procedure, which involves evaluating a large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We developed an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs and therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using the previously developed University of Wisconsin breast cancer simulation model (UWBCS). In a recent study, calibration of the UWBCS required the evaluation of 378 000 input parameter combinations to build a race-specific model, and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378 000 combinations. Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. © The Author(s) 2015.

  18. Grid based calibration of SWAT hydrological models

    Directory of Open Access Journals (Sweden)

    D. Gorgan

    2012-07-01

    Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.

  19. Stochastic Mixed-Effects Parameters Bertalanffy Process, with Applications to Tree Crown Width Modeling

    Directory of Open Access Journals (Sweden)

    Petras Rupšys

    2015-01-01

    Full Text Available A stochastic modeling approach based on the Bertalanffy law gained interest due to its ability to produce more accurate results than the deterministic approaches. We examine tree crown width dynamic with the Bertalanffy type stochastic differential equation (SDE and mixed-effects parameters. In this study, we demonstrate how this simple model can be used to calculate predictions of crown width. We propose a parameter estimation method and computational guidelines. The primary goal of the study was to estimate the parameters by considering discrete sampling of the diameter at breast height and crown width and by using maximum likelihood procedure. Performance statistics for the crown width equation include statistical indexes and analysis of residuals. We use data provided by the Lithuanian National Forest Inventory from Scots pine trees to illustrate issues of our modeling technique. Comparison of the predicted crown width values of mixed-effects parameters model with those obtained using fixed-effects parameters model demonstrates the predictive power of the stochastic differential equations model with mixed-effects parameters. All results were implemented in a symbolic algebra system MAPLE.

  20. Mathematical models of information and stochastic systems

    CERN Document Server

    Kornreich, Philipp

    2008-01-01

    From ancient soothsayers and astrologists to today's pollsters and economists, probability theory has long been used to predict the future on the basis of past and present knowledge. Mathematical Models of Information and Stochastic Systems shows that the amount of knowledge about a system plays an important role in the mathematical models used to foretell the future of the system. It explains how this known quantity of information is used to derive a system's probabilistic properties. After an introduction, the book presents several basic principles that are employed in the remainder of the t

  1. Sampling from stochastic reservoir models constrained by production data

    Energy Technology Data Exchange (ETDEWEB)

    Hegstad, Bjoern Kaare

    1997-12-31

    When a petroleum reservoir is evaluated, it is important to forecast future production of oil and gas and to assess forecast uncertainty. This is done by defining a stochastic model for the reservoir characteristics, generating realizations from this model and applying a fluid flow simulator to the realizations. The reservoir characteristics define the geometry of the reservoir, initial saturation, petrophysical properties etc. This thesis discusses how to generate realizations constrained by production data, that is to say, the realizations should reproduce the observed production history of the petroleum reservoir within the uncertainty of these data. The topics discussed are: (1) Theoretical framework, (2) History matching, forecasting and forecasting uncertainty, (3) A three-dimensional test case, (4) Modelling transmissibility multipliers by Markov random fields, (5) Up scaling, (6) The link between model parameters, well observations and production history in a simple test case, (7) Sampling the posterior using optimization in a hierarchical model, (8) A comparison of Rejection Sampling and Metropolis-Hastings algorithm, (9) Stochastic simulation and conditioning by annealing in reservoir description, and (10) Uncertainty assessment in history matching and forecasting. 139 refs., 85 figs., 1 tab.

  2. Stochastic light-cone CTMRG: a new DMRG approach to stochastic models 02.50.Ey Stochastic processes; 64.60.Ht Dynamic critical phenomena; 02.70.-c Computational techniques; 05.10.Cc Renormalization group methods;

    CERN Document Server

    Kemper, A; Nishino, T; Schadschneider, A; Zittartz, J

    2003-01-01

    We develop a new variant of the recently introduced stochastic transfer matrix DMRG which we call stochastic light-cone corner-transfer-matrix DMRG (LCTMRG). It is a numerical method to compute dynamic properties of one-dimensional stochastic processes. As suggested by its name, the LCTMRG is a modification of the corner-transfer-matrix DMRG, adjusted by an additional causality argument. As an example, two reaction-diffusion models, the diffusion-annihilation process and the branch-fusion process are studied and compared with exact data and Monte Carlo simulations to estimate the capability and accuracy of the new method. The number of possible Trotter steps of more than 10 sup 5 shows a considerable improvement on the old stochastic TMRG algorithm.

  3. Smooth Solutions to Optimal Investment Models with Stochastic Volatilities and Portfolio Constraints

    International Nuclear Information System (INIS)

    Pham, H.

    2002-01-01

    This paper deals with an extension of Merton's optimal investment problem to a multidimensional model with stochastic volatility and portfolio constraints. The classical dynamic programming approach leads to a characterization of the value function as a viscosity solution of the highly nonlinear associated Bellman equation. A logarithmic transformation expresses the value function in terms of the solution to a semilinear parabolic equation with quadratic growth on the derivative term. Using a stochastic control representation and some approximations, we prove the existence of a smooth solution to this semilinear equation. An optimal portfolio is shown to exist, and is expressed in terms of the classical solution to this semilinear equation. This reduction is useful for studying numerical schemes for both the value function and the optimal portfolio. We illustrate our results with several examples of stochastic volatility models popular in the financial literature

  4. Modeling and stochastic analysis of dynamic mechanisms of the perception

    Science.gov (United States)

    Pisarchik, A.; Bashkirtseva, I.; Ryashko, L.

    2017-10-01

    Modern studies in physiology and cognitive neuroscience consider a noise as an important constructive factor of the brain functionality. Under the adequate noise, the brain can rapidly access different ordered states, and provide decision-making by preventing deadlocks. Bistable dynamic models are often used for the study of the underlying mechanisms of the visual perception. In the present paper, we consider a bistable energy model subject to both additive and parametric noise. Using the catastrophe theory formalism and stochastic sensitivity functions technique, we analyze a response of the equilibria to noise, and study noise-induced transitions between equilibria. We demonstrate and analyse the effect of hysteresis squeezing when the intensity of noise is increased. Stochastic bifurcations connected with the suppression of oscillations by parametric noises are discussed.

  5. Hybrid Stochastic Forecasting Model for Management of Large Open Water Reservoir with Storage Function

    Science.gov (United States)

    Kozel, Tomas; Stary, Milos

    2017-12-01

    The main advantage of stochastic forecasting is fan of possible value whose deterministic method of forecasting could not give us. Future development of random process is described better by stochastic then deterministic forecasting. Discharge in measurement profile could be categorized as random process. Content of article is construction and application of forecasting model for managed large open water reservoir with supply function. Model is based on neural networks (NS) and zone models, which forecasting values of average monthly flow from inputs values of average monthly flow, learned neural network and random numbers. Part of data was sorted to one moving zone. The zone is created around last measurement average monthly flow. Matrix of correlation was assembled only from data belonging to zone. The model was compiled for forecast of 1 to 12 month with using backward month flows (NS inputs) from 2 to 11 months for model construction. Data was got ridded of asymmetry with help of Box-Cox rule (Box, Cox, 1964), value r was found by optimization. In next step were data transform to standard normal distribution. The data were with monthly step and forecast is not recurring. 90 years long real flow series was used for compile of the model. First 75 years were used for calibration of model (matrix input-output relationship), last 15 years were used only for validation. Outputs of model were compared with real flow series. For comparison between real flow series (100% successfully of forecast) and forecasts, was used application to management of artificially made reservoir. Course of water reservoir management using Genetic algorithm (GE) + real flow series was compared with Fuzzy model (Fuzzy) + forecast made by Moving zone model. During evaluation process was founding the best size of zone. Results show that the highest number of input did not give the best results and ideal size of zone is in interval from 25 to 35, when course of management was almost same for

  6. Calibration of PMIS pavement performance prediction models.

    Science.gov (United States)

    2012-02-01

    Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...

  7. A simple topography-driven, calibration-free runoff generation model

    Science.gov (United States)

    Gao, H.; Birkel, C.; Hrachowitz, M.; Tetzlaff, D.; Soulsby, C.; Savenije, H. H. G.

    2017-12-01

    Determining the amount of runoff generation from rainfall occupies a central place in rainfall-runoff modelling. Moreover, reading landscapes and developing calibration-free runoff generation models that adequately reflect land surface heterogeneities remains the focus of much hydrological research. In this study, we created a new method to estimate runoff generation - HAND-based Storage Capacity curve (HSC) which uses a topographic index (HAND, Height Above the Nearest Drainage) to identify hydrological similarity and partially the saturated areas of catchments. We then coupled the HSC model with the Mass Curve Technique (MCT) method to estimate root zone storage capacity (SuMax), and obtained the calibration-free runoff generation model HSC-MCT. Both the two models (HSC and HSC-MCT) allow us to estimate runoff generation and simultaneously visualize the spatial dynamic of saturated area. We tested the two models in the data-rich Bruntland Burn (BB) experimental catchment in Scotland with an unusual time series of the field-mapped saturation area extent. The models were subsequently tested in 323 MOPEX (Model Parameter Estimation Experiment) catchments in the United States. HBV and TOPMODEL were used as benchmarks. We found that the HSC performed better in reproducing the spatio-temporal pattern of the observed saturated areas in the BB catchment compared with TOPMODEL which is based on the topographic wetness index (TWI). The HSC also outperformed HBV and TOPMODEL in the MOPEX catchments for both calibration and validation. Despite having no calibrated parameters, the HSC-MCT model also performed comparably well with the calibrated HBV and TOPMODEL, highlighting the robustness of the HSC model to both describe the spatial distribution of the root zone storage capacity and the efficiency of the MCT method to estimate the SuMax. Moreover, the HSC-MCT model facilitated effective visualization of the saturated area, which has the potential to be used for broader

  8. Stochastic and simulation models of maritime intercept operations capabilities

    OpenAIRE

    Sato, Hiroyuki

    2005-01-01

    The research formulates and exercises stochastic and simulation models to assess the Maritime Intercept Operations (MIO) capabilities. The models focus on the surveillance operations of the Maritime Patrol Aircraft (MPA). The analysis using the models estimates the probability with which a terrorist vessel (Red) is detected, correctly classified, and escorted for intensive investigation and neutralization before it leaves an area of interest (AOI). The difficulty of obtaining adequate int...

  9. On the abelianity of the stochastic sandpile model

    OpenAIRE

    Nunzi, François

    2016-01-01

    We consider a stochastic variant of the Abelian Sandpile Model (ASM) on a finite graph, introduced by Chan, Marckert and Selig. Even though it is a more general model, some nice properties still hold. We show that on a certain probability space, even if we lose the group structure due to topplings not being deterministic, some operators still commute. As a corollary, we show that the stationary distribution still does not depend on how sand grains are added onto the graph in our model, answer...

  10. CSL model checking of deterministic and stochastic Petri nets

    NARCIS (Netherlands)

    Martinez Verdugo, J.M.; Haverkort, Boudewijn R.H.M.; German, R.; Heindl, A.

    2006-01-01

    Deterministic and Stochastic Petri Nets (DSPNs) are a widely used high-level formalism for modeling discrete-event systems where events may occur either without consuming time, after a deterministic time, or after an exponentially distributed time. The underlying process dened by DSPNs, under

  11. Hydrological modeling using a multi-site stochastic weather generator

    Science.gov (United States)

    Weather data is usually required at several locations over a large watershed, especially when using distributed models for hydrological simulations. In many applications, spatially correlated weather data can be provided by a multi-site stochastic weather generator which considers the spatial correl...

  12. A single model procedure for tank calibration function estimation

    International Nuclear Information System (INIS)

    York, J.C.; Liebetrau, A.M.

    1995-01-01

    Reliable tank calibrations are a vital component of any measurement control and accountability program for bulk materials in a nuclear reprocessing facility. Tank volume calibration functions used in nuclear materials safeguards and accountability programs are typically constructed from several segments, each of which is estimated independently. Ideally, the segments correspond to structural features in the tank. In this paper the authors use an extension of the Thomas-Liebetrau model to estimate the entire calibration function in a single step. This procedure automatically takes significant run-to-run differences into account and yields an estimate of the entire calibration function in one operation. As with other procedures, the first step is to define suitable calibration segments. Next, a polynomial of low degree is specified for each segment. In contrast with the conventional practice of constructing a separate model for each segment, this information is used to set up the design matrix for a single model that encompasses all of the calibration data. Estimation of the model parameters is then done using conventional statistical methods. The method described here has several advantages over traditional methods. First, modeled run-to-run differences can be taken into account automatically at the estimation step. Second, no interpolation is required between successive segments. Third, variance estimates are based on all the data, rather than that from a single segment, with the result that discontinuities in confidence intervals at segment boundaries are eliminated. Fourth, the restrictive assumption of the Thomas-Liebetrau method, that the measured volumes be the same for all runs, is not required. Finally, the proposed methods are readily implemented using standard statistical procedures and widely-used software packages

  13. Coarse-graining and hybrid methods for efficient simulation of stochastic multi-scale models of tumour growth

    International Nuclear Information System (INIS)

    Cruz, Roberto de la; Guerrero, Pilar; Calvo, Juan; Alarcón, Tomás

    2017-01-01

    The development of hybrid methodologies is of current interest in both multi-scale modelling and stochastic reaction–diffusion systems regarding their applications to biology. We formulate a hybrid method for stochastic multi-scale models of cells populations that extends the remit of existing hybrid methods for reaction–diffusion systems. Such method is developed for a stochastic multi-scale model of tumour growth, i.e. population-dynamical models which account for the effects of intrinsic noise affecting both the number of cells and the intracellular dynamics. In order to formulate this method, we develop a coarse-grained approximation for both the full stochastic model and its mean-field limit. Such approximation involves averaging out the age-structure (which accounts for the multi-scale nature of the model) by assuming that the age distribution of the population settles onto equilibrium very fast. We then couple the coarse-grained mean-field model to the full stochastic multi-scale model. By doing so, within the mean-field region, we are neglecting noise in both cell numbers (population) and their birth rates (structure). This implies that, in addition to the issues that arise in stochastic-reaction diffusion systems, we need to account for the age-structure of the population when attempting to couple both descriptions. We exploit our coarse-graining model so that, within the mean-field region, the age-distribution is in equilibrium and we know its explicit form. This allows us to couple both domains consistently, as upon transference of cells from the mean-field to the stochastic region, we sample the equilibrium age distribution. Furthermore, our method allows us to investigate the effects of intracellular noise, i.e. fluctuations of the birth rate, on collective properties such as travelling wave velocity. We show that the combination of population and birth-rate noise gives rise to large fluctuations of the birth rate in the region at the leading edge

  14. A global model for residential energy use: Uncertainty in calibration to regional data

    International Nuclear Information System (INIS)

    van Ruijven, Bas; van Vuuren, Detlef P.; de Vries, Bert; van der Sluijs, Jeroen P.

    2010-01-01

    Uncertainties in energy demand modelling allow for the development of different models, but also leave room for different calibrations of a single model. We apply an automated model calibration procedure to analyse calibration uncertainty of residential sector energy use modelling in the TIMER 2.0 global energy model. This model simulates energy use on the basis of changes in useful energy intensity, technology development (AEEI) and price responses (PIEEI). We find that different implementations of these factors yield behavioural model results. Model calibration uncertainty is identified as influential source for variation in future projections: amounting 30% to 100% around the best estimate. Energy modellers should systematically account for this and communicate calibration uncertainty ranges. (author)

  15. Bond-based linear indices of the non-stochastic and stochastic edge-adjacency matrix. 1. Theory and modeling of ChemPhys properties of organic molecules.

    Science.gov (United States)

    Marrero-Ponce, Yovani; Martínez-Albelo, Eugenio R; Casañola-Martín, Gerardo M; Castillo-Garit, Juan A; Echevería-Díaz, Yunaimy; Zaldivar, Vicente Romero; Tygat, Jan; Borges, José E Rodriguez; García-Domenech, Ramón; Torrens, Francisco; Pérez-Giménez, Facundo

    2010-11-01

    Novel bond-level molecular descriptors are proposed, based on linear maps similar to the ones defined in algebra theory. The kth edge-adjacency matrix (E(k)) denotes the matrix of bond linear indices (non-stochastic) with regard to canonical basis set. The kth stochastic edge-adjacency matrix, ES(k), is here proposed as a new molecular representation easily calculated from E(k). Then, the kth stochastic bond linear indices are calculated using ES(k) as operators of linear transformations. In both cases, the bond-type formalism is developed. The kth non-stochastic and stochastic total linear indices are calculated by adding the kth non-stochastic and stochastic bond linear indices, respectively, of all bonds in molecule. First, the new bond-based molecular descriptors (MDs) are tested for suitability, for the QSPRs, by analyzing regressions of novel indices for selected physicochemical properties of octane isomers (first round). General performance of the new descriptors in this QSPR studies is evaluated with regard to the well-known sets of 2D/3D MDs. From the analysis, we can conclude that the non-stochastic and stochastic bond-based linear indices have an overall good modeling capability proving their usefulness in QSPR studies. Later, the novel bond-level MDs are also used for the description and prediction of the boiling point of 28 alkyl-alcohols (second round), and to the modeling of the specific rate constant (log k), partition coefficient (log P), as well as the antibacterial activity of 34 derivatives of 2-furylethylenes (third round). The comparison with other approaches (edge- and vertices-based connectivity indices, total and local spectral moments, and quantum chemical descriptors as well as E-state/biomolecular encounter parameters) exposes a good behavior of our method in this QSPR studies. Finally, the approach described in this study appears to be a very promising structural invariant, useful not only for QSPR studies but also for similarity

  16. Optimisation of timetable-based, stochastic transit assignment models based on MSA

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker; Frederiksen, Rasmus Dyhr

    2006-01-01

    (CRM), such a large-scale transit assignment model was developed and estimated. The Stochastic User Equilibrium problem was solved by the Method of Successive Averages (MSA). However, the model suffered from very large calculation times. The paper focuses on how to optimise transit assignment models...

  17. Predictive sensor based x-ray calibration using a physical model

    International Nuclear Information System (INIS)

    Fuente, Matias de la; Lutz, Peter; Wirtz, Dieter C.; Radermacher, Klaus

    2007-01-01

    Many computer assisted surgery systems are based on intraoperative x-ray images. To achieve reliable and accurate results these images have to be calibrated concerning geometric distortions, which can be distinguished between constant distortions and distortions caused by magnetic fields. Instead of using an intraoperative calibration phantom that has to be visible within each image resulting in overlaying markers, the presented approach directly takes advantage of the physical background of the distortions. Based on a computed physical model of an image intensifier and a magnetic field sensor, an online compensation of distortions can be achieved without the need of an intraoperative calibration phantom. The model has to be adapted once to each specific image intensifier through calibration, which is based on an optimization algorithm systematically altering the physical model parameters, until a minimal error is reached. Once calibrated, the model is able to predict the distortions caused by the measured magnetic field vector and build an appropriate dewarping function. The time needed for model calibration is not yet optimized and takes up to 4 h on a 3 GHz CPU. In contrast, the time needed for distortion correction is less than 1 s and therefore absolutely acceptable for intraoperative use. First evaluations showed that by using the model based dewarping algorithm the distortions of an XRII with a 21 cm FOV could be significantly reduced. The model was able to predict and compensate distortions by approximately 80% to a remaining error of 0.45 mm (max) (0.19 mm rms)

  18. A stochastic aerodynamic model for stationary blades in unsteady 3D wind fields

    International Nuclear Information System (INIS)

    Fluck, Manuel; Crawford, Curran

    2016-01-01

    Dynamic loads play an important roll in the design of wind turbines, but establishing the life-time aerodynamic loads (e.g. extreme and fatigue loads) is a computationally expensive task. Conventional (deterministic) methods to analyze long term loads, which rely on the repeated analysis of multiple different wind samples, are usually too expensive to be included in optimization routines. We present a new stochastic approach, which solves the aerodynamic system equations (Lagrangian vortex model) in the stochastic space, and thus arrive directly at a stochastic description of the coupled loads along a turbine blade. This new approach removes the requirement of analyzing multiple different realizations. Instead, long term loads can be extracted from a single stochastic solution, a procedure that is obviously significantly faster. Despite the reduced analysis time, results obtained from the stochastic approach match deterministic result well for a simple test-case (a stationary blade). In future work, the stochastic method will be extended to rotating blades, thus opening up new avenues to include long term loads into turbine optimization. (paper)

  19. Stochastic Greybox Modeling for Control of an Alternating Activated Sludge Process

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus Fogtmann; Vezzaro, Luca; Grum, M.

    We present a stochastic greybox model of a BioDenitro WWTP that can be used for short time horizon Model Predictive Control. The model is based on a simplified ASM1 model and takes model uncertainty in to account. It estimates unmeasured state variables in the system, e.g. the inlet concentration...

  20. Calibration models for high enthalpy calorimetric probes.

    Science.gov (United States)

    Kannel, A

    1978-07-01

    The accuracy of gas-aspirated liquid-cooled calorimetric probes used for measuring the enthalpy of high-temperature gas streams is studied. The error in the differential temperature measurements caused by internal and external heat transfer interactions is considered and quantified by mathematical models. The analysis suggests calibration methods for the evaluation of dimensionless heat transfer parameters in the models, which then can give a more accurate value for the enthalpy of the sample. Calibration models for four types of calorimeters are applied to results from the literature and from our own experiments: a circular slit calorimeter developed by the author, single-cooling jacket probe, double-cooling jacket probe, and split-flow cooling jacket probe. The results show that the models are useful for describing and correcting the temperature measurements.

  1. A stochastic phase-field model determined from molecular dynamics

    KAUST Repository

    von Schwerin, Erik

    2010-03-17

    The dynamics of dendritic growth of a crystal in an undercooled melt is determined by macroscopic diffusion-convection of heat and by capillary forces acting on the nanometer scale of the solid-liquid interface width. Its modelling is useful for instance in processing techniques based on casting. The phase-field method is widely used to study evolution of such microstructural phase transformations on a continuum level; it couples the energy equation to a phenomenological Allen-Cahn/Ginzburg-Landau equation modelling the dynamics of an order parameter determining the solid and liquid phases, including also stochastic fluctuations to obtain the qualitatively correct result of dendritic side branching. This work presents a method to determine stochastic phase-field models from atomistic formulations by coarse-graining molecular dynamics. It has three steps: (1) a precise quantitative atomistic definition of the phase-field variable, based on the local potential energy; (2) derivation of its coarse-grained dynamics model, from microscopic Smoluchowski molecular dynamics (that is Brownian or over damped Langevin dynamics); and (3) numerical computation of the coarse-grained model functions. The coarse-grained model approximates Gibbs ensemble averages of the atomistic phase-field, by choosing coarse-grained drift and diffusion functions that minimize the approximation error of observables in this ensemble average. © EDP Sciences, SMAI, 2010.

  2. A stochastic phase-field model determined from molecular dynamics

    KAUST Repository

    von Schwerin, Erik; Szepessy, Anders

    2010-01-01

    The dynamics of dendritic growth of a crystal in an undercooled melt is determined by macroscopic diffusion-convection of heat and by capillary forces acting on the nanometer scale of the solid-liquid interface width. Its modelling is useful for instance in processing techniques based on casting. The phase-field method is widely used to study evolution of such microstructural phase transformations on a continuum level; it couples the energy equation to a phenomenological Allen-Cahn/Ginzburg-Landau equation modelling the dynamics of an order parameter determining the solid and liquid phases, including also stochastic fluctuations to obtain the qualitatively correct result of dendritic side branching. This work presents a method to determine stochastic phase-field models from atomistic formulations by coarse-graining molecular dynamics. It has three steps: (1) a precise quantitative atomistic definition of the phase-field variable, based on the local potential energy; (2) derivation of its coarse-grained dynamics model, from microscopic Smoluchowski molecular dynamics (that is Brownian or over damped Langevin dynamics); and (3) numerical computation of the coarse-grained model functions. The coarse-grained model approximates Gibbs ensemble averages of the atomistic phase-field, by choosing coarse-grained drift and diffusion functions that minimize the approximation error of observables in this ensemble average. © EDP Sciences, SMAI, 2010.

  3. Using genetic algorithm to solve a new multi-period stochastic optimization model

    Science.gov (United States)

    Zhang, Xin-Li; Zhang, Ke-Cun

    2009-09-01

    This paper presents a new asset allocation model based on the CVaR risk measure and transaction costs. Institutional investors manage their strategic asset mix over time to achieve favorable returns subject to various uncertainties, policy and legal constraints, and other requirements. One may use a multi-period portfolio optimization model in order to determine an optimal asset mix. Recently, an alternative stochastic programming model with simulated paths was proposed by Hibiki [N. Hibiki, A hybrid simulation/tree multi-period stochastic programming model for optimal asset allocation, in: H. Takahashi, (Ed.) The Japanese Association of Financial Econometrics and Engineering, JAFFE Journal (2001) 89-119 (in Japanese); N. Hibiki A hybrid simulation/tree stochastic optimization model for dynamic asset allocation, in: B. Scherer (Ed.), Asset and Liability Management Tools: A Handbook for Best Practice, Risk Books, 2003, pp. 269-294], which was called a hybrid model. However, the transaction costs weren't considered in that paper. In this paper, we improve Hibiki's model in the following aspects: (1) The risk measure CVaR is introduced to control the wealth loss risk while maximizing the expected utility; (2) Typical market imperfections such as short sale constraints, proportional transaction costs are considered simultaneously. (3) Applying a genetic algorithm to solve the resulting model is discussed in detail. Numerical results show the suitability and feasibility of our methodology.

  4. Stochastic resonance in a generalized Von Foerster population growth model

    Energy Technology Data Exchange (ETDEWEB)

    Lumi, N.; Mankin, R. [Institute of Mathematics and Natural Sciences, Tallinn University, 25 Narva Road, 10120 Tallinn (Estonia)

    2014-11-12

    The stochastic dynamics of a population growth model, similar to the Von Foerster model for human population, is studied. The influence of fluctuating environment on the carrying capacity is modeled as a multiplicative dichotomous noise. It is established that an interplay between nonlinearity and environmental fluctuations can cause single unidirectional discontinuous transitions of the mean population size versus the noise amplitude, i.e., an increase of noise amplitude can induce a jump from a state with a moderate number of individuals to that with a very large number, while by decreasing the noise amplitude an opposite transition cannot be effected. An analytical expression of the mean escape time for such transitions is found. Particularly, it is shown that the mean transition time exhibits a strong minimum at intermediate values of noise correlation time, i.e., the phenomenon of stochastic resonance occurs. Applications of the results in ecology are also discussed.

  5. Efficient Stochastic Inversion Using Adjoint Models and Kernel-PCA

    Energy Technology Data Exchange (ETDEWEB)

    Thimmisetty, Charanraj A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing; Zhao, Wenju [Florida State Univ., Tallahassee, FL (United States). Dept. of Scientific Computing; Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing; Tong, Charles H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing; White, Joshua A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Atmospheric, Earth and Energy Division

    2017-10-18

    Performing stochastic inversion on a computationally expensive forward simulation model with a high-dimensional uncertain parameter space (e.g. a spatial random field) is computationally prohibitive even when gradient information can be computed efficiently. Moreover, the ‘nonlinear’ mapping from parameters to observables generally gives rise to non-Gaussian posteriors even with Gaussian priors, thus hampering the use of efficient inversion algorithms designed for models with Gaussian assumptions. In this paper, we propose a novel Bayesian stochastic inversion methodology, which is characterized by a tight coupling between the gradient-based Langevin Markov Chain Monte Carlo (LMCMC) method and a kernel principal component analysis (KPCA). This approach addresses the ‘curse-of-dimensionality’ via KPCA to identify a low-dimensional feature space within the high-dimensional and nonlinearly correlated parameter space. In addition, non-Gaussian posterior distributions are estimated via an efficient LMCMC method on the projected low-dimensional feature space. We will demonstrate this computational framework by integrating and adapting our recent data-driven statistics-on-manifolds constructions and reduction-through-projection techniques to a linear elasticity model.

  6. Combining a popularity-productivity stochastic block model with a discriminative-content model for general structure detection.

    Science.gov (United States)

    Chai, Bian-fang; Yu, Jian; Jia, Cai-Yan; Yang, Tian-bao; Jiang, Ya-wen

    2013-07-01

    Latent community discovery that combines links and contents of a text-associated network has drawn more attention with the advance of social media. Most of the previous studies aim at detecting densely connected communities and are not able to identify general structures, e.g., bipartite structure. Several variants based on the stochastic block model are more flexible for exploring general structures by introducing link probabilities between communities. However, these variants cannot identify the degree distributions of real networks due to a lack of modeling of the differences among nodes, and they are not suitable for discovering communities in text-associated networks because they ignore the contents of nodes. In this paper, we propose a popularity-productivity stochastic block (PPSB) model by introducing two random variables, popularity and productivity, to model the differences among nodes in receiving links and producing links, respectively. This model has the flexibility of existing stochastic block models in discovering general community structures and inherits the richness of previous models that also exploit popularity and productivity in modeling the real scale-free networks with power law degree distributions. To incorporate the contents in text-associated networks, we propose a combined model which combines the PPSB model with a discriminative model that models the community memberships of nodes by their contents. We then develop expectation-maximization (EM) algorithms to infer the parameters in the two models. Experiments on synthetic and real networks have demonstrated that the proposed models can yield better performances than previous models, especially on networks with general structures.

  7. Persistence and extinction of a stochastic single-specie model under regime switching in a polluted environment.

    Science.gov (United States)

    Liu, Meng; Wang, Ke

    2010-06-07

    A new single-species model disturbed by both white noise and colored noise in a polluted environment is developed and analyzed. Sufficient criteria for extinction, stochastic nonpersistence in the mean, stochastic weak persistence in the mean, stochastic strong persistence in the mean and stochastic permanence of the species are established. The threshold between stochastic weak persistence in the mean and extinction is obtained. The results show that both white and colored environmental noises have sufficient effect to the survival results. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  8. Data-driven remaining useful life prognosis techniques stochastic models, methods and applications

    CERN Document Server

    Si, Xiao-Sheng; Hu, Chang-Hua

    2017-01-01

    This book introduces data-driven remaining useful life prognosis techniques, and shows how to utilize the condition monitoring data to predict the remaining useful life of stochastic degrading systems and to schedule maintenance and logistics plans. It is also the first book that describes the basic data-driven remaining useful life prognosis theory systematically and in detail. The emphasis of the book is on the stochastic models, methods and applications employed in remaining useful life prognosis. It includes a wealth of degradation monitoring experiment data, practical prognosis methods for remaining useful life in various cases, and a series of applications incorporated into prognostic information in decision-making, such as maintenance-related decisions and ordering spare parts. It also highlights the latest advances in data-driven remaining useful life prognosis techniques, especially in the contexts of adaptive prognosis for linear stochastic degrading systems, nonlinear degradation modeling based pro...

  9. An extension of clarke's model with stochastic amplitude flip processes

    KAUST Repository

    Hoel, Hakon

    2014-07-01

    Stochastic modeling is an essential tool for studying statistical properties of wireless channels. In multipath fading channel (MFC) models, the signal reception is modeled by a sum of wave path contributions, and Clarke\\'s model is an important example of such which has been widely accepted in many wireless applications. However, since Clarke\\'s model is temporally deterministic, Feng and Field noted that it does not model real wireless channels with time-varying randomness well. Here, we extend Clarke\\'s model to a novel time-varying stochastic MFC model with scatterers randomly flipping on and off. Statistical properties of the MFC model are analyzed and shown to fit well with real signal measurements, and a limit Gaussian process is derived from the model when the number of active wave paths tends to infinity. A second focus of this work is a comparison study of the error and computational cost of generating signal realizations from the MFC model and from its limit Gaussian process. By rigorous analysis and numerical studies, we show that in many settings, signal realizations are generated more efficiently by Gaussian process algorithms than by the MFC model\\'s algorithm. Numerical examples that strengthen these observations are also presented. © 2014 IEEE.

  10. SURF Model Calibration Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    SURF and SURFplus are high explosive reactive burn models for shock initiation and propagation of detonation waves. They are engineering models motivated by the ignition & growth concept of high spots and for SURFplus a second slow reaction for the energy release from carbon clustering. A key feature of the SURF model is that there is a partial decoupling between model parameters and detonation properties. This enables reduced sets of independent parameters to be calibrated sequentially for the initiation and propagation regimes. Here we focus on a methodology for tting the initiation parameters to Pop plot data based on 1-D simulations to compute a numerical Pop plot. In addition, the strategy for tting the remaining parameters for the propagation regime and failure diameter is discussed.

  11. Modeling and Calibration of a Novel One-Mirror Galvanometric Laser Scanner

    Directory of Open Access Journals (Sweden)

    Chengyi Yu

    2017-01-01

    Full Text Available A laser stripe sensor has limited application when a point cloud of geometric samples on the surface of the object needs to be collected, so a galvanometric laser scanner is designed by using a one-mirror galvanometer element as its mechanical device to drive the laser stripe to sweep along the object. A novel mathematical model is derived for the proposed galvanometer laser scanner without any position assumptions and then a model-driven calibration procedure is proposed. Compared with available model-driven approaches, the influence of machining and assembly errors is considered in the proposed model. Meanwhile, a plane-constraint-based approach is proposed to extract a large number of calibration points effectively and accurately to calibrate the galvanometric laser scanner. Repeatability and accuracy of the galvanometric laser scanner are evaluated on the automobile production line to verify the efficiency and accuracy of the proposed calibration method. Experimental results show that the proposed calibration approach yields similar measurement performance compared with a look-up table calibration method.

  12. Three-dimensional DFN Model Development and Calibration: A Case Study for Pahute Mesa, Nevada National Security Site

    Science.gov (United States)

    Pham, H. V.; Parashar, R.; Sund, N. L.; Pohlmann, K.

    2017-12-01

    Pahute Mesa, located in the north-western region of the Nevada National Security Site, is an area where numerous underground nuclear tests were conducted. The mesa contains several fractured aquifers that can potentially provide high permeability pathways for migration of radionuclides away from testing locations. The BULLION Forced-Gradient Experiment (FGE) conducted on Pahute Mesa injected and pumped solute and colloid tracers from a system of three wells for obtaining site-specific information about the transport of radionuclides in fractured rock aquifers. This study aims to develop reliable three-dimensional discrete fracture network (DFN) models to simulate the BULLION FGE as a means for computing realistic ranges of important parameters describing fractured rock. Multiple conceptual DFN models were developed using dfnWorks, a parallelized computational suite developed by Los Alamos National Laboratory, to simulate flow and conservative particle movement in subsurface fractured rocks downgradient from the BULLION test. The model domain is 100x200x100 m and includes the three tracer-test wells of the BULLION FGE and the Pahute Mesa Lava-flow aquifer. The model scenarios considered differ from each other in terms of boundary conditions and fracture density. For each conceptual model, a number of statistically equivalent fracture network realizations were generated using data from fracture characterization studies. We adopt the covariance matrix adaptation-evolution strategy (CMA-ES) as a global local stochastic derivative-free optimization method to calibrate the DFN models using groundwater levels and tracer breakthrough data obtained from the three wells. Models of fracture apertures based on fracture type and size are proposed and the values of apertures in each model are estimated during model calibration. The ranges of fracture aperture values resulting from this study are expected to enhance understanding of radionuclide transport in fractured rocks and

  13. Stochastic geometry, spatial statistics and random fields models and algorithms

    CERN Document Server

    2015-01-01

    Providing a graduate level introduction to various aspects of stochastic geometry, spatial statistics and random fields, this volume places a special emphasis on fundamental classes of models and algorithms as well as on their applications, for example in materials science, biology and genetics. This book has a strong focus on simulations and includes extensive codes in Matlab and R, which are widely used in the mathematical community. It can be regarded as a continuation of the recent volume 2068 of Lecture Notes in Mathematics, where other issues of stochastic geometry, spatial statistics and random fields were considered, with a focus on asymptotic methods.

  14. A stochastic formulation of the Bass model of new-product diffusion

    Directory of Open Access Journals (Sweden)

    Niu Shun-Chen

    2002-01-01

    Full Text Available For a large variety of new products, the Bass Model (BM describes the empirical cumulative-adoptions curve extremely well. The BM postulates that the trajectory of cumulative adoptions of a new product follows a deterministic function whose instantaneous growth rate depends on two parameters, one of which captures an individual's intrinsic tendency to purchase, independent of the number of previous adopters, and the other captures a positive force of influence on an individual by previous adopters. In this paper, we formulate a stochastic version of the BM, which we call the Stochastic Bass Model (SBM, where the trajectory of cumulative number of adoptions is governed by a pure birth process. We show that with an appropriately-chosen set of birth rates, the fractions of individuals who have adopted the product by time t in a family of SBMs indexed by the size of the target population converge in probability to the deterministic fraction in a corresponding BM, when the population size approaches infinity. The formulation therefore supports and expands the BM by allowing stochastic trajectories.

  15. Stochastic linear hybrid systems: Modeling, estimation, and application

    Science.gov (United States)

    Seah, Chze Eng

    Hybrid systems are dynamical systems which have interacting continuous state and discrete state (or mode). Accurate modeling and state estimation of hybrid systems are important in many applications. We propose a hybrid system model, known as the Stochastic Linear Hybrid System (SLHS), to describe hybrid systems with stochastic linear system dynamics in each mode and stochastic continuous-state-dependent mode transitions. We then develop a hybrid estimation algorithm, called the State-Dependent-Transition Hybrid Estimation (SDTHE) algorithm, to estimate the continuous state and discrete state of the SLHS from noisy measurements. It is shown that the SDTHE algorithm is more accurate or more computationally efficient than existing hybrid estimation algorithms. Next, we develop a performance analysis algorithm to evaluate the performance of the SDTHE algorithm in a given operating scenario. We also investigate sufficient conditions for the stability of the SDTHE algorithm. The proposed SLHS model and SDTHE algorithm are illustrated to be useful in several applications. In Air Traffic Control (ATC), to facilitate implementations of new efficient operational concepts, accurate modeling and estimation of aircraft trajectories are needed. In ATC, an aircraft's trajectory can be divided into a number of flight modes. Furthermore, as the aircraft is required to follow a given flight plan or clearance, its flight mode transitions are dependent of its continuous state. However, the flight mode transitions are also stochastic due to navigation uncertainties or unknown pilot intents. Thus, we develop an aircraft dynamics model in ATC based on the SLHS. The SDTHE algorithm is then used in aircraft tracking applications to estimate the positions/velocities of aircraft and their flight modes accurately. Next, we develop an aircraft conformance monitoring algorithm to detect any deviations of aircraft trajectories in ATC that might compromise safety. In this application, the SLHS

  16. Expected utility and catastrophic risk in a stochastic economy-climate model

    Energy Technology Data Exchange (ETDEWEB)

    Ikefuji, M. [Institute of Social and Economic Research, Osaka University, Osaka (Japan); Laeven, R.J.A.; Magnus, J.R. [Department of Econometrics and Operations Research, Tilburg University, Tilburg (Netherlands); Muris, C. [CentER, Tilburg University, Tilburg (Netherlands)

    2010-11-15

    In the context of extreme climate change, we ask how to conduct expected utility analysis in the presence of catastrophic risks. Economists typically model decision making under risk and uncertainty by expected utility with constant relative risk aversion (power utility); statisticians typically model economic catastrophes by probability distributions with heavy tails. Unfortunately, the expected utility framework is fragile with respect to heavy-tailed distributional assumptions. We specify a stochastic economy-climate model with power utility and explicitly demonstrate this fragility. We derive necessary and sufficient compatibility conditions on the utility function to avoid fragility and solve our stochastic economy-climate model for two examples of such compatible utility functions. We further develop and implement a procedure to learn the input parameters of our model and show that the model thus specified produces quite robust optimal policies. The numerical results indicate that higher levels of uncertainty (heavier tails) lead to less abatement and consumption, and to more investment, but this effect is not unlimited.

  17. Regionalization of the Modified Bartlett-Lewis Rectangular Pulse Stochastic Rainfall Model

    Directory of Open Access Journals (Sweden)

    Dongkyun Kim

    2013-01-01

    Full Text Available Parameters of the Modified Bartlett-Lewis Rectangular Pulse (MBLRP stochastic rainfall simulation model were regionalized across the contiguous United States. Three thousand four hundred forty-four National Climate Data Center (NCDC rain gauges were used to obtain spatial and seasonal patterns of the model parameters. The MBLRP model was calibrated to minimize the discrepancy between the precipitation depth statistics between the observed and MBLRP-generated precipitation time series. These statistics included the mean, variance, probability of zero rainfall and autocorrelation at 1-, 3-, 12- and 24-hour accumulation intervals. The Ordinary Kriging interpolation technique was used to generate maps of the six MBLRP model parameters for each of the 12 months of the year. All parameters had clear to discernible regional tendencies; except for one related to rain cell duration distribution. Parameter seasonality was not obvious and it was more apparent in some locations than in others, depending on the seasonality of the rainfall statistics. Cross-validation was used to assess the validity of the parameter maps. The results indicate that the suggested maps reproduce well the observed rainfall statistics for different accumulation intervals, except for the lag-1 autocorrelation coefficient. The boundaries of the expected residual, with 95% confidence, between the observed rainfall statistics and the simulated rainfall statistics based on the map parameters were approximately ±0.064 mm hr-1, ±1.63 mm2 hr-2, ±0.16, and ±0.030 for the mean, variance, lag-1 autocorrelation and probability of zero rainfall at hourly accumulation levels, respectively. The estimated parameter values were also used to estimate the storm and rain cell characteristics.

  18. Testing of a one dimensional model for Field II calibration

    DEFF Research Database (Denmark)

    Bæk, David; Jensen, Jørgen Arendt; Willatzen, Morten

    2008-01-01

    Field II is a program for simulating ultrasound transducer fields. It is capable of calculating the emitted and pulse-echoed fields for both pulsed and continuous wave transducers. To make it fully calibrated a model of the transducer’s electro-mechanical impulse response must be included. We...... examine an adapted one dimensional transducer model originally proposed by Willatzen [9] to calibrate Field II. This model is modified to calculate the required impulse responses needed by Field II for a calibrated field pressure and external circuit current calculation. The testing has been performed...... to the calibrated Field II program for 1, 4, and 10 cycle excitations. Two parameter sets were applied for modeling, one real valued Pz27 parameter set, manufacturer supplied, and one complex valued parameter set found in literature, Alguer´o et al. [11]. The latter implicitly accounts for attenuation. Results show...

  19. Stochastic model of Rayleigh-Taylor turbulent mixing

    International Nuclear Information System (INIS)

    Abarzhi, S.I.; Cadjan, M.; Fedotov, S.

    2007-01-01

    We propose a stochastic model to describe the random character of the dissipation process in Rayleigh-Taylor turbulent mixing. The parameter alpha, used conventionally to characterize the mixing growth-rate, is not a universal constant and is very sensitive to the statistical properties of the dissipation. The ratio between the rates of momentum loss and momentum gain is the statistic invariant and a robust parameter to diagnose with or without turbulent diffusion accounted for

  20. Stochastic E2F activation and reconciliation of phenomenological cell-cycle models.

    Science.gov (United States)

    Lee, Tae J; Yao, Guang; Bennett, Dorothy C; Nevins, Joseph R; You, Lingchong

    2010-09-21

    The transition of the mammalian cell from quiescence to proliferation is a highly variable process. Over the last four decades, two lines of apparently contradictory, phenomenological models have been proposed to account for such temporal variability. These include various forms of the transition probability (TP) model and the growth control (GC) model, which lack mechanistic details. The GC model was further proposed as an alternative explanation for the concept of the restriction point, which we recently demonstrated as being controlled by a bistable Rb-E2F switch. Here, through a combination of modeling and experiments, we show that these different lines of models in essence reflect different aspects of stochastic dynamics in cell cycle entry. In particular, we show that the variable activation of E2F can be described by stochastic activation of the bistable Rb-E2F switch, which in turn may account for the temporal variability in cell cycle entry. Moreover, we show that temporal dynamics of E2F activation can be recast into the frameworks of both the TP model and the GC model via parameter mapping. This mapping suggests that the two lines of phenomenological models can be reconciled through the stochastic dynamics of the Rb-E2F switch. It also suggests a potential utility of the TP or GC models in defining concise, quantitative phenotypes of cell physiology. This may have implications in classifying cell types or states.