WorldWideScience

Sample records for assimilation sequential monte

  1. Joint Data Assimilation and Parameter Calibration in on-line groundwater modelling using Sequential Monte Carlo techniques

    Science.gov (United States)

    Ramgraber, M.; Schirmer, M.

    2017-12-01

    As computational power grows and wireless sensor networks find their way into common practice, it becomes increasingly feasible to pursue on-line numerical groundwater modelling. The reconciliation of model predictions with sensor measurements often necessitates the application of Sequential Monte Carlo (SMC) techniques, most prominently represented by the Ensemble Kalman Filter. In the pursuit of on-line predictions it seems advantageous to transcend the scope of pure data assimilation and incorporate on-line parameter calibration as well. Unfortunately, the interplay between shifting model parameters and transient states is non-trivial. Several recent publications (e.g. Chopin et al., 2013, Kantas et al., 2015) in the field of statistics discuss potential algorithms addressing this issue. However, most of these are computationally intractable for on-line application. In this study, we investigate to what extent compromises between mathematical rigour and computational restrictions can be made within the framework of on-line numerical modelling of groundwater. Preliminary studies are conducted in a synthetic setting, with the goal of transferring the conclusions drawn into application in a real-world setting. To this end, a wireless sensor network has been established in the valley aquifer around Fehraltorf, characterized by a highly dynamic groundwater system and located about 20 km to the East of Zürich, Switzerland. By providing continuous probabilistic estimates of the state and parameter distribution, a steady base for branched-off predictive scenario modelling could be established, providing water authorities with advanced tools for assessing the impact of groundwater management practices. Chopin, N., Jacob, P.E. and Papaspiliopoulos, O. (2013): SMC2: an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (3), p. 397-426. Kantas, N., Doucet, A., Singh, S

  2. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros; Jasra, Ajay; Law, Kody; Tempone, Raul; Zhou, Yan

    2016-01-01

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  3. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros

    2016-08-29

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  4. Comparison of Sequential and Variational Data Assimilation

    Science.gov (United States)

    Alvarado Montero, Rodolfo; Schwanenberg, Dirk; Weerts, Albrecht

    2017-04-01

    Data assimilation is a valuable tool to improve model state estimates by combining measured observations with model simulations. It has recently gained significant attention due to its potential in using remote sensing products to improve operational hydrological forecasts and for reanalysis purposes. This has been supported by the application of sequential techniques such as the Ensemble Kalman Filter which require no additional features within the modeling process, i.e. it can use arbitrary black-box models. Alternatively, variational techniques rely on optimization algorithms to minimize a pre-defined objective function. This function describes the trade-off between the amount of noise introduced into the system and the mismatch between simulated and observed variables. While sequential techniques have been commonly applied to hydrological processes, variational techniques are seldom used. In our believe, this is mainly attributed to the required computation of first order sensitivities by algorithmic differentiation techniques and related model enhancements, but also to lack of comparison between both techniques. We contribute to filling this gap and present the results from the assimilation of streamflow data in two basins located in Germany and Canada. The assimilation introduces noise to precipitation and temperature to produce better initial estimates of an HBV model. The results are computed for a hindcast period and assessed using lead time performance metrics. The study concludes with a discussion of the main features of each technique and their advantages/disadvantages in hydrological applications.

  5. Sequential Monte Carlo with Highly Informative Observations

    OpenAIRE

    Del Moral, Pierre; Murray, Lawrence M.

    2014-01-01

    We propose sequential Monte Carlo (SMC) methods for sampling the posterior distribution of state-space models under highly informative observation regimes, a situation in which standard SMC methods can perform poorly. A special case is simulating bridges between given initial and final values. The basic idea is to introduce a schedule of intermediate weighting and resampling times between observation times, which guide particles towards the final state. This can always be done for continuous-...

  6. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-01

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  7. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-05

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  8. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  9. DUAL STATE-PARAMETER UPDATING SCHEME ON A CONCEPTUAL HYDROLOGIC MODEL USING SEQUENTIAL MONTE CARLO FILTERS

    Science.gov (United States)

    Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin

    Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.

  10. Multivariate Error Covariance Estimates by Monte-Carlo Simulation for Assimilation Studies in the Pacific Ocean

    Science.gov (United States)

    Borovikov, Anna; Rienecker, Michele M.; Keppenne, Christian; Johnson, Gregory C.

    2004-01-01

    One of the most difficult aspects of ocean state estimation is the prescription of the model forecast error covariances. The paucity of ocean observations limits our ability to estimate the covariance structures from model-observation differences. In most practical applications, simple covariances are usually prescribed. Rarely are cross-covariances between different model variables used. Here a comparison is made between a univariate Optimal Interpolation (UOI) scheme and a multivariate OI algorithm (MvOI) in the assimilation of ocean temperature. In the UOI case only temperature is updated using a Gaussian covariance function and in the MvOI salinity, zonal and meridional velocities as well as temperature, are updated using an empirically estimated multivariate covariance matrix. Earlier studies have shown that a univariate OI has a detrimental effect on the salinity and velocity fields of the model. Apparently, in a sequential framework it is important to analyze temperature and salinity together. For the MvOI an estimation of the model error statistics is made by Monte-Carlo techniques from an ensemble of model integrations. An important advantage of using an ensemble of ocean states is that it provides a natural way to estimate cross-covariances between the fields of different physical variables constituting the model state vector, at the same time incorporating the model's dynamical and thermodynamical constraints as well as the effects of physical boundaries. Only temperature observations from the Tropical Atmosphere-Ocean array have been assimilated in this study. In order to investigate the efficacy of the multivariate scheme two data assimilation experiments are validated with a large independent set of recently published subsurface observations of salinity, zonal velocity and temperature. For reference, a third control run with no data assimilation is used to check how the data assimilation affects systematic model errors. While the performance of the

  11. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

    Science.gov (United States)

    Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik

    2018-05-01

    Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.

  12. Sequential assimilation of multi-mission dynamical topography into a global finite-element ocean model

    Directory of Open Access Journals (Sweden)

    S. Skachko

    2008-12-01

    Full Text Available This study focuses on an accurate estimation of ocean circulation via assimilation of satellite measurements of ocean dynamical topography into the global finite-element ocean model (FEOM. The dynamical topography data are derived from a complex analysis of multi-mission altimetry data combined with a referenced earth geoid. The assimilation is split into two parts. First, the mean dynamic topography is adjusted. To this end an adiabatic pressure correction method is used which reduces model divergence from the real evolution. Second, a sequential assimilation technique is applied to improve the representation of thermodynamical processes by assimilating the time varying dynamic topography. A method is used according to which the temperature and salinity are updated following the vertical structure of the first baroclinic mode. It is shown that the method leads to a partially successful assimilation approach reducing the rms difference between the model and data from 16 cm to 2 cm. This improvement of the mean state is accompanied by significant improvement of temporal variability in our analysis. However, it remains suboptimal, showing a tendency in the forecast phase of returning toward a free run without data assimilation. Both the mean difference and standard deviation of the difference between the forecast and observation data are reduced as the result of assimilation.

  13. Assessing sequential data assimilation techniques for integrating GRACE data into a hydrological model

    KAUST Repository

    Khaki, M.

    2017-07-06

    The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.

  14. Parameter sampling capabilities of sequential and simultaneous data assimilation: I. Analytical comparison

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess the parameter sampling capabilities of some Bayesian, ensemble-based, joint state-parameter (JS) estimation methods. The forward model is assumed to be non-chaotic and have nonlinear components, and the emphasis is on results obtained for the parameters in the state-parameter vector. A variety of approximate sampling methods exist, and a number of numerical comparisons between such methods have been performed. Often, more than one of the defining characteristics vary from one method to another, so it can be difficult to point out which characteristic of the more successful method in such a comparison was decisive. In this study, we single out one defining characteristic for comparison; whether or not data are assimilated sequentially or simultaneously. The current paper is concerned with analytical investigations into this issue. We carefully select one sequential and one simultaneous JS method for the comparison. We also design a corresponding pair of pure parameter estimation methods, and we show how the JS methods and the parameter estimation methods are pairwise related. It is shown that the sequential and the simultaneous parameter estimation methods are equivalent for one particular combination of observations with different degrees of nonlinearity. Strong indications are presented for why one may expect the sequential parameter estimation method to outperform the simultaneous parameter estimation method for all other combinations of observations. Finally, the conditions for when similar relations can be expected to hold between the corresponding JS methods are discussed. A companion paper, part II (Fossum and Mannseth 2014 Inverse Problems 30 114003), is concerned with statistical analysis of results from a range of numerical experiments involving sequential and simultaneous JS estimation, where the design of the numerical investigation is motivated by our findings in the current paper. (paper)

  15. Data assimilation using a GPU accelerated path integral Monte Carlo approach

    Science.gov (United States)

    Quinn, John C.; Abarbanel, Henry D. I.

    2011-09-01

    The answers to data assimilation questions can be expressed as path integrals over all possible state and parameter histories. We show how these path integrals can be evaluated numerically using a Markov Chain Monte Carlo method designed to run in parallel on a graphics processing unit (GPU). We demonstrate the application of the method to an example with a transmembrane voltage time series of a simulated neuron as an input, and using a Hodgkin-Huxley neuron model. By taking advantage of GPU computing, we gain a parallel speedup factor of up to about 300, compared to an equivalent serial computation on a CPU, with performance increasing as the length of the observation time used for data assimilation increases.

  16. Optimization of sequential decisions by least squares Monte Carlo method

    DEFF Research Database (Denmark)

    Nishijima, Kazuyoshi; Anders, Annett

    change adaptation measures, and evacuation of people and assets in the face of an emerging natural hazard event. Focusing on the last example, an efficient solution scheme is proposed by Anders and Nishijima (2011). The proposed solution scheme takes basis in the least squares Monte Carlo method, which...... is proposed by Longstaff and Schwartz (2001) for pricing of American options. The present paper formulates the decision problem in a more general manner and explains how the solution scheme proposed by Anders and Nishijima (2011) is implemented for the optimization of the formulated decision problem...

  17. Parameter sampling capabilities of sequential and simultaneous data assimilation: II. Statistical analysis of numerical results

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess and compare parameter sampling capabilities of one sequential and one simultaneous Bayesian, ensemble-based, joint state-parameter (JS) estimation method. In the companion paper, part I (Fossum and Mannseth 2014 Inverse Problems 30 114002), analytical investigations lead us to propose three claims, essentially stating that the sequential method can be expected to outperform the simultaneous method for weakly nonlinear forward models. Here, we assess the reliability and robustness of these claims through statistical analysis of results from a range of numerical experiments. Samples generated by the two approximate JS methods are compared to samples from the posterior distribution generated by a Markov chain Monte Carlo method, using four approximate measures of distance between probability distributions. Forward-model nonlinearity is assessed from a stochastic nonlinearity measure allowing for sufficiently large model dimensions. Both toy models (with low computational complexity, and where the nonlinearity is fairly easy to control) and two-phase porous-media flow models (corresponding to down-scaled versions of problems to which the JS methods have been frequently applied recently) are considered in the numerical experiments. Results from the statistical analysis show strong support of all three claims stated in part I. (paper)

  18. Sequential assimilation of volcanic monitoring data to quantify eruption potential: Application to Kerinci volcano

    Science.gov (United States)

    Zhan, Yan; Gregg, Patricia M.; Chaussard, Estelle; Aoki, Yosuke

    2017-12-01

    Quantifying the eruption potential of a restless volcano requires the ability to model parameters such as overpressure and calculate the host rock stress state as the system evolves. A critical challenge is developing a model-data fusion framework to take advantage of observational data and provide updates of the volcanic system through time. The Ensemble Kalman Filter (EnKF) uses a Monte Carlo approach to assimilate volcanic monitoring data and update models of volcanic unrest, providing time-varying estimates of overpressure and stress. Although the EnKF has been proven effective to forecast volcanic deformation using synthetic InSAR and GPS data, until now, it has not been applied to assimilate data from an active volcanic system. In this investigation, the EnKF is used to provide a “hindcast” of the 2009 explosive eruption of Kerinci volcano, Indonesia. A two-sources analytical model is used to simulate the surface deformation of Kerinci volcano observed by InSAR time-series data and to predict the system evolution. A deep, deflating dike-like source reproduces the subsiding signal on the flanks of the volcano, and a shallow spherical McTigue source reproduces the central uplift. EnKF predicted parameters are used in finite element models to calculate the host-rock stress state prior to the 2009 eruption. Mohr-Coulomb failure models reveal that the shallow magma reservoir is trending towards tensile failure prior to 2009, which may be the catalyst for the 2009 eruption. Our results illustrate that the EnKF shows significant promise for future applications to forecasting the eruption potential of restless volcanoes and hind-cast the triggering mechanisms of observed eruptions.

  19. Sequential Assimilation of Volcanic Monitoring Data to Quantify Eruption Potential: Application to Kerinci Volcano, Sumatra

    Directory of Open Access Journals (Sweden)

    Yan Zhan

    2017-12-01

    Full Text Available Quantifying the eruption potential of a restless volcano requires the ability to model parameters such as overpressure and calculate the host rock stress state as the system evolves. A critical challenge is developing a model-data fusion framework to take advantage of observational data and provide updates of the volcanic system through time. The Ensemble Kalman Filter (EnKF uses a Monte Carlo approach to assimilate volcanic monitoring data and update models of volcanic unrest, providing time-varying estimates of overpressure and stress. Although the EnKF has been proven effective to forecast volcanic deformation using synthetic InSAR and GPS data, until now, it has not been applied to assimilate data from an active volcanic system. In this investigation, the EnKF is used to provide a “hindcast” of the 2009 explosive eruption of Kerinci volcano, Indonesia. A two-sources analytical model is used to simulate the surface deformation of Kerinci volcano observed by InSAR time-series data and to predict the system evolution. A deep, deflating dike-like source reproduces the subsiding signal on the flanks of the volcano, and a shallow spherical McTigue source reproduces the central uplift. EnKF predicted parameters are used in finite element models to calculate the host-rock stress state prior to the 2009 eruption. Mohr-Coulomb failure models reveal that the host rock around the shallow magma reservoir is trending toward tensile failure prior to 2009, which may be the catalyst for the 2009 eruption. Our results illustrate that the EnKF shows significant promise for future applications to forecasting the eruption potential of restless volcanoes and hind-cast the triggering mechanisms of observed eruptions.

  20. Monte Carlo Tree Search for Continuous and Stochastic Sequential Decision Making Problems

    International Nuclear Information System (INIS)

    Couetoux, Adrien

    2013-01-01

    In this thesis, I studied sequential decision making problems, with a focus on the unit commitment problem. Traditionally solved by dynamic programming methods, this problem is still a challenge, due to its high dimension and to the sacrifices made on the accuracy of the model to apply state of the art methods. I investigated on the applicability of Monte Carlo Tree Search methods for this problem, and other problems that are single player, stochastic and continuous sequential decision making problems. In doing so, I obtained a consistent and anytime algorithm, that can easily be combined with existing strong heuristic solvers. (author)

  1. Simulation based sequential Monte Carlo methods for discretely observed Markov processes

    OpenAIRE

    Neal, Peter

    2014-01-01

    Parameter estimation for discretely observed Markov processes is a challenging problem. However, simulation of Markov processes is straightforward using the Gillespie algorithm. We exploit this ease of simulation to develop an effective sequential Monte Carlo (SMC) algorithm for obtaining samples from the posterior distribution of the parameters. In particular, we introduce two key innovations, coupled simulations, which allow us to study multiple parameter values on the basis of a single sim...

  2. A computationally efficient Bayesian sequential simulation approach for the assimilation of vast and diverse hydrogeophysical datasets

    Science.gov (United States)

    Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus

    2016-04-01

    Bayesian sequential simulation (BSS) is a powerful geostatistical technique, which notably has shown significant potential for the assimilation of datasets that are diverse with regard to the spatial resolution and their relationship. However, these types of applications of BSS require a large number of realizations to adequately explore the solution space and to assess the corresponding uncertainties. Moreover, such simulations generally need to be performed on very fine grids in order to adequately exploit the technique's potential for characterizing heterogeneous environments. Correspondingly, the computational cost of BSS algorithms in their classical form is very high, which so far has limited an effective application of this method to large models and/or vast datasets. In this context, it is also important to note that the inherent assumption regarding the independence of the considered datasets is generally regarded as being too strong in the context of sequential simulation. To alleviate these problems, we have revisited the classical implementation of BSS and incorporated two key features to increase the computational efficiency. The first feature is a combined quadrant spiral - superblock search, which targets run-time savings on large grids and adds flexibility with regard to the selection of neighboring points using equal directional sampling and treating hard data and previously simulated points separately. The second feature is a constant path of simulation, which enhances the efficiency for multiple realizations. We have also modified the aggregation operator to be more flexible with regard to the assumption of independence of the considered datasets. This is achieved through log-linear pooling, which essentially allows for attributing weights to the various data components. Finally, a multi-grid simulating path was created to enforce large-scale variance and to allow for adapting parameters, such as, for example, the log-linear weights or the type

  3. Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution

    Science.gov (United States)

    Svensson, Andreas; Schön, Thomas B.; Lindsten, Fredrik

    2018-05-01

    Probabilistic (or Bayesian) modeling and learning offers interesting possibilities for systematic representation of uncertainty using probability theory. However, probabilistic learning often leads to computationally challenging problems. Some problems of this type that were previously intractable can now be solved on standard personal computers thanks to recent advances in Monte Carlo methods. In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful. A notoriously challenging problem, however, still occurs when the observations in the state-space model are highly informative, i.e. when there is very little or no measurement noise present, relative to the amount of process noise. The particle filter will then struggle in estimating one of the basic components for probabilistic learning, namely the likelihood p (data | parameters). To this end we suggest an algorithm which initially assumes that there is substantial amount of artificial measurement noise present. The variance of this noise is sequentially decreased in an adaptive fashion such that we, in the end, recover the original problem or possibly a very close approximation of it. The main component in our algorithm is a sequential Monte Carlo (SMC) sampler, which gives our proposed method a clear resemblance to the SMC2 method. Another natural link is also made to the ideas underlying the approximate Bayesian computation (ABC). We illustrate it with numerical examples, and in particular show promising results for a challenging Wiener-Hammerstein benchmark problem.

  4. Assessing sequential data assimilation techniques for integrating GRACE data into a hydrological model

    KAUST Repository

    Khaki, M.; Hoteit, Ibrahim; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A.; Schumacher, M.; Pattiaratchi, C.

    2017-01-01

    The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques

  5. A new moving strategy for the sequential Monte Carlo approach in optimizing the hydrological model parameters

    Science.gov (United States)

    Zhu, Gaofeng; Li, Xin; Ma, Jinzhu; Wang, Yunquan; Liu, Shaomin; Huang, Chunlin; Zhang, Kun; Hu, Xiaoli

    2018-04-01

    Sequential Monte Carlo (SMC) samplers have become increasing popular for estimating the posterior parameter distribution with the non-linear dependency structures and multiple modes often present in hydrological models. However, the explorative capabilities and efficiency of the sampler depends strongly on the efficiency in the move step of SMC sampler. In this paper we presented a new SMC sampler entitled the Particle Evolution Metropolis Sequential Monte Carlo (PEM-SMC) algorithm, which is well suited to handle unknown static parameters of hydrologic model. The PEM-SMC sampler is inspired by the works of Liang and Wong (2001) and operates by incorporating the strengths of the genetic algorithm, differential evolution algorithm and Metropolis-Hasting algorithm into the framework of SMC. We also prove that the sampler admits the target distribution to be a stationary distribution. Two case studies including a multi-dimensional bimodal normal distribution and a conceptual rainfall-runoff hydrologic model by only considering parameter uncertainty and simultaneously considering parameter and input uncertainty show that PEM-SMC sampler is generally superior to other popular SMC algorithms in handling the high dimensional problems. The study also indicated that it may be important to account for model structural uncertainty by using multiplier different hydrological models in the SMC framework in future study.

  6. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    Directory of Open Access Journals (Sweden)

    Yoo-Geun Ham

    2016-01-01

    Full Text Available This study introduces a modified version of the incremental analysis updates (IAU, called the nonstationary IAU (NIAU method, to improve the assimilation accuracy of the IAU while keeping the continuity of the analysis. Similar to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. However, unlike the IAU, the NIAU procedure uses time-evolved forcing using the forward operator as corrections to the model. The solution of the NIAU is superior to that of the forward IAU, of which analysis is performed at the beginning of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  7. astroABC : An Approximate Bayesian Computation Sequential Monte Carlo sampler for cosmological parameter estimation

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, E.; Madigan, M.

    2017-04-01

    Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrain high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC

  8. A sequential Monte Carlo model of the combined GB gas and electricity network

    International Nuclear Information System (INIS)

    Chaudry, Modassar; Wu, Jianzhong; Jenkins, Nick

    2013-01-01

    A Monte Carlo model of the combined GB gas and electricity network was developed to determine the reliability of the energy infrastructure. The model integrates the gas and electricity network into a single sequential Monte Carlo simulation. The model minimises the combined costs of the gas and electricity network, these include gas supplies, gas storage operation and electricity generation. The Monte Carlo model calculates reliability indices such as loss of load probability and expected energy unserved for the combined gas and electricity network. The intention of this tool is to facilitate reliability analysis of integrated energy systems. Applications of this tool are demonstrated through a case study that quantifies the impact on the reliability of the GB gas and electricity network given uncertainties such as wind variability, gas supply availability and outages to energy infrastructure assets. Analysis is performed over a typical midwinter week on a hypothesised GB gas and electricity network in 2020 that meets European renewable energy targets. The efficacy of doubling GB gas storage capacity on the reliability of the energy system is assessed. The results highlight the value of greater gas storage facilities in enhancing the reliability of the GB energy system given various energy uncertainties. -- Highlights: •A Monte Carlo model of the combined GB gas and electricity network was developed. •Reliability indices are calculated for the combined GB gas and electricity system. •The efficacy of doubling GB gas storage capacity on reliability of the energy system is assessed. •Integrated reliability indices could be used to assess the impact of investment in energy assets

  9. Particle rejuvenation of Rao-Blackwellized sequential Monte Carlo smoothers for conditionally linear and Gaussian models

    Science.gov (United States)

    Nguyen, Ngoc Minh; Corff, Sylvain Le; Moulines, Éric

    2017-12-01

    This paper focuses on sequential Monte Carlo approximations of smoothing distributions in conditionally linear and Gaussian state spaces. To reduce Monte Carlo variance of smoothers, it is typical in these models to use Rao-Blackwellization: particle approximation is used to sample sequences of hidden regimes while the Gaussian states are explicitly integrated conditional on the sequence of regimes and observations, using variants of the Kalman filter/smoother. The first successful attempt to use Rao-Blackwellization for smoothing extends the Bryson-Frazier smoother for Gaussian linear state space models using the generalized two-filter formula together with Kalman filters/smoothers. More recently, a forward-backward decomposition of smoothing distributions mimicking the Rauch-Tung-Striebel smoother for the regimes combined with backward Kalman updates has been introduced. This paper investigates the benefit of introducing additional rejuvenation steps in all these algorithms to sample at each time instant new regimes conditional on the forward and backward particles. This defines particle-based approximations of the smoothing distributions whose support is not restricted to the set of particles sampled in the forward or backward filter. These procedures are applied to commodity markets which are described using a two-factor model based on the spot price and a convenience yield for crude oil data.

  10. Estimation of parameters and basic reproduction ratio for Japanese encephalitis transmission in the Philippines using sequential Monte Carlo filter

    Science.gov (United States)

    We developed a sequential Monte Carlo filter to estimate the states and the parameters in a stochastic model of Japanese Encephalitis (JE) spread in the Philippines. This method is particularly important for its adaptability to the availability of new incidence data. This method can also capture the...

  11. Parameter optimisation for a better representation of drought by LSMs: inverse modelling vs. sequential data assimilation

    Science.gov (United States)

    Dewaele, Hélène; Munier, Simon; Albergel, Clément; Planque, Carole; Laanaia, Nabil; Carrer, Dominique; Calvet, Jean-Christophe

    2017-09-01

    Soil maximum available water content (MaxAWC) is a key parameter in land surface models (LSMs). However, being difficult to measure, this parameter is usually uncertain. This study assesses the feasibility of using a 15-year (1999-2013) time series of satellite-derived low-resolution observations of leaf area index (LAI) to estimate MaxAWC for rainfed croplands over France. LAI interannual variability is simulated using the CO2-responsive version of the Interactions between Soil, Biosphere and Atmosphere (ISBA) LSM for various values of MaxAWC. Optimal value is then selected by using (1) a simple inverse modelling technique, comparing simulated and observed LAI and (2) a more complex method consisting in integrating observed LAI in ISBA through a land data assimilation system (LDAS) and minimising LAI analysis increments. The evaluation of the MaxAWC estimates from both methods is done using simulated annual maximum above-ground biomass (Bag) and straw cereal grain yield (GY) values from the Agreste French agricultural statistics portal, for 45 administrative units presenting a high proportion of straw cereals. Significant correlations (p value Bag and GY are found for up to 36 and 53 % of the administrative units for the inverse modelling and LDAS tuning methods, respectively. It is found that the LDAS tuning experiment gives more realistic values of MaxAWC and maximum Bag than the inverse modelling experiment. Using undisaggregated LAI observations leads to an underestimation of MaxAWC and maximum Bag in both experiments. Median annual maximum values of disaggregated LAI observations are found to correlate very well with MaxAWC.

  12. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  13. Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis

    Science.gov (United States)

    Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.

    As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.

  14. Sequential Markov chain Monte Carlo filter with simultaneous model selection for electrocardiogram signal modeling.

    Science.gov (United States)

    Edla, Shwetha; Kovvali, Narayan; Papandreou-Suppappola, Antonia

    2012-01-01

    Constructing statistical models of electrocardiogram (ECG) signals, whose parameters can be used for automated disease classification, is of great importance in precluding manual annotation and providing prompt diagnosis of cardiac diseases. ECG signals consist of several segments with different morphologies (namely the P wave, QRS complex and the T wave) in a single heart beat, which can vary across individuals and diseases. Also, existing statistical ECG models exhibit a reliance upon obtaining a priori information from the ECG data by using preprocessing algorithms to initialize the filter parameters, or to define the user-specified model parameters. In this paper, we propose an ECG modeling technique using the sequential Markov chain Monte Carlo (SMCMC) filter that can perform simultaneous model selection, by adaptively choosing from different representations depending upon the nature of the data. Our results demonstrate the ability of the algorithm to track various types of ECG morphologies, including intermittently occurring ECG beats. In addition, we use the estimated model parameters as the feature set to classify between ECG signals with normal sinus rhythm and four different types of arrhythmia.

  15. Improving Robustness of Hydrologic Ensemble Predictions Through Probabilistic Pre- and Post-Processing in Sequential Data Assimilation

    Science.gov (United States)

    Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.

    2018-03-01

    Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.

  16. Sequential Monte Carlo simulation of collision risk in free flight air traffic

    NARCIS (Netherlands)

    Blom, H.A.P.; Bakker, G.; Krystul, J.; Everdij, M.H.C.; Klein Obbink, B.; Klompstra, M.B.

    2005-01-01

    Within HYBRIDGE a novel approach in speeding up Monte Carlo simulation of rare events has been developed. In the current report this method is extended for application to simulating collisions with a stochastic dynamical model of an air traffic operational concept. Subsequently this extended Monte

  17. Enhancing hydrologic data assimilation by evolutionary Particle Filter and Markov Chain Monte Carlo

    Science.gov (United States)

    Abbaszadeh, Peyman; Moradkhani, Hamid; Yan, Hongxiang

    2018-01-01

    Particle Filters (PFs) have received increasing attention by researchers from different disciplines including the hydro-geosciences, as an effective tool to improve model predictions in nonlinear and non-Gaussian dynamical systems. The implication of dual state and parameter estimation using the PFs in hydrology has evolved since 2005 from the PF-SIR (sampling importance resampling) to PF-MCMC (Markov Chain Monte Carlo), and now to the most effective and robust framework through evolutionary PF approach based on Genetic Algorithm (GA) and MCMC, the so-called EPFM. In this framework, the prior distribution undergoes an evolutionary process based on the designed mutation and crossover operators of GA. The merit of this approach is that the particles move to an appropriate position by using the GA optimization and then the number of effective particles is increased by means of MCMC, whereby the particle degeneracy is avoided and the particle diversity is improved. In this study, the usefulness and effectiveness of the proposed EPFM is investigated by applying the technique on a conceptual and highly nonlinear hydrologic model over four river basins located in different climate and geographical regions of the United States. Both synthetic and real case studies demonstrate that the EPFM improves both the state and parameter estimation more effectively and reliably as compared with the PF-MCMC.

  18. Registration of 3D FMT and CT Images of Mouse via Affine Transformation using Sequential Monte Carlo

    International Nuclear Information System (INIS)

    Xia Zheng; Zhou Xiaobo; Wong, Stephen T. C.; Sun Youxian

    2007-01-01

    It is difficult to directly co-register the 3D FMT (Fluorescence Molecular Tomography) image of a small tumor in a mouse whose maximal diameter is only a few mm with a larger CT image of the entire animal that spans about ten cm. This paper proposes a new method to register 2D flat and 3D CT image first to facilitate the registration between small 3D FMT images and large CT images. A novel algorithm based on SMC (Sequential Monte Carlo) incorporated with least square operation for the registration between the 2D flat and 3D CT images is introduced and validated with simulated images and real images of mice. The visualization of the preliminary alignment of the 3D FMT and CT image through 2D registration shows promising results

  19. Exploiting neurovascular coupling: a Bayesian sequential Monte Carlo approach applied to simulated EEG fNIRS data

    Science.gov (United States)

    Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria

    2017-08-01

    Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.

  20. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  1. Sequential Monte Carlo filter for state estimation of LiFePO4 batteries based on an online updated model

    Science.gov (United States)

    Li, Jiahao; Klee Barillas, Joaquin; Guenther, Clemens; Danzer, Michael A.

    2014-02-01

    Battery state monitoring is one of the key techniques in battery management systems e.g. in electric vehicles. An accurate estimation can help to improve the system performance and to prolong the battery remaining useful life. Main challenges for the state estimation for LiFePO4 batteries are the flat characteristic of open-circuit-voltage over battery state of charge (SOC) and the existence of hysteresis phenomena. Classical estimation approaches like Kalman filtering show limitations to handle nonlinear and non-Gaussian error distribution problems. In addition, uncertainties in the battery model parameters must be taken into account to describe the battery degradation. In this paper, a novel model-based method combining a Sequential Monte Carlo filter with adaptive control to determine the cell SOC and its electric impedance is presented. The applicability of this dual estimator is verified using measurement data acquired from a commercial LiFePO4 cell. Due to a better handling of the hysteresis problem, results show the benefits of the proposed method against the estimation with an Extended Kalman filter.

  2. Non-Pilot-Aided Sequential Monte Carlo Method to Joint Signal, Phase Noise, and Frequency Offset Estimation in Multicarrier Systems

    Directory of Open Access Journals (Sweden)

    Christelle Garnier

    2008-05-01

    Full Text Available We address the problem of phase noise (PHN and carrier frequency offset (CFO mitigation in multicarrier receivers. In multicarrier systems, phase distortions cause two effects: the common phase error (CPE and the intercarrier interference (ICI which severely degrade the accuracy of the symbol detection stage. Here, we propose a non-pilot-aided scheme to jointly estimate PHN, CFO, and multicarrier signal in time domain. Unlike existing methods, non-pilot-based estimation is performed without any decision-directed scheme. Our approach to the problem is based on Bayesian estimation using sequential Monte Carlo filtering commonly referred to as particle filtering. The particle filter is efficiently implemented by combining the principles of the Rao-Blackwellization technique and an approximate optimal importance function for phase distortion sampling. Moreover, in order to fully benefit from time-domain processing, we propose a multicarrier signal model which includes the redundancy information induced by the cyclic prefix, thus leading to a significant performance improvement. Simulation results are provided in terms of bit error rate (BER and mean square error (MSE to illustrate the efficiency and the robustness of the proposed algorithm.

  3. A coherent structure approach for parameter estimation in Lagrangian Data Assimilation

    Science.gov (United States)

    Maclean, John; Santitissadeekorn, Naratip; Jones, Christopher K. R. T.

    2017-12-01

    We introduce a data assimilation method to estimate model parameters with observations of passive tracers by directly assimilating Lagrangian Coherent Structures. Our approach differs from the usual Lagrangian Data Assimilation approach, where parameters are estimated based on tracer trajectories. We employ the Approximate Bayesian Computation (ABC) framework to avoid computing the likelihood function of the coherent structure, which is usually unavailable. We solve the ABC by a Sequential Monte Carlo (SMC) method, and use Principal Component Analysis (PCA) to identify the coherent patterns from tracer trajectory data. Our new method shows remarkably improved results compared to the bootstrap particle filter when the physical model exhibits chaotic advection.

  4. Application and Evaluation of a Snowmelt Runoff Model in the Tamor River Basin, Eastern Himalaya Using a Markov Chain Monte Carlo (MCMC) Data Assimilation Approach

    Science.gov (United States)

    Panday, Prajjwal K.; Williams, Christopher A.; Frey, Karen E.; Brown, Molly E.

    2013-01-01

    Previous studies have drawn attention to substantial hydrological changes taking place in mountainous watersheds where hydrology is dominated by cryospheric processes. Modelling is an important tool for understanding these changes but is particularly challenging in mountainous terrain owing to scarcity of ground observations and uncertainty of model parameters across space and time. This study utilizes a Markov Chain Monte Carlo data assimilation approach to examine and evaluate the performance of a conceptual, degree-day snowmelt runoff model applied in the Tamor River basin in the eastern Nepalese Himalaya. The snowmelt runoff model is calibrated using daily streamflow from 2002 to 2006 with fairly high accuracy (average Nash-Sutcliffe metric approx. 0.84, annual volume bias runoff in the Tamor River basin for the 2002-2006 period is estimated to be 29.7+/-2.9% (which includes 4.2+/-0.9% from snowfall that promptly melts), whereas 70.3+/-2.6% is attributed to contributions from rainfall. On average, the elevation zone in the 4000-5500m range contributes the most to basin runoff, averaging 56.9+/-3.6% of all snowmelt input and 28.9+/-1.1% of all rainfall input to runoff. Model simulated streamflow using an interpolated precipitation data set decreases the fractional contribution from rainfall versus snowmelt compared with simulations using observed station precipitation. Model experiments indicate that the hydrograph itself does not constrain estimates of snowmelt versus rainfall contributions to total outflow but that this derives from the degree-day melting model. Lastly, we demonstrate that the data assimilation approach is useful for quantifying and reducing uncertainty related to model parameters and thus provides uncertainty bounds on snowmelt and rainfall contributions in such mountainous watersheds.

  5. Sequential assimilation of satellite-derived vegetation and soil moisture products using SURFEX_v8.0: LDAS-Monde assessment over the Euro-Mediterranean area

    Science.gov (United States)

    Albergel, Clément; Munier, Simon; Leroux, Delphine Jennifer; Dewaele, Hélène; Fairbairn, David; Lavinia Barbu, Alina; Gelati, Emiliano; Dorigo, Wouter; Faroux, Stéphanie; Meurey, Catherine; Le Moigne, Patrick; Decharme, Bertrand; Mahfouf, Jean-Francois; Calvet, Jean-Christophe

    2017-10-01

    In this study, a global land data assimilation system (LDAS-Monde) is applied over Europe and the Mediterranean basin to increase monitoring accuracy for land surface variables. LDAS-Monde is able to ingest information from satellite-derived surface soil moisture (SSM) and leaf area index (LAI) observations to constrain the interactions between soil-biosphere-atmosphere (ISBA, Interactions between Soil, Biosphere and Atmosphere) land surface model (LSM) coupled with the CNRM (Centre National de Recherches Météorologiques) version of the Total Runoff Integrating Pathways (ISBA-CTRIP) continental hydrological system. It makes use of the CO2-responsive version of ISBA which models leaf-scale physiological processes and plant growth. Transfer of water and heat in the soil rely on a multilayer diffusion scheme. SSM and LAI observations are assimilated using a simplified extended Kalman filter (SEKF), which uses finite differences from perturbed simulations to generate flow dependence between the observations and the model control variables. The latter include LAI and seven layers of soil (from 1 to 100 cm depth). A sensitivity test of the Jacobians over 2000-2012 exhibits effects related to both depth and season. It also suggests that observations of both LAI and SSM have an impact on the different control variables. From the assimilation of SSM, the LDAS is more effective in modifying soil moisture (SM) from the top layers of soil, as model sensitivity to SSM decreases with depth and has almost no impact from 60 cm downwards. From the assimilation of LAI, a strong impact on LAI itself is found. The LAI assimilation impact is more pronounced in SM layers that contain the highest fraction of roots (from 10 to 60 cm). The assimilation is more efficient in summer and autumn than in winter and spring. Results shows that the LDAS works well constraining the model to the observations and that stronger corrections are applied to LAI than to SM. A comprehensive evaluation of

  6. The metabolic network of Clostridium acetobutylicum: Comparison of the approximate Bayesian computation via sequential Monte Carlo (ABC-SMC) and profile likelihood estimation (PLE) methods for determinability analysis.

    Science.gov (United States)

    Thorn, Graeme J; King, John R

    2016-01-01

    The Gram-positive bacterium Clostridium acetobutylicum is an anaerobic endospore-forming species which produces acetone, butanol and ethanol via the acetone-butanol (AB) fermentation process, leading to biofuels including butanol. In previous work we looked to estimate the parameters in an ordinary differential equation model of the glucose metabolism network using data from pH-controlled continuous culture experiments. Here we combine two approaches, namely the approximate Bayesian computation via an existing sequential Monte Carlo (ABC-SMC) method (to compute credible intervals for the parameters), and the profile likelihood estimation (PLE) (to improve the calculation of confidence intervals for the same parameters), the parameters in both cases being derived from experimental data from forward shift experiments. We also apply the ABC-SMC method to investigate which of the models introduced previously (one non-sporulation and four sporulation models) have the greatest strength of evidence. We find that the joint approximate posterior distribution of the parameters determines the same parameters as previously, including all of the basal and increased enzyme production rates and enzyme reaction activity parameters, as well as the Michaelis-Menten kinetic parameters for glucose ingestion, while other parameters are not as well-determined, particularly those connected with the internal metabolites acetyl-CoA, acetoacetyl-CoA and butyryl-CoA. We also find that the approximate posterior is strongly non-Gaussian, indicating that our previous assumption of elliptical contours of the distribution is not valid, which has the effect of reducing the numbers of pairs of parameters that are (linearly) correlated with each other. Calculations of confidence intervals using the PLE method back this up. Finally, we find that all five of our models are equally likely, given the data available at present. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Data Assimilation in Marine Models

    DEFF Research Database (Denmark)

    Frydendall, Jan

    maximum likelihood framework. These issues are discussed in paper B. The third part of the thesis falls a bit out of the above context is work published in papers C, F. In the first paper, a simple data assimilation scheme was investigated to examine the potential benefits of incorporating a data......This thesis consists of six research papers published or submitted for publication in the period 2006-2009 together with a summary report. The main topics of this thesis are nonlinear data assimilation techniques and estimation in dynamical models. The focus has been on the nonlinear filtering...... techniques for large scale geophysical numerical models and making them feasible to work with in the data assimilation framework. The filtering techniques investigated are all Monte Carlo simulation based. Some very nice features that can be exploited in the Monte Carlo based data assimilation framework from...

  8. Sequential Monte Carlo Instant Radiosity.

    Science.gov (United States)

    Hedman, Peter; Karras, Tero; Lehtinen, Jaakko

    2017-05-01

    Instant Radiosity and its derivatives are interactive methods for efficiently estimating global (indirect) illumination. They represent the last indirect bounce of illumination before the camera as the composite radiance field emitted by a set of virtual point light sources (VPLs). In complex scenes, current algorithms suffer from a difficult combination of two issues: it remains a challenge to distribute VPLs in a manner that simultaneously gives a high-quality indirect illumination solution for each frame, and to do so in a temporally coherent manner. We address both issues by building, and maintaining over time, an adaptive and temporally coherent distribution of VPLs in locations where they bring indirect light to the image. We introduce a novel heuristic sampling method that strives to only move as few of the VPLs between frames as possible. The result is, to the best of our knowledge, the first interactive global illumination algorithm that works in complex, highly-occluded scenes, suffers little from temporal flickering, supports moving cameras and light sources, and is output-sensitive in the sense that it places VPLs in locations that matter most to the final result.

  9. SU-F-SPS-02: Accuracy of the Small Field Dosimetry Using the Monte Carlo and Sequential Dose Calculation Algorithms of Multiplan Treatment Planning System Within and Beyond Heterogeneous Media for Cyberknife M6 Unit

    Energy Technology Data Exchange (ETDEWEB)

    Serin, E.; Codel, G.; Mabhouti, H.; Cebe, M.; Sanli, E.; Pacaci, P.; Kucuk, N.; Kucukmorkoc, E.; Doyuran, M.; Canoglu, D.; Altinok, A.; Acar, H.; Caglar Ozkok, H. [Medipol University, Istanbul, Istanbul (Turkey)

    2016-06-15

    Purpose: In small field geometries, the electronic equilibrium can be lost, making it challenging for the dose-calculation algorithm to accurately predict the dose, especially in the presence of tissue heterogeneities. In this study, dosimetric accuracy of Monte Carlo (MC) advanced dose calculation and sequential algorithms of Multiplan treatment planning system were investigated for small radiation fields incident on homogeneous and heterogeneous geometries. Methods: Small open fields of fixed cones of Cyberknife M6 unit 100 to 500 mm2 were used for this study. The fields were incident on in house phantom containing lung, air, and bone inhomogeneities and also homogeneous phantom. Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of MC and sequential algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement between two algorithms and film measurement For MC algorithm, the minimum gamma analysis passing rates between measured and calculated dose distributions were 99.7% and 98.3% for homogeneous and inhomogeneous fields in the case of lung and bone respectively. For sequential algorithm, the minimum gamma analysis passing rates were 98.9% and 92.5% for for homogeneous and inhomogeneous fields respectively for used all cone sizes. In the case of the air heterogeneity, the differences were larger for both calculation algorithms. Overall, when compared to measurement, the MC had better agreement than sequential algorithm. Conclusion: The Monte Carlo calculation algorithm in the Multiplan treatment planning system is an improvement over the existing sequential algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.

  10. SU-F-SPS-02: Accuracy of the Small Field Dosimetry Using the Monte Carlo and Sequential Dose Calculation Algorithms of Multiplan Treatment Planning System Within and Beyond Heterogeneous Media for Cyberknife M6 Unit

    International Nuclear Information System (INIS)

    Serin, E.; Codel, G.; Mabhouti, H.; Cebe, M.; Sanli, E.; Pacaci, P.; Kucuk, N.; Kucukmorkoc, E.; Doyuran, M.; Canoglu, D.; Altinok, A.; Acar, H.; Caglar Ozkok, H.

    2016-01-01

    Purpose: In small field geometries, the electronic equilibrium can be lost, making it challenging for the dose-calculation algorithm to accurately predict the dose, especially in the presence of tissue heterogeneities. In this study, dosimetric accuracy of Monte Carlo (MC) advanced dose calculation and sequential algorithms of Multiplan treatment planning system were investigated for small radiation fields incident on homogeneous and heterogeneous geometries. Methods: Small open fields of fixed cones of Cyberknife M6 unit 100 to 500 mm2 were used for this study. The fields were incident on in house phantom containing lung, air, and bone inhomogeneities and also homogeneous phantom. Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of MC and sequential algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement between two algorithms and film measurement For MC algorithm, the minimum gamma analysis passing rates between measured and calculated dose distributions were 99.7% and 98.3% for homogeneous and inhomogeneous fields in the case of lung and bone respectively. For sequential algorithm, the minimum gamma analysis passing rates were 98.9% and 92.5% for for homogeneous and inhomogeneous fields respectively for used all cone sizes. In the case of the air heterogeneity, the differences were larger for both calculation algorithms. Overall, when compared to measurement, the MC had better agreement than sequential algorithm. Conclusion: The Monte Carlo calculation algorithm in the Multiplan treatment planning system is an improvement over the existing sequential algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.

  11. Advances in sequential data assimilation and numerical weather forecasting: An Ensemble Transform Kalman-Bucy Filter, a study on clustering in deterministic ensemble square root filters, and a test of a new time stepping scheme in an atmospheric model

    Science.gov (United States)

    Amezcua, Javier

    This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn't represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion

  12. Regional Ocean Data Assimilation

    KAUST Repository

    Edwards, Christopher A.

    2015-01-03

    This article reviews the past 15 years of developments in regional ocean data assimilation. A variety of scientific, management, and safety-related objectives motivate marine scientists to characterize many ocean environments, including coastal regions. As in weather prediction, the accurate representation of physical, chemical, and/or biological properties in the ocean is challenging. Models and observations alone provide imperfect representations of the ocean state, but together they can offer improved estimates. Variational and sequential methods are among the most widely used in regional ocean systems, and there have been exciting recent advances in ensemble and four-dimensional variational approaches. These techniques are increasingly being tested and adapted for biogeochemical applications.

  13. Sequential assimilation of satellite-derived vegetation and soil moisture products using SURFEX_v8.0: LDAS-Monde assessment over the Euro-Mediterranean area

    Directory of Open Access Journals (Sweden)

    C. Albergel

    2017-10-01

    Full Text Available In this study, a global land data assimilation system (LDAS-Monde is applied over Europe and the Mediterranean basin to increase monitoring accuracy for land surface variables. LDAS-Monde is able to ingest information from satellite-derived surface soil moisture (SSM and leaf area index (LAI observations to constrain the interactions between soil–biosphere–atmosphere (ISBA, Interactions between Soil, Biosphere and Atmosphere land surface model (LSM coupled with the CNRM (Centre National de Recherches Météorologiques version of the Total Runoff Integrating Pathways (ISBA-CTRIP continental hydrological system. It makes use of the CO2-responsive version of ISBA which models leaf-scale physiological processes and plant growth. Transfer of water and heat in the soil rely on a multilayer diffusion scheme. SSM and LAI observations are assimilated using a simplified extended Kalman filter (SEKF, which uses finite differences from perturbed simulations to generate flow dependence between the observations and the model control variables. The latter include LAI and seven layers of soil (from 1 to 100 cm depth. A sensitivity test of the Jacobians over 2000–2012 exhibits effects related to both depth and season. It also suggests that observations of both LAI and SSM have an impact on the different control variables. From the assimilation of SSM, the LDAS is more effective in modifying soil moisture (SM from the top layers of soil, as model sensitivity to SSM decreases with depth and has almost no impact from 60 cm downwards. From the assimilation of LAI, a strong impact on LAI itself is found. The LAI assimilation impact is more pronounced in SM layers that contain the highest fraction of roots (from 10 to 60 cm. The assimilation is more efficient in summer and autumn than in winter and spring. Results shows that the LDAS works well constraining the model to the observations and that stronger corrections are applied to LAI than to SM. A

  14. Sequential Banking.

    OpenAIRE

    Bizer, David S; DeMarzo, Peter M

    1992-01-01

    The authors study environments in which agents may borrow sequentially from more than one leader. Although debt is prioritized, additional lending imposes an externality on prior debt because, with moral hazard, the probability of repayment of prior loans decreases. Equilibrium interest rates are higher than they would be if borrowers could commit to borrow from at most one bank. Even though the loan terms are less favorable than they would be under commitment, the indebtedness of borrowers i...

  15. A Particle Smoother with Sequential Importance Resampling for soil hydraulic parameter estimation: A lysimeter experiment

    Science.gov (United States)

    Montzka, Carsten; Hendricks Franssen, Harrie-Jan; Moradkhani, Hamid; Pütz, Thomas; Han, Xujun; Vereecken, Harry

    2013-04-01

    An adequate description of soil hydraulic properties is essential for a good performance of hydrological forecasts. So far, several studies showed that data assimilation could reduce the parameter uncertainty by considering soil moisture observations. However, these observations and also the model forcings were recorded with a specific measurement error. It seems a logical step to base state updating and parameter estimation on observations made at multiple time steps, in order to reduce the influence of outliers at single time steps given measurement errors and unknown model forcings. Such outliers could result in erroneous state estimation as well as inadequate parameters. This has been one of the reasons to use a smoothing technique as implemented for Bayesian data assimilation methods such as the Ensemble Kalman Filter (i.e. Ensemble Kalman Smoother). Recently, an ensemble-based smoother has been developed for state update with a SIR particle filter. However, this method has not been used for dual state-parameter estimation. In this contribution we present a Particle Smoother with sequentially smoothing of particle weights for state and parameter resampling within a time window as opposed to the single time step data assimilation used in filtering techniques. This can be seen as an intermediate variant between a parameter estimation technique using global optimization with estimation of single parameter sets valid for the whole period, and sequential Monte Carlo techniques with estimation of parameter sets evolving from one time step to another. The aims are i) to improve the forecast of evaporation and groundwater recharge by estimating hydraulic parameters, and ii) to reduce the impact of single erroneous model inputs/observations by a smoothing method. In order to validate the performance of the proposed method in a real world application, the experiment is conducted in a lysimeter environment.

  16. Data Assimilation - Advances and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batch sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.

  17. Implicit particle filtering for models with partial noise, and an application to geomagnetic data assimilation

    Directory of Open Access Journals (Sweden)

    M. Morzfeld

    2012-06-01

    Full Text Available Implicit particle filtering is a sequential Monte Carlo method for data assimilation, designed to keep the number of particles manageable by focussing attention on regions of large probability. These regions are found by minimizing, for each particle, a scalar function F of the state variables. Some previous implementations of the implicit filter rely on finding the Hessians of these functions. The calculation of the Hessians can be cumbersome if the state dimension is large or if the underlying physics are such that derivatives of F are difficult to calculate, as happens in many geophysical applications, in particular in models with partial noise, i.e. with a singular state covariance matrix. Examples of models with partial noise include models where uncertain dynamic equations are supplemented by conservation laws with zero uncertainty, or with higher order (in time stochastic partial differential equations (PDE or with PDEs driven by spatially smooth noise processes. We make the implicit particle filter applicable to such situations by combining gradient descent minimization with random maps and show that the filter is efficient, accurate and reliable because it operates in a subspace of the state space. As an example, we consider a system of nonlinear stochastic PDEs that is of importance in geomagnetic data assimilation.

  18. Hindcasting and Forecasting of Surface Flow Fields through Assimilating High Frequency Remotely Sensing Radar Data

    Directory of Open Access Journals (Sweden)

    Lei Ren

    2017-09-01

    Full Text Available In order to improve the forecasting ability of numerical models, a sequential data assimilation scheme, nudging, was applied to blend remotely sensing high-frequency (HF radar surface currents with results from a three-dimensional numerical, EFDC (Environmental Fluid Dynamics Code model. For the first time, this research presents the most appropriate nudging parameters, which were determined from sensitivity experiments. To examine the influence of data assimilation cycle lengths on forecasts and to extend forecasting improvements, the duration of data assimilation cycles was studied through assimilating linearly interpolated temporal radar data. Data assimilation nudging parameters have not been previously analyzed. Assimilation of HF radar measurements at each model computational timestep outperformed those assimilation models using longer data assimilation cycle lengths; root-mean-square error (RMSE values of both surface velocity components during a 12 h model forecasting period indicated that surface flow fields were significantly improved when implementing nudging assimilation at each model computational timestep. The Data Assimilation Skill Score (DASS technique was used to quantitatively evaluate forecast improvements. The averaged values of DASS over the data assimilation domain were 26% and 33% for east–west and north–south velocity components, respectively, over the half-day forecasting period. Correlation of Averaged Kinetic Energy (AKE was improved by more than 10% in the best data assimilation model. Time series of velocity components and surface flow fields were presented to illustrate the improvement resulting from data assimilation application over time.

  19. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  20. Conditions for successful data assimilation

    Science.gov (United States)

    Morzfeld, M.; Chorin, A. J.

    2013-12-01

    Many applications in science and engineering require that the predictions of uncertain models be updated by information from a stream of noisy data. The model and the data jointly define a conditional probability density function (pdf), which contains all the information one has about the process of interest and various numerical methods can be used to study and approximate this pdf, e.g. the Kalman filter, variational methods or particle filters. Given a model and data, each of these algorithms will produce a result. We are interested in the conditions under which this result is reasonable, i.e. consistent with the real-life situation one is modeling. In particular, we show, using idealized models, that numerical data assimilation is feasible in principle only if a suitably defined effective dimension of the problem is not excessive. This effective dimension depends on the noise in the model and the data, and in physically reasonable problems it can be moderate even when the number of variables is huge. In particular, we find that the effective dimension being moderate induces a balance condition between the noises in the model and the data; this balance condition is often satisfied in realistic applications or else the noise levels are excessive and drown the underlying signal. We also study the effects of the effective dimension on particle filters in two instances, one in which the importance function is based on the model alone, and one in which it is based on both the model and the data. We have three main conclusions: (1) the stability (i.e., non-collapse of weights) in particle filtering depends on the effective dimension of the problem. Particle filters can work well if the effective dimension is moderate even if the true dimension is large (which we expect to happen often in practice). (2) A suitable choice of importance function is essential, or else particle filtering fails even when data assimilation is feasible in principle with a sequential algorithm

  1. Regional Ocean Data Assimilation

    KAUST Repository

    Edwards, Christopher A.; Moore, Andrew M.; Hoteit, Ibrahim; Cornuelle, Bruce D.

    2015-01-01

    This article reviews the past 15 years of developments in regional ocean data assimilation. A variety of scientific, management, and safety-related objectives motivate marine scientists to characterize many ocean environments, including coastal

  2. Assimilation of stratospheric ozone in the chemical transport model STRATAQ

    Directory of Open Access Journals (Sweden)

    B. Grassi

    2004-09-01

    Full Text Available We describe a sequential assimilation approach useful for assimilating tracer measurements into a three-dimensional chemical transport model (CTM of the stratosphere. The numerical code, developed largely according to Kha00, uses parameterizations and simplifications allowing assimilation of sparse observations and the simultaneous evaluation of analysis errors, with reasonable computational requirements. Assimilation parameters are set by using χ2 and OmF (Observation minus Forecast statistics. The CTM used here is a high resolution three-dimensional model. It includes a detailed chemical package and is driven by UKMO (United Kingdom Meteorological Office analyses. We illustrate the method using assimilation of Upper Atmosphere Research Satellite/Microwave Limb Sounder (UARS/MLS ozone observations for three weeks during the 1996 antarctic spring. The comparison of results from the simulations with TOMS (Total Ozone Mapping Spectrometer measurements shows improved total ozone fields due to assimilation of MLS observations. Moreover, the assimilation gives indications on a possible model weakness in reproducing polar ozone values during springtime.

  3. Assimilation of stratospheric ozone in the chemical transport model STRATAQ

    Directory of Open Access Journals (Sweden)

    B. Grassi

    2004-09-01

    Full Text Available We describe a sequential assimilation approach useful for assimilating tracer measurements into a three-dimensional chemical transport model (CTM of the stratosphere. The numerical code, developed largely according to Kha00, uses parameterizations and simplifications allowing assimilation of sparse observations and the simultaneous evaluation of analysis errors, with reasonable computational requirements. Assimilation parameters are set by using χ2 and OmF (Observation minus Forecast statistics. The CTM used here is a high resolution three-dimensional model. It includes a detailed chemical package and is driven by UKMO (United Kingdom Meteorological Office analyses. We illustrate the method using assimilation of Upper Atmosphere Research Satellite/Microwave Limb Sounder (UARS/MLS ozone observations for three weeks during the 1996 antarctic spring. The comparison of results from the simulations with TOMS (Total Ozone Mapping Spectrometer measurements shows improved total ozone fields due to assimilation of MLS observations. Moreover, the assimilation gives indications on a possible model weakness in reproducing polar ozone values during springtime.

  4. Data Assimilation with Optimal Maps

    Science.gov (United States)

    El Moselhy, T.; Marzouk, Y.

    2012-12-01

    Tarek El Moselhy and Youssef Marzouk Massachusetts Institute of Technology We present a new approach to Bayesian inference that entirely avoids Markov chain simulation and sequential importance resampling, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. The map is written as a multivariate polynomial expansion and computed efficiently through the solution of a stochastic optimization problem. While our previous work [1] focused on static Bayesian inference problems, we now extend the map-based approach to sequential data assimilation, i.e., nonlinear filtering and smoothing. One scheme involves pushing forward a fixed reference measure to each filtered state distribution, while an alternative scheme computes maps that push forward the filtering distribution from one stage to the other. We compare the performance of these schemes and extend the former to problems of smoothing, using a map implementation of the forward-backward smoothing formula. Advantages of a map-based representation of the filtering and smoothing distributions include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent uniformly-weighted posterior samples without additional evaluations of the dynamical model. Perhaps the main advantage, however, is that the map approach inherently avoids issues of sample impoverishment, since it explicitly represents the posterior as the pushforward of a reference measure, rather than with a particular set of samples. The computational complexity of our algorithm is comparable to state-of-the-art particle filters. Moreover, the accuracy of the approach is controlled via the convergence criterion of the underlying optimization problem. We demonstrate the efficiency and accuracy of the map approach via data assimilation in

  5. Land Surface Data Assimilation

    Science.gov (United States)

    Houser, P. R.

    2012-12-01

    Information about land surface water, energy and carbon conditions is of critical importance to real-world applications such as agricultural production, water resource management, flood prediction, water supply, weather and climate forecasting, and environmental preservation. While ground-based observational networks are improving, the only practical way to observe these land surface states on continental to global scales is via satellites. Remote sensing can make spatially comprehensive measurements of various components of the terrestrial system, but it cannot provide information on the entire system (e.g. evaporation), and the observations represent only an instant in time. Land surface process models may be used to predict temporal and spatial terrestrial dynamics, but these predictions are often poor, due to model initialization, parameter and forcing, and physics errors. Therefore, an attractive prospect is to combine the strengths of land surface models and observations (and minimize the weaknesses) to provide a superior terrestrial state estimate. This is the goal of land surface data assimilation. Data Assimilation combines observations into a dynamical model, using the model's equations to provide time continuity and coupling between the estimated fields. Land surface data assimilation aims to utilize both our land surface process knowledge, as embodied in a land surface model, and information that can be gained from observations. Both model predictions and observations are imperfect and we wish to use both synergistically to obtain a more accurate result. Moreover, both contain different kinds of information, that when used together, provide an accuracy level that cannot be obtained individually. Model biases can be mitigated using a complementary calibration and parameterization process. Limited point measurements are often used to calibrate the model(s) and validate the assimilation results. This presentation will provide a brief background on land

  6. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay; Law, Kody; Suciu, Carina

    2017-01-01

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  7. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay

    2017-04-24

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  8. Data Assimilation in Forest Inventory: First Empirical Results

    Directory of Open Access Journals (Sweden)

    Mattias Nyström

    2015-12-01

    Full Text Available Data assimilation techniques were used to estimate forest stand data in 2011 by sequentially combining remote sensing based estimates of forest variables with predictions from growth models. Estimates of stand data, based on canopy height models obtained from image matching of digital aerial images at six different time-points between 2003 and 2011, served as input to the data assimilation. The assimilation routines were built on the extended Kalman filter. The study was conducted in hemi-boreal forest at the Remningstorp test site in southern Sweden (lat. 13°37′ N; long. 58°28′ E. The assimilation results were compared with two other methods used in practice for estimation of forest variables: the first was to use only the most recent estimate obtained from remotely sensed data (2011 and the second was to forecast the first estimate (2003 to the endpoint (2011. All three approaches were validated using nine 40 m radius validation plots, which were carefully measured in the field. The results showed that the data assimilation approach provided better results than the two alternative methods. Data assimilation of remote sensing time series has been used previously for calibrating forest ecosystem models, but, to our knowledge, this is the first study with real data where data assimilation has been used for estimating forest inventory data. The study constitutes a starting point for the development of a framework useful for sequentially utilizing all types of remote sensing data in order to provide precise and up-to-date estimates of forest stand parameters.

  9. Displacement data assimilation

    Energy Technology Data Exchange (ETDEWEB)

    Rosenthal, W. Steven [Pacific Northwest Laboratory, Richland, WA 99354 (United States); Venkataramani, Shankar [Department of Mathematics and Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721 (United States); Mariano, Arthur J. [Rosenstiel School of Marine & Atmospheric Science, University of Miami, Miami, FL 33149 (United States); Restrepo, Juan M., E-mail: restrepo@math.oregonstate.edu [Department of Mathematics, Oregon State University, Corvallis, OR 97331 (United States)

    2017-02-01

    We show that modifying a Bayesian data assimilation scheme by incorporating kinematically-consistent displacement corrections produces a scheme that is demonstrably better at estimating partially observed state vectors in a setting where feature information is important. While the displacement transformation is generic, here we implement it within an ensemble Kalman Filter framework and demonstrate its effectiveness in tracking stochastically perturbed vortices.

  10. Improving operational flood forecasting through data assimilation

    Science.gov (United States)

    Rakovec, Oldrich; Weerts, Albrecht; Uijlenhoet, Remko; Hazenberg, Pieter; Torfs, Paul

    2010-05-01

    Accurate flood forecasts have been a challenging topic in hydrology for decades. Uncertainty in hydrological forecasts is due to errors in initial state (e.g. forcing errors in historical mode), errors in model structure and parameters and last but not least the errors in model forcings (weather forecasts) during the forecast mode. More accurate flood forecasts can be obtained through data assimilation by merging observations with model simulations. This enables to identify the sources of uncertainties in the flood forecasting system. Our aim is to assess the different sources of error that affect the initial state and to investigate how they propagate through hydrological models with different levels of spatial variation, starting from lumped models. The knowledge thus obtained can then be used in a data assimilation scheme to improve the flood forecasts. This study presents the first results of this framework and focuses on quantifying precipitation errors and its effect on discharge simulations within the Ourthe catchment (1600 km2), which is situated in the Belgian Ardennes and is one of the larger subbasins of the Meuse River. Inside the catchment, hourly rain gauge information from 10 different locations is available over a period of 15 years. Based on these time series, the bootstrap method has been applied to generate precipitation ensembles. These were then used to simulate the catchment's discharges at the outlet. The corresponding streamflow ensembles were further assimilated with observed river discharges to update the model states of lumped hydrological models (R-PDM, HBV) through Residual Resampling. This particle filtering technique is a sequential data assimilation method and takes no prior assumption of the probability density function for the model states, which in contrast to the Ensemble Kalman filter does not have to be Gaussian. Our further research will be aimed at quantifying and reducing the sources of uncertainty that affect the initial

  11. Multilevel Monte Carlo in Approximate Bayesian Computation

    KAUST Repository

    Jasra, Ajay

    2017-02-13

    In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.

  12. Real-time projections of cholera outbreaks through data assimilation and rainfall forecasting

    Science.gov (United States)

    Pasetto, Damiano; Finger, Flavio; Rinaldo, Andrea; Bertuzzo, Enrico

    2017-10-01

    Although treatment for cholera is well-known and cheap, outbreaks in epidemic regions still exact high death tolls mostly due to the unpreparedness of health care infrastructures to face unforeseen emergencies. In this context, mathematical models for the prediction of the evolution of an ongoing outbreak are of paramount importance. Here, we test a real-time forecasting framework that readily integrates new information as soon as available and periodically issues an updated forecast. The spread of cholera is modeled by a spatially-explicit scheme that accounts for the dynamics of susceptible, infected and recovered individuals hosted in different local communities connected through hydrologic and human mobility networks. The framework presents two major innovations for cholera modeling: the use of a data assimilation technique, specifically an ensemble Kalman filter, to update both state variables and parameters based on the observations, and the use of rainfall forecasts to force the model. The exercise of simulating the state of the system and the predictive capabilities of the novel tools, set at the initial phase of the 2010 Haitian cholera outbreak using only information that was available at that time, serves as a benchmark. Our results suggest that the assimilation procedure with the sequential update of the parameters outperforms calibration schemes based on Markov chain Monte Carlo. Moreover, in a forecasting mode the model usefully predicts the spatial incidence of cholera at least one month ahead. The performance decreases for longer time horizons yet allowing sufficient time to plan for deployment of medical supplies and staff, and to evaluate alternative strategies of emergency management.

  13. Data assimilation strategies for volcano geodesy

    Science.gov (United States)

    Zhan, Yan; Gregg, Patricia M.

    2017-09-01

    Ground deformation observed using near-real time geodetic methods, such as InSAR and GPS, can provide critical information about the evolution of a magma chamber prior to volcanic eruption. Rapid advancement in numerical modeling capabilities has resulted in a number of finite element models targeted at better understanding the connection between surface uplift associated with magma chamber pressurization and the potential for volcanic eruption. Robust model-data fusion techniques are necessary to take full advantage of the numerical models and the volcano monitoring observations currently available. In this study, we develop a 3D data assimilation framework using the Ensemble Kalman Filter (EnKF) approach in order to combine geodetic observations of surface deformation with geodynamic models to investigate volcanic unrest. The EnKF sequential assimilation method utilizes disparate data sets as they become available to update geodynamic models of magma reservoir evolution. While the EnKF has been widely applied in hydrologic and climate modeling, the adaptation for volcano monitoring is in its initial stages. As such, our investigation focuses on conducting a series of sensitivity tests to optimize the EnKF for volcano applications and on developing specific strategies for assimilation of geodetic data. Our numerical experiments illustrate that the EnKF is able to adapt well to the spatial limitations posed by GPS data and the temporal limitations of InSAR, and that specific strategies can be adopted to enhance EnKF performance to improve model forecasts. Specifically, our numerical experiments indicate that: (1) incorporating additional iterations of the EnKF analysis step is more efficient than increasing the number of ensemble members; (2) the accuracy of the EnKF results are not affected by initial parameter assumptions; (3) GPS observations near the center of uplift improve the quality of model forecasts; (4) occasionally shifting continuous GPS stations to

  14. Impact of real-time measurements for data assimilation in reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R; Krosche, M [Scandpower Petroleum Technology GmbH, Hamburg (Germany); Pajonk, O [TU Braunschweig (Germany). Inst. fuer Wissenschaftliches Rechnen; Myrland, T [Morges Teknisk-Naturvitenskapelige Univ. (NTNU), Trondheim (Germany)

    2008-10-23

    This paper gives an overview on the conceptual background of data assimilation techniques. The framework of sequential data assimilation as described for the ensemble Kalman filter implementation allows a continuous integration of new measurement data. The initial diversity of ensemble members will be critical for the assimilation process and the ability to successfully assimilate measurement data. At the same time the initial ensemble will impact the propagation of uncertainties with crucial consequences for production forecasts. Data assimilation techniques have complimentary features compared to other optimization techniques built on selection or regression schemes. Specifically, EnKF is applicable to real field cases and defines an important perspective for facilitating continuous reservoir simulation model updates in a reservoir life cycle. (orig.)

  15. Spatial Assimilation in Denmark?

    DEFF Research Database (Denmark)

    Andersen, Hans Skifter

    2010-01-01

    market and discrimination, which limits the housing possibilities for ethnic minorities. Another explanation could be that immigrants for different reasons choose to settle in so-called ethnic enclaves where they can find an ethnic social network, which can support them in their new country....... In traditional research literature about immigration it has been shown that for many immigrants living in enclaves has been a temporary situation. The 'spatial assimilation theory' says that this situation ends when the family has become more integrated in the new society and then moves to other parts...

  16. Variational data assimilation using targetted random walks

    KAUST Repository

    Cotter, S. L.

    2011-02-15

    The variational approach to data assimilation is a widely used methodology for both online prediction and for reanalysis. In either of these scenarios, it can be important to assess uncertainties in the assimilated state. Ideally, it is desirable to have complete information concerning the Bayesian posterior distribution for unknown state given data. We show that complete computational probing of this posterior distribution is now within the reach in the offline situation. We introduce a Markov chain-Monte Carlo (MCMC) method which enables us to directly sample from the Bayesian posterior distribution on the unknown functions of interest given observations. Since we are aware that these methods are currently too computationally expensive to consider using in an online filtering scenario, we frame this in the context of offline reanalysis. Using a simple random walk-type MCMC method, we are able to characterize the posterior distribution using only evaluations of the forward model of the problem, and of the model and data mismatch. No adjoint model is required for the method we use; however, more sophisticated MCMC methods are available which exploit derivative information. For simplicity of exposition, we consider the problem of assimilating data, either Eulerian or Lagrangian, into a low Reynolds number flow in a two-dimensional periodic geometry. We will show that in many cases it is possible to recover the initial condition and model error (which we describe as unknown forcing to the model) from data, and that with increasing amounts of informative data, the uncertainty in our estimations reduces. © 2011 John Wiley & Sons, Ltd.

  17. Data assimilation and PWR primary measurement

    International Nuclear Information System (INIS)

    Mercier, Thibaud

    2015-01-01

    A Pressurized Water Reactor (PWR) Reactor Coolant System (RCS) is a highly complex physical process: heterogeneous power, flow and temperature distributions are difficult to be accurately measured, since instrumentations are limited in number, thus leading to the relevant safety and protection margins. EDF R and D is seeking to assess the potential benefits of applying Data Assimilation to a PWR's RCS (Reactor Coolant System) measurements, in order to improve the estimators for parameters of a reactor's operating setpoint, i.e. improving accuracy and reducing uncertainties and biases of measured RCS parameters. In this thesis, we define a 0D semi-empirical model for RCS, satisfying the description level usually chosen by plant operators, and construct a Monte-Carlo Method (inspired from Ensemble Methods) in order to use this model with Data Assimilation tools. We apply this method on simulated data in order to assess the reduction of uncertainties on key parameters: results are beyond expectations, however strong hypothesis are required, implying a careful preprocessing of input data. (author)

  18. Nonlinear data assimilation

    CERN Document Server

    Van Leeuwen, Peter Jan; Reich, Sebastian

    2015-01-01

    This book contains two review articles on nonlinear data assimilation that deal with closely related topics but were written and can be read independently. Both contributions focus on so-called particle filters. The first contribution by Jan van Leeuwen focuses on the potential of proposal densities. It discusses the issues with present-day particle filters and explorers new ideas for proposal densities to solve them, converging to particle filters that work well in systems of any dimension, closing the contribution with a high-dimensional example. The second contribution by Cheng and Reich discusses a unified framework for ensemble-transform particle filters. This allows one to bridge successful ensemble Kalman filters with fully nonlinear particle filters, and allows a proper introduction of localization in particle filters, which has been lacking up to now.

  19. Priming in concert: Assimilation and contrast with multiple affective and gender primes.

    NARCIS (Netherlands)

    Fockenberg, D.A.; Koole, S.L.; Semin, G.R.

    2008-01-01

    The present research investigated the influence of multiple sequential primes on social categorization processes. Study 1 examined an evaluative decision task in which targets were preceded and succeeded by two primes. As expected, the temporally closest forward primes had assimilative effects on

  20. Improving carbon model phenology using data assimilation

    Science.gov (United States)

    Exrayat, Jean-François; Smallman, T. Luke; Bloom, A. Anthony; Williams, Mathew

    2015-04-01

    Carbon cycle dynamics is significantly impacted by ecosystem phenology, leading to substantial seasonal and inter-annual variation in the global carbon balance. Representing inter-annual variability is key for predicting the response of the terrestrial ecosystem to climate change and disturbance. Existing terrestrial ecosystem models (TEMs) often struggle to accurately simulate observed inter-annual variability. TEMs often use different phenological models based on plant functional type (PFT) assumptions. Moreover, due to a high level of computational overhead in TEMs they are unable to take advantage of globally available datasets to calibrate their models. Here we describe the novel CARbon DAta MOdel fraMework (CARDAMOM) for data assimilation. CARDAMOM is used to calibrate the Data Assimilation Linked Ecosystem Carbon version 2 (DALEC2) model using Bayes' Theorem within a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC). CARDAMOM provides a framework which combines knowledge from observations, such as remotely sensed LAI, and heuristic information in the form of Ecological and Dynamical Constraints (EDCs). The EDCs are representative of real world processes and constrain parameter interdependencies and constrain carbon dynamics. We used CARDAMOM to bring together globally spanning datasets of LAI and the DALEC2 and DALEC2-GSI models. These analyses allow us to investigate the sensitivity ecosystem processes to the representation of phenology. DALEC2 uses an analytically solved model of phenology which is invariant between years. In contrast DALEC2-GSI uses a growing season index (GSI) calculated as a function of temperature, vapour pressure deficit (VPD) and photoperiod to calculate bud-burst and leaf senescence, allowing the model to simulate inter-annual variability in response to climate. Neither model makes any PFT assumptions about the phenological controls of a given ecosystem, allowing the data alone to determine the impact of the meteorological

  1. Data Assimilation by Conditioning of Driving Noise on Future Observations

    KAUST Repository

    Lee, Wonjung

    2014-08-01

    Conventional recursive filtering approaches, designed for quantifying the state of an evolving stochastic dynamical system with intermittent observations, use a sequence of i) an uncertainty propagation step followed by ii) a step where the associated data is assimilated using Bayes\\' rule. Alternatively, the order of the steps can be switched to i) one step ahead data assimilation followed by ii) uncertainty propagation. In this paper, we apply this smoothing-based sequential filter to systems driven by random noise, however with the conditioning on future observation not only to the system variable but to the driving noise. Our research reveals that, for the nonlinear filtering problem, the conditioned driving noise is biased by a nonzero mean and in turn pushes forward the filtering solution in time closer to the true state when it drives the system. As a result our proposed method can yield a more accurate approximate solution for the state estimation problem. © 1991-2012 IEEE.

  2. Sequential charged particle reaction

    International Nuclear Information System (INIS)

    Hori, Jun-ichi; Ochiai, Kentaro; Sato, Satoshi; Yamauchi, Michinori; Nishitani, Takeo

    2004-01-01

    The effective cross sections for producing the sequential reaction products in F82H, pure vanadium and LiF with respect to the 14.9-MeV neutron were obtained and compared with the estimation ones. Since the sequential reactions depend on the secondary charged particles behavior, the effective cross sections are corresponding to the target nuclei and the material composition. The effective cross sections were also estimated by using the EAF-libraries and compared with the experimental ones. There were large discrepancies between estimated and experimental values. Additionally, we showed the contribution of the sequential reaction on the induced activity and dose rate in the boundary region with water. From the present study, it has been clarified that the sequential reactions are of great importance to evaluate the dose rates around the surface of cooling pipe and the activated corrosion products. (author)

  3. Global Data Assimilation System (GDAS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Global Data Assimilation System (GDAS) is the system used by the Global Forecast System (GFS) model to place observations into a gridded model space for the...

  4. Data assimilation in hydrological modelling

    DEFF Research Database (Denmark)

    Drecourt, Jean-Philippe

    Data assimilation is an invaluable tool in hydrological modelling as it allows to efficiently combine scarce data with a numerical model to obtain improved model predictions. In addition, data assimilation also provides an uncertainty analysis of the predictions made by the hydrological model....... In this thesis, the Kalman filter is used for data assimilation with a focus on groundwater modelling. However the developed techniques are general and can be applied also in other modelling domains. Modelling involves conceptualization of the processes of Nature. Data assimilation provides a way to deal...... with model non-linearities and biased errors. A literature review analyzes the most popular techniques and their application in hydrological modelling. Since bias is an important problem in groundwater modelling, two bias aware Kalman filters have been implemented and compared using an artificial test case...

  5. Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction

    Science.gov (United States)

    Li, Zhijin; Chao, Yi; Li, P. Peggy

    2012-01-01

    A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.

  6. ASSIMILATION OF COARSE-SCALEDATAUSINGTHE ENSEMBLE KALMAN FILTER

    KAUST Repository

    Efendiev, Yalchin

    2011-01-01

    Reservoir data is usually scale dependent and exhibits multiscale features. In this paper we use the ensemble Kalman filter (EnKF) to integrate data at different spatial scales for estimating reservoir fine-scale characteristics. Relationships between the various scales is modeled via upscaling techniques. We propose two versions of the EnKF to assimilate the multiscale data, (i) where all the data are assimilated together and (ii) the data are assimilated sequentially in batches. Ensemble members obtained after assimilating one set of data are used as a prior to assimilate the next set of data. Both of these versions are easily implementable with any other upscaling which links the fine to the coarse scales. The numerical results with different methods are presented in a twin experiment setup using a two-dimensional, two-phase (oil and water) flow model. Results are shown with coarse-scale permeability and coarse-scale saturation data. They indicate that additional data provides better fine-scale estimates and fractional flow predictions. We observed that the two versions of the EnKF differed in their estimates when coarse-scale permeability is provided, whereas their results are similar when coarse-scale saturation is used. This behavior is thought to be due to the nonlinearity of the upscaling operator in the case of the former data. We also tested our procedures with various precisions of the coarse-scale data to account for the inexact relationship between the fine and coarse scale data. As expected, the results show that higher precision in the coarse-scale data yielded improved estimates. With better coarse-scale modeling and inversion techniques as more data at multiple coarse scales is made available, the proposed modification to the EnKF could be relevant in future studies.

  7. Effective Assimilation of Global Precipitation

    Science.gov (United States)

    Lien, G.; Kalnay, E.; Miyoshi, T.; Huffman, G. J.

    2012-12-01

    Assimilating precipitation observations by modifying the moisture and sometimes temperature profiles has been shown successful in forcing the model precipitation to be close to the observed precipitation, but only while the assimilation is taking place. After the forecast start, the model tends to "forget" the assimilation changes and lose their extra skill after few forecast hours. This suggests that this approach is not an efficient way to modify the potential vorticity field, since this is the variable that the model would remember. In this study, the ensemble Kalman filter (EnKF) method is used to effectively change the potential vorticity field by allowing ensemble members with better precipitation to receive higher weights. In addition to using an EnKF, two other changes in the precipitation assimilation process are proposed to solve the problems related to the highly non-Gaussian nature of the precipitation variable: a) transform precipitation into a Gaussian distribution based on its climatological distribution, and b) only assimilate precipitation at the location where some ensemble members have positive precipitation. The idea is first tested by the observing system simulation experiments (OSSEs) using SPEEDY, a simplified but realistic general circulation model. When the global precipitation is assimilated in addition to conventional rawinsonde observations, both the analyses and the medium range forecasts are significantly improved as compared to only having rawinsonde observations. The improvement is much reduced when only modifying the moisture field with the same approach, which shows the importance of the error covariance between precipitation and all other model variables. The effect of precipitation assimilation is larger in the Southern Hemisphere than that in the Northern Hemisphere because the Northern Hemisphere analyses are already accurate as a result of denser rawinsonde stations. Assimilation of precipitation using a more comprehensive

  8. Sequential stochastic optimization

    CERN Document Server

    Cairoli, Renzo

    1996-01-01

    Sequential Stochastic Optimization provides mathematicians and applied researchers with a well-developed framework in which stochastic optimization problems can be formulated and solved. Offering much material that is either new or has never before appeared in book form, it lucidly presents a unified theory of optimal stopping and optimal sequential control of stochastic processes. This book has been carefully organized so that little prior knowledge of the subject is assumed; its only prerequisites are a standard graduate course in probability theory and some familiarity with discrete-paramet

  9. Diffusion Filters for Variational Data Assimilation of Sea Surface Temperature in an Intermediate Climate Model

    Directory of Open Access Journals (Sweden)

    Xuefeng Zhang

    2015-01-01

    Full Text Available Sequential, adaptive, and gradient diffusion filters are implemented into spatial multiscale three-dimensional variational data assimilation (3DVAR as alternative schemes to model background error covariance matrix for the commonly used correction scale method, recursive filter method, and sequential 3DVAR. The gradient diffusion filter (GDF is verified by a two-dimensional sea surface temperature (SST assimilation experiment. Compared to the existing DF, the new GDF scheme shows a superior performance in the assimilation experiment due to its success in extracting the spatial multiscale information. The GDF can retrieve successfully the longwave information over the whole analysis domain and the shortwave information over data-dense regions. After that, a perfect twin data assimilation experiment framework is designed to study the effect of the GDF on the state estimation based on an intermediate coupled model. In this framework, the assimilation model is subject to “biased” initial fields from the “truth” model. While the GDF reduces the model bias in general, it can enhance the accuracy of the state estimation in the region that the observations are removed, especially in the South Ocean. In addition, the higher forecast skill can be obtained through the better initial state fields produced by the GDF.

  10. Sequential memory: Binding dynamics

    Science.gov (United States)

    Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail

    2015-10-01

    Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.

  11. Sequential Dependencies in Driving

    Science.gov (United States)

    Doshi, Anup; Tran, Cuong; Wilder, Matthew H.; Mozer, Michael C.; Trivedi, Mohan M.

    2012-01-01

    The effect of recent experience on current behavior has been studied extensively in simple laboratory tasks. We explore the nature of sequential effects in the more naturalistic setting of automobile driving. Driving is a safety-critical task in which delayed response times may have severe consequences. Using a realistic driving simulator, we find…

  12. Mining compressing sequential problems

    NARCIS (Netherlands)

    Hoang, T.L.; Mörchen, F.; Fradkin, D.; Calders, T.G.K.

    2012-01-01

    Compression based pattern mining has been successfully applied to many data mining tasks. We propose an approach based on the minimum description length principle to extract sequential patterns that compress a database of sequences well. We show that mining compressing patterns is NP-Hard and

  13. Data Assimilation for Applied Meteorology

    Science.gov (United States)

    Haupt, S. E.

    2012-12-01

    Although atmospheric models provide a best estimate of the future state of the atmosphere, due to sensitivity to initial condition, it is difficult to predict the precise future state. For applied problems, however, users often depend on having accurate knowledge of that future state. To improve prediction of a particular realization of an evolving flow field requires knowledge of the current state of that field and assimilation of local observations into the model. This talk will consider how dynamic assimilation can help address the concerns of users of atmospheric forecasts. First, we will look at the value of assimilation for the renewable energy industry. If the industry decision makers can have confidence in the wind and solar power forecasts, they can build their power allocations around the expected renewable resource, saving money for the ratepayers as well as reducing carbon emissions. We will assess the value to that industry of assimilating local real-time observations into the model forecasts and the value that is provided. The value of the forecasts with assimilation is important on both short (several hour) to medium range (within two days). A second application will be atmospheric transport and dispersion problems. In particular, we will look at assimilation of concentration data into a prediction model. An interesting aspect of this problem is that the dynamics are a one-way coupled system, with the fluid dynamic equations affecting the concentration equation, but not vice versa. So when the observations are of the concentration, one must infer the fluid dynamics. This one-way coupled system presents a challenge: one must first infer the changes in the flow field from observations of the contaminant, then assimilate that to recover both the advecting flow and information on the subgrid processes that provide the mixing. To accomplish such assimilation requires a robust method to match the observed contaminant field to that modeled. One approach is

  14. Forced Sequence Sequential Decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis

    In this thesis we describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon codes with non-uniform profile. With this scheme decoding with good performance...... is possible as low as Eb/No=0.6 dB, which is about 1.7 dB below the signal-to-noise ratio that marks the cut-off rate for the convolutional code. This is possible since the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability...... of computational overflow. Analytical results for the probability that the first Reed-Solomon word is decoded after C computations are presented. This is supported by simulation results that are also extended to other parameters....

  15. Sequential Power-Dependence Theory

    NARCIS (Netherlands)

    Buskens, Vincent; Rijt, Arnout van de

    2008-01-01

    Existing methods for predicting resource divisions in laboratory exchange networks do not take into account the sequential nature of the experimental setting. We extend network exchange theory by considering sequential exchange. We prove that Sequential Power-Dependence Theory—unlike

  16. Modelling sequentially scored item responses

    NARCIS (Netherlands)

    Akkermans, W.

    2000-01-01

    The sequential model can be used to describe the variable resulting from a sequential scoring process. In this paper two more item response models are investigated with respect to their suitability for sequential scoring: the partial credit model and the graded response model. The investigation is

  17. Forced Sequence Sequential Decoding

    DEFF Research Database (Denmark)

    Jensen, Ole Riis; Paaske, Erik

    1998-01-01

    We describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon (RS) codes with nonuniform profile. With this scheme decoding with good performance is possible as low...... as Eb/N0=0.6 dB, which is about 1.25 dB below the signal-to-noise ratio (SNR) that marks the cutoff rate for the full system. Accounting for about 0.45 dB due to the outer codes, sequential decoding takes place at about 1.7 dB below the SNR cutoff rate for the convolutional code. This is possible since...... the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability of computational overflow. Analytical results for the probability that the first RS word is decoded after C computations are presented. These results are supported...

  18. Monte Carlo Solutions for Blind Phase Noise Estimation

    Directory of Open Access Journals (Sweden)

    Çırpan Hakan

    2009-01-01

    Full Text Available This paper investigates the use of Monte Carlo sampling methods for phase noise estimation on additive white Gaussian noise (AWGN channels. The main contributions of the paper are (i the development of a Monte Carlo framework for phase noise estimation, with special attention to sequential importance sampling and Rao-Blackwellization, (ii the interpretation of existing Monte Carlo solutions within this generic framework, and (iii the derivation of a novel phase noise estimator. Contrary to the ad hoc phase noise estimators that have been proposed in the past, the estimators considered in this paper are derived from solid probabilistic and performance-determining arguments. Computer simulations demonstrate that, on one hand, the Monte Carlo phase noise estimators outperform the existing estimators and, on the other hand, our newly proposed solution exhibits a lower complexity than the existing Monte Carlo solutions.

  19. Development of a software framework for data assimilation and its applications for streamflow forecasting in Japan

    Science.gov (United States)

    Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.

    2012-04-01

    Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.

  20. Assimilate partitioning during reproductive growth

    International Nuclear Information System (INIS)

    Finazzo, S.F.; Davenport, T.L.

    1987-01-01

    Leaves having various phyllotactic relationships to fruitlets were labeled for 1 hour with 10/sub r/Ci of 14 CO 2 . Fruitlets were also labeled. Fruitlets did fix 14 CO 2 . Translocation of radioactivity from the peel into the fruit occurred slowly and to a limited extent. No evidence of translocation out of the fruitlets was observed. Assimilate partitioning in avocado was strongly influenced by phyllotaxy. If a fruit and the labeled leaf had the same phyllotaxy then greater than 95% of the radiolabel was present in this fruit. When the fruit did not have the same phyllotaxy as the labeled leaf, the radiolabel distribution was skewed with 70% of the label going to a single adjacent position. Avocado fruitlets exhibit uniform labeling throughout a particular tissue. In avocado, assimilates preferentially move from leaves to fruits with the same phyllotaxy

  1. Ensemble Kalman Filter Assimilation of ERT Data for Numerical Modeling of Seawater Intrusion in a Laboratory Experiment

    Directory of Open Access Journals (Sweden)

    Véronique Bouzaglou

    2018-03-01

    Full Text Available Seawater intrusion in coastal aquifers is a worldwide problem exacerbated by aquifer overexploitation and climate changes. To limit the deterioration of water quality caused by saline intrusion, research studies are needed to identify and assess the performance of possible countermeasures, e.g., underground barriers. Within this context, numerical models are fundamental to fully understand the process and for evaluating the effectiveness of the proposed solutions to contain the saltwater wedge; on the other hand, they are typically affected by uncertainty on hydrogeological parameters, as well as initial and boundary conditions. Data assimilation methods such as the ensemble Kalman filter (EnKF represent promising tools that can reduce such uncertainties. Here, we present an application of the EnKF to the numerical modeling of a laboratory experiment where seawater intrusion was reproduced in a specifically designed sandbox and continuously monitored with electrical resistivity tomography (ERT. Combining EnKF and the SUTRA model for the simulation of density-dependent flow and transport in porous media, we assimilated the collected ERT data by means of joint and sequential assimilation approaches. In the joint approach, raw ERT data (electrical resistances are assimilated to update both salt concentration and soil parameters, without the need for an electrical inversion. In the sequential approach, we assimilated electrical conductivities computed from a previously performed electrical inversion. Within both approaches, we suggest dual-step update strategies to minimize the effects of spurious correlations in parameter estimation. The results show that, in both cases, ERT data assimilation can reduce the uncertainty not only on the system state in terms of salt concentration, but also on the most relevant soil parameters, i.e., saturated hydraulic conductivity and longitudinal dispersivity. However, the sequential approach is more prone to

  2. Sequential decay of Reggeons

    International Nuclear Information System (INIS)

    Yoshida, Toshihiro

    1981-01-01

    Probabilities of meson production in the sequential decay of Reggeons, which are formed from the projectile and the target in the hadron-hadron to Reggeon-Reggeon processes, are investigated. It is assumed that pair creation of heavy quarks and simultaneous creation of two antiquark-quark pairs are negligible. The leading-order terms with respect to ratio of creation probabilities of anti s s to anti u u (anti d d) are calculated. The production cross sections in the target fragmentation region are given in terms of probabilities in the initial decay of the Reggeons and an effect of manyparticle production. (author)

  3. Sequential Ensembles Tolerant to Synthetic Aperture Radar (SAR Soil Moisture Retrieval Errors

    Directory of Open Access Journals (Sweden)

    Ju Hyoung Lee

    2016-04-01

    Full Text Available Due to complicated and undefined systematic errors in satellite observation, data assimilation integrating model states with satellite observations is more complicated than field measurements-based data assimilation at a local scale. In the case of Synthetic Aperture Radar (SAR soil moisture, the systematic errors arising from uncertainties in roughness conditions are significant and unavoidable, but current satellite bias correction methods do not resolve the problems very well. Thus, apart from the bias correction process of satellite observation, it is important to assess the inherent capability of satellite data assimilation in such sub-optimal but more realistic observational error conditions. To this end, time-evolving sequential ensembles of the Ensemble Kalman Filter (EnKF is compared with stationary ensemble of the Ensemble Optimal Interpolation (EnOI scheme that does not evolve the ensembles over time. As the sensitivity analysis demonstrated that the surface roughness is more sensitive to the SAR retrievals than measurement errors, it is a scope of this study to monitor how data assimilation alters the effects of roughness on SAR soil moisture retrievals. In results, two data assimilation schemes all provided intermediate values between SAR overestimation, and model underestimation. However, under the same SAR observational error conditions, the sequential ensembles approached a calibrated model showing the lowest Root Mean Square Error (RMSE, while the stationary ensemble converged towards the SAR observations exhibiting the highest RMSE. As compared to stationary ensembles, sequential ensembles have a better tolerance to SAR retrieval errors. Such inherent nature of EnKF suggests an operational merit as a satellite data assimilation system, due to the limitation of bias correction methods currently available.

  4. Sequential effects in judgements of attractiveness: the influences of face race and sex.

    Directory of Open Access Journals (Sweden)

    Robin S S Kramer

    Full Text Available In perceptual decision-making, a person's response on a given trial is influenced by their response on the immediately preceding trial. This sequential effect was initially demonstrated in psychophysical tasks, but has now been found in more complex, real-world judgements. The similarity of the current and previous stimuli determines the nature of the effect, with more similar items producing assimilation in judgements, while less similarity can cause a contrast effect. Previous research found assimilation in ratings of facial attractiveness, and here, we investigated whether this effect is influenced by the social categories of the faces presented. Over three experiments, participants rated the attractiveness of own- (White and other-race (Chinese faces of both sexes that appeared successively. Through blocking trials by race (Experiment 1, sex (Experiment 2, or both dimensions (Experiment 3, we could examine how sequential judgements were altered by the salience of different social categories in face sequences. For sequences that varied in sex alone, own-race faces showed significantly less opposite-sex assimilation (male and female faces perceived as dissimilar, while other-race faces showed equal assimilation for opposite- and same-sex sequences (male and female faces were not differentiated. For sequences that varied in race alone, categorisation by race resulted in no opposite-race assimilation for either sex of face (White and Chinese faces perceived as dissimilar. For sequences that varied in both race and sex, same-category assimilation was significantly greater than opposite-category. Our results suggest that the race of a face represents a superordinate category relative to sex. These findings demonstrate the importance of social categories when considering sequential judgements of faces, and also highlight a novel approach for investigating how multiple social dimensions interact during decision-making.

  5. Synthetic Aperture Sequential Beamforming

    DEFF Research Database (Denmark)

    Kortbek, Jacob; Jensen, Jørgen Arendt; Gammelmark, Kim Løkke

    2008-01-01

    A synthetic aperture focusing (SAF) technique denoted Synthetic Aperture Sequential Beamforming (SASB) suitable for 2D and 3D imaging is presented. The technique differ from prior art of SAF in the sense that SAF is performed on pre-beamformed data contrary to channel data. The objective is to im......A synthetic aperture focusing (SAF) technique denoted Synthetic Aperture Sequential Beamforming (SASB) suitable for 2D and 3D imaging is presented. The technique differ from prior art of SAF in the sense that SAF is performed on pre-beamformed data contrary to channel data. The objective...... is to improve and obtain a more range independent lateral resolution compared to conventional dynamic receive focusing (DRF) without compromising frame rate. SASB is a two-stage procedure using two separate beamformers. First a set of Bmode image lines using a single focal point in both transmit and receive...... is stored. The second stage applies the focused image lines from the first stage as input data. The SASB method has been investigated using simulations in Field II and by off-line processing of data acquired with a commercial scanner. The performance of SASB with a static image object is compared with DRF...

  6. A data assimilation tool for the Pagasitikos Gulf ecosystem dynamics: Methods and benefits

    KAUST Repository

    Korres, Gerasimos

    2012-06-01

    Within the framework of the European INSEA project, an advanced assimilation system has been implemented for the Pagasitikos Gulf ecosystem. The system is based on a multivariate sequential data assimilation scheme that combines satellite ocean sea color (chlorophyll-a) data with the predictions of a three-dimensional coupled physical-biochemical model of the Pagasitikos Gulf ecosystem presented in a companion paper. The hydrodynamics are solved with a very high resolution (1/100°) implementation of the Princeton Ocean Model (POM). This model is nested within a coarser resolution model of the Aegean Sea which is part of the Greek POSEIDON forecasting system. The forecast of the Aegean Sea model, itself nested and initialized from a Mediterranean implementation of POM, is also used to periodically re-initalize the Pagatisikos hydrodynamics model using variational initialization techniques. The ecosystem dynamics of Pagasitikos are tackled with a stand-alone implementation of the European Seas Ecosystem Model (ERSEM). The assimilation scheme is based on the Singular Evolutive Extended Kalman (SEEK) filter, in which the error statistics are parameterized by means of a suitable set of Empirical Orthogonal Functions (EOFs).The assimilation experiments were performed for year 2003 and additionally for a 9-month period over 2006 during which the physical model was forced with the POSEIDON-ETA 6-hour atmospheric fields. The assimilation system is validated by assessing the relevance of the system in fitting the data, the impact of the assimilation on non-observed biochemical processes and the overall quality of the forecasts. Assimilation of either GlobColour in 2003 or SeaWiFS in 2006 chlorophyll-a data enhances the identification of the ecological state of the Pagasitikos Gulf. Results, however, suggest that subsurface ecological observations are needed to improve the controllability of the ecosystem in the deep layers. © 2011 Elsevier B.V.

  7. Exploring Monte Carlo methods

    CERN Document Server

    Dunn, William L

    2012-01-01

    Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble

  8. Monte Carlo methods

    Directory of Open Access Journals (Sweden)

    Bardenet Rémi

    2013-07-01

    Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.

  9. Data assimilation in the hydrological dispersion module of Rodos

    International Nuclear Information System (INIS)

    Madsen, H.

    2003-01-01

    become available for assimilation into the DeMM more precise estimates of the deposition are obtained, which will reduce the uncertainty in the wash-off modelling. The prediction uncertainty of radionuclide contamination will be further reduced when radionuclide concentration measurements in downstream water bodies become available for updating the hydrological models. For the data assimilation in the HDM measurements of concentrations of different radionuclides in solute and on suspended sediments will be available. Based on these measurements the hydrological modelling components can be updated. This includes updating of the three different phases of radionuclides (i) in solute, (ii) on suspended sediments, and (iii) in bottom depositions in all computational grid points of the modelled system. Since the three radionuclide phases are linked together via the sorption/desorption process descriptions in the model, the data assimilation system is able to update all three phases when only one of the phases is being measured. The data assimilation system is based on the Kalman filter. In this respect, different cost-effective Kalman filter procedures that are feasible for real-time applications are being developed and implemented. These include the reduced rank square-root filter in which the error covariance matrix is approximated by a matrix of lower rank using a square-root factorization, an ensemble Kalman filter based an a Monte Carlo simulation approach for propagation of errors, and a steady Kalman filter based on a fixed error assumption. This paper provides a description of the data assimilation system that is being developed and implemented in the RODOS HDM. Test examples are presented that illustrate the use of the data assimilation procedures to improve the predictive capabilities of the one-dimensional and two-dimensional models of the RODOS HDM for prediction of radionuclide contamination of rivers and reservoirs. (author)

  10. Data assimilation with inequality constraints

    Science.gov (United States)

    Thacker, W. C.

    If values of variables in a numerical model are limited to specified ranges, these restrictions should be enforced when data are assimilated. The simplest option is to assimilate without regard for constraints and then to correct any violations without worrying about additional corrections implied by correlated errors. This paper addresses the incorporation of inequality constraints into the standard variational framework of optimal interpolation with emphasis on our limited knowledge of the underlying probability distributions. Simple examples involving only two or three variables are used to illustrate graphically how active constraints can be treated as error-free data when background errors obey a truncated multi-normal distribution. Using Lagrange multipliers, the formalism is expanded to encompass the active constraints. Two algorithms are presented, both relying on a solution ignoring the inequality constraints to discover violations to be enforced. While explicitly enforcing a subset can, via correlations, correct the others, pragmatism based on our poor knowledge of the underlying probability distributions suggests the expedient of enforcing them all explicitly to avoid the computationally expensive task of determining the minimum active set. If additional violations are encountered with these solutions, the process can be repeated. Simple examples are used to illustrate the algorithms and to examine the nature of the corrections implied by correlated errors.

  11. Hydrologic Remote Sensing and Land Surface Data Assimilation

    Directory of Open Access Journals (Sweden)

    Hamid Moradkhani

    2008-05-01

    Full Text Available Accurate, reliable and skillful forecasting of key environmental variables such as soil moisture and snow are of paramount importance due to their strong influence on many water resources applications including flood control, agricultural production and effective water resources management which collectively control the behavior of the climate system. Soil moisture is a key state variable in land surface–atmosphere interactions affecting surface energy fluxes, runoff and the radiation balance. Snow processes also have a large influence on land-atmosphere energy exchanges due to snow high albedo, low thermal conductivity and considerable spatial and temporal variability resulting in the dramatic change on surface and ground temperature. Measurement of these two variables is possible through variety of methods using ground-based and remote sensing procedures. Remote sensing, however, holds great promise for soil moisture and snow measurements which have considerable spatial and temporal variability. Merging these measurements with hydrologic model outputs in a systematic and effective way results in an improvement of land surface model prediction. Data Assimilation provides a mechanism to combine these two sources of estimation. Much success has been attained in recent years in using data from passive microwave sensors and assimilating them into the models. This paper provides an overview of the remote sensing measurement techniques for soil moisture and snow data and describes the advances in data assimilation techniques through the ensemble filtering, mainly Ensemble Kalman filter (EnKF and Particle filter (PF, for improving the model prediction and reducing the uncertainties involved in prediction process. It is believed that PF provides a complete representation of the probability distribution of state variables of interests (according to sequential Bayes law and could be a strong alternative to EnKF which is subject to some

  12. Hydrologic Remote Sensing and Land Surface Data Assimilation.

    Science.gov (United States)

    Moradkhani, Hamid

    2008-05-06

    Accurate, reliable and skillful forecasting of key environmental variables such as soil moisture and snow are of paramount importance due to their strong influence on many water resources applications including flood control, agricultural production and effective water resources management which collectively control the behavior of the climate system. Soil moisture is a key state variable in land surface-atmosphere interactions affecting surface energy fluxes, runoff and the radiation balance. Snow processes also have a large influence on land-atmosphere energy exchanges due to snow high albedo, low thermal conductivity and considerable spatial and temporal variability resulting in the dramatic change on surface and ground temperature. Measurement of these two variables is possible through variety of methods using ground-based and remote sensing procedures. Remote sensing, however, holds great promise for soil moisture and snow measurements which have considerable spatial and temporal variability. Merging these measurements with hydrologic model outputs in a systematic and effective way results in an improvement of land surface model prediction. Data Assimilation provides a mechanism to combine these two sources of estimation. Much success has been attained in recent years in using data from passive microwave sensors and assimilating them into the models. This paper provides an overview of the remote sensing measurement techniques for soil moisture and snow data and describes the advances in data assimilation techniques through the ensemble filtering, mainly Ensemble Kalman filter (EnKF) and Particle filter (PF), for improving the model prediction and reducing the uncertainties involved in prediction process. It is believed that PF provides a complete representation of the probability distribution of state variables of interests (according to sequential Bayes law) and could be a strong alternative to EnKF which is subject to some limitations including the linear

  13. Monte Carlo: Basics

    OpenAIRE

    Murthy, K. P. N.

    2001-01-01

    An introduction to the basics of Monte Carlo is given. The topics covered include, sample space, events, probabilities, random variables, mean, variance, covariance, characteristic function, chebyshev inequality, law of large numbers, central limit theorem (stable distribution, Levy distribution), random numbers (generation and testing), random sampling techniques (inversion, rejection, sampling from a Gaussian, Metropolis sampling), analogue Monte Carlo and Importance sampling (exponential b...

  14. Covariance Function for Nearshore Wave Assimilation Systems

    Science.gov (United States)

    2018-01-30

    which is applicable for any spectral wave model. The four dimensional variational (4DVar) assimilation methods are based on the mathematical ...covariance can be modeled by a parameterized Gaussian function, for nearshore wave assimilation applications , the covariance function depends primarily on...SPECTRAL ACTION DENSITY, RESPECTIVELY. ............................ 5 FIGURE 2. TOP ROW: STATISTICAL ANALYSIS OF THE WAVE-FIELD PROPERTIES AT THE

  15. Adaptive sequential controller

    Energy Technology Data Exchange (ETDEWEB)

    El-Sharkawi, Mohamed A. (Renton, WA); Xing, Jian (Seattle, WA); Butler, Nicholas G. (Newberg, OR); Rodriguez, Alonso (Pasadena, CA)

    1994-01-01

    An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.

  16. Adaptive sequential controller

    Science.gov (United States)

    El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso

    1994-01-01

    An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.

  17. Data assimilation a mathematical introduction

    CERN Document Server

    Law, Kody; Zygalakis, Konstantinos

    2015-01-01

    This book provides a systematic treatment of the mathematical underpinnings of work in data assimilation, covering both theoretical and computational approaches. Specifically the authors develop a unified mathematical framework in which a Bayesian formulation of the problem provides the bedrock for the derivation, development and analysis of algorithms; the many examples used in the text, together with the algorithms which are introduced and discussed, are all illustrated by the MATLAB software detailed in the book and made freely available online. The book is organized into nine chapters: the first contains a brief introduction to the mathematical tools around which the material is organized; the next four are concerned with discrete time dynamical systems and discrete time data; the last four are concerned with continuous time dynamical systems and continuous time data and are organized analogously to the corresponding discrete time chapters. This book is aimed at mathematical researchers interested in a sy...

  18. Non-Saccharomyces Yeasts Nitrogen Source Preferences: Impact on Sequential Fermentation and Wine Volatile Compounds Profile

    Science.gov (United States)

    Gobert, Antoine; Tourdot-Maréchal, Raphaëlle; Morge, Christophe; Sparrow, Céline; Liu, Youzhong; Quintanilla-Casas, Beatriz; Vichi, Stefania; Alexandre, Hervé

    2017-01-01

    Nitrogen sources in the must are important for yeast metabolism, growth, and performance, and wine volatile compounds profile. Yeast assimilable nitrogen (YAN) deficiencies in grape must are one of the main causes of stuck and sluggish fermentation. The nitrogen requirement of Saccharomyces cerevisiae metabolism has been described in detail. However, the YAN preferences of non-Saccharomyces yeasts remain unknown despite their increasingly widespread use in winemaking. Furthermore, the impact of nitrogen consumption by non-Saccharomyces yeasts on YAN availability, alcoholic performance and volatile compounds production by S. cerevisiae in sequential fermentation has been little studied. With a view to improving the use of non-Saccharomyces yeasts in winemaking, we studied the use of amino acids and ammonium by three strains of non-Saccharomyces yeasts (Starmerella bacillaris, Metschnikowia pulcherrima, and Pichia membranifaciens) in grape juice. We first determined which nitrogen sources were preferentially used by these yeasts in pure cultures at 28 and 20°C (because few data are available). We then carried out sequential fermentations at 20°C with S. cerevisiae, to assess the impact of the non-Saccharomyces yeasts on the availability of assimilable nitrogen for S. cerevisiae. Finally, 22 volatile compounds were quantified in sequential fermentation and their levels compared with those in pure cultures of S. cerevisiae. We report here, for the first time, that non-Saccharomyces yeasts have specific amino-acid consumption profiles. Histidine, methionine, threonine, and tyrosine were not consumed by S. bacillaris, aspartic acid was assimilated very slowly by M. pulcherrima, and glutamine was not assimilated by P. membranifaciens. By contrast, cysteine appeared to be a preferred nitrogen source for all non-Saccharomyces yeasts. In sequential fermentation, these specific profiles of amino-acid consumption by non-Saccharomyces yeasts may account for some of the

  19. Non-Saccharomyces Yeasts Nitrogen Source Preferences: Impact on Sequential Fermentation and Wine Volatile Compounds Profile

    Directory of Open Access Journals (Sweden)

    Antoine Gobert

    2017-11-01

    Full Text Available Nitrogen sources in the must are important for yeast metabolism, growth, and performance, and wine volatile compounds profile. Yeast assimilable nitrogen (YAN deficiencies in grape must are one of the main causes of stuck and sluggish fermentation. The nitrogen requirement of Saccharomyces cerevisiae metabolism has been described in detail. However, the YAN preferences of non-Saccharomyces yeasts remain unknown despite their increasingly widespread use in winemaking. Furthermore, the impact of nitrogen consumption by non-Saccharomyces yeasts on YAN availability, alcoholic performance and volatile compounds production by S. cerevisiae in sequential fermentation has been little studied. With a view to improving the use of non-Saccharomyces yeasts in winemaking, we studied the use of amino acids and ammonium by three strains of non-Saccharomyces yeasts (Starmerella bacillaris, Metschnikowia pulcherrima, and Pichia membranifaciens in grape juice. We first determined which nitrogen sources were preferentially used by these yeasts in pure cultures at 28 and 20°C (because few data are available. We then carried out sequential fermentations at 20°C with S. cerevisiae, to assess the impact of the non-Saccharomyces yeasts on the availability of assimilable nitrogen for S. cerevisiae. Finally, 22 volatile compounds were quantified in sequential fermentation and their levels compared with those in pure cultures of S. cerevisiae. We report here, for the first time, that non-Saccharomyces yeasts have specific amino-acid consumption profiles. Histidine, methionine, threonine, and tyrosine were not consumed by S. bacillaris, aspartic acid was assimilated very slowly by M. pulcherrima, and glutamine was not assimilated by P. membranifaciens. By contrast, cysteine appeared to be a preferred nitrogen source for all non-Saccharomyces yeasts. In sequential fermentation, these specific profiles of amino-acid consumption by non-Saccharomyces yeasts may account for

  20. Effective assimilation of global precipitation: simulation experiments

    Directory of Open Access Journals (Sweden)

    Guo-Yuan Lien

    2013-07-01

    Full Text Available Past attempts to assimilate precipitation by nudging or variational methods have succeeded in forcing the model precipitation to be close to the observed values. However, the model forecasts tend to lose their additional skill after a few forecast hours. In this study, a local ensemble transform Kalman filter (LETKF is used to effectively assimilate precipitation by allowing ensemble members with better precipitation to receive higher weights in the analysis. In addition, two other changes in the precipitation assimilation process are found to alleviate the problems related to the non-Gaussianity of the precipitation variable: (a transform the precipitation variable into a Gaussian distribution based on its climatological distribution (an approach that could also be used in the assimilation of other non-Gaussian observations and (b only assimilate precipitation at the location where at least some ensemble members have precipitation. Unlike many current approaches, both positive and zero rain observations are assimilated effectively. Observing system simulation experiments (OSSEs are conducted using the Simplified Parametrisations, primitivE-Equation DYnamics (SPEEDY model, a simplified but realistic general circulation model. When uniformly and globally distributed observations of precipitation are assimilated in addition to rawinsonde observations, both the analyses and the medium-range forecasts of all model variables, including precipitation, are significantly improved as compared to only assimilating rawinsonde observations. The effect of precipitation assimilation on the analyses is retained on the medium-range forecasts and is larger in the Southern Hemisphere (SH than that in the Northern Hemisphere (NH because the NH analyses are already made more accurate by the denser rawinsonde stations. These improvements are much reduced when only the moisture field is modified by the precipitation observations. Both the Gaussian transformation and

  1. Quantum Inequalities and Sequential Measurements

    International Nuclear Information System (INIS)

    Candelpergher, B.; Grandouz, T.; Rubinx, J.L.

    2011-01-01

    In this article, the peculiar context of sequential measurements is chosen in order to analyze the quantum specificity in the two most famous examples of Heisenberg and Bell inequalities: Results are found at some interesting variance with customary textbook materials, where the context of initial state re-initialization is described. A key-point of the analysis is the possibility of defining Joint Probability Distributions for sequential random variables associated to quantum operators. Within the sequential context, it is shown that Joint Probability Distributions can be defined in situations where not all of the quantum operators (corresponding to random variables) do commute two by two. (authors)

  2. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  3. Data assimilation of citizen collected information for real-time flood hazard mapping

    Science.gov (United States)

    Sayama, T.; Takara, K. T.

    2017-12-01

    Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for assimilating citizen collected information for real-time flood hazard mapping in the future.

  4. Skill Assessment in Ocean Biological Data Assimilation

    Science.gov (United States)

    Gregg, Watson W.; Friedrichs, Marjorie A. M.; Robinson, Allan R.; Rose, Kenneth A.; Schlitzer, Reiner; Thompson, Keith R.; Doney, Scott C.

    2008-01-01

    There is growing recognition that rigorous skill assessment is required to understand the ability of ocean biological models to represent ocean processes and distributions. Statistical analysis of model results with observations represents the most quantitative form of skill assessment, and this principle serves as well for data assimilation models. However, skill assessment for data assimilation requires special consideration. This is because there are three sets of information in the free-run model, data, and the assimilation model, which uses Data assimilation information from both the flee-run model and the data. Intercom parison of results among the three sets of information is important and useful for assessment, but is not conclusive since the three information sets are intertwined. An independent data set is necessary for an objective determination. Other useful measures of ocean biological data assimilation assessment include responses of unassimilated variables to the data assimilation, performance outside the prescribed region/time of interest, forecasting, and trend analysis. Examples of each approach from the literature are provided. A comprehensive list of ocean biological data assimilation and their applications of skill assessment, in both ecosystem/biogeochemical and fisheries efforts, is summarized.

  5. Framework for sequential approximate optimization

    NARCIS (Netherlands)

    Jacobs, J.H.; Etman, L.F.P.; Keulen, van F.; Rooda, J.E.

    2004-01-01

    An object-oriented framework for Sequential Approximate Optimization (SAO) isproposed. The framework aims to provide an open environment for thespecification and implementation of SAO strategies. The framework is based onthe Python programming language and contains a toolbox of Python

  6. Methodological Developments in Geophysical Assimilation Modeling

    Science.gov (United States)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  7. Sequentially pulsed traveling wave accelerator

    Science.gov (United States)

    Caporaso, George J [Livermore, CA; Nelson, Scott D [Patterson, CA; Poole, Brian R [Tracy, CA

    2009-08-18

    A sequentially pulsed traveling wave compact accelerator having two or more pulse forming lines each with a switch for producing a short acceleration pulse along a short length of a beam tube, and a trigger mechanism for sequentially triggering the switches so that a traveling axial electric field is produced along the beam tube in synchronism with an axially traversing pulsed beam of charged particles to serially impart energy to the particle beam.

  8. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...

  9. Implicit particle filtering for equations with partial noise and application to geomagnetic data assimilation

    Science.gov (United States)

    Morzfeld, M.; Atkins, E.; Chorin, A. J.

    2011-12-01

    The task in data assimilation is to identify the state of a system from an uncertain model supplemented by a stream of incomplete and noisy data. The model is typically given in form of a discretization of an Ito stochastic differential equation (SDE), x(n+1) = R(x(n))+ G W(n), where x is an m-dimensional vector and n=0,1,2,.... The m-dimensional vector function R and the m x m matrix G depend on the SDE as well as on the discretization scheme, and W is an m-dimensional vector whose elements are independent standard normal variates. The data are y(n) = h(x(n))+QV(n) where h is a k-dimensional vector function, Q is a k x k matrix and V is a vector whose components are independent standard normal variates. One can use statistics of the conditional probability density (pdf) of the state given the observations, p(n+1)=p(x(n+1)|y(1), ... , y(n+1)), to identify the state x(n+1). Particle filters approximate p(n+1) by sequential Monte Carlo and rely on the recursive formulation of the target pdf, p(n+1)∝p(x(n+1)|x(n)) p(y(n+1)|x(n+1)). The pdf p(x(n+1)|x(n)) can be read off of the model equations to be a Gaussian with mean R(x(n)) and covariance matrix Σ = GG^T, where the T denotes a transposed; the pdf p(y(n+1)|x(n+1)) is a Gaussian with mean h(x(n+1)) and covariance QQ^T. In a sampling-importance-resampling (SIR) filter one samples new values for the particles from a prior pdf and then one weighs these samples with weights determined by the observations, to yield an approximation to p(n+1). Such weighting schemes often yield small weights for many of the particles. Implicit particle filtering overcomes this problem by using the observations to generate the particles, thus focusing attention on regions of large probability. A suitable algebraic equation that depends on the model and the observations is constructed for each particle, and its solution yields high probability samples of p(n+1). In the current formulation of the implicit particle filter, the state

  10. Impact of an observational time window on coupled data assimilation: simulation with a simple climate model

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2017-11-01

    Full Text Available Climate signals are the results of interactions of multiple timescale media such as the atmosphere and ocean in the coupled earth system. Coupled data assimilation (CDA pursues balanced and coherent climate analysis and prediction initialization by incorporating observations from multiple media into a coupled model. In practice, an observational time window (OTW is usually used to collect measured data for an assimilation cycle to increase observational samples that are sequentially assimilated with their original error scales. Given different timescales of characteristic variability in different media, what are the optimal OTWs for the coupled media so that climate signals can be most accurately recovered by CDA? With a simple coupled model that simulates typical scale interactions in the climate system and twin CDA experiments, we address this issue here. Results show that in each coupled medium, an optimal OTW can provide maximal observational information that best fits the characteristic variability of the medium during the data blending process. Maintaining correct scale interactions, the resulting CDA improves the analysis of climate signals greatly. These simple model results provide a guideline for when the real observations are assimilated into a coupled general circulation model for improving climate analysis and prediction initialization by accurately recovering important characteristic variability such as sub-diurnal in the atmosphere and diurnal in the ocean.

  11. The importance of peers: assimilation patterns among second-generation Turkish immigrants in Western Europe

    NARCIS (Netherlands)

    Ali, S.; Fokkema, C.M.

    2011-01-01

    The two dominant approaches to immigrant assimilation, segmented assimilation and "new" assimilation theories, have been successful at reporting and analyzing between-group differences in assimilation patterns. However, studies of assimilation generally do not address differences at the individual

  12. On NonAsymptotic Optimal Stopping Criteria in Monte Carlo Simulations

    KAUST Repository

    Bayer, Christian; Hoel, Hakon; von Schwerin, Erik; Tempone, Raul

    2014-01-01

    We consider the setting of estimating the mean of a random variable by a sequential stopping rule Monte Carlo (MC) method. The performance of a typical second moment based sequential stopping rule MC method is shown to be unreliable in such settings both by numerical examples and through analysis. By analysis and approximations, we construct a higher moment based stopping rule which is shown in numerical examples to perform more reliably and only slightly less efficiently than the second moment based stopping rule.

  13. Assimilative and non-assimilative color spreading in the watercolor configuration

    Directory of Open Access Journals (Sweden)

    Eiji eKimura

    2014-09-01

    Full Text Available A colored line flanking a darker contour will appear to spread its color onto an area enclosed by the line (watercolor effect. The watercolor effect has been characterized as an assimilative effect, but non-assimilative color spreading has also been demonstrated in the same spatial configuration; e.g., when a black inner contour (IC is paired with a blue outer contour (OC, yellow color spreading can be observed. To elucidate visual mechanisms underlying these different color spreading effects, this study investigated the effects of luminance ratio between the double contours on the induced color by systematically manipulating the IC and OC luminances (Experiment 1 as well as the background luminance (Experiment 2. The results showed that the luminance conditions suitable for assimilative and non-assimilative color spreading were nearly opposite. When the Weber contrast of the IC to the background luminances (IC contrast was smaller than that of the OC (OC contrast, the induced color became similar to the IC color (assimilative spreading. In contrast, when the OC contrast was smaller than or equal to the IC contrast, the induced color became yellow (non-assimilative spreading. Extending these findings, Experiment 3 showed that bilateral color spreading, e.g., assimilative spreading on one side and non-assimilative spreading on the other side, can also be observed in the watercolor configuration. These results suggest that the assimilative and non-assimilative spreading were mediated by different visual mechanisms. The properties of the assimilative spreading are consistent with the model proposed to account for neon color spreading [Grossberg, S. & Mingolla, E. (1985 Percept. Psychophys., 38, 141-171] and extended for the watercolor effect [Pinna, B., & Grossberg, S. (2005 J. Opt. Soc. Am. A, 22, 2207-2221]. However, the present results suggest that additional mechanisms are needed to account for the non-assimilative color spreading.

  14. Assimilative and non-assimilative color spreading in the watercolor configuration.

    Science.gov (United States)

    Kimura, Eiji; Kuroki, Mikako

    2014-01-01

    A colored line flanking a darker contour will appear to spread its color onto an area enclosed by the line (watercolor effect). The watercolor effect has been characterized as an assimilative effect, but non-assimilative color spreading has also been demonstrated in the same spatial configuration; e.g., when a black inner contour (IC) is paired with a blue outer contour (OC), yellow color spreading can be observed. To elucidate visual mechanisms underlying these different color spreading effects, this study investigated the effects of luminance ratio between the double contours on the induced color by systematically manipulating the IC and the OC luminance (Experiment 1) as well as the background luminance (Experiment 2). The results showed that the luminance conditions suitable for assimilative and non-assimilative color spreading were nearly opposite. When the Weber contrast of the IC to the background luminance (IC contrast) was smaller in size than that of the OC (OC contrast), the induced color became similar to the IC color (assimilative spreading). In contrast, when the OC contrast was smaller than or equal to the IC contrast, the induced color became yellow (non-assimilative spreading). Extending these findings, Experiment 3 showed that bilateral color spreading, i.e., assimilative spreading on one side and non-assimilative spreading on the other side, can also be observed in the watercolor configuration. These results suggest that the assimilative and the non-assimilative spreading were mediated by different visual mechanisms. The properties of the assimilative spreading are consistent with the model proposed to account for neon color spreading (Grossberg and Mingolla, 1985) and extended for the watercolor effect (Pinna and Grossberg, 2005). However, the present results suggest that additional mechanisms are needed to account for the non-assimilative color spreading.

  15. Error Covariance Estimation of Mesoscale Data Assimilation

    National Research Council Canada - National Science Library

    Xu, Qin

    2005-01-01

    The goal of this project is to explore and develop new methods of error covariance estimation that will provide necessary statistical descriptions of prediction and observation errors for mesoscale data assimilation...

  16. ERP ASSIMILATION: AN END-USER APPROACH

    Directory of Open Access Journals (Sweden)

    Hurbean Luminita

    2013-07-01

    The paper discusses the ERP adoption based on the IT assimilation theory. The ERP lifecycle is associated with the IT assimilation steps. We propose a distribution of these steps along the lifecycle. Derived from the findings in the reviewed literature we will focus the cultural factors, in particular those related to the end-users (determined as a major impact factor in our previous study: Negovan et al., 2011. Our empirical study is centred on the end-users perspective and it tries to determine if and how their behaviour affects the achievement of the ERP assimilation steps. The paper reasons that organizations that understand the IT assimilation steps correlated to the ERP implementation critical factors are more likely to implement and use ERP successfully.

  17. Comparison between assimilated and non-assimilated experiments of the MACCii global reanalysis near surface ozone

    Science.gov (United States)

    Tsikerdekis, Athanasios; Katragou, Eleni; Zanis, Prodromos; Melas, Dimitrios; Eskes, Henk; Flemming, Johannes; Huijnen, Vincent; Inness, Antje; Kapsomenakis, Ioannis; Schultz, Martin; Stein, Olaf; Zerefos, Christos

    2014-05-01

    In this work we evaluate near surface ozone concentrations of the MACCii global reanalysis using measurements from the EMEP and AIRBASE database. The eight-year long reanalysis of atmospheric composition data covering the period 2003-2010 was constructed as part of the FP7-funded Monitoring Atmospheric Composition and Climate project by assimilating satellite data into a global model and data assimilation system (Inness et al., 2013). The study mainly focuses in the differences between the assimilated and the non-assimilated experiments and aims to identify and quantify any improvements achieved by adding data assimilation to the system. Results are analyzed in eight European sub-regions and region-specific Taylor plots illustrate the evaluation and the overall predictive skill of each experiment. The diurnal and annual cycles of near surface ozone are evaluated for both experiments. Furthermore ozone exposure indices for crop growth (AOT40), human health (SOMO35) and the number of days that 8-hour ozone averages exceeded 60ppb and 90ppb have been calculated for each station based on both observed and simulated data. Results indicate mostly improvement of the assimilated experiment with respect to the high near surface ozone concentrations, the diurnal cycle and range and the bias in comparison to the non-assimilated experiment. The limitations of the comparison between assimilated and non-assimilated experiments for near surface ozone are also discussed.

  18. Assimilation of Baba and Nyonya in Malaysia

    OpenAIRE

    Razaleigh Muhamat Kawangit

    2015-01-01

    This research set outs to explore the exact level of the social aspect of assimilation between Baba and Nyonya and their Malay counterparts in Malaysia. It was sure that assimilation in social aspect is a dilemma which Baba and Nyonya face when they interact with Malays as a dominant ethnic group. It suggests that when the process of interaction, their behavior changes in line with the identity of the Malays. This is because the majority influenced the minority in the Malaysian context. Whils...

  19. Computational methods for data evaluation and assimilation

    CERN Document Server

    Cacuci, Dan Gabriel

    2013-01-01

    Data evaluation and data combination require the use of a wide range of probability theory concepts and tools, from deductive statistics mainly concerning frequencies and sample tallies to inductive inference for assimilating non-frequency data and a priori knowledge. Computational Methods for Data Evaluation and Assimilation presents interdisciplinary methods for integrating experimental and computational information. This self-contained book shows how the methods can be applied in many scientific and engineering areas. After presenting the fundamentals underlying the evaluation of experiment

  20. Temporal Reference, Attentional Modulation, and Crossmodal Assimilation

    Directory of Open Access Journals (Sweden)

    Yingqi Wan

    2018-06-01

    Full Text Available Crossmodal assimilation effect refers to the prominent phenomenon by which ensemble mean extracted from a sequence of task-irrelevant distractor events, such as auditory intervals, assimilates/biases the perception (such as visual interval of the subsequent task-relevant target events in another sensory modality. In current experiments, using visual Ternus display, we examined the roles of temporal reference, materialized as the time information accumulated before the onset of target event, as well as the attentional modulation in crossmodal temporal interaction. Specifically, we examined how the global time interval, the mean auditory inter-intervals and the last interval in the auditory sequence assimilate and bias the subsequent percept of visual Ternus motion (element motion vs. group motion. We demonstrated that both the ensemble (geometric mean and the last interval in the auditory sequence contribute to bias the percept of visual motion. Longer mean (or last interval elicited more reports of group motion, whereas the shorter mean (or last auditory intervals gave rise to more dominant percept of element motion. Importantly, observers have shown dynamic adaptation to the temporal reference of crossmodal assimilation: when the target visual Ternus stimuli were separated by a long gap interval after the preceding sound sequence, the assimilation effect by ensemble mean was reduced. Our findings suggested that crossmodal assimilation relies on a suitable temporal reference on adaptation level, and revealed a general temporal perceptual grouping principle underlying complex audio-visual interactions in everyday dynamic situations.

  1. A simple lightning assimilation technique for improving ...

    Science.gov (United States)

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-Fritsch (KF) convective scheme to improve retrospective simulations using the Weather Research and Forecasting (WRF) model. The assimilation method has a straightforward approach: force KF deep convection where lightning is observed and, optionally, suppress deep convection where lightning is absent. WRF simulations were made with and without lightning assimilation over the continental United States for July 2012, July 2013, and January 2013. The simulations were evaluated against NCEP stage-IV precipitation data and MADIS near-surface meteorological observations. In general, the use of lightning assimilation considerably improves the simulation of summertime rainfall. For example, the July 2012 monthly averaged bias of 6 h accumulated rainfall is reduced from 0.54 to 0.07 mm and the spatial correlation is increased from 0.21 to 0.43 when lightning assimilation is used. Statistical measures of near-surface meteorological variables also are improved. Consistent improvements also are seen for the July 2013 case. These results suggest that this lightning assimilation technique has the potential to substantially improve simulation of warm-season rainfall in retrospective WRF applications. The

  2. Data Assimilation: Making Sense of Earth Observation

    Directory of Open Access Journals (Sweden)

    William Albert Lahoz

    2014-05-01

    Full Text Available Climate change, air quality and environmental degradation are important societal challenges for the 21st Century. These challenges require an intelligent response from society, which in turn requires access to information about the Earth System. This information comes from observations and prior knowledge, the latter typically embodied in a model describing relationships between variables of the Earth System. Data assimilation provides an objective methodology to combine observational and model information to provide an estimate of the most likely state and its uncertainty for the whole Earth System. This approach adds value to the observations – by filling in the spatio-temporal gaps in observations; and to the model – by constraining it with the observations. In this review paper we motivate data assimilation as a methodology to fill in the gaps in observational information; illustrate the data assimilation approach with examples that span a broad range of features of the Earth System (atmosphere, including chemistry; ocean; land surface; and discuss the outlook for data assimilation, including the novel application of data assimilation ideas to observational information obtained using Citizen Science. Ultimately, a strong motivation of data assimilation is the many benefits it provides to users. These include: providing the initial state for weather and air quality forecasts; providing analyses and reanalyses for studying the Earth System; evaluating observations, instruments and models; assessing the relative value of elements of the Global Observing System (GOS; and assessing the added value of future additions to the GOS.

  3. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  4. Vectorized Monte Carlo

    International Nuclear Information System (INIS)

    Brown, F.B.

    1981-01-01

    Examination of the global algorithms and local kernels of conventional general-purpose Monte Carlo codes shows that multigroup Monte Carlo methods have sufficient structure to permit efficient vectorization. A structured multigroup Monte Carlo algorithm for vector computers is developed in which many particle events are treated at once on a cell-by-cell basis. Vectorization of kernels for tracking and variance reduction is described, and a new method for discrete sampling is developed to facilitate the vectorization of collision analysis. To demonstrate the potential of the new method, a vectorized Monte Carlo code for multigroup radiation transport analysis was developed. This code incorporates many features of conventional general-purpose production codes, including general geometry, splitting and Russian roulette, survival biasing, variance estimation via batching, a number of cutoffs, and generalized tallies of collision, tracklength, and surface crossing estimators with response functions. Predictions of vectorized performance characteristics for the CYBER-205 were made using emulated coding and a dynamic model of vector instruction timing. Computation rates were examined for a variety of test problems to determine sensitivities to batch size and vector lengths. Significant speedups are predicted for even a few hundred particles per batch, and asymptotic speedups by about 40 over equivalent Amdahl 470V/8 scalar codes arepredicted for a few thousand particles per batch. The principal conclusion is that vectorization of a general-purpose multigroup Monte Carlo code is well worth the significant effort required for stylized coding and major algorithmic changes

  5. Remarks on sequential designs in risk assessment

    International Nuclear Information System (INIS)

    Seidenfeld, T.

    1982-01-01

    The special merits of sequential designs are reviewed in light of particular challenges that attend risk assessment for human population. The kinds of ''statistical inference'' are distinguished and the problem of design which is pursued is the clash between Neyman-Pearson and Bayesian programs of sequential design. The value of sequential designs is discussed and the Neyman-Pearson vs. Bayesian sequential designs are probed in particular. Finally, warnings with sequential designs are considered, especially in relation to utilitarianism

  6. Sequential versus simultaneous market delineation

    DEFF Research Database (Denmark)

    Haldrup, Niels; Møllgaard, Peter; Kastberg Nielsen, Claus

    2005-01-01

    and geographical markets. Using a unique data setfor prices of Norwegian and Scottish salmon, we propose a methodologyfor simultaneous market delineation and we demonstrate that comparedto a sequential approach conclusions will be reversed.JEL: C3, K21, L41, Q22Keywords: Relevant market, econometric delineation......Delineation of the relevant market forms a pivotal part of most antitrustcases. The standard approach is sequential. First the product marketis delineated, then the geographical market is defined. Demand andsupply substitution in both the product dimension and the geographicaldimension...

  7. Sequential logic analysis and synthesis

    CERN Document Server

    Cavanagh, Joseph

    2007-01-01

    Until now, there was no single resource for actual digital system design. Using both basic and advanced concepts, Sequential Logic: Analysis and Synthesis offers a thorough exposition of the analysis and synthesis of both synchronous and asynchronous sequential machines. With 25 years of experience in designing computing equipment, the author stresses the practical design of state machines. He clearly delineates each step of the structured and rigorous design principles that can be applied to practical applications. The book begins by reviewing the analysis of combinatorial logic and Boolean a

  8. Evaluation of Gaussian approximations for data assimilation in reservoir models

    KAUST Repository

    Iglesias, Marco A.

    2013-07-14

    The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our

  9. Evaluation Using Sequential Trials Methods.

    Science.gov (United States)

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  10. Attack Trees with Sequential Conjunction

    NARCIS (Netherlands)

    Jhawar, Ravi; Kordy, Barbara; Mauw, Sjouke; Radomirović, Sasa; Trujillo-Rasua, Rolando

    2015-01-01

    We provide the first formal foundation of SAND attack trees which are a popular extension of the well-known attack trees. The SAND at- tack tree formalism increases the expressivity of attack trees by intro- ducing the sequential conjunctive operator SAND. This operator enables the modeling of

  11. Markov Chain Monte Carlo

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Markov Chain Monte Carlo - Examples. Arnab Chakraborty. General Article Volume 7 Issue 3 March 2002 pp 25-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0025-0034. Keywords.

  12. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  13. Bootstrap Sequential Determination of the Co-integration Rank in VAR Models

    DEFF Research Database (Denmark)

    Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert

    with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....

  14. Coupled assimilation for an intermediated coupled ENSO prediction model

    Science.gov (United States)

    Zheng, Fei; Zhu, Jiang

    2010-10-01

    The value of coupled assimilation is discussed using an intermediate coupled model in which the wind stress is the only atmospheric state which is slavery to model sea surface temperature (SST). In the coupled assimilation analysis, based on the coupled wind-ocean state covariance calculated from the coupled state ensemble, the ocean state is adjusted by assimilating wind data using the ensemble Kalman filter. As revealed by a series of assimilation experiments using simulated observations, the coupled assimilation of wind observations yields better results than the assimilation of SST observations. Specifically, the coupled assimilation of wind observations can help to improve the accuracy of the surface and subsurface currents because the correlation between the wind and ocean currents is stronger than that between SST and ocean currents in the equatorial Pacific. Thus, the coupled assimilation of wind data can decrease the initial condition errors in the surface/subsurface currents that can significantly contribute to SST forecast errors. The value of the coupled assimilation of wind observations is further demonstrated by comparing the prediction skills of three 12-year (1997-2008) hindcast experiments initialized by the ocean-only assimilation scheme that assimilates SST observations, the coupled assimilation scheme that assimilates wind observations, and a nudging scheme that nudges the observed wind stress data, respectively. The prediction skills of two assimilation schemes are significantly better than those of the nudging scheme. The prediction skills of assimilating wind observations are better than assimilating SST observations. Assimilating wind observations for the 2007/2008 La Niña event triggers better predictions, while assimilating SST observations fails to provide an early warning for that event.

  15. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  16. Data assimilation the ensemble Kalman filter

    CERN Document Server

    Evensen, Geir

    2007-01-01

    Data Assimilation comprehensively covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers. It is demonstrated how the different methods can be derived from a common theoretical basis, as well as how they differ and/or are related to each other, and which properties characterize them, using several examples. Rather than emphasize a particular discipline such as oceanography or meteorology, it presents the mathematical framework and derivations in a way which is common for any discipline where dynamics is merged with measurements. The mathematics level is modest, although it requires knowledge of basic spatial statistics, Bayesian statistics, and calculus of variations. Readers will also appreciate the introduction to the mathematical methods used and detailed derivations, which should b...

  17. Scalable and balanced dynamic hybrid data assimilation

    Science.gov (United States)

    Kauranne, Tuomo; Amour, Idrissa; Gunia, Martin; Kallio, Kari; Lepistö, Ahti; Koponen, Sampsa

    2017-04-01

    Scalability of complex weather forecasting suites is dependent on the technical tools available for implementing highly parallel computational kernels, but to an equally large extent also on the dependence patterns between various components of the suite, such as observation processing, data assimilation and the forecast model. Scalability is a particular challenge for 4D variational assimilation methods that necessarily couple the forecast model into the assimilation process and subject this combination to an inherently serial quasi-Newton minimization process. Ensemble based assimilation methods are naturally more parallel, but large models force ensemble sizes to be small and that results in poor assimilation accuracy, somewhat akin to shooting with a shotgun in a million-dimensional space. The Variational Ensemble Kalman Filter (VEnKF) is an ensemble method that can attain the accuracy of 4D variational data assimilation with a small ensemble size. It achieves this by processing a Gaussian approximation of the current error covariance distribution, instead of a set of ensemble members, analogously to the Extended Kalman Filter EKF. Ensemble members are re-sampled every time a new set of observations is processed from a new approximation of that Gaussian distribution which makes VEnKF a dynamic assimilation method. After this a smoothing step is applied that turns VEnKF into a dynamic Variational Ensemble Kalman Smoother VEnKS. In this smoothing step, the same process is iterated with frequent re-sampling of the ensemble but now using past iterations as surrogate observations until the end result is a smooth and balanced model trajectory. In principle, VEnKF could suffer from similar scalability issues as 4D-Var. However, this can be avoided by isolating the forecast model completely from the minimization process by implementing the latter as a wrapper code whose only link to the model is calling for many parallel and totally independent model runs, all of them

  18. A nested sampling particle filter for nonlinear data assimilation

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-04-15

    We present an efficient nonlinear data assimilation filter that combines particle filtering with the nested sampling algorithm. Particle filters (PF) utilize a set of weighted particles as a discrete representation of probability distribution functions (PDF). These particles are propagated through the system dynamics and their weights are sequentially updated based on the likelihood of the observed data. Nested sampling (NS) is an efficient sampling algorithm that iteratively builds a discrete representation of the posterior distributions by focusing a set of particles to high-likelihood regions. This would allow the representation of the posterior PDF with a smaller number of particles and reduce the effects of the curse of dimensionality. The proposed nested sampling particle filter (NSPF) iteratively builds the posterior distribution by applying a constrained sampling from the prior distribution to obtain particles in high-likelihood regions of the search space, resulting in a reduction of the number of particles required for an efficient behaviour of particle filters. Numerical experiments with the 3-dimensional Lorenz63 and the 40-dimensional Lorenz96 models show that NSPF outperforms PF in accuracy with a relatively smaller number of particles. © 2013 Royal Meteorological Society.

  19. Simultaneous state-parameter estimation supports the evaluation of data assimilation performance and measurement design for soil-water-atmosphere-plant system

    Science.gov (United States)

    Hu, Shun; Shi, Liangsheng; Zha, Yuanyuan; Williams, Mathew; Lin, Lin

    2017-12-01

    Improvements to agricultural water and crop managements require detailed information on crop and soil states, and their evolution. Data assimilation provides an attractive way of obtaining these information by integrating measurements with model in a sequential manner. However, data assimilation for soil-water-atmosphere-plant (SWAP) system is still lack of comprehensive exploration due to a large number of variables and parameters in the system. In this study, simultaneous state-parameter estimation using ensemble Kalman filter (EnKF) was employed to evaluate the data assimilation performance and provide advice on measurement design for SWAP system. The results demonstrated that a proper selection of state vector is critical to effective data assimilation. Especially, updating the development stage was able to avoid the negative effect of ;phenological shift;, which was caused by the contrasted phenological stage in different ensemble members. Simultaneous state-parameter estimation (SSPE) assimilation strategy outperformed updating-state-only (USO) assimilation strategy because of its ability to alleviate the inconsistency between model variables and parameters. However, the performance of SSPE assimilation strategy could deteriorate with an increasing number of uncertain parameters as a result of soil stratification and limited knowledge on crop parameters. In addition to the most easily available surface soil moisture (SSM) and leaf area index (LAI) measurements, deep soil moisture, grain yield or other auxiliary data were required to provide sufficient constraints on parameter estimation and to assure the data assimilation performance. This study provides an insight into the response of soil moisture and grain yield to data assimilation in SWAP system and is helpful for soil moisture movement and crop growth modeling and measurement design in practice.

  20. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  1. Discharge data assimilation in a distributed hydrologic model for flood forecasting purposes

    Science.gov (United States)

    Ercolani, G.; Castelli, F.

    2017-12-01

    Flood early warning systems benefit from accurate river flow forecasts, and data assimilation may improve their reliability. However, the actual enhancement that can be obtained in the operational practice should be investigated in detail and quantified. In this work we assess the benefits that the simultaneous assimilation of discharge observations at multiple locations can bring to flow forecasting through a distributed hydrologic model. The distributed model, MOBIDIC, is part of the operational flood forecasting chain of Tuscany Region in Central Italy. The assimilation system adopts a mixed variational-Monte Carlo approach to update efficiently initial river flow, soil moisture, and a parameter related to runoff production. The evaluation of the system is based on numerous hindcast experiments of real events. The events are characterized by significant rainfall that resulted in both high and relatively low flow in the river network. The area of study is the main basin of Tuscany Region, i.e. Arno river basin, which extends over about 8300 km2 and whose mean annual precipitation is around 800 mm. Arno's mainstream, with its nearly 240 km length, passes through major Tuscan cities, as Florence and Pisa, that are vulnerable to floods (e.g. flood of November 1966). The assimilation tests follow the usage of the model in the forecasting chain, employing the operational resolution in both space and time (500 m and 15 minutes respectively) and releasing new flow forecasts every 6 hours. The assimilation strategy is evaluated in respect to open loop simulations, i.e. runs that do not exploit discharge observations through data assimilation. We compare hydrographs in their entirety, as well as classical performance indexes, as error on peak flow and Nash-Sutcliffe efficiency. The dependence of performances on lead time and location is assessed. Results indicate that the operational forecasting chain can benefit from the developed assimilation system, although with a

  2. Monte Carlo principles and applications

    Energy Technology Data Exchange (ETDEWEB)

    Raeside, D E [Oklahoma Univ., Oklahoma City (USA). Health Sciences Center

    1976-03-01

    The principles underlying the use of Monte Carlo methods are explained, for readers who may not be familiar with the approach. The generation of random numbers is discussed, and the connection between Monte Carlo methods and random numbers is indicated. Outlines of two well established Monte Carlo sampling techniques are given, together with examples illustrating their use. The general techniques for improving the efficiency of Monte Carlo calculations are considered. The literature relevant to the applications of Monte Carlo calculations in medical physics is reviewed.

  3. Bayesian Monte Carlo method

    International Nuclear Information System (INIS)

    Rajabalinejad, M.

    2010-01-01

    To reduce cost of Monte Carlo (MC) simulations for time-consuming processes, Bayesian Monte Carlo (BMC) is introduced in this paper. The BMC method reduces number of realizations in MC according to the desired accuracy level. BMC also provides a possibility of considering more priors. In other words, different priors can be integrated into one model by using BMC to further reduce cost of simulations. This study suggests speeding up the simulation process by considering the logical dependence of neighboring points as prior information. This information is used in the BMC method to produce a predictive tool through the simulation process. The general methodology and algorithm of BMC method are presented in this paper. The BMC method is applied to the simplified break water model as well as the finite element model of 17th Street Canal in New Orleans, and the results are compared with the MC and Dynamic Bounds methods.

  4. A Multigrid NLS-4DVar Data Assimilation Scheme with Advanced Research WRF (ARW)

    Science.gov (United States)

    Zhang, H.; Tian, X.

    2017-12-01

    The motions of the atmosphere have multiscale properties in space and/or time, and the background error covariance matrix (Β) should thus contain error information at different correlation scales. To obtain an optimal analysis, the multigrid three-dimensional variational data assimilation scheme is used widely when sequentially correcting errors from large to small scales. However, introduction of the multigrid technique into four-dimensional variational data assimilation is not easy, due to its strong dependence on the adjoint model, which has extremely high computational costs in data coding, maintenance, and updating. In this study, the multigrid technique was introduced into the nonlinear least-squares four-dimensional variational assimilation (NLS-4DVar) method, which is an advanced four-dimensional ensemble-variational method that can be applied without invoking the adjoint models. The multigrid NLS-4DVar (MG-NLS-4DVar) scheme uses the number of grid points to control the scale, with doubling of this number when moving from a coarse to a finer grid. Furthermore, the MG-NLS-4DVar scheme not only retains the advantages of NLS-4DVar, but also sufficiently corrects multiscale errors to achieve a highly accurate analysis. The effectiveness and efficiency of the proposed MG-NLS-4DVar scheme were evaluated by several groups of observing system simulation experiments using the Advanced Research Weather Research and Forecasting Model. MG-NLS-4DVar outperformed NLS-4DVar, with a lower computational cost.

  5. Robustness of the Sequential Lineup Advantage

    Science.gov (United States)

    Gronlund, Scott D.; Carlson, Curt A.; Dailey, Sarah B.; Goodsell, Charles A.

    2009-01-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup…

  6. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  7. Contributon Monte Carlo

    International Nuclear Information System (INIS)

    Dubi, A.; Gerstl, S.A.W.

    1979-05-01

    The contributon Monte Carlo method is based on a new recipe to calculate target responses by means of volume integral of the contributon current in a region between the source and the detector. A comprehensive description of the method, its implementation in the general-purpose MCNP code, and results of the method for realistic nonhomogeneous, energy-dependent problems are presented. 23 figures, 10 tables

  8. Fundamentals of Monte Carlo

    International Nuclear Information System (INIS)

    Wollaber, Allan Benton

    2016-01-01

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating @@), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  9. Microcanonical Monte Carlo

    International Nuclear Information System (INIS)

    Creutz, M.

    1986-01-01

    The author discusses a recently developed algorithm for simulating statistical systems. The procedure interpolates between molecular dynamics methods and canonical Monte Carlo. The primary advantages are extremely fast simulations of discrete systems such as the Ising model and a relative insensitivity to random number quality. A variation of the algorithm gives rise to a deterministic dynamics for Ising spins. This model may be useful for high speed simulation of non-equilibrium phenomena

  10. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  11. Random sequential adsorption of cubes

    Science.gov (United States)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  12. Monte Carlo alpha calculation

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, D.; Soran, P.; Whalen, P.

    1985-01-01

    A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.

  13. Differentiation between Trichophyton mentagrophytes and T. rubrum by sorbitol assimilation.

    OpenAIRE

    Rezusta, A; Rubio, M C; Alejandre, M C

    1991-01-01

    Trichophyton rubrum was easily differentiated from T. mentagrophytes by its ability to assimilate sorbitol with an API 20C AUX strip. One hundred percent of 36 T. rubrum strains and none of 147 T. mentagrophytes strains assimilated sorbitol.

  14. Assimilation of Doppler weather radar observations in a mesoscale ...

    Indian Academy of Sciences (India)

    Research (PSU–NCAR) mesoscale model (MM5) version 3.5.6. The variational data assimilation ... investigation of the direct assimilation of radar reflectivity data in 3DVAR system. The present ...... Results presented in this paper are based on.

  15. Data assimilation of CALIPSO aerosol observations

    Directory of Open Access Journals (Sweden)

    T. T. Sekiyama

    2010-01-01

    Full Text Available We have developed an advanced data assimilation system for a global aerosol model with a four-dimensional ensemble Kalman filter in which the Level 1B data from the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO were successfully assimilated for the first time, to the best of the authors' knowledge. A one-month data assimilation cycle experiment for dust, sulfate, and sea-salt aerosols was performed in May 2007. The results were validated via two independent observations: 1 the ground-based lidar network in East Asia, managed by the National Institute for Environmental Studies of Japan, and 2 weather reports of aeolian dust events in Japan. Detailed four-dimensional structures of aerosol outflows from source regions over oceans and continents for various particle types and sizes were well reproduced. The intensity of dust emission at each grid point was also corrected by this data assimilation system. These results are valuable for the comprehensive analysis of aerosol behavior as well as aerosol forecasting.

  16. Development of a data assimilation algorithm

    DEFF Research Database (Denmark)

    Thomsen, Per Grove; Zlatev, Zahari

    2008-01-01

    It is important to incorporate all available observations when large-scale mathematical models arising in different fields of science and engineering are used to study various physical and chemical processes. Variational data assimilation techniques can be used in the attempts to utilize efficien......It is important to incorporate all available observations when large-scale mathematical models arising in different fields of science and engineering are used to study various physical and chemical processes. Variational data assimilation techniques can be used in the attempts to utilize...... assimilation technique is applied. Therefore, it is important to study the interplay between the three components of the variational data assimilation techniques as well as to apply powerful parallel computers in the computations. Some results obtained in the search for a good combination of numerical methods...... computers, Mathematics and Computers in Simulation, 65 (2004) 557–577, Z. Zlatev, Computer Treatment of Large Air Pollution Models, Kluwer Academic Publishers, Dordrecht, Boston, London, 1995]. The ideas are rather general and can easily be applied in connection with other mathematical models....

  17. Nitrogen assimilation in soybean nodules, 1

    International Nuclear Information System (INIS)

    Ohyama, Takuji; Kumazawa, Kikuo

    1980-01-01

    In order to elucidate the pathways to assimilate the ammonia produced by N 2 -fixation in soybean nodules, 15 N-labeled compounds were administered to intact nodules or nodule slices pretreated with various inhibitors of nitrogen assimilation. After exposure to 15 N 2 , 15 N-incorporation into various nitrogenous compounds was investigated in attached nodules injected with methionine sulfoximine (MSX) or azaserine (AS). MSX treatment increased the 15 N content of ammonia more than 6 times, however, depressed 15 N content of most of amides and amino acids. AS treatment enhanced 15 N content of amido-N of glutamine as well as ammonia, but decreased amino-N of glutamine and most of amino acids. Experiments with nodule slices pretreated with MSX or AS solution and then fed with 15 N-labeled ammonia or amido- 15 N of glutamine showed the same trends. Aminooxyacetate inhibited nitrogen flow from glutamic acid to other amino acids. These results strongly indicate that the ammonia produced by N 2 -fixation is assimilated by GS/GOGAT system to glutamic acid and then transaminated to various amino acids in situ. 15 N-incorporation patterns in nodule slices fed with 15 N-labeled ammonia, hydroxylamine, nitrite were similar, but nitrate seemed to be reduced in a definite compartment and assimilated similarly as in intact nodules fed with 15 N 2 (author)

  18. Data assimilation for air quality models

    DEFF Research Database (Denmark)

    Silver, Jeremy David

    2014-01-01

    -dimensional optimal interpolation procedure (OI), an Ensemble Kalman Filter (EnKF), and a three-dimensional variational scheme (3D-var). The three assimilation procedures are described and tested. A multi-faceted approach is taken for the verification, using independent measurements from surface air-quality...

  19. Data ingestion and assimilation in ionospheric models

    Czech Academy of Sciences Publication Activity Database

    Burešová, Dalia; Nava, B.; Galkin, I.; Angling, M.; Stankov, S. M.; Coisson, P.

    2009-01-01

    Roč. 52, 3/4 (2009), s. 235-253 ISSN 1593-5213 R&D Projects: GA ČR GA205/08/1356; GA MŠk OC 091 Institutional research plan: CEZ:AV0Z30420517 Keywords : ionosphere * models * data assimilation * data ingestion Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.548, year: 2009

  20. A study on assimilating potential vorticity data

    Science.gov (United States)

    Li, Yong; Ménard, Richard; Riishøjgaard, Lars Peter; Cohn, Stephen E.; Rood, Richard B.

    1998-08-01

    The correlation that exists between the potential vorticity (PV) field and the distribution of chemical tracers such as ozone suggests the possibility of using tracer observations as proxy PV data in atmospheric data assimilation systems. Especially in the stratosphere, there are plentiful tracer observations but a general lack of reliable wind observations, and the correlation is most pronounced. The issue investigated in this study is how model dynamics would respond to the assimilation of PV data. First, numerical experiments of identical-twin type were conducted with a simple univariate nuding algorithm and a global shallow water model based on PV and divergence (PV-D model). All model fields are successfully reconstructed through the insertion of complete PV data alone if an appropriate value for the nudging coefficient is used. A simple linear analysis suggests that slow modes are recovered rapidly, at a rate nearly independent of spatial scale. In a more realistic experiment, appropriately scaled total ozone data from the NIMBUS-7 TOMS instrument were assimilated as proxy PV data into the PV-D model over a 10-day period. The resulting model PV field matches the observed total ozone field relatively well on large spatial scales, and the PV, geopotential and divergence fields are dynamically consistent. These results indicate the potential usefulness that tracer observations, as proxy PV data, may offer in a data assimilation system.

  1. Empowering Geoscience with Improved Data Assimilation Using the Data Assimilation Research Testbed "Manhattan" Release.

    Science.gov (United States)

    Raeder, K.; Hoar, T. J.; Anderson, J. L.; Collins, N.; Hendricks, J.; Kershaw, H.; Ha, S.; Snyder, C.; Skamarock, W. C.; Mizzi, A. P.; Liu, H.; Liu, J.; Pedatella, N. M.; Karspeck, A. R.; Karol, S. I.; Bitz, C. M.; Zhang, Y.

    2017-12-01

    The capabilities of the Data Assimilation Research Testbed (DART) at NCAR have been significantly expanded with the recent "Manhattan" release. DART is an ensemble Kalman filter based suite of tools, which enables researchers to use data assimilation (DA) without first becoming DA experts. Highlights: significant improvement in efficient ensemble DA for very large models on thousands of processors, direct read and write of model state files in parallel, more control of the DA output for finer-grained analysis, new model interfaces which are useful to a variety of geophysical researchers, new observation forward operators and the ability to use precomputed forward operators from the forecast model. The new model interfaces and example applications include the following: MPAS-A; Model for Prediction Across Scales - Atmosphere is a global, nonhydrostatic, variable-resolution mesh atmospheric model, which facilitates multi-scale analysis and forecasting. The absence of distinct subdomains eliminates problems associated with subdomain boundaries. It demonstrates the ability to consistently produce higher-quality analyses than coarse, uniform meshes do. WRF-Chem; Weather Research and Forecasting + (MOZART) Chemistry model assimilates observations from FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment). WACCM-X; Whole Atmosphere Community Climate Model with thermosphere and ionosphere eXtension assimilates observations of electron density to investigate sudden stratospheric warming. CESM (weakly) coupled assimilation; NCAR's Community Earth System Model is used for assimilation of atmospheric and oceanic observations into their respective components using coupled atmosphere+land+ocean+sea+ice forecasts. CESM2.0; Assimilation in the atmospheric component (CAM, WACCM) of the newly released version is supported. This version contains new and extensively updated components and software environment. CICE; Los Alamos sea ice model (in CESM) is used to assimilate

  2. Variational data assimilation using targetted random walks

    KAUST Repository

    Cotter, S. L.; Dashti, M.; Stuart, A. M.

    2011-01-01

    chain-Monte Carlo (MCMC) method which enables us to directly sample from the Bayesian posterior distribution on the unknown functions of interest given observations. Since we are aware that these methods are currently too computationally expensive

  3. A variational ensemble scheme for noisy image data assimilation

    Science.gov (United States)

    Yang, Yin; Robinson, Cordelia; Heitz, Dominique; Mémin, Etienne

    2014-05-01

    Data assimilation techniques aim at recovering a system state variables trajectory denoted as X, along time from partially observed noisy measurements of the system denoted as Y. These procedures, which couple dynamics and noisy measurements of the system, fulfill indeed a twofold objective. On one hand, they provide a denoising - or reconstruction - procedure of the data through a given model framework and on the other hand, they provide estimation procedures for unknown parameters of the dynamics. A standard variational data assimilation problem can be formulated as the minimization of the following objective function with respect to the initial discrepancy, η, from the background initial guess: δ« J(η(x)) = 1∥Xb (x) - X (t ,x)∥2 + 1 tf∥H(X (t,x ))- Y (t,x)∥2dt. 2 0 0 B 2 t0 R (1) where the observation operator H links the state variable and the measurements. The cost function can be interpreted as the log likelihood function associated to the a posteriori distribution of the state given the past history of measurements and the background. In this work, we aim at studying ensemble based optimal control strategies for data assimilation. Such formulation nicely combines the ingredients of ensemble Kalman filters and variational data assimilation (4DVar). It is also formulated as the minimization of the objective function (1), but similarly to ensemble filter, it introduces in its objective function an empirical ensemble-based background-error covariance defined as: B ≡ )(Xb - )T>. (2) Thus, it works in an off-line smoothing mode rather than on the fly like sequential filters. Such resulting ensemble variational data assimilation technique corresponds to a relatively new family of methods [1,2,3]. It presents two main advantages: first, it does not require anymore to construct the adjoint of the dynamics tangent linear operator, which is a considerable advantage with respect to the method's implementation, and second, it enables the handling of a flow

  4. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  5. Monts Jura Jazz Festival

    CERN Multimedia

    Jazz Club

    2012-01-01

    The 5th edition of the "Monts Jura Jazz Festival" that will take place on September 21st and 22nd 2012 at the Esplanade du Lac in Divonne-les-Bains. This festival is organized by the "CERN Jazz Club" with the support of the "CERN Staff Association". This festival is a major musical event in the French/Swiss area and proposes a world class program with jazz artists such as D.Lockwood and D.Reinhardt. More information on http://www.jurajazz.com.

  6. Monts Jura Jazz Festival

    CERN Document Server

    2012-01-01

    The 5th edition of the "Monts Jura Jazz Festival" will take place at the Esplanade du Lac in Divonne-les-Bains, France on September 21 and 22. This festival organized by the CERN Jazz Club and supported by the CERN Staff Association is becoming a major musical event in the Geneva region. International Jazz artists like Didier Lockwood and David Reinhardt are part of this year outstanding program. Full program and e-tickets are available on the festival website. Don't miss this great festival!

  7. Improved Assimilation of Streamflow and Satellite Soil Moisture with the Evolutionary Particle Filter and Geostatistical Modeling

    Science.gov (United States)

    Yan, Hongxiang; Moradkhani, Hamid; Abbaszadeh, Peyman

    2017-04-01

    Assimilation of satellite soil moisture and streamflow data into hydrologic models using has received increasing attention over the past few years. Currently, these observations are increasingly used to improve the model streamflow and soil moisture predictions. However, the performance of this land data assimilation (DA) system still suffers from two limitations: 1) satellite data scarcity and quality; and 2) particle weight degeneration. In order to overcome these two limitations, we propose two possible solutions in this study. First, the general Gaussian geostatistical approach is proposed to overcome the limitation in the space/time resolution of satellite soil moisture products thus improving their accuracy at uncovered/biased grid cells. Secondly, an evolutionary PF approach based on Genetic Algorithm (GA) and Markov Chain Monte Carlo (MCMC), the so-called EPF-MCMC, is developed to further reduce weight degeneration and improve the robustness of the land DA system. This study provides a detailed analysis of the joint and separate assimilation of streamflow and satellite soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed EPF-MCMC and the general Gaussian geostatistical approach. Performance is assessed over several basins in the USA selected from Model Parameter Estimation Experiment (MOPEX) and located in different climate regions. The results indicate that: 1) the general Gaussian approach can predict the soil moisture at uncovered grid cells within the expected satellite data quality threshold; 2) assimilation of satellite soil moisture inferred from the general Gaussian model can significantly improve the soil moisture predictions; and 3) in terms of both deterministic and probabilistic measures, the EPF-MCMC can achieve better streamflow predictions. These results recommend that the geostatistical model is a helpful tool to aid the remote sensing technique and the EPF-MCMC is a

  8. Assimilation of Gridded GRACE Terrestrial Water Storage Estimates in the North American Land Data Assimilation System

    Science.gov (United States)

    Kumar, Sujay V.; Zaitchik, Benjamin F.; Peters-Lidard, Christa D.; Rodell, Matthew; Reichle, Rolf; Li, Bailing; Jasinski, Michael; Mocko, David; Getirana, Augusto; De Lannoy, Gabrielle; hide

    2016-01-01

    The objective of the North American Land Data Assimilation System (NLDAS) is to provide best available estimates of near-surface meteorological conditions and soil hydrological status for the continental United States. To support the ongoing efforts to develop data assimilation (DA) capabilities for NLDAS, the results of Gravity Recovery and Climate Experiment (GRACE) DA implemented in a manner consistent with NLDAS development are presented. Following previous work, GRACE terrestrial water storage (TWS) anomaly estimates are assimilated into the NASA Catchment land surface model using an ensemble smoother. In contrast to many earlier GRACE DA studies, a gridded GRACE TWS product is assimilated, spatially distributed GRACE error estimates are accounted for, and the impact that GRACE scaling factors have on assimilation is evaluated. Comparisons with quality-controlled in situ observations indicate that GRACE DA has a positive impact on the simulation of unconfined groundwater variability across the majority of the eastern United States and on the simulation of surface and root zone soil moisture across the country. Smaller improvements are seen in the simulation of snow depth, and the impact of GRACE DA on simulated river discharge and evapotranspiration is regionally variable. The use of GRACE scaling factors during assimilation improved DA results in the western United States but led to small degradations in the eastern United States. The study also found comparable performance between the use of gridded and basin averaged GRACE observations in assimilation. Finally, the evaluations presented in the paper indicate that GRACE DA can be helpful in improving the representation of droughts.

  9. Data Assimilation in Integrated and Distributed Hydrological Models

    DEFF Research Database (Denmark)

    Zhang, Donghua

    processes and provide simulations in refined temporal and spatial resolutions. Recent developments in measurement and sensor technologies have significantly improved the coverage, quality, frequency and diversity of hydrological observations. Data assimilation provides a great potential in relation...... point of view, different assimilation methodologies and techniques have been developed or customized to better serve hydrological assimilation. From the application point of view, real data and real-world complex catchments are used with the focus of investigating the models’ improvements with data...... a variety of model uncertainty sources and scales. Next the groundwater head assimilation experiment was tested in a much more complex catchment with assimilation of biased real observations. In such cases, the bias-aware assimilation method significantly outperforms the standard assimilation method...

  10. Dark matter assimilation into the baryon asymmetry

    International Nuclear Information System (INIS)

    D'Eramo, Francesco; Fei, Lin; Thaler, Jesse

    2012-01-01

    Pure singlets are typically disfavored as dark matter candidates, since they generically have a thermal relic abundance larger than the observed value. In this paper, we propose a new dark matter mechanism called a ssimilation , which takes advantage of the baryon asymmetry of the universe to generate the correct relic abundance of singlet dark matter. Through assimilation, dark matter itself is efficiently destroyed, but dark matter number is stored in new quasi-stable heavy states which carry the baryon asymmetry. The subsequent annihilation and late-time decay of these heavy states yields (symmetric) dark matter as well as (asymmetric) standard model baryons. We study in detail the case of pure bino dark matter by augmenting the minimal supersymmetric standard model with vector-like chiral multiplets. In the parameter range where this mechanism is effective, the LHC can discover long-lived charged particles which were responsible for assimilating dark matter

  11. Data assimilation in integrated hydrological modelling

    DEFF Research Database (Denmark)

    Rasmussen, Jørn

    Integrated hydrological models are useful tools for water resource management and research, and advances in computational power and the advent of new observation types has resulted in the models generally becoming more complex and distributed. However, the models are often characterized by a high...... degree of parameterization which results in significant model uncertainty which cannot be reduced much due to observations often being scarce and often taking the form of point measurements. Data assimilation shows great promise for use in integrated hydrological models , as it allows for observations...... to be efficiently combined with models to improve model predictions, reduce uncertainty and estimate model parameters. In this thesis, a framework for assimilating multiple observation types and updating multiple components and parameters of a catchment scale integrated hydrological model is developed and tested...

  12. A Monte Carlo study of radiation trapping effects

    International Nuclear Information System (INIS)

    Wang, J.B.; Williams, J.F.; Carter, C.J.

    1997-01-01

    A Monte Carlo simulation of radiative transfer in an atomic beam is carried out to investigate the effects of radiation trapping on electron-atom collision experiments. The collisionally excited atom is represented by a simple electric dipole, for which the emission intensity distribution is well known. The spatial distribution, frequency and free path of this and the sequential dipoles were determined by a computer random generator according to the probabilities given by quantum theory. By altering the atomic number density at the target site, the pressure dependence of the observed atomic lifetime, the angular intensity distribution and polarisation of the radiation field is studied. 7 refs., 5 figs

  13. Sequential series for nuclear reactions

    International Nuclear Information System (INIS)

    Izumo, Ko

    1975-01-01

    A new time-dependent treatment of nuclear reactions is given, in which the wave function of compound nucleus is expanded by a sequential series of the reaction processes. The wave functions of the sequential series form another complete set of compound nucleus at the limit Δt→0. It is pointed out that the wave function is characterized by the quantities: the number of degrees of freedom of motion n, the period of the motion (Poincare cycle) tsub(n), the delay time t sub(nμ) and the relaxation time tausub(n) to the equilibrium of compound nucleus, instead of the usual quantum number lambda, the energy eigenvalue Esub(lambda) and the total width GAMMAsub(lambda) of resonance levels, respectively. The transition matrix elements and the yields of nuclear reactions also become the functions of time given by the Fourier transform of the usual ones. The Poincare cycles of compound nuclei are compared with the observed correlations among resonance levels, which are about 10 -17 --10 -16 sec for medium and heavy nuclei and about 10 -20 sec for the intermediate resonances. (auth.)

  14. Data assimilation techniques in modeling ocean processes

    Digital Repository Service at National Institute of Oceanography (India)

    Mahadevan, R.; Fernandes, A.A.; Naqvi, S.W.A.

    are usually called data analysis or assimilation. These homogeneous fields are prerequisites for various practical applications and theoretical studies. The fields produced by an analysi the one hand, they must be close to the observations.... The practical usefulness of variational methods for meteorological problems are pointed out very early by Sasaki (1955, 1958), but in spite of that these methods have not been fully utilized. Probably, the complex mathematical technicality of these methods...

  15. Data assimilation and model evaluation experiment datasets

    Science.gov (United States)

    Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.

    1994-01-01

    The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.

  16. Exploring the sequential lineup advantage using WITNESS.

    Science.gov (United States)

    Goodsell, Charles A; Gronlund, Scott D; Carlson, Curt A

    2010-12-01

    Advocates claim that the sequential lineup is an improvement over simultaneous lineup procedures, but no formal (quantitatively specified) explanation exists for why it is better. The computational model WITNESS (Clark, Appl Cogn Psychol 17:629-654, 2003) was used to develop theoretical explanations for the sequential lineup advantage. In its current form, WITNESS produced a sequential advantage only by pairing conservative sequential choosing with liberal simultaneous choosing. However, this combination failed to approximate four extant experiments that exhibited large sequential advantages. Two of these experiments became the focus of our efforts because the data were uncontaminated by likely suspect position effects. Decision-based and memory-based modifications to WITNESS approximated the data and produced a sequential advantage. The next step is to evaluate the proposed explanations and modify public policy recommendations accordingly.

  17. Multi-Scale Hydrometeorological Modeling, Land Data Assimilation and Parameter Estimation with the Land Information System

    Science.gov (United States)

    Peters-Lidard, Christa D.; Kumar, Sujay V.; Santanello, Joseph A., Jr.; Reichle, Rolf H.

    2009-01-01

    NOAA's National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC) for their land data assimilation systems to support weather and climate modeling. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through "plugins,". As described in Kumar et al., 2007, and demonstrated in Case et al., 2008, and Santanello et al., 2009, LIS has been coupled to the Weather Research and Forecasting (WRF) model to support studies of land-atmosphere coupling the enabling ensembles of land surface states to be tested against multiple representations of the atmospheric boundary layer. LIS has also been demonstrated for parameter estimation as described in Peters-Lidard et al. (2008) and Santanello et al. (2007), who showed that the use of sequential remotely sensed soil moisture products can be used to derive soil hydraulic and texture properties given a sufficient dynamic range in the soil moisture retrievals and accurate precipitation inputs. LIS has also recently been demonstrated for multi-model data assimilation (Kumar et al., 2008) using an Ensemble Kalman Filter for sequential assimilation of soil moisture, snow, and temperature. Ongoing work has demonstrated the value of bias correction as part of the filter, and also that of joint calibration and assimilation. Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeoroogical modeling, assimilation and parameter estimation will be presented as advancements towards the next generation of integrated observation and modeling systems.

  18. Multi-Scale Hydrometeorological Modeling, Land Data Assimilation and Parameter Estimation with the Land Information System

    Science.gov (United States)

    Peters-Lidard, Christa D.

    2011-01-01

    Center (EMC) for their land data assimilation systems to support weather and climate modeling. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through "plugins". LIS has been coupled to the Weather Research and Forecasting (WRF) model to support studies of land-atmosphere coupling be enabling ensembles of land surface states to be tested against multiple representations of the atmospheric boundary layer. LIS has also been demonstrated for parameter estimation, who showed that the use of sequential remotely sensed soil moisture products can be used to derive soil hydraulic and texture properties given a sufficient dynamic range in the soil moisture retrievals and accurate precipitation inputs.LIS has also recently been demonstrated for multi-model data assimilation using an Ensemble Kalman Filter for sequential assimilation of soil moisture, snow, and temperature.Ongoing work has demonstrated the value of bias correction as part of the filter, and also that of joint calibration and assimilation.Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeorological modeling, assimilation and parameter estimation will be presented as advancements towards the next generation of integrated observation and modeling systems

  19. Sequential lineup presentation: Patterns and policy

    OpenAIRE

    Lindsay, R C L; Mansour, Jamal K; Beaudry, J L; Leach, A-M; Bertrand, M I

    2009-01-01

    Sequential lineups were offered as an alternative to the traditional simultaneous lineup. Sequential lineups reduce incorrect lineup selections; however, the accompanying loss of correct identifications has resulted in controversy regarding adoption of the technique. We discuss the procedure and research relevant to (1) the pattern of results found using sequential versus simultaneous lineups; (2) reasons (theory) for differences in witness responses; (3) two methodological issues; and (4) im...

  20. The Bacterial Sequential Markov Coalescent.

    Science.gov (United States)

    De Maio, Nicola; Wilson, Daniel J

    2017-05-01

    Bacteria can exchange and acquire new genetic material from other organisms directly and via the environment. This process, known as bacterial recombination, has a strong impact on the evolution of bacteria, for example, leading to the spread of antibiotic resistance across clades and species, and to the avoidance of clonal interference. Recombination hinders phylogenetic and transmission inference because it creates patterns of substitutions (homoplasies) inconsistent with the hypothesis of a single evolutionary tree. Bacterial recombination is typically modeled as statistically akin to gene conversion in eukaryotes, i.e. , using the coalescent with gene conversion (CGC). However, this model can be very computationally demanding as it needs to account for the correlations of evolutionary histories of even distant loci. So, with the increasing popularity of whole genome sequencing, the need has emerged for a faster approach to model and simulate bacterial genome evolution. We present a new model that approximates the coalescent with gene conversion: the bacterial sequential Markov coalescent (BSMC). Our approach is based on a similar idea to the sequential Markov coalescent (SMC)-an approximation of the coalescent with crossover recombination. However, bacterial recombination poses hurdles to a sequential Markov approximation, as it leads to strong correlations and linkage disequilibrium across very distant sites in the genome. Our BSMC overcomes these difficulties, and shows a considerable reduction in computational demand compared to the exact CGC, and very similar patterns in simulated data. We implemented our BSMC model within new simulation software FastSimBac. In addition to the decreased computational demand compared to previous bacterial genome evolution simulators, FastSimBac provides more general options for evolutionary scenarios, allowing population structure with migration, speciation, population size changes, and recombination hotspots. FastSimBac is

  1. Biased lineups: sequential presentation reduces the problem.

    Science.gov (United States)

    Lindsay, R C; Lea, J A; Nosworthy, G J; Fulford, J A; Hector, J; LeVan, V; Seabrook, C

    1991-12-01

    Biased lineups have been shown to increase significantly false, but not correct, identification rates (Lindsay, Wallbridge, & Drennan, 1987; Lindsay & Wells, 1980; Malpass & Devine, 1981). Lindsay and Wells (1985) found that sequential lineup presentation reduced false identification rates, presumably by reducing reliance on relative judgment processes. Five staged-crime experiments were conducted to examine the effect of lineup biases and sequential presentation on eyewitness recognition accuracy. Sequential lineup presentation significantly reduced false identification rates from fair lineups as well as from lineups biased with regard to foil similarity, instructions, or witness attire, and from lineups biased in all of these ways. The results support recommendations that police present lineups sequentially.

  2. MONTE and ANAL1

    International Nuclear Information System (INIS)

    Lupton, L.R.; Keller, N.A.

    1982-09-01

    The design of a positron emission tomography (PET) ring camera involves trade-offs between such things as sensitivity, resolution and cost. As a design aid, a Monte Carlo simulation of a single-ring camera system has been developed. The model includes a source-filled phantom, collimators, detectors, and optional shadow shields and inter-crystal septa. Individual gamma rays are tracked within the system materials until they escape, are absorbed, or are detected. Compton and photelectric interactions are modelled. All system dimensions are variable within the computation. Coincidence and singles data are recorded according to type (true or scattered), annihilation origin, and detected energy. Photon fluxes at various points of interest, such as the edge of the phantom and the collimator, are available. This report reviews the basics of PET, describes the physics involved in the simulation, and provides detailed outlines of the routines

  3. Frost in Charitum Montes

    Science.gov (United States)

    2003-01-01

    MGS MOC Release No. MOC2-387, 10 June 2003This is a Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle view of the Charitum Montes, south of Argyre Planitia, in early June 2003. The seasonal south polar frost cap, composed of carbon dioxide, has been retreating southward through this area since spring began a month ago. The bright features toward the bottom of this picture are surfaces covered by frost. The picture is located near 57oS, 43oW. North is at the top, south is at the bottom. Sunlight illuminates the scene from the upper left. The area shown is about 217 km (135 miles) wide.

  4. Immediate Sequential Bilateral Cataract Surgery

    DEFF Research Database (Denmark)

    Kessel, Line; Andresen, Jens; Erngaard, Ditte

    2015-01-01

    The aim of the present systematic review was to examine the benefits and harms associated with immediate sequential bilateral cataract surgery (ISBCS) with specific emphasis on the rate of complications, postoperative anisometropia, and subjective visual function in order to formulate evidence......-based national Danish guidelines for cataract surgery. A systematic literature review in PubMed, Embase, and Cochrane central databases identified three randomized controlled trials that compared outcome in patients randomized to ISBCS or bilateral cataract surgery on two different dates. Meta-analyses were...... performed using the Cochrane Review Manager software. The quality of the evidence was assessed using the GRADE method (Grading of Recommendation, Assessment, Development, and Evaluation). We did not find any difference in the risk of complications or visual outcome in patients randomized to ISBCS or surgery...

  5. Random and cooperative sequential adsorption

    Science.gov (United States)

    Evans, J. W.

    1993-10-01

    Irreversible random sequential adsorption (RSA) on lattices, and continuum "car parking" analogues, have long received attention as models for reactions on polymer chains, chemisorption on single-crystal surfaces, adsorption in colloidal systems, and solid state transformations. Cooperative generalizations of these models (CSA) are sometimes more appropriate, and can exhibit richer kinetics and spatial structure, e.g., autocatalysis and clustering. The distribution of filled or transformed sites in RSA and CSA is not described by an equilibrium Gibbs measure. This is the case even for the saturation "jammed" state of models where the lattice or space cannot fill completely. However exact analysis is often possible in one dimension, and a variety of powerful analytic methods have been developed for higher dimensional models. Here we review the detailed understanding of asymptotic kinetics, spatial correlations, percolative structure, etc., which is emerging for these far-from-equilibrium processes.

  6. Monte Carlo Methods in Physics

    International Nuclear Information System (INIS)

    Santoso, B.

    1997-01-01

    Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained

  7. Advanced data assimilation in strongly nonlinear dynamical systems

    Science.gov (United States)

    Miller, Robert N.; Ghil, Michael; Gauthiez, Francois

    1994-01-01

    Advanced data assimilation methods are applied to simple but highly nonlinear problems. The dynamical systems studied here are the stochastically forced double well and the Lorenz model. In both systems, linear approximation of the dynamics about the critical points near which regime transitions occur is not always sufficient to track their occurrence or nonoccurrence. Straightforward application of the extended Kalman filter yields mixed results. The ability of the extended Kalman filter to track transitions of the double-well system from one stable critical point to the other depends on the frequency and accuracy of the observations relative to the mean-square amplitude of the stochastic forcing. The ability of the filter to track the chaotic trajectories of the Lorenz model is limited to short times, as is the ability of strong-constraint variational methods. Examples are given to illustrate the difficulties involved, and qualitative explanations for these difficulties are provided. Three generalizations of the extended Kalman filter are described. The first is based on inspection of the innovation sequence, that is, the successive differences between observations and forecasts; it works very well for the double-well problem. The second, an extension to fourth-order moments, yields excellent results for the Lorenz model but will be unwieldy when applied to models with high-dimensional state spaces. A third, more practical method--based on an empirical statistical model derived from a Monte Carlo simulation--is formulated, and shown to work very well. Weak-constraint methods can be made to perform satisfactorily in the context of these simple models, but such methods do not seem to generalize easily to practical models of the atmosphere and ocean. In particular, it is shown that the equations derived in the weak variational formulation are difficult to solve conveniently for large systems.

  8. Particle filters, a quasi-Monte-Carlo-solution for segmentation of coronaries.

    Science.gov (United States)

    Florin, Charles; Paragios, Nikos; Williams, Jim

    2005-01-01

    In this paper we propose a Particle Filter-based approach for the segmentation of coronary arteries. To this end, successive planes of the vessel are modeled as unknown states of a sequential process. Such states consist of the orientation, position, shape model and appearance (in statistical terms) of the vessel that are recovered in an incremental fashion, using a sequential Bayesian filter (Particle Filter). In order to account for bifurcations and branchings, we consider a Monte Carlo sampling rule that propagates in parallel multiple hypotheses. Promising results on the segmentation of coronary arteries demonstrate the potential of the proposed approach.

  9. Spatial dependence of color assimilation by the watercolor effect.

    Science.gov (United States)

    Devinck, Frédéric; Delahunt, Peter B; Hardy, Joseph L; Spillmann, Lothar; Werner, John S

    2006-01-01

    Color assimilation with bichromatic contours was quantified for spatial extents ranging from von Bezold-type color assimilation to the watercolor effect. The magnitude and direction of assimilative hue change was measured as a function of the width of a rectangular stimulus. Assimilation was quantified by hue cancellation. Large hue shifts were required to null the color of stimuli < or = 9.3 min of arc in width, with an exponential decrease for stimuli increasing up to 7.4 deg. When stimuli were viewed through an achromatizing lens, the magnitude of the assimilation effect was reduced for narrow stimuli, but not for wide ones. These results demonstrate that chromatic aberration may account, in part, for color assimilation over small, but not large, surface areas.

  10. Accelerating assimilation development for new observing systems using EFSO

    Science.gov (United States)

    Lien, Guo-Yuan; Hotta, Daisuke; Kalnay, Eugenia; Miyoshi, Takemasa; Chen, Tse-Chun

    2018-03-01

    To successfully assimilate data from a new observing system, it is necessary to develop appropriate data selection strategies, assimilating only the generally useful data. This development work is usually done by trial and error using observing system experiments (OSEs), which are very time and resource consuming. This study proposes a new, efficient methodology to accelerate the development using ensemble forecast sensitivity to observations (EFSO). First, non-cycled assimilation of the new observation data is conducted to compute EFSO diagnostics for each observation within a large sample. Second, the average EFSO conditionally sampled in terms of various factors is computed. Third, potential data selection criteria are designed based on the non-cycled EFSO statistics, and tested in cycled OSEs to verify the actual assimilation impact. The usefulness of this method is demonstrated with the assimilation of satellite precipitation data. It is shown that the EFSO-based method can efficiently suggest data selection criteria that significantly improve the assimilation results.

  11. Benefits and Pitfalls of GRACE Terrestrial Water Storage Data Assimilation

    Science.gov (United States)

    Girotto, Manuela

    2018-01-01

    Satellite observations of terrestrial water storage (TWS) from the Gravity Recovery and Climate Experiment (GRACE) mission have a coarse resolution in time (monthly) and space (roughly 150,000 sq km at midlatitudes) and vertically integrate all water storage components over land, including soil moisture and groundwater. Nonetheless, data assimilation can be used to horizontally downscale and vertically partition GRACE-TWS observations. This presentation illustrates some of the benefits and drawbacks of assimilating TWS observations from GRACE into a land surface model over the continental United States and India. The assimilation scheme yields improved skill metrics for groundwater compared to the no-assimilation simulations. A smaller impact is seen for surface and root-zone soil moisture. Further, GRACE observes TWS depletion associated with anthropogenic groundwater extraction. Results from the assimilation emphasize the importance of representing anthropogenic processes in land surface modeling and data assimilation systems.

  12. Initializing carbon cycle predictions from the Community Land Model by assimilating global biomass observations

    Science.gov (United States)

    Fox, A. M.; Hoar, T. J.; Smith, W. K.; Moore, D. J.

    2017-12-01

    assumptions and inputs in the algorithms that are incompatible with those encoded within CLM. It is probable that VOD describes changes in biomass more accurately than absolute values, so in additional to sequential assimilation of observations, we have tested alternative filter algorithms, and assimilating VOD anomalies.

  13. Ensemble perturbation smoother for optimizing tidal boundary conditions by assimilation of High-Frequency radar surface currents – application to the German Bight

    Directory of Open Access Journals (Sweden)

    A. Barth

    2010-02-01

    Full Text Available High-Frequency (HF radars measure the ocean surface currents at various spatial and temporal scales. These include tidal currents, wind-driven circulation, density-driven circulation and Stokes drift. Sequential assimilation methods updating the model state have been proven successful to correct the density-driven currents by assimilation of observations such as sea surface height, sea surface temperature and in-situ profiles. However, the situation is different for tides in coastal models since these are not generated within the domain, but are rather propagated inside the domain through the boundary conditions. For improving the modeled tidal variability it is therefore not sufficient to update the model state via data assimilation without updating the boundary conditions. The optimization of boundary conditions to match observations inside the domain is traditionally achieved through variational assimilation methods. In this work we present an ensemble smoother to improve the tidal boundary values so that the model represents more closely the observed currents. To create an ensemble of dynamically realistic boundary conditions, a cost function is formulated which is directly related to the probability of each boundary condition perturbation. This cost function ensures that the boundary condition perturbations are spatially smooth and that the structure of the perturbations satisfies approximately the harmonic linearized shallow water equations. Based on those perturbations an ensemble simulation is carried out using the full three-dimensional General Estuarine Ocean Model (GETM. Optimized boundary values are obtained by assimilating all observations using the covariances of the ensemble simulation.

  14. Testbed model and data assimilation for ARM

    International Nuclear Information System (INIS)

    Louis, J.F.

    1992-01-01

    The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes

  15. Nitrogen uptake and assimilation by corn roots

    International Nuclear Information System (INIS)

    Yoneyama, Tadakatsu; Akiyama, Yoko; Kumazawa, Kikuo

    1977-01-01

    The site of nitrogen uptake in the apical root zone of corn was experimentally investigated. Two experiments were performed. The one is to see the assimilation of nitrate and ammonium and the effects of low temperature on it. The 4-day-old roots were treated with 15 N-labelled inorganic nitrogen of 20 ppm N in 5 x 10 -4 M CaSO 4 solution at 30 deg. C and 0 deg. C. The other is to see the nitrogen uptake at apical root zone and the utilization of newly absorbed nitrogen at the root top. The 4-day-old roots were transferred into 5 x 10 -4 M CaSO 4 solution containing 15 N-labelled ammonium nitrate of 40 ppm N. As a result, the effect of low temperature on the nitrogen uptake appeared to be more drastic in the case of nitrate than ammonium. The 15 N content of amino acids indicates that ammonium is assimilated into amino acids even at 0 deg. C, but nitrate is not. The ammonium nitrogen seemed to be absorbed at both cell dividing and elongating zones. On the other hand, nitrate nitrogen seemed to be strongly absorbed at cell elongating zone. The nitrogen in the apical part may be supplied not only by direct absorption but also by translocation from the basal part. The clear difference was found in the utilization of nitrate and ammonium nitrogen at the root top when the root was elongating. This may be due to the difference of assimilation products of inorganic nitrogen. Newly absorbed ammonium nitrogen is more utilizable for the growth of root top than nitrate nitrogen. (Iwakiri, K.)

  16. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  17. Data assimilation approaches in the EURANOS project

    DEFF Research Database (Denmark)

    Kaiser, J.C.; Gering, F.; Astrup, Poul

    2010-01-01

    -nuclides in urban areas the results of demonstration exercises are presented here. With the data assimilation module of the RIMPUFF dispersion code, predictions of the gamma dose rate are corrected with simulated readings of fixed detector stations. Using the DA capabilities of the IAMM package for mapping...... the radioactive contamination in inhabited areas, predictions of a large scale deposition model have been combined with hypothetical measurements on a local scale. In both examples the accuracy of the model predictions has been improved and the uncertainties have been reduced. © EDP Sciences, 2010...

  18. Sequential lineup laps and eyewitness accuracy.

    Science.gov (United States)

    Steblay, Nancy K; Dietrich, Hannah L; Ryan, Shannon L; Raczynski, Jeanette L; James, Kali A

    2011-08-01

    Police practice of double-blind sequential lineups prompts a question about the efficacy of repeated viewings (laps) of the sequential lineup. Two laboratory experiments confirmed the presence of a sequential lap effect: an increase in witness lineup picks from first to second lap, when the culprit was a stranger. The second lap produced more errors than correct identifications. In Experiment 2, lineup diagnosticity was significantly higher for sequential lineup procedures that employed a single versus double laps. Witnesses who elected to view a second lap made significantly more errors than witnesses who chose to stop after one lap or those who were required to view two laps. Witnesses with prior exposure to the culprit did not exhibit a sequential lap effect.

  19. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  20. Sequential Product of Quantum Effects: An Overview

    Science.gov (United States)

    Gudder, Stan

    2010-12-01

    This article presents an overview for the theory of sequential products of quantum effects. We first summarize some of the highlights of this relatively recent field of investigation and then provide some new results. We begin by discussing sequential effect algebras which are effect algebras endowed with a sequential product satisfying certain basic conditions. We then consider sequential products of (discrete) quantum measurements. We next treat transition effect matrices (TEMs) and their associated sequential product. A TEM is a matrix whose entries are effects and whose rows form quantum measurements. We show that TEMs can be employed for the study of quantum Markov chains. Finally, we prove some new results concerning TEMs and vector densities.

  1. First assimilations of COSMIC radio occultation data into the Electron Density Assimilative Model (EDAM

    Directory of Open Access Journals (Sweden)

    M. J. Angling

    2008-02-01

    Full Text Available Ground based measurements of slant total electron content (TEC can be assimilated into ionospheric models to produce 3-D representations of ionospheric electron density. The Electron Density Assimilative Model (EDAM has been developed for this purpose. Previous tests using EDAM and ground based data have demonstrated that the information on the vertical structure of the ionosphere is limited in this type of data. The launch of the COSMIC satellite constellation provides the opportunity to use radio occultation data which has more vertical information. EDAM assimilations have been run for three time periods representing quiet, moderate and disturbed geomagnetic conditions. For each run, three data sets have been ingested – only ground based data, only COSMIC data and both ground based and COSMIC data. The results from this preliminary study show that both ground and space based data are capable of improving the representation of the vertical structure of the ionosphere. However, the analysis is limited by the incomplete deployment of the COSMIC constellation and the use of auto-scaled ionosonde data. The first of these can be addressed by repeating this type of study once full deployment has been achieved. The latter requires the manual scaling of ionosonde data; ideally an agreed data set would be scaled and made available to the community to facilitate comparative testing of assimilative models.

  2. Development of airborne remote sensing data assimilation system

    International Nuclear Information System (INIS)

    Gudu, B R; Bi, H Y; Wang, H Y; Qin, S X; Ma, J W

    2014-01-01

    In this paper, an airborne remote sensing data assimilation system for China Airborne Remote Sensing System is introduced. This data assimilation system is composed of a land surface model, data assimilation algorithms, observation data and fundamental parameters forcing the land surface model. In this data assimilation system, Variable Infiltration Capacity hydrologic model is selected as the land surface model, which also serves as the main framework of the system. Three-dimensional variation algorithm, four-dimensional variation algorithms, ensemble Kalman filter and Particle filter algorithms are integrated in this system. Observation data includes ground observations and remotely sensed data. The fundamental forcing parameters include soil parameters, vegetation parameters and the meteorological data

  3. Boundary Conditions, Data Assimilation, and Predictability in Coastal Ocean Models

    National Research Council Canada - National Science Library

    Samelson, Roger M; Allen, John S; Egbert, Gary D; Kindle, John C; Snyder, Chris

    2007-01-01

    ...: The specific objectives of this research are to determine the impact on coastal ocean circulation models of open ocean boundary conditions from Global Ocean Data Assimilation Experiment (GODAE...

  4. Assimilation of enterprise technology upgrades: a factor-based study

    Science.gov (United States)

    Claybaugh, Craig C.; Ramamurthy, Keshavamurthy; Haseman, William D.

    2017-02-01

    The purpose of this study is to gain a better understanding of the differences in the propensity of firms to initiate and commit to the assimilation of an enterprise technology upgrade. A research model is proposed that examines the influences that four technological and four organisational factors have on predicting assimilation of a technology upgrade. Results show that firms with a greater propensity to assimilate the new enterprise resource planning (ERP) version have a higher assessment of relative advantage, IS technical competence, and the strategic role of IS relative to those firms with a lower propensity to assimilate a new ERP version.

  5. Lectures on Monte Carlo methods

    CERN Document Server

    Madras, Neal

    2001-01-01

    Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati

  6. Iterative ensemble variational methods for nonlinear data assimilation: Application to transport and atmospheric chemistry

    International Nuclear Information System (INIS)

    Haussaire, Jean-Matthieu

    2017-01-01

    assimilation of real tropospheric ozone concentrations mitigates these results and shows how hard atmospheric chemistry data assimilation is. A strong model error is indeed attached to these models, stemming from multiple uncertainty sources. Two steps must be taken to tackle this issue. First of all, the data assimilation method used must be able to efficiently take into account the model error. However, most methods are developed under the assumption of a perfect model. To avoid this hypothesis, a new method has then been developed. Called IEnKF-Q, it expands the IEnKS to the model error framework. It has been validated on a low-order model, proving its superiority over data assimilation methods naively adapted to take into account model error. Nevertheless, such methods need to know the exact nature and amplitude of the model error which needs to be accounted for. Therefore, the second step is to use statistical tools to quantify this model error. The expectation-maximization algorithm, the naive and unbiased randomize-then-optimize algorithms, an importance sampling based on a Laplace proposal, and a Markov chain Monte Carlo simulation, potentially trans-dimensional, have been assessed, expanded, and compared to estimate the uncertainty on the retrieval of the source term of the Chernobyl and Fukushima-Daiichi nuclear power plant accidents. This thesis therefore improves the domain of 4D EnVar data assimilation by its methodological input and by paving the way to applying these methods on atmospheric chemistry models. (author) [fr

  7. Simultaneous perceptual and response biases on sequential face attractiveness judgments

    Science.gov (United States)

    Pegors, Teresa K.; Mattar, Marcelo G.; Bryan, Peter B.; Epstein, Russell A.

    2015-01-01

    Face attractiveness is a social characteristic that we often use to make first-pass judgments about the people around us. However, these judgments are highly influenced by our surrounding social world, and researchers still understand little about the mechanisms underlying these influences. In a series of three experiments, we used a novel sequential rating paradigm that enabled us to measure biases on attractiveness judgments from the previous face and the previous rating. Our results revealed two simultaneous and opposing influences on face attractiveness judgments that arise from our past experience of faces: a response bias in which attractiveness ratings shift towards a previously given rating, and a stimulus bias in which attractiveness ratings shift away from the mean attractiveness of the previous face. Furthermore, we provide evidence that the contrastive stimulus bias (but not the assimilative response bias) is strengthened by increasing the duration of the previous stimulus, suggesting an underlying perceptual mechanism. These results demonstrate that judgments of face attractiveness are influenced by information from our evaluative and perceptual history and that these influences have measurable behavioral effects over the course of just a few seconds. PMID:25867223

  8. Sequential Scintigraphy in Renal Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Winkel, K. zum; Harbst, H.; Schenck, P.; Franz, H. E.; Ritz, E.; Roehl, L.; Ziegler, M.; Ammann, W.; Maier-Borst, W. [Institut Fuer Nuklearmedizin, Deutsches Krebsforschungszentrum, Heidelberg, Federal Republic of Germany (Germany)

    1969-05-15

    Based on experience gained from more than 1600 patients with proved or suspected kidney diseases and on results on extended studies with dogs, sequential scintigraphy was performed after renal transplantation in dogs. After intravenous injection of 500 {mu}Ci. {sup 131}I-Hippuran scintiphotos were taken during the first minute with an exposure time of 15 sec each and thereafter with an exposure of 2 min up to at least 16 min.. Several examinations were evaluated digitally. 26 examinations were performed on 11 dogs with homotransplanted kidneys. Immediately after transplantation the renal function was almost normal arid the bladder was filled in due time. At the beginning of rejection the initial uptake of radioactive Hippuran was reduced. The intrarenal transport became delayed; probably the renal extraction rate decreased. Corresponding to the development of an oedema in the transplant the uptake area increased in size. In cases of thrombosis of the main artery there was no evidence of any uptake of radioactivity in the transplant. Similar results were obtained in 41 examinations on 15 persons. Patients with postoperative anuria due to acute tubular necrosis showed still some uptake of radioactivity contrary to those with thrombosis of the renal artery, where no uptake was found. In cases of rejection the most frequent signs were a reduced initial uptake and a delayed intrarenal transport of radioactive Hippuran. Infarction could be detected by a reduced uptake in distinct areas of the transplant. (author)

  9. Sequential provisional implant prosthodontics therapy.

    Science.gov (United States)

    Zinner, Ira D; Markovits, Stanley; Jansen, Curtis E; Reid, Patrick E; Schnader, Yale E; Shapiro, Herbert J

    2012-01-01

    The fabrication and long-term use of first- and second-stage provisional implant prostheses is critical to create a favorable prognosis for function and esthetics of a fixed-implant supported prosthesis. The fixed metal and acrylic resin cemented first-stage prosthesis, as reviewed in Part I, is needed for prevention of adjacent and opposing tooth movement, pressure on the implant site as well as protection to avoid micromovement of the freshly placed implant body. The second-stage prosthesis, reviewed in Part II, should be used following implant uncovering and abutment installation. The patient wears this provisional prosthesis until maturation of the bone and healing of soft tissues. The second-stage provisional prosthesis is also a fail-safe mechanism for possible early implant failures and also can be used with late failures and/or for the necessity to repair the definitive prosthesis. In addition, the screw-retained provisional prosthesis is used if and when an implant requires removal or other implants are to be placed as in a sequential approach. The creation and use of both first- and second-stage provisional prostheses involve a restorative dentist, dental technician, surgeon, and patient to work as a team. If the dentist alone cannot do diagnosis and treatment planning, surgery, and laboratory techniques, he or she needs help by employing the expertise of a surgeon and a laboratory technician. This team approach is essential for optimum results.

  10. Data Assimilation by delay-coordinate nudging

    Science.gov (United States)

    Pazo, Diego; Lopez, Juan Manuel; Carrassi, Alberto

    2016-04-01

    A new nudging method for data assimilation, delay-coordinate nudging, is presented. Delay-coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time-step. Numerical experiments with a low order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an un-optimized formulation of the delay-nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay-coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal-to-decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures.

  11. Turbulent viscosity optimized by data assimilation

    Directory of Open Access Journals (Sweden)

    Y. Leredde

    Full Text Available As an alternative approach to classical turbulence modelling using a first or second order closure, the data assimilation method of optimal control is applied to estimate a time and space-dependent turbulent viscosity in a three-dimensional oceanic circulation model. The optimal control method, described for a 3-D primitive equation model, involves the minimization of a cost function that quantifies the discrepancies between the simulations and the observations. An iterative algorithm is obtained via the adjoint model resolution. In a first experiment, a k + L model is used to simulate the one-dimensional development of inertial oscillations resulting from a wind stress at the sea surface and with the presence of a halocline. These results are used as synthetic observations to be assimilated. The turbulent viscosity is then recovered without the k + L closure, even with sparse and noisy observations. The problems of controllability and of the dimensions of the control are then discussed. A second experiment consists of a two-dimensional schematic simulation. A 2-D turbulent viscosity field is estimated from data on the initial and final states of a coastal upwelling event.

    Key words. Oceanography: general (numerical modelling · Oceanography: physical (turbulence · diffusion · and mixing processes

  12. Benchmarking a Soil Moisture Data Assimilation System for Agricultural Drought Monitoring

    Science.gov (United States)

    Hun, Eunjin; Crow, Wade T.; Holmes, Thomas; Bolten, John

    2014-01-01

    Despite considerable interest in the application of land surface data assimilation systems (LDAS) for agricultural drought applications, relatively little is known about the large-scale performance of such systems and, thus, the optimal methodological approach for implementing them. To address this need, this paper evaluates an LDAS for agricultural drought monitoring by benchmarking individual components of the system (i.e., a satellite soil moisture retrieval algorithm, a soil water balance model and a sequential data assimilation filter) against a series of linear models which perform the same function (i.e., have the same basic inputoutput structure) as the full system component. Benchmarking is based on the calculation of the lagged rank cross-correlation between the normalized difference vegetation index (NDVI) and soil moisture estimates acquired for various components of the system. Lagged soil moistureNDVI correlations obtained using individual LDAS components versus their linear analogs reveal the degree to which non-linearities andor complexities contained within each component actually contribute to the performance of the LDAS system as a whole. Here, a particular system based on surface soil moisture retrievals from the Land Parameter Retrieval Model (LPRM), a two-layer Palmer soil water balance model and an Ensemble Kalman filter (EnKF) is benchmarked. Results suggest significant room for improvement in each component of the system.

  13. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    Science.gov (United States)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  14. Ozone data assimilation with GEOS-Chem: a comparison between 3-D-Var, 4-D-Var, and suboptimal Kalman filter approaches

    Science.gov (United States)

    Singh, K.; Sandu, A.; Bowman, K. W.; Parrington, M.; Jones, D. B. A.; Lee, M.

    2011-08-01

    Chemistry transport models determine the evolving chemical state of the atmosphere by solving the fundamental equations that govern physical and chemical transformations subject to initial conditions of the atmospheric state and surface boundary conditions, e.g., surface emissions. The development of data assimilation techniques synthesize model predictions with measurements in a rigorous mathematical framework that provides observational constraints on these conditions. Two families of data assimilation methods are currently widely used: variational and Kalman filter (KF). The variational approach is based on control theory and formulates data assimilation as a minimization problem of a cost functional that measures the model-observations mismatch. The Kalman filter approach is rooted in statistical estimation theory and provides the analysis covariance together with the best state estimate. Suboptimal Kalman filters employ different approximations of the covariances in order to make the computations feasible with large models. Each family of methods has both merits and drawbacks. This paper compares several data assimilation methods used for global chemical data assimilation. Specifically, we evaluate data assimilation approaches for improving estimates of the summertime global tropospheric ozone distribution in August 2006 based on ozone observations from the NASA Tropospheric Emission Spectrometer and the GEOS-Chem chemistry transport model. The resulting analyses are compared against independent ozonesonde measurements to assess the effectiveness of each assimilation method. All assimilation methods provide notable improvements over the free model simulations, which differ from the ozonesonde measurements by about 20 % (below 200 hPa). Four dimensional variational data assimilation with window lengths between five days and two weeks is the most accurate method, with mean differences between analysis profiles and ozonesonde measurements of 1-5 %. Two sequential

  15. A Robust Non-Gaussian Data Assimilation Method for Highly Non-Linear Models

    Directory of Open Access Journals (Sweden)

    Elias D. Nino-Ruiz

    2018-03-01

    Full Text Available In this paper, we propose an efficient EnKF implementation for non-Gaussian data assimilation based on Gaussian Mixture Models and Markov-Chain-Monte-Carlo (MCMC methods. The proposed method works as follows: based on an ensemble of model realizations, prior errors are estimated via a Gaussian Mixture density whose parameters are approximated by means of an Expectation Maximization method. Then, by using an iterative method, observation operators are linearized about current solutions and posterior modes are estimated via a MCMC implementation. The acceptance/rejection criterion is similar to that of the Metropolis-Hastings rule. Experimental tests are performed on the Lorenz 96 model. The results show that the proposed method can decrease prior errors by several order of magnitudes in a root-mean-square-error sense for nearly sparse or dense observational networks.

  16. Monte Carlo simulation for IRRMA

    International Nuclear Information System (INIS)

    Gardner, R.P.; Liu Lianyan

    2000-01-01

    Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors

  17. Geology of Maxwell Montes, Venus

    Science.gov (United States)

    Head, J. W.; Campbell, D. B.; Peterfreund, A. R.; Zisk, S. A.

    1984-01-01

    Maxwell Montes represent the most distinctive topography on the surface of Venus, rising some 11 km above mean planetary radius. The multiple data sets of the Pioneer missing and Earth based radar observations to characterize Maxwell Montes are analyzed. Maxwell Montes is a porkchop shaped feature located at the eastern end of Lakshmi Planum. The main massif trends about North 20 deg West for approximately 1000 km and the narrow handle extends several hundred km West South-West WSW from the north end of the main massif, descending down toward Lakshmi Planum. The main massif is rectilinear and approximately 500 km wide. The southern and northern edges of Maxwell Montes coincide with major topographic boundaries defining the edge of Ishtar Terra.

  18. Aerosol Observability and Predictability: From Research to Operations for Chemical Weather Forecasting. Lagrangian Displacement Ensembles for Aerosol Data Assimilation

    Science.gov (United States)

    da Silva, Arlindo

    2010-01-01

    A challenge common to many constituent data assimilation applications is the fact that one observes a much smaller fraction of the phase space that one wishes to estimate. For example, remotely sensed estimates of the column average concentrations are available, while one is faced with the problem of estimating 3D concentrations for initializing a prognostic model. This problem is exacerbated in the case of aerosols because the observable Aerosol Optical Depth (AOD) is not only a column integrated quantity, but it also sums over a large number of species (dust, sea-salt, carbonaceous and sulfate aerosols. An aerosol transport model when driven by high-resolution, state-of-the-art analysis of meteorological fields and realistic emissions can produce skillful forecasts even when no aerosol data is assimilated. The main task of aerosol data assimilation is to address the bias arising from inaccurate emissions, and Lagrangian misplacement of plumes induced by errors in the driving meteorological fields. As long as one decouples the meteorological and aerosol assimilation as we do here, the classic baroclinic growth of error is no longer the main order of business. We will describe an aerosol data assimilation scheme in which the analysis update step is conducted in observation space, using an adaptive maximum-likelihood scheme for estimating background errors in AOD space. This scheme includes e explicit sequential bias estimation as in Dee and da Silva. Unlikely existing aerosol data assimilation schemes we do not obtain analysis increments of the 3D concentrations by scaling the background profiles. Instead we explore the Lagrangian characteristics of the problem for generating local displacement ensembles. These high-resolution state-dependent ensembles are then used to parameterize the background errors and generate 3D aerosol increments. The algorithm has computational complexity running at a resolution of 1/4 degree, globally. We will present the result of

  19. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  20. Monte Carlo theory and practice

    International Nuclear Information System (INIS)

    James, F.

    1987-01-01

    Historically, the first large-scale calculations to make use of the Monte Carlo method were studies of neutron scattering and absorption, random processes for which it is quite natural to employ random numbers. Such calculations, a subset of Monte Carlo calculations, are known as direct simulation, since the 'hypothetical population' of the narrower definition above corresponds directly to the real population being studied. The Monte Carlo method may be applied wherever it is possible to establish equivalence between the desired result and the expected behaviour of a stochastic system. The problem to be solved may already be of a probabilistic or statistical nature, in which case its Monte Carlo formulation will usually be a straightforward simulation, or it may be of a deterministic or analytic nature, in which case an appropriate Monte Carlo formulation may require some imagination and may appear contrived or artificial. In any case, the suitability of the method chosen will depend on its mathematical properties and not on its superficial resemblance to the problem to be solved. The authors show how Monte Carlo techniques may be compared with other methods of solution of the same physical problem

  1. Blind Decoding of Multiple Description Codes over OFDM Systems via Sequential Monte Carlo

    Directory of Open Access Journals (Sweden)

    Guo Dong

    2005-01-01

    Full Text Available We consider the problem of transmitting a continuous source through an OFDM system. Multiple description scalar quantization (MDSQ is applied to the source signal, resulting in two correlated source descriptions. The two descriptions are then OFDM modulated and transmitted through two parallel frequency-selective fading channels. At the receiver, a blind turbo receiver is developed for joint OFDM demodulation and MDSQ decoding. Transformation of the extrinsic information of the two descriptions are exchanged between each other to improve system performance. A blind soft-input soft-output OFDM detector is developed, which is based on the techniques of importance sampling and resampling. Such a detector is capable of exchanging the so-called extrinsic information with the other component in the above turbo receiver, and successively improving the overall receiver performance. Finally, we also treat channel-coded systems, and a novel blind turbo receiver is developed for joint demodulation, channel decoding, and MDSQ source decoding.

  2. Efficient Sequential Monte Carlo Sampling for Continuous Monitoring of a Radiation Situation

    Czech Academy of Sciences Publication Activity Database

    Šmídl, Václav; Hofman, Radek

    2014-01-01

    Roč. 56, č. 4 (2014), s. 514-527 ISSN 0040-1706 R&D Projects: GA MV VG20102013018 Institutional support: RVO:67985556 Keywords : radiation protection * atmospheric dispersion model * importance sampling Subject RIV: BD - Theory of Information Impact factor: 1.814, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/smidl-0433631.pdf

  3. Parallel Sequential Monte Carlo for Efficient Density Combination: The Deco Matlab Toolbox

    DEFF Research Database (Denmark)

    Casarin, Roberto; Grassi, Stefano; Ravazzolo, Francesco

    This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights...... for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy...... times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications....

  4. Tradable permit allocations and sequential choice

    Energy Technology Data Exchange (ETDEWEB)

    MacKenzie, Ian A. [Centre for Economic Research, ETH Zuerich, Zurichbergstrasse 18, 8092 Zuerich (Switzerland)

    2011-01-15

    This paper investigates initial allocation choices in an international tradable pollution permit market. For two sovereign governments, we compare allocation choices that are either simultaneously or sequentially announced. We show sequential allocation announcements result in higher (lower) aggregate emissions when announcements are strategic substitutes (complements). Whether allocation announcements are strategic substitutes or complements depends on the relationship between the follower's damage function and governments' abatement costs. When the marginal damage function is relatively steep (flat), allocation announcements are strategic substitutes (complements). For quadratic abatement costs and damages, sequential announcements provide a higher level of aggregate emissions. (author)

  5. Sequential Generalized Transforms on Function Space

    Directory of Open Access Journals (Sweden)

    Jae Gil Choi

    2013-01-01

    Full Text Available We define two sequential transforms on a function space Ca,b[0,T] induced by generalized Brownian motion process. We then establish the existence of the sequential transforms for functionals in a Banach algebra of functionals on Ca,b[0,T]. We also establish that any one of these transforms acts like an inverse transform of the other transform. Finally, we give some remarks about certain relations between our sequential transforms and other well-known transforms on Ca,b[0,T].

  6. Data assimilation in the decision support system RODOS

    DEFF Research Database (Denmark)

    Rojas-Palma, C.; Madsen, H.; Gering, F.

    2003-01-01

    . The process of combining model predictions and observations, usually referred to as data assimilation, is described in this article within the framework of the real time on-line decision support system (RODOS) for off-site nuclear emergency management in Europe. Data assimilation capabilities, based on Kalman...

  7. Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology

    Science.gov (United States)

    Wayment, Adam

    2009-01-01

    This dissertation explores similarity effects in assimilation, proposing an Attraction Framework to analyze cases of parasitic harmony where a trigger-target pair only results in harmony if the trigger and target agree on other features. Attraction provides a natural model of these effects by relating the pressure for assimilation to the…

  8. Assimilating Remote Sensing Observations of Leaf Area Index and Soil Moisture for Wheat Yield Estimates: An Observing System Simulation Experiment

    Science.gov (United States)

    Nearing, Grey S.; Crow, Wade T.; Thorp, Kelly R.; Moran, Mary S.; Reichle, Rolf H.; Gupta, Hoshin V.

    2012-01-01

    Observing system simulation experiments were used to investigate ensemble Bayesian state updating data assimilation of observations of leaf area index (LAI) and soil moisture (theta) for the purpose of improving single-season wheat yield estimates with the Decision Support System for Agrotechnology Transfer (DSSAT) CropSim-Ceres model. Assimilation was conducted in an energy-limited environment and a water-limited environment. Modeling uncertainty was prescribed to weather inputs, soil parameters and initial conditions, and cultivar parameters and through perturbations to model state transition equations. The ensemble Kalman filter and the sequential importance resampling filter were tested for the ability to attenuate effects of these types of uncertainty on yield estimates. LAI and theta observations were synthesized according to characteristics of existing remote sensing data, and effects of observation error were tested. Results indicate that the potential for assimilation to improve end-of-season yield estimates is low. Limitations are due to a lack of root zone soil moisture information, error in LAI observations, and a lack of correlation between leaf and grain growth.

  9. The dynamic radiation environment assimilation model (DREAM)

    International Nuclear Information System (INIS)

    Reeves, Geoffrey D.; Koller, Josef; Tokar, Robert L.; Chen, Yue; Henderson, Michael G.; Friedel, Reiner H.

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.

  10. Transgenic plants that exhibit enhanced nitrogen assimilation

    Science.gov (United States)

    Coruzzi, Gloria M.; Brears, Timothy

    1999-01-01

    The present invention relates to a method for producing plants with improved agronomic and nutritional traits. Such traits include enhanced nitrogen assimilatory and utilization capacities, faster and more vigorous growth, greater vegetative and reproductive yields, and enriched or altered nitrogen content in vegetative and reproductive parts. More particularly, the invention relates to the engineering of plants modified to have altered expression of key enzymes in the nitrogen assimilation and utilization pathways. In one embodiment of the present invention, the desired altered expression is accomplished by engineering the plant for ectopic overexpression of one of more the native or modified nitrogen assimilatory enzymes. The invention also has a number of other embodiments, all of which are disclosed herein.

  11. Assimilating data into open ocean tidal models

    Science.gov (United States)

    Kivman, Gennady A.

    The problem of deriving tidal fields from observations by reason of incompleteness and imperfectness of every data set practically available has an infinitely large number of allowable solutions fitting the data within measurement errors and hence can be treated as ill-posed. Therefore, interpolating the data always relies on some a priori assumptions concerning the tides, which provide a rule of sampling or, in other words, a regularization of the ill-posed problem. Data assimilation procedures used in large scale tide modeling are viewed in a common mathematical framework as such regularizations. It is shown that they all (basis functions expansion, parameter estimation, nudging, objective analysis, general inversion, and extended general inversion), including those (objective analysis and general inversion) originally formulated in stochastic terms, may be considered as utilizations of one of the three general methods suggested by the theory of ill-posed problems. The problem of grid refinement critical for inverse methods and nudging is discussed.

  12. Efficacy of premixed versus sequential administration of ...

    African Journals Online (AJOL)

    sequential administration in separate syringes on block characteristics, haemodynamic parameters, side effect profile and postoperative analgesic requirement. Trial design: This was a prospective, randomised clinical study. Method: Sixty orthopaedic patients scheduled for elective lower limb surgery under spinal ...

  13. PARTICLE FILTERING WITH SEQUENTIAL PARAMETER LEARNING FOR NONLINEAR BOLD fMRI SIGNALS.

    Science.gov (United States)

    Xia, Jing; Wang, Michelle Yongmei

    Analyzing the blood oxygenation level dependent (BOLD) effect in the functional magnetic resonance imaging (fMRI) is typically based on recent ground-breaking time series analysis techniques. This work represents a significant improvement over existing approaches to system identification using nonlinear hemodynamic models. It is important for three reasons. First, instead of using linearized approximations of the dynamics, we present a nonlinear filtering based on the sequential Monte Carlo method to capture the inherent nonlinearities in the physiological system. Second, we simultaneously estimate the hidden physiological states and the system parameters through particle filtering with sequential parameter learning to fully take advantage of the dynamic information of the BOLD signals. Third, during the unknown static parameter learning, we employ the low-dimensional sufficient statistics for efficiency and avoiding potential degeneration of the parameters. The performance of the proposed method is validated using both the simulated data and real BOLD fMRI data.

  14. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  15. Reliability Evaluation of Distribution System Considering Sequential Characteristics of Distributed Generation

    Directory of Open Access Journals (Sweden)

    Sheng Wanxing

    2016-01-01

    Full Text Available In allusion to the randomness of output power of distributed generation (DG, a reliability evaluation model based on sequential Monte Carlo simulation (SMCS for distribution system with DG is proposed. Operating states of the distribution system can be sampled by SMCS in chronological order thus the corresponding output power of DG can be generated. The proposed method has been tested on feeder F4 of IEEE-RBTS Bus 6. The results show that reliability evaluation of distribution system considering the uncertainty of output power of DG can be effectively implemented by SMCS.

  16. Assessing the benefit of snow data assimilation for runoff modeling in Alpine catchments

    Science.gov (United States)

    Griessinger, Nena; Seibert, Jan; Magnusson, Jan; Jonas, Tobias

    2016-09-01

    In Alpine catchments, snowmelt is often a major contribution to runoff. Therefore, modeling snow processes is important when concerned with flood or drought forecasting, reservoir operation and inland waterway management. In this study, we address the question of how sensitive hydrological models are to the representation of snow cover dynamics and whether the performance of a hydrological model can be enhanced by integrating data from a dedicated external snow monitoring system. As a framework for our tests we have used the hydrological model HBV (Hydrologiska Byråns Vattenbalansavdelning) in the version HBV-light, which has been applied in many hydrological studies and is also in use for operational purposes. While HBV originally follows a temperature-index approach with time-invariant calibrated degree-day factors to represent snowmelt, in this study the HBV model was modified to use snowmelt time series from an external and spatially distributed snow model as model input. The external snow model integrates three-dimensional sequential assimilation of snow monitoring data with a snowmelt model, which is also based on the temperature-index approach but uses a time-variant degree-day factor. The following three variations of this external snow model were applied: (a) the full model with assimilation of observational snow data from a dense monitoring network, (b) the same snow model but with data assimilation switched off and (c) a downgraded version of the same snow model representing snowmelt with a time-invariant degree-day factor. Model runs were conducted for 20 catchments at different elevations within Switzerland for 15 years. Our results show that at low and mid-elevations the performance of the runoff simulations did not vary considerably with the snow model version chosen. At higher elevations, however, best performance in terms of simulated runoff was obtained when using the snowmelt time series from the snow model, which utilized data assimilation

  17. Data Assimilation for Management of Industrial Groundwater Contamination at a Regional Scale

    KAUST Repository

    El Gharamti, Mohamad

    2014-12-01

    Groundwater is one of the main sources for drinking water and agricultural activities. Various activities of both humans and nature may lead to groundwater pollution. Very often, pollution, or contamination, of groundwater goes undetected for long periods of time until it begins to a ect human health and/or the environment. Cleanup technologies used to remediate pollution can be costly and remediation processes are often protracted. A more practical and feasible way to manage groundwater contamination is to monitor and predict contamination and act as soon as there is risk to the population and the environment. Predicting groundwater contamination requires advanced numerical models of groundwater ow and solute transport. Such numerical modeling is increasingly becoming a reference criterion for water resources assessment and environmental protection. Subsurface numerical models are, however, subject to many sources of uncertainties from unknown parameters and approximate dynamics. This dissertation considers the sequential data assimilation approach and tackles the groundwater contamination problem at the port of Rotterdam in the Netherlands. Industrial concentration data are used to monitor and predict the fate of organic contaminants using a threedimensional coupled ow and reactive transport model. We propose a number of 5 novel assimilation techniques that address di erent challenges, including prohibitive computational burden, the nonlinearity and coupling of the subsurface dynamics, and the structural and parametric uncertainties. We also investigate the problem of optimal observational designs to optimize the location and the number of wells. The proposed new methods are based on the ensemble Kalman Filter (EnKF), which provides an e cient numerical solution to the Bayesian ltering problem. The dissertation rst investigates in depth the popular joint and dual ltering formulations of the state-parameters estimation problem. New methodologies, algorithmically

  18. Bayesian Modelling, Monte Carlo Sampling and Capital Allocation of Insurance Risks

    Directory of Open Access Journals (Sweden)

    Gareth W. Peters

    2017-09-01

    Full Text Available The main objective of this work is to develop a detailed step-by-step guide to the development and application of a new class of efficient Monte Carlo methods to solve practically important problems faced by insurers under the new solvency regulations. In particular, a novel Monte Carlo method to calculate capital allocations for a general insurance company is developed, with a focus on coherent capital allocation that is compliant with the Swiss Solvency Test. The data used is based on the balance sheet of a representative stylized company. For each line of business in that company, allocations are calculated for the one-year risk with dependencies based on correlations given by the Swiss Solvency Test. Two different approaches for dealing with parameter uncertainty are discussed and simulation algorithms based on (pseudo-marginal Sequential Monte Carlo algorithms are described and their efficiency is analysed.

  19. Monte Carlo simulation of experiments

    International Nuclear Information System (INIS)

    Opat, G.I.

    1977-07-01

    An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)

  20. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  1. On the role of perception in shaping phonological assimilation rules.

    Science.gov (United States)

    Hura, S L; Lindblom, B; Diehl, R L

    1992-01-01

    Assimilation of nasals to the place of articulation of following consonants is a common and natural process among the world's languages. Recent phonological theory attributes this naturalness to the postulated geometry of articulatory features and the notion of spreading (McCarthy, 1988). Others view assimilation as a result of perception (Ohala, 1990), or as perceptually tolerated articulatory simplification (Kohler, 1990). Kohler notes that certain consonant classes (such as nasals and stops) are more likely than other classes (such as fricatives) to undergo place assimilation to a following consonant. To explain this pattern, he proposes that assimilation tends not to occur when the members of a consonant class are relatively distinctive perceptually, such that their articulatory reduction would be particularly salient. This explanation, of course, presupposes that the stops and nasals which undergo place assimilation are less distinctive than fricatives, which tend not to assimilate. We report experimental results that confirm Kohler's perceptual assumption: In the context of a following word initial stop, fricatives were less confusable than nasals or unreleased stops. We conclude, in agreement with Ohala and Kohler, that perceptual factors are likely to shape phonological assimilation rules.

  2. SMOS brightness temperature assimilation into the Community Land Model

    Directory of Open Access Journals (Sweden)

    D. Rains

    2017-11-01

    Full Text Available SMOS (Soil Moisture and Ocean Salinity mission brightness temperatures at a single incident angle are assimilated into the Community Land Model (CLM across Australia to improve soil moisture simulations. Therefore, the data assimilation system DasPy is coupled to the local ensemble transform Kalman filter (LETKF as well as to the Community Microwave Emission Model (CMEM. Brightness temperature climatologies are precomputed to enable the assimilation of brightness temperature anomalies, making use of 6 years of SMOS data (2010–2015. Mean correlation R with in situ measurements increases moderately from 0.61 to 0.68 (11 % for upper soil layers if the root zone is included in the updates. A reduced improvement of 5 % is achieved if the assimilation is restricted to the upper soil layers. Root-zone simulations improve by 7 % when updating both the top layers and root zone, and by 4 % when only updating the top layers. Mean increments and increment standard deviations are compared for the experiments. The long-term assimilation impact is analysed by looking at a set of quantiles computed for soil moisture at each grid cell. Within hydrological monitoring systems, extreme dry or wet conditions are often defined via their relative occurrence, adding great importance to assimilation-induced quantile changes. Although still being limited now, longer L-band radiometer time series will become available and make model output improved by assimilating such data that are more usable for extreme event statistics.

  3. Chromatic assimilation unaffected by perceived depth of inducing light.

    Science.gov (United States)

    Shevell, Steven K; Cao, Dingcai

    2004-01-01

    Chromatic assimilation is a shift toward the color of nearby light. Several studies conclude that a neural process contributes to assimilation but the neural locus remains in question. Some studies posit a peripheral process, such as retinal receptive-field organization, while others claim the neural mechanism follows depth perception, figure/ground segregation, or perceptual grouping. The experiments here tested whether assimilation depends on a neural process that follows stereoscopic depth perception. By introducing binocular disparity, the test field judged in color was made to appear in a different depth plane than the light that induced assimilation. The chromaticity and spatial frequency of the inducing light, and the chromaticity of the test light, were varied. Chromatic assimilation was found with all inducing-light sizes and chromaticities, but the magnitude of assimilation did not depend on the perceived relative depth planes of the test and inducing fields. We found no evidence to support the view that chromatic assimilation depends on a neural process that follows binocular combination of the two eyes' signals.

  4. Sensitivity analysis of a data assimilation technique for hindcasting and forecasting hydrodynamics of a complex coastal water body

    Science.gov (United States)

    Ren, Lei; Hartnett, Michael

    2017-02-01

    Accurate forecasting of coastal surface currents is of great economic importance due to marine activities such as marine renewable energy and fish farms in coastal regions in recent twenty years. Advanced oceanographic observation systems such as satellites and radars can provide many parameters of interest, such as surface currents and waves, with fine spatial resolution in near real time. To enhance modelling capability, data assimilation (DA) techniques which combine the available measurements with the hydrodynamic models have been used since the 1990s in oceanography. Assimilating measurements into hydrodynamic models makes the original model background states follow the observation trajectory, then uses it to provide more accurate forecasting information. Galway Bay is an open, wind dominated water body on which two coastal radars are deployed. An efficient and easy to implement sequential DA algorithm named Optimal Interpolation (OI) was used to blend radar surface current data into a three-dimensional Environmental Fluid Dynamics Code (EFDC) model. Two empirical parameters, horizontal correlation length and DA cycle length (CL), are inherent within OI. No guidance has previously been published regarding selection of appropriate values of these parameters or how sensitive OI DA is to variations in their values. Detailed sensitivity analysis has been performed on both of these parameters and results presented. Appropriate value of DA CL was examined and determined on producing the minimum Root-Mean-Square-Error (RMSE) between radar data and model background states. Analysis was performed to evaluate assimilation index (AI) of using an OI DA algorithm in the model. AI of the half-day forecasting mean vectors' directions was over 50% in the best assimilation model. The ability of using OI to improve model forecasts was also assessed and is reported upon.

  5. Strategije drevesnega preiskovanja Monte Carlo

    OpenAIRE

    VODOPIVEC, TOM

    2018-01-01

    Po preboju pri igri go so metode drevesnega preiskovanja Monte Carlo (ang. Monte Carlo tree search – MCTS) sprožile bliskovit napredek agentov za igranje iger: raziskovalna skupnost je od takrat razvila veliko variant in izboljšav algoritma MCTS ter s tem zagotovila napredek umetne inteligence ne samo pri igrah, ampak tudi v številnih drugih domenah. Čeprav metode MCTS združujejo splošnost naključnega vzorčenja z natančnostjo drevesnega preiskovanja, imajo lahko v praksi težave s počasno konv...

  6. Application of Bred Vectors To Data Assimilation

    Science.gov (United States)

    Corazza, M.; Kalnay, E.; Patil, Dj

    We introduced a statistic, the BV-dimension, to measure the effective local finite-time dimensionality of the atmosphere. We show that this dimension is often quite low, and suggest that this finding has important implications for data assimilation and the accuracy of weather forecasting (Patil et al, 2001). The original database for this study was the forecasts of the NCEP global ensemble forecasting system. The initial differences between the control forecast and the per- turbed forecasts are called bred vectors. The control and perturbed initial conditions valid at time t=n(t are evolved using the forecast model until time t=(n+1) (t. The differences between the perturbed and the control forecasts are scaled down to their initial amplitude, and constitute the bred vectors valid at (n+1) (t. Their growth rate is typically about 1.5/day. The bred vectors are similar by construction to leading Lya- punov vectors except that they have small but finite amplitude, and they are valid at finite times. The original NCEP ensemble data set has 5 independent bred vectors. We define a local bred vector at each grid point by choosing the 5 by 5 grid points centered at the grid point (a region of about 1100km by 1100km), and using the north-south and east- west velocity components at 500mb pressure level to form a 50 dimensional column vector. Since we have k=5 global bred vectors, we also have k local bred vectors at each grid point. We estimate the effective dimensionality of the subspace spanned by the local bred vectors by performing a singular value decomposition (EOF analysis). The k local bred vector columns form a 50xk matrix M. The singular values s(i) of M measure the extent to which the k column unit vectors making up the matrix M point in the direction of v(i). We define the bred vector dimension as BVDIM={Sum[s(i)]}^2/{Sum[s(i)]^2} For example, if 4 out of the 5 vectors lie along v, and one lies along v, the BV- dimension would be BVDIM[sqrt(4), 1, 0

  7. Towards a Comprehensive Dynamic-chemistry Assimilation for Eos-Chem: Plans and Status in NASA's Data Assimilation Office

    Science.gov (United States)

    Pawson, Steven; Lin, Shian-Jiann; Rood, Richard B.; Stajner, Ivanka; Nebuda, Sharon; Nielsen, J. Eric; Douglass, Anne R.

    2000-01-01

    In order to support the EOS-Chem project, a comprehensive assimilation package for the coupled chemical-dynamical system is being developed by the Data Assimilation Office at NASA GSFC. This involves development of a coupled chemistry/meteorology model and of data assimilation techniques for trace species and meteorology. The model is being developed using the flux-form semi-Lagrangian dynamical core of Lin and Rood, the physical parameterizations from the NCAR Community Climate Model, and atmospheric chemistry modules from the Atmospheric Chemistry and Dynamics branch at NASA GSFC. To date the following results have been obtained: (i) multi-annual simulations with the dynamics-radiation model show the credibility of the package for atmospheric simulations; (ii) initial simulations including a limited number of middle atmospheric trace gases reveal the realistic nature of transport mechanisms, although there is still a need for some improvements. Samples of these results will be shown. A meteorological assimilation system is currently being constructed using the model; this will form the basis for the proposed meteorological/chemical assimilation package. The latter part of the presentation will focus on areas targeted for development in the near and far terms, with the objective of Providing a comprehensive assimilation package for the EOS-Chem science experiment. The first stage will target ozone assimilation. The plans also encompass a reanalysis (ReSTS) for the 1991-1995 period, which includes the Mt. Pinatubo eruption and the time when a large number of UARS observations were available. One of the most challenging aspects of future developments will be to couple theoretical advances in tracer assimilation with the practical considerations of a real environment and eventually a near-real-time assimilation system.

  8. Assimilate partitioning in avocado, Persea americana

    Energy Technology Data Exchange (ETDEWEB)

    Finazzo, S.; Davenport, T.L.

    1986-04-01

    Assimilate partitioning is being studied in avocado, Persea americana cv. Millborrow in relation to fruit set. Single leaves on girdled branches of 10 year old trees were radiolabeled for 1 hr with 13..mu..Ci of /sup 14/CO/sub 2/. The source leaves were sampled during the experiment to measure translocation rates. At harvest the sink tissues were dissected and the incorporated radioactivity was measured. The translocation of /sup 14/C-labelled compounds to other leaves was minimal. Incorporation of label into fruitlets varied with the tissue and the stage of development. Sink (fruitlets) nearest to the labelled leaf and sharing the same phyllotaxy incorporated the most /sup 14/C. Source leaves for single non-abscising fruitlets retained 3X more /sup 14/C-labelled compounds than did source leaves for 2 or more fruitlets at 31 hrs. post-labelling. Export of label decreased appreciably when fruitlets abscised. If fruitlets abscised within 4 days of labeling then the translocation pattern was similar to the pattern for single fruitlets. If the fruitlet abscised later, the translocation pattern was intermediate between the single and double fruitlet pattern.

  9. Efficient data assimilation algorithm for bathymetry application

    Science.gov (United States)

    Ghorbanidehno, H.; Lee, J. H.; Farthing, M.; Hesser, T.; Kitanidis, P. K.; Darve, E. F.

    2017-12-01

    Information on the evolving state of the nearshore zone bathymetry is crucial to shoreline management, recreational safety, and naval operations. The high cost and complex logistics of using ship-based surveys for bathymetry estimation have encouraged the use of remote sensing techniques. Data assimilation methods combine the remote sensing data and nearshore hydrodynamic models to estimate the unknown bathymetry and the corresponding uncertainties. In particular, several recent efforts have combined Kalman Filter-based techniques such as ensembled-based Kalman filters with indirect video-based observations to address the bathymetry inversion problem. However, these methods often suffer from ensemble collapse and uncertainty underestimation. Here, the Compressed State Kalman Filter (CSKF) method is used to estimate the bathymetry based on observed wave celerity. In order to demonstrate the accuracy and robustness of the CSKF method, we consider twin tests with synthetic observations of wave celerity, while the bathymetry profiles are chosen based on surveys taken by the U.S. Army Corps of Engineer Field Research Facility (FRF) in Duck, NC. The first test case is a bathymetry estimation problem for a spatially smooth and temporally constant bathymetry profile. The second test case is a bathymetry estimation problem for a temporally evolving bathymetry from a smooth to a non-smooth profile. For both problems, we compare the results of CSKF with those obtained by the local ensemble transform Kalman filter (LETKF), which is a popular ensemble-based Kalman filter method.

  10. Efficient Data Assimilation Algorithms for Bathymetry Applications

    Science.gov (United States)

    Ghorbanidehno, H.; Kokkinaki, A.; Lee, J. H.; Farthing, M.; Hesser, T.; Kitanidis, P. K.; Darve, E. F.

    2016-12-01

    Information on the evolving state of the nearshore zone bathymetry is crucial to shoreline management, recreational safety, and naval operations. The high cost and complex logistics of using ship-based surveys for bathymetry estimation have encouraged the use of remote sensing monitoring. Data assimilation methods combine monitoring data and models of nearshore dynamics to estimate the unknown bathymetry and the corresponding uncertainties. Existing applications have been limited to the basic Kalman Filter (KF) and the Ensemble Kalman Filter (EnKF). The former can only be applied to low-dimensional problems due to its computational cost; the latter often suffers from ensemble collapse and uncertainty underestimation. This work explores the use of different variants of the Kalman Filter for bathymetry applications. In particular, we compare the performance of the EnKF to the Unscented Kalman Filter and the Hierarchical Kalman Filter, both of which are KF variants for non-linear problems. The objective is to identify which method can better handle the nonlinearities of nearshore physics, while also having a reasonable computational cost. We present two applications; first, the bathymetry of a synthetic one-dimensional cross section normal to the shore is estimated from wave speed measurements. Second, real remote measurements with unknown error statistics are used and compared to in situ bathymetric survey data collected at the USACE Field Research Facility in Duck, NC. We evaluate the information content of different data sets and explore the impact of measurement error and nonlinearities.

  11. Assimilate partitioning in avocado, Persea americana

    International Nuclear Information System (INIS)

    Finazzo, S.; Davenport, T.L.

    1986-01-01

    Assimilate partitioning is being studied in avocado, Persea americana cv. Millborrow in relation to fruit set. Single leaves on girdled branches of 10 year old trees were radiolabeled for 1 hr with 13μCi of 14 CO 2 . The source leaves were sampled during the experiment to measure translocation rates. At harvest the sink tissues were dissected and the incorporated radioactivity was measured. The translocation of 14 C-labelled compounds to other leaves was minimal. Incorporation of label into fruitlets varied with the tissue and the stage of development. Sink (fruitlets) nearest to the labelled leaf and sharing the same phyllotaxy incorporated the most 14 C. Source leaves for single non-abscising fruitlets retained 3X more 14 C-labelled compounds than did source leaves for 2 or more fruitlets at 31 hrs. post-labelling. Export of label decreased appreciably when fruitlets abscised. If fruitlets abscised within 4 days of labeling then the translocation pattern was similar to the pattern for single fruitlets. If the fruitlet abscised later, the translocation pattern was intermediate between the single and double fruitlet pattern

  12. Is Monte Carlo embarrassingly parallel?

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J. E. [Delft Univ. of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Delft Nuclear Consultancy, IJsselzoom 2, 2902 LB Capelle aan den IJssel (Netherlands)

    2012-07-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  13. Is Monte Carlo embarrassingly parallel?

    International Nuclear Information System (INIS)

    Hoogenboom, J. E.

    2012-01-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  14. Exact Monte Carlo for molecules

    International Nuclear Information System (INIS)

    Lester, W.A. Jr.; Reynolds, P.J.

    1985-03-01

    A brief summary of the fixed-node quantum Monte Carlo method is presented. Results obtained for binding energies, the classical barrier height for H + H 2 , and the singlet-triplet splitting in methylene are presented and discussed. 17 refs

  15. Monte Carlo - Advances and Challenges

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.

    2008-01-01

    Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature

  16. Sequential dependencies in magnitude scaling of loudness

    DEFF Research Database (Denmark)

    Joshi, Suyash Narendra; Jesteadt, Walt

    2013-01-01

    Ten normally hearing listeners used a programmable sone-potentiometer knob to adjust the level of a 1000-Hz sinusoid to match the loudness of numbers presented to them in a magnitude production task. Three different power-law exponents (0.15, 0.30, and 0.60) and a log-law with equal steps in d......B were used to program the sone-potentiometer. The knob settings systematically influenced the form of the loudness function. Time series analysis was used to assess the sequential dependencies in the data, which increased with increasing exponent and were greatest for the log-law. It would be possible......, therefore, to choose knob properties that minimized these dependencies. When the sequential dependencies were removed from the data, the slope of the loudness functions did not change, but the variability decreased. Sequential dependencies were only present when the level of the tone on the previous trial...

  17. Assimilate unloading from maize (Zea mays L.) pedicel tissues

    International Nuclear Information System (INIS)

    Porter, G.A.; Knievel, D.P.; Shannon, J.C.

    1987-01-01

    Sugar and 14 C-assimilate release from the pedicel tissue of attached maize (Zea mays L.) kernels was studied following treatment with solute concentrations of up to 800 millimolal. Exposure and collection times ranged from 3 to 6 hours. Sugar and 14 C-assimilate unloading and collection in agar traps was reduced by 25 and 43%, respectively, following exposure to 800 millimolal mannitol. Inhibition of unloading was not specific to mannitol, since similar concentrations of glucose, fructose, or equimolar glucose plus fructose resulted in comparable inhibition. Ethylene glycol, a rapidly permeating solute which should not greatly influence cell turgor, did not inhibit 14 C-assimilate unloading. Based on these results, they suggest that inhibition of unloading by high concentrations of sugar or mannitol was due to reduced pedicel cell turgor. Changes in pedicel cell turgor may play a role in the regulation of assimilate transfer within the maize kernel

  18. Develop a Hybrid Coordinate Ocean Model with Data Assimilation Capabilities

    National Research Council Canada - National Science Library

    Thacker, W. C

    2003-01-01

    .... The objectives of the research are as follows: (1) to develop a methodology for assimilating temperature and salinity profiles from XBT, CTD, and ARGO float data that accommodates the peculiarities of HYCOM's hybrid vertical coordinates, allowing...

  19. Regional Ocean Modeling System (ROMS): CNMI: Data Assimilating

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 3-day, 3-hourly data assimilating hindcast for the region surrounding the Commonwealth of the Northern Mariana Islands (CNMI)...

  20. A simple lightning assimilation technique for improving retrospective WRF simulations.

    Science.gov (United States)

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-F...

  1. Data assimilation in modeling ocean processes: A bibliographic study

    Digital Repository Service at National Institute of Oceanography (India)

    Mahadevan, R.; Fernandes, A.A.; Saran, A.K.

    An annotated bibliography on studies related to data assimilation in modeling ocean processes has been prepared. The bibliography listed here is not comprehensive and is not prepared from the original references. Information obtainable from...

  2. Air Quality Activities in the Global Modeling and Assimilation Office

    Science.gov (United States)

    Pawson, Steven

    2016-01-01

    GMAO's mission is to enhance the use of NASA's satellite observations in weather and climate modeling. This presentation will be discussing GMAO's mission, value of data assimilation, and some relevant (available) GMAO data products.

  3. UARS Correlative UKMO Daily Gridded Stratospheric Assimilated Data V001

    Data.gov (United States)

    National Aeronautics and Space Administration — The UARS Correlative assimilation data from the U.K. Meteorological Office (UKMO) consists of daily model runs at 12:00 GMT as a means of providing an independent...

  4. Regional Ocean Modeling System (ROMS): Samoa: Data Assimilating

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 3-day, 3-hourly data assimilating hindcast for the region surrounding the islands of Samoa at approximately 3-km resolution....

  5. Regional Ocean Modeling System (ROMS): Main Hawaiian Islands: Data Assimilating

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 3-day, 3-hourly data assimilating hindcast for the region surrounding the main Hawaiian islands at approximately 4-km...

  6. The Culure Assimilator: An Approach to Cross-Cultural Training

    Science.gov (United States)

    Fiedler, Fred E.; And Others

    1971-01-01

    Evaluates the cultural assimilator, a kind of training manual to help members of one culture understand and adjust to another culture. Describes those constructed for the Arab countries, Iran, Thailand, Central America, and Greece. (MB)

  7. Regional Ocean Modeling System (ROMS): Oahu: Data Assimilating

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 2-day, 3-hourly data assimilating hindcast for the region surrounding the island of Oahu at approximately 1-km resolution....

  8. Assimilation of Aircraft Observations in High-Resolution Mesoscale Modeling

    Directory of Open Access Journals (Sweden)

    Brian P. Reen

    2018-01-01

    Full Text Available Aircraft-based observations are a promising source of above-surface observations for assimilation into mesoscale model simulations. The Tropospheric Airborne Meteorological Data Reporting (TAMDAR observations have potential advantages over some other aircraft observations including the presence of water vapor observations. The impact of assimilating TAMDAR observations via observation nudging in 1 km horizontal grid spacing Weather Research and Forecasting model simulations is evaluated using five cases centered over California. Overall, the impact of assimilating the observations is mixed, with the layer with the greatest benefit being above the surface in the lowest 1000 m above ground level and the variable showing the most consistent benefit being temperature. Varying the nudging configuration demonstrates the sensitivity of the results to details of the assimilation, but does not clearly demonstrate the superiority of a specific configuration.

  9. Disconfirmed hedonic expectations produce perceptual contrast, not assimilation.

    Science.gov (United States)

    Zellner, Debra A; Strickhouser, Dinah; Tornow, Carina E

    2004-01-01

    In studies of hedonic ratings, contrast is the usual result when expectations about test stimuli are produced through the presentation of context stimuli, whereas assimilation is the usual result when expectations about test stimuli are produced through labeling, advertising, or the relaying of information to the subject about the test stimuli. Both procedures produce expectations that are subsequently violated, but the outcomes are different. The present studies demonstrate that both assimilation and contrast can occur even when expectations are produced by verbal labels and the degree of violation of the expectation is held constant. One factor determining whether assimilation or contrast occurs appears to be the certainty of the expectation. Expectations that convey certainty are produced by methods that lead to social influence on subjects' ratings, producing assimilation. When social influence is not a factor and subjects give judgments influenced only by the perceived hedonic value of the stimulus, contrast is the result.

  10. Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast

    Science.gov (United States)

    Zhu, Jiang; Stevens, E.; Zavodsky, B. T.; Zhang, X.; Heinrichs, T.; Broderson, D.

    2014-01-01

    Data assimilation has been demonstrated very useful in improving both global and regional numerical weather prediction. Alaska has very coarser surface observation sites. On the other hand, it gets much more satellite overpass than lower 48 states. How to utilize satellite data to improve numerical prediction is one of hot topics among weather forecast community in Alaska. The Geographic Information Network of Alaska (GINA) at University of Alaska is conducting study on satellite data assimilation for WRF model. AIRS/CRIS sounder profile data are used to assimilate the initial condition for the customized regional WRF model (GINA-WRF model). Normalized standard deviation, RMSE, and correlation statistic analysis methods are applied to analyze one case of 48 hours forecasts and one month of 24-hour forecasts in order to evaluate the improvement of regional numerical model from Data assimilation. The final goal of the research is to provide improved real-time short-time forecast for Alaska regions.

  11. Comparison between 3D-Var and 4D-Var data assimilation methods for the simulation of a heavy rainfall case in central Italy

    Science.gov (United States)

    Mazzarella, Vincenzo; Maiello, Ida; Capozzi, Vincenzo; Budillon, Giorgio; Ferretti, Rossella

    2017-08-01

    This work aims to provide a comparison between three dimensional and four dimensional variational data assimilation methods (3D-Var and 4D-Var) for a heavy rainfall case in central Italy. To evaluate the impact of the assimilation of reflectivity and radial velocity acquired from Monte Midia Doppler radar into the Weather Research Forecasting (WRF) model, the quantitative precipitation forecast (QPF) is used.The two methods are compared for a heavy rainfall event that occurred in central Italy on 14 September 2012 during the first Special Observation Period (SOP1) of the HyMeX (HYdrological cycle in Mediterranean EXperiment) campaign. This event, characterized by a deep low pressure system over the Tyrrhenian Sea, produced flash floods over the Marche and Abruzzo regions, where rainfall maxima reached more than 150 mm 24 h-1.To identify the best QPF, nine experiments are performed using 3D-Var and 4D-Var data assimilation techniques. All simulations are compared in terms of rainfall forecast and precipitation measured by the gauges through three statistical indicators: probability of detection (POD), critical success index (CSI) and false alarm ratio (FAR). The assimilation of conventional observations with 4D-Var method improves the QPF compared to 3D-Var. In addition, the use of radar measurements in 4D-Var simulations enhances the performances of statistical scores for higher rainfall thresholds.

  12. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  13. Multi-parametric variational data assimilation for hydrological forecasting

    Science.gov (United States)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  14. Ethnicity, assimilation and harassment in the labor market

    OpenAIRE

    Epstein, Gil S.; Gang, Ira N.

    2008-01-01

    We often observe minority ethnic groups at a disadvantage relative to the majority. Why is this and what can be done about it? Efforts made to assimilate, and time, are two elements working to bring the minority into line with the majority. A third element, the degree to which the majority welcomes the minority, also plays a role. We develop a simple theoretical model useful for examining the consequences for assimilation and harassment of growth in the minority population, time, and the role...

  15. Joint Center for Satellite Data Assimilation Overview and Research Activities

    Science.gov (United States)

    Auligne, T.

    2017-12-01

    In 2001 NOAA/NESDIS, NOAA/NWS, NOAA/OAR, and NASA, subsequently joined by the US Navy and Air Force, came together to form the Joint Center for Satellite Data Assimilation (JCSDA) for the common purpose of accelerating the use of satellite data in environmental numerical prediction modeling by developing, using, and anticipating advances in numerical modeling, satellite-based remote sensing, and data assimilation methods. The primary focus was to bring these advances together to improve operational numerical model-based forecasting, under the premise that these partners have common technical and logistical challenges assimilating satellite observations into their modeling enterprises that could be better addressed through cooperative action and/or common solutions. Over the last 15 years, the JCSDA has made and continues to make major contributions to operational assimilation of satellite data. The JCSDA is a multi-agency U.S. government-owned-and-operated organization that was conceived as a venue for the several agencies NOAA, NASA, USAF and USN to collaborate on advancing the development and operational use of satellite observations into numerical model-based environmental analysis and forecasting. The primary mission of the JCSDA is to "accelerate and improve the quantitative use of research and operational satellite data in weather, ocean, climate and environmental analysis and prediction systems." This mission is fulfilled through directed research targeting the following key science objectives: Improved radiative transfer modeling; new instrument assimilation; assimilation of humidity, clouds, and precipitation observations; assimilation of land surface observations; assimilation of ocean surface observations; atmospheric composition; and chemistry and aerosols. The goal of this presentation is to briefly introduce the JCSDA's mission and vision, and to describe recent research activities across various JCSDA partners.

  16. Dihydroazulene photoswitch operating in sequential tunneling regime

    DEFF Research Database (Denmark)

    Broman, Søren Lindbæk; Lara-Avila, Samuel; Thisted, Christine Lindbjerg

    2012-01-01

    to electrodes so that the electron transport goes by sequential tunneling. To assure weak coupling, the DHA switching kernel is modified by incorporating p-MeSC6H4 end-groups. Molecules are prepared by Suzuki cross-couplings on suitable halogenated derivatives of DHA. The synthesis presents an expansion of our......, incorporating a p-MeSC6H4 anchoring group in one end, has been placed in a silver nanogap. Conductance measurements justify that transport through both DHA (high resistivity) and VHF (low resistivity) forms goes by sequential tunneling. The switching is fairly reversible and reenterable; after more than 20 ON...

  17. Asynchronous Operators of Sequential Logic Venjunction & Sequention

    CERN Document Server

    Vasyukevich, Vadim

    2011-01-01

    This book is dedicated to new mathematical instruments assigned for logical modeling of the memory of digital devices. The case in point is logic-dynamical operation named venjunction and venjunctive function as well as sequention and sequentional function. Venjunction and sequention operate within the framework of sequential logic. In a form of the corresponding equations, they organically fit analytical expressions of Boolean algebra. Thus, a sort of symbiosis is formed using elements of asynchronous sequential logic on the one hand and combinational logic on the other hand. So, asynchronous

  18. Yeast identification: reassessment of assimilation tests as sole universal identifiers.

    Science.gov (United States)

    Spencer, J; Rawling, S; Stratford, M; Steels, H; Novodvorska, M; Archer, D B; Chandra, S

    2011-11-01

    To assess whether assimilation tests in isolation remain a valid method of identification of yeasts, when applied to a wide range of environmental and spoilage isolates. Seventy-one yeast strains were isolated from a soft drinks factory. These were identified using assimilation tests and by D1/D2 rDNA sequencing. When compared to sequencing, assimilation test identifications (MicroLog™) were 18·3% correct, a further 14·1% correct within the genus and 67·6% were incorrectly identified. The majority of the latter could be attributed to the rise in newly reported yeast species. Assimilation tests alone are unreliable as a universal means of yeast identification, because of numerous new species, variability of strains and increasing coincidence of assimilation profiles. Assimilation tests still have a useful role in the identification of common species, such as the majority of clinical isolates. It is probable, based on these results, that many yeast identifications reported in older literature are incorrect. This emphasizes the crucial need for accurate identification in present and future publications. © 2011 The Authors. Letters in Applied Microbiology © 2011 The Society for Applied Microbiology.

  19. IASI Radiance Data Assimilation in Local Ensemble Transform Kalman Filter

    Science.gov (United States)

    Cho, K.; Hyoung-Wook, C.; Jo, Y.

    2016-12-01

    Korea institute of Atmospheric Prediction Systems (KIAPS) is developing NWP model with data assimilation systems. Local Ensemble Transform Kalman Filter (LETKF) system, one of the data assimilation systems, has been developed for KIAPS Integrated Model (KIM) based on cubed-sphere grid and has successfully assimilated real data. LETKF data assimilation system has been extended to 4D- LETKF which considers time-evolving error covariance within assimilation window and IASI radiance data assimilation using KPOP (KIAPS package for observation processing) with RTTOV (Radiative Transfer for TOVS). The LETKF system is implementing semi operational prediction including conventional (sonde, aircraft) observation and AMSU-A (Advanced Microwave Sounding Unit-A) radiance data from April. Recently, the semi operational prediction system updated radiance observations including GPS-RO, AMV, IASI (Infrared Atmospheric Sounding Interferometer) data at July. A set of simulation of KIM with ne30np4 and 50 vertical levels (of top 0.3hPa) were carried out for short range forecast (10days) within semi operation prediction LETKF system with ensemble forecast 50 members. In order to only IASI impact, our experiments used only conventional and IAIS radiance data to same semi operational prediction set. We carried out sensitivity test for IAIS thinning method (3D and 4D). IASI observation number was increased by temporal (4D) thinning and the improvement of IASI radiance data impact on the forecast skill of model will expect.

  20. Cholesterol Assimilation by Lactobacillus Probiotic Bacteria: An In Vitro Investigation

    Directory of Open Access Journals (Sweden)

    Catherine Tomaro-Duchesneau

    2014-01-01

    Full Text Available Excess cholesterol is associated with cardiovascular diseases (CVD, an important cause of mortality worldwide. Current CVD therapeutic measures, lifestyle and dietary interventions, and pharmaceutical agents for regulating cholesterol levels are inadequate. Probiotic bacteria have demonstrated potential to lower cholesterol levels by different mechanisms, including bile salt hydrolase activity, production of compounds that inhibit enzymes such as 3-hydroxy-3-methylglutaryl coenzyme A, and cholesterol assimilation. This work investigates 11 Lactobacillus strains for cholesterol assimilation. Probiotic strains for investigation were selected from the literature: Lactobacillus reuteri NCIMB 11951, L. reuteri NCIMB 701359, L. reuteri NCIMB 702655, L. reuteri NCIMB 701089, L. reuteri NCIMB 702656, Lactobacillus fermentum NCIMB 5221, L. fermentum NCIMB 8829, L. fermentum NCIMB 2797, Lactobacillus rhamnosus ATCC 53103 GG, Lactobacillus acidophilus ATCC 314, and Lactobacillus plantarum ATCC 14917. Cholesterol assimilation was investigated in culture media and under simulated intestinal conditions. The best cholesterol assimilator was L. plantarum ATCC 14917 (15.18 ± 0.55 mg/1010 cfu in MRS broth. L. reuteri NCIMB 701089 assimilated over 67% (2254.70 ± 63.33 mg/1010 cfu of cholesterol, the most of all the strains, under intestinal conditions. This work demonstrates that probiotic bacteria can assimilate cholesterol under intestinal conditions, with L. reuteri NCIMB 701089 showing great potential as a CVD therapeutic.

  1. (U) Introduction to Monte Carlo Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  2. Exploring synchronisation in nonlinear data assimilation

    Science.gov (United States)

    Rodrigues-Pinheiro, Flavia; van Leeuwen, Peter Jan

    2016-04-01

    Present-day data assimilation methods are based on linearizations and face serious problems in strongly nonlinear cases such as convection. A promising solution to this problem is a particle filter, which provides a representation of the model probability density function (pdf) by a discrete set of model states, or particles. The basic particle filter uses Bayes's theorem directly, but does not work in high-dimensional cases. The performance can be improved by considering the proposal density freedom. This allows one to change the model equations to bring the particles closer to the observations, resulting in very efficient update schemes at observation times, but extending these schemes between observation times is computationally expensive. Simple solutions like nudging have been shown to be not powerful enough. A potential solution might be synchronization, in which one tries to synchronise the model of a system with the true evolution of the system via the observations. In practice this means that an extra term is added to the model equations that hampers growth of instabilities on the synchronization manifold. Especially the delayed versions, where observations are allowed to influence the state in the past have shown some remarkable successes. Unfortunately, all efforts ignore errors in the observations, and as soon as these are introduced the performance degrades considerably. There is a close connection between time-delayed synchronization and a Kalman Smoother, which does allow for observational (and other) errors. In this presentation we will explore this connection to the full, with a view to extend synchronization to more realistic settings. Specifically performance of the spread of information from observed to unobserved variables is studied in detail. The results indicate that this extended synchronisation is a promising tool to steer the model states towards the observations efficiently. If time permits, we will show initial results of embedding the

  3. Eyewitness confidence in simultaneous and sequential lineups: a criterion shift account for sequential mistaken identification overconfidence.

    Science.gov (United States)

    Dobolyi, David G; Dodson, Chad S

    2013-12-01

    Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  4. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    Science.gov (United States)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi

  5. Interpretability degrees of finitely axiomatized sequential theories

    NARCIS (Netherlands)

    Visser, Albert

    In this paper we show that the degrees of interpretability of finitely axiomatized extensions-in-the-same-language of a finitely axiomatized sequential theory-like Elementary Arithmetic EA, IΣ1, or the Gödel-Bernays theory of sets and classes GB-have suprema. This partially answers a question posed

  6. Interpretability Degrees of Finitely Axiomatized Sequential Theories

    NARCIS (Netherlands)

    Visser, Albert

    2012-01-01

    In this paper we show that the degrees of interpretability of finitely axiomatized extensions-in-the-same-language of a finitely axiomatized sequential theory —like Elementary Arithmetic EA, IΣ1, or the Gödel-Bernays theory of sets and classes GB— have suprema. This partially answers a question

  7. S.M.P. SEQUENTIAL MATHEMATICS PROGRAM.

    Science.gov (United States)

    CICIARELLI, V; LEONARD, JOSEPH

    A SEQUENTIAL MATHEMATICS PROGRAM BEGINNING WITH THE BASIC FUNDAMENTALS ON THE FOURTH GRADE LEVEL IS PRESENTED. INCLUDED ARE AN UNDERSTANDING OF OUR NUMBER SYSTEM, AND THE BASIC OPERATIONS OF WORKING WITH WHOLE NUMBERS--ADDITION, SUBTRACTION, MULTIPLICATION, AND DIVISION. COMMON FRACTIONS ARE TAUGHT IN THE FIFTH, SIXTH, AND SEVENTH GRADES. A…

  8. Sequential and Simultaneous Logit: A Nested Model.

    NARCIS (Netherlands)

    van Ophem, J.C.M.; Schram, A.J.H.C.

    1997-01-01

    A nested model is presented which has both the sequential and the multinomial logit model as special cases. This model provides a simple test to investigate the validity of these specifications. Some theoretical properties of the model are discussed. In the analysis a distribution function is

  9. Sensitivity Analysis in Sequential Decision Models.

    Science.gov (United States)

    Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet

    2017-02-01

    Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.

  10. Sequential models for coarsening and missingness

    NARCIS (Netherlands)

    Gill, R.D.; Robins, J.M.

    1997-01-01

    In a companion paper we described what intuitively would seem to be the most general possible way to generate Coarsening at Random mechanisms a sequential procedure called randomized monotone coarsening Counterexamples showed that CAR mechanisms exist which cannot be represented in this way Here we

  11. Sequential motor skill: cognition, perception and action

    NARCIS (Netherlands)

    Ruitenberg, M.F.L.

    2013-01-01

    Discrete movement sequences are assumed to be the building blocks of more complex sequential actions that are present in our everyday behavior. The studies presented in this dissertation address the (neuro)cognitive underpinnings of such movement sequences, in particular in relationship to the role

  12. Sequential decoders for large MIMO systems

    KAUST Repository

    Ali, Konpal S.; Abediseid, Walid; Alouini, Mohamed-Slim

    2014-01-01

    the Sequential Decoder using the Fano Algorithm for large MIMO systems. A parameter called the bias is varied to attain different performance-complexity trade-offs. Low values of the bias result in excellent performance but at the expense of high complexity

  13. A framework for sequential multiblock component methods

    NARCIS (Netherlands)

    Smilde, A.K.; Westerhuis, J.A.; Jong, S.de

    2003-01-01

    Multiblock or multiset methods are starting to be used in chemistry and biology to study complex data sets. In chemometrics, sequential multiblock methods are popular; that is, methods that calculate one component at a time and use deflation for finding the next component. In this paper a framework

  14. Classical and sequential limit analysis revisited

    Science.gov (United States)

    Leblond, Jean-Baptiste; Kondo, Djimédo; Morin, Léo; Remmal, Almahdi

    2018-04-01

    Classical limit analysis applies to ideal plastic materials, and within a linearized geometrical framework implying small displacements and strains. Sequential limit analysis was proposed as a heuristic extension to materials exhibiting strain hardening, and within a fully general geometrical framework involving large displacements and strains. The purpose of this paper is to study and clearly state the precise conditions permitting such an extension. This is done by comparing the evolution equations of the full elastic-plastic problem, the equations of classical limit analysis, and those of sequential limit analysis. The main conclusion is that, whereas classical limit analysis applies to materials exhibiting elasticity - in the absence of hardening and within a linearized geometrical framework -, sequential limit analysis, to be applicable, strictly prohibits the presence of elasticity - although it tolerates strain hardening and large displacements and strains. For a given mechanical situation, the relevance of sequential limit analysis therefore essentially depends upon the importance of the elastic-plastic coupling in the specific case considered.

  15. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); V. Capasso

    2009-01-01

    htmlabstractWe give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects

  16. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    Lieshout, van M.N.M.; Capasso, V.

    2009-01-01

    We give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects through a video

  17. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    maintains the flexibility of deciding sooner than the fixed sample size procedure at the price of some lower power [13, 514]. The sequential probability... markets , detection of signals with unknown arrival time in seismology, navigation, radar and sonar signal processing, speech segmentation, and the... skimming cruise missile can yield a significant increase in the probability of raid annihilation. Furthermore, usually detection systems are

  18. STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING: A SURVEY

    Directory of Open Access Journals (Sweden)

    Damián Fernández

    2014-12-01

    Full Text Available We review the motivation for, the current state-of-the-art in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globalization challenges, and extentions of the method to the more general variational problems.

  19. Truly costly sequential search and oligopolistic pricing

    NARCIS (Netherlands)

    Janssen, Maarten C W; Moraga-González, José Luis; Wildenbeest, Matthijs R.

    We modify the paper of Stahl (1989) [Stahl, D.O., 1989. Oligopolistic pricing with sequential consumer search. American Economic Review 79, 700-12] by relaxing the assumption that consumers obtain the first price quotation for free. When all price quotations are costly to obtain, the unique

  20. Zips : mining compressing sequential patterns in streams

    NARCIS (Netherlands)

    Hoang, T.L.; Calders, T.G.K.; Yang, J.; Mörchen, F.; Fradkin, D.; Chau, D.H.; Vreeken, J.; Leeuwen, van M.; Faloutsos, C.

    2013-01-01

    We propose a streaming algorithm, based on the minimal description length (MDL) principle, for extracting non-redundant sequential patterns. For static databases, the MDL-based approach that selects patterns based on their capacity to compress data rather than their frequency, was shown to be

  1. How to Read the Tractatus Sequentially

    Directory of Open Access Journals (Sweden)

    Tim Kraft

    2016-11-01

    Full Text Available One of the unconventional features of Wittgenstein’s Tractatus Logico-Philosophicus is its use of an elaborated and detailed numbering system. Recently, Bazzocchi, Hacker und Kuusela have argued that the numbering system means that the Tractatus must be read and interpreted not as a sequentially ordered book, but as a text with a two-dimensional, tree-like structure. Apart from being able to explain how the Tractatus was composed, the tree reading allegedly solves exegetical issues both on the local (e. g. how 4.02 fits into the series of remarks surrounding it and the global level (e. g. relation between ontology and picture theory, solipsism and the eye analogy, resolute and irresolute readings. This paper defends the sequential reading against the tree reading. After presenting the challenges generated by the numbering system and the two accounts as attempts to solve them, it is argued that Wittgenstein’s own explanation of the numbering system, anaphoric references within the Tractatus and the exegetical issues mentioned above do not favour the tree reading, but a version of the sequential reading. This reading maintains that the remarks of the Tractatus form a sequential chain: The role of the numbers is to indicate how remarks on different levels are interconnected to form a concise, surveyable and unified whole.

  2. Adult Word Recognition and Visual Sequential Memory

    Science.gov (United States)

    Holmes, V. M.

    2012-01-01

    Two experiments were conducted investigating the role of visual sequential memory skill in the word recognition efficiency of undergraduate university students. Word recognition was assessed in a lexical decision task using regularly and strangely spelt words, and nonwords that were either standard orthographically legal strings or items made from…

  3. Terminating Sequential Delphi Survey Data Collection

    Science.gov (United States)

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  4. Isotopic depletion with Monte Carlo

    International Nuclear Information System (INIS)

    Martin, W.R.; Rathkopf, J.A.

    1996-06-01

    This work considers a method to deplete isotopes during a time- dependent Monte Carlo simulation of an evolving system. The method is based on explicitly combining a conventional estimator for the scalar flux with the analytical solutions to the isotopic depletion equations. There are no auxiliary calculations; the method is an integral part of the Monte Carlo calculation. The method eliminates negative densities and reduces the variance in the estimates for the isotope densities, compared to existing methods. Moreover, existing methods are shown to be special cases of the general method described in this work, as they can be derived by combining a high variance estimator for the scalar flux with a low-order approximation to the analytical solution to the depletion equation

  5. Monte Carlo Methods in ICF

    Science.gov (United States)

    Zimmerman, George B.

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials.

  6. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, G.B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials. copyright 1997 American Institute of Physics

  7. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, George B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials

  8. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.; Dean, D.J.; Langanke, K.

    1997-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; the resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo (SMMC) methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, the thermal and rotational behavior of rare-earth and γ-soft nuclei, and the calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. (orig.)

  9. A contribution Monte Carlo method

    International Nuclear Information System (INIS)

    Aboughantous, C.H.

    1994-01-01

    A Contribution Monte Carlo method is developed and successfully applied to a sample deep-penetration shielding problem. The random walk is simulated in most of its parts as in conventional Monte Carlo methods. The probability density functions (pdf's) are expressed in terms of spherical harmonics and are continuous functions in direction cosine and azimuthal angle variables as well as in position coordinates; the energy is discretized in the multigroup approximation. The transport pdf is an unusual exponential kernel strongly dependent on the incident and emergent directions and energies and on the position of the collision site. The method produces the same results obtained with the deterministic method with a very small standard deviation, with as little as 1,000 Contribution particles in both analog and nonabsorption biasing modes and with only a few minutes CPU time

  10. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.

    1996-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs

  11. Parallel Monte Carlo reactor neutronics

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Brown, F.B.

    1994-01-01

    The issues affecting implementation of parallel algorithms for large-scale engineering Monte Carlo neutron transport simulations are discussed. For nuclear reactor calculations, these include load balancing, recoding effort, reproducibility, domain decomposition techniques, I/O minimization, and strategies for different parallel architectures. Two codes were parallelized and tested for performance. The architectures employed include SIMD, MIMD-distributed memory, and workstation network with uneven interactive load. Speedups linear with the number of nodes were achieved

  12. Elements of Monte Carlo techniques

    International Nuclear Information System (INIS)

    Nagarajan, P.S.

    2000-01-01

    The Monte Carlo method is essentially mimicking the real world physical processes at the microscopic level. With the incredible increase in computing speeds and ever decreasing computing costs, there is widespread use of the method for practical problems. The method is used in calculating algorithm-generated sequences known as pseudo random sequence (prs)., probability density function (pdf), test for randomness, extension to multidimensional integration etc

  13. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  14. Geometrical splitting in Monte Carlo

    International Nuclear Information System (INIS)

    Dubi, A.; Elperin, T.; Dudziak, D.J.

    1982-01-01

    A statistical model is presented by which a direct statistical approach yielded an analytic expression for the second moment, the variance ratio, and the benefit function in a model of an n surface-splitting Monte Carlo game. In addition to the insight into the dependence of the second moment on the splitting parameters the main importance of the expressions developed lies in their potential to become a basis for in-code optimization of splitting through a general algorithm. Refs

  15. Extending canonical Monte Carlo methods

    International Nuclear Information System (INIS)

    Velazquez, L; Curilef, S

    2010-01-01

    In this paper, we discuss the implications of a recently obtained equilibrium fluctuation-dissipation relation for the extension of the available Monte Carlo methods on the basis of the consideration of the Gibbs canonical ensemble to account for the existence of an anomalous regime with negative heat capacities C α with α≈0.2 for the particular case of the 2D ten-state Potts model

  16. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 2: Sensitivity tests and results

    Science.gov (United States)

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational–Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by

  17. Monte Carlo Bayesian Inference on a Statistical Model of Sub-gridcolumn Moisture Variability Using High-resolution Cloud Observations . Part II; Sensitivity Tests and Results

    Science.gov (United States)

    da Silva, Arlindo M.; Norris, Peter M.

    2013-01-01

    Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.

  18. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 2: Sensitivity Tests and Results

    Science.gov (United States)

    Norris, Peter M.; da Silva, Arlindo M.

    2016-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by

  19. Non statistical Monte-Carlo

    International Nuclear Information System (INIS)

    Mercier, B.

    1985-04-01

    We have shown that the transport equation can be solved with particles, like the Monte-Carlo method, but without random numbers. In the Monte-Carlo method, particles are created from the source, and are followed from collision to collision until either they are absorbed or they leave the spatial domain. In our method, particles are created from the original source, with a variable weight taking into account both collision and absorption. These particles are followed until they leave the spatial domain, and we use them to determine a first collision source. Another set of particles is then created from this first collision source, and tracked to determine a second collision source, and so on. This process introduces an approximation which does not exist in the Monte-Carlo method. However, we have analyzed the effect of this approximation, and shown that it can be limited. Our method is deterministic, gives reproducible results. Furthermore, when extra accuracy is needed in some region, it is easier to get more particles to go there. It has the same kind of applications: rather problems where streaming is dominant than collision dominated problems

  20. BREM5 electroweak Monte Carlo

    International Nuclear Information System (INIS)

    Kennedy, D.C. II.

    1987-01-01

    This is an update on the progress of the BREMMUS Monte Carlo simulator, particularly in its current incarnation, BREM5. The present report is intended only as a follow-up to the Mark II/Granlibakken proceedings, and those proceedings should be consulted for a complete description of the capabilities and goals of the BREMMUS program. The new BREM5 program improves on the previous version of BREMMUS, BREM2, in a number of important ways. In BREM2, the internal loop (oblique) corrections were not treated in consistent fashion, a deficiency that led to renormalization scheme-dependence; i.e., physical results, such as cross sections, were dependent on the method used to eliminate infinities from the theory. Of course, this problem cannot be tolerated in a Monte Carlo designed for experimental use. BREM5 incorporates a new way of treating the oblique corrections, as explained in the Granlibakken proceedings, that guarantees renormalization scheme-independence and dramatically simplifies the organization and calculation of radiative corrections. This technique is to be presented in full detail in a forthcoming paper. BREM5 is, at this point, the only Monte Carlo to contain the entire set of one-loop corrections to electroweak four-fermion processes and renormalization scheme-independence. 3 figures

  1. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  2. Bayesian data assimilation in shape registration

    KAUST Repository

    Cotter, C J

    2013-03-28

    In this paper we apply a Bayesian framework to the problem of geodesic curve matching. Given a template curve, the geodesic equations provide a mapping from initial conditions for the conjugate momentum onto topologically equivalent shapes. Here, we aim to recover the well-defined posterior distribution on the initial momentum which gives rise to observed points on the target curve; this is achieved by explicitly including a reparameterization in the formulation. Appropriate priors are chosen for the functions which together determine this field and the positions of the observation points, the initial momentum p0 and the reparameterization vector field ν, informed by regularity results about the forward model. Having done this, we illustrate how maximum likelihood estimators can be used to find regions of high posterior density, but also how we can apply recently developed Markov chain Monte Carlo methods on function spaces to characterize the whole of the posterior density. These illustrative examples also include scenarios where the posterior distribution is multimodal and irregular, leading us to the conclusion that knowledge of a state of global maximal posterior density does not always give us the whole picture, and full posterior sampling can give better quantification of likely states and the overall uncertainty inherent in the problem. © 2013 IOP Publishing Ltd.

  3. Geochemical influences on assimilation of sediment-bound metals in clams and mussels

    Science.gov (United States)

    Griscom, S.B.; Fisher, N.S.; Luoma, S.N.

    2000-01-01

    A series of experiments was performed to evaluate the extent to which Cd, Co, Ag, Se, Cr, and Zn bound to sediments with different geochemical properties could be assimilated by the mussel Mytilus edulis and the clam Macoma balthica. Oxidized and reduced radiolabeled sediments were fed to suspension-feeding animals, the depuration patterns of the individuals were followed by ??-spectrometry, and the assimilation efficiencies (AEs) of ingested metals were determined. AEs from geochemically diverse sediments typically varied less than 2-fold and ranged from 1% for Cr to 42% for Zn. Metals were assimilated from anoxic sediment by both animals; Ag, Cd, and Co AEs in M. balthica were 9-16%, 2-fold lower than from oxic sediment, but in M. edulis AEs were about two times greater from anoxic sediment for all metals but Ag. For oxic sediment, Cd and Co AEs in M. edulis decreased 3-4-fold with increased sediment exposure time to the metals with smaller but significant effects also noted for Zn and Se but not Ag. A less pronounced decrease in AE for M. balthica was evident only after 6 months exposure time. Sequential extractions of the oxidized sediments showed a transfer of metals into more resistant sediment components over time, but the rate did not correlate with a decrease in metal AEs. Comparing the two bivalves, TOC concentrations had an inconsistent effect on metal AEs. AEs of metals from bacteria-coated glass beads were slightly higher than from humic acid-coated beads, which were comparable with whole-sediment AEs. There was correspondence of AE with desorption of Ag, Cd, Co, and Se (but not Zn) from sediments into pH 5 seawater, measured to simulate the gut pH of these bivalves. The results imply that metals associated with sulfides and anoxic sediments are bioavailable, that the bioavailability of metals from sediments decreases over exposure time, that organic carbon content generally has a small effect on AEs, and that AEs of sediment-bound metals differ among

  4. Carbon cycling of European croplands: A framework for the assimilation of optical and microwave Earth observation data

    Science.gov (United States)

    Revill, Andrew; Sus, Oliver; Williams, Mathew

    2013-04-01

    Croplands are traditionally managed to maximise the production of food, feed, fibre and bioenergy. Advancements in agricultural technologies, together with land-use change, have approximately doubled World grain harvests over the past 50 years. Cropland ecosystems also play a significant role in the global carbon (C) cycle and, through changes to C storage in response to management activities, they can provide opportunities for climate change mitigation. However, quantifying and understanding the cropland C cycle is complex, due to variable environmental drivers, varied management practices and often highly heterogeneous landscapes. Efforts to upscale processes using simulation models must resolve these challenges. Here we show how data assimilation (DA) approaches can link C cycle modelling to Earth observation (EO) and reduce uncertainty in upscaling. We evaluate a framework for the assimilation of leaf area index (LAI) time series, empirically derived from EO optical and radar sensors, for state-updating a model of crop development and C fluxes. Sensors are selected with fine spatial resolutions (20-50 m) to resolve variability across field sizes typically used in European agriculture. Sequential DA is used to improve the canopy development simulation, which is validated by comparing time-series LAI and net ecosystem exchange (NEE) predictions to independent ground measurements and eddy covariance observations at multiple European cereal crop sites. Significant empirical relationships were established between the LAI ground measurements and the optical reflectance and radar backscatter, which allowed for single LAI calibrations being valid for all the cropland sites for each sensor. The DA of all EO LAI estimates results indicated clear adjustments in LAI and an enhanced representation of daily CO2 exchanges, particularly around the time of peak C uptake. Compared to the simulation without DA, the assimilation of all EO LAI estimates improved the predicted at

  5. DART: New Research Using Ensemble Data Assimilation in Geophysical Models

    Science.gov (United States)

    Hoar, T. J.; Raeder, K.

    2015-12-01

    The Data Assimilation Research Testbed (DART) is a community facilityfor ensemble data assimilation developed and supported by the NationalCenter for Atmospheric Research. DART provides a comprehensive suite of software, documentation, and tutorials that can be used for ensemble data assimilation research, operations, and education. Scientists and software engineers at NCAR are available to support DART users who want to use existing DART products or develop their own applications. Current DART users range from university professors teaching data assimilation, to individual graduate students working with simple models, through national laboratories doing operational prediction with large state-of-the-art models. DART runs efficiently on many computational platforms ranging from laptops through thousands of cores on the newest supercomputers.This poster focuses on several recent research activities using DART with geophysical models.Using CAM/DART to understand whether OCO-2 Total Precipitable Water observations can be useful in numerical weather prediction.Impacts of the synergistic use of Infra-red CO retrievals (MOPITT, IASI) in CAM-CHEM/DART assimilations.Assimilation and Analysis of Observations of Amazonian Biomass Burning Emissions by MOPITT (aerosol optical depth), MODIS (carbon monoxide) and MISR (plume height).Long term evaluation of the chemical response of MOPITT-CO assimilation in CAM-CHEM/DART OSSEs for satellite planning and emission inversion capabilities.Improved forward observation operators for land models that have multiple land use/land cover segments in a single grid cell,Simulating mesoscale convective systems (MCSs) using a variable resolution, unstructured grid in the Model for Prediction Across Scales (MPAS) and DART.The mesoscale WRF+DART system generated an ensemble of year-long, real-time initializations of a convection allowing model over the United States.Constraining WACCM with observations in the tropical band (30S-30N) using DART

  6. Assimilation of Altimeter Data into a Quasigeostrophic Model of the Gulf Stream System. Part 2; Assimilation Results

    Science.gov (United States)

    Capotondi, Antonietta; Holland, William R.; Malanotte-Rizzoli, Paola

    1995-01-01

    The improvement in the climatological behavior of a numerical model as a consequence of the assimilation of surface data is investigated. The model used for this study is a quasigeostrophic (QG) model of the Gulf Stream region. The data that have been assimilated are maps of sea surface height that have been obtained as the superposition of sea surface height variability deduced from the Geosat altimeter measurements and a mean field constructed from historical hydrographic data. The method used for assimilating the data is the nudging technique. Nudging has been implemented in such a way as to achieve a high degree of convergence of the surface model fields toward the observations. Comparisons of the assimilation results with available in situ observations show a significant improvement in the degree of realism of the climatological model behavior, with respect to the model in which no data are assimilated. The remaining discrepancies in the model mean circulation seem to be mainly associated with deficiencies in the mean component of the surface data that are assimilated. On the other hand, the possibility of building into the model more realistic eddy characteristics through the assimilation of the surface eddy field proves very successful in driving components of the mean model circulation that are in relatively good agreement with the available observations. Comparisons with current meter time series during a time period partially overlapping the Geosat mission show that the model is able to 'correctly' extrapolate the instantaneous surface eddy signals to depths of approximately 1500 m. The correlation coefficient between current meter and model time series varies from values close to 0.7 in the top 1500 m to values as low as 0.1-0.2 in the deep ocean.

  7. An integrated GIS application system for soil moisture data assimilation

    Science.gov (United States)

    Wang, Di; Shen, Runping; Huang, Xiaolong; Shi, Chunxiang

    2014-11-01

    The gaps in knowledge and existing challenges in precisely describing the land surface process make it critical to represent the massive soil moisture data visually and mine the data for further research.This article introduces a comprehensive soil moisture assimilation data analysis system, which is instructed by tools of C#, IDL, ArcSDE, Visual Studio 2008 and SQL Server 2005. The system provides integrated service, management of efficient graphics visualization and analysis of land surface data assimilation. The system is not only able to improve the efficiency of data assimilation management, but also comprehensively integrate the data processing and analysis tools into GIS development environment. So analyzing the soil moisture assimilation data and accomplishing GIS spatial analysis can be realized in the same system. This system provides basic GIS map functions, massive data process and soil moisture products analysis etc. Besides,it takes full advantage of a spatial data engine called ArcSDE to effeciently manage, retrieve and store all kinds of data. In the system, characteristics of temporal and spatial pattern of soil moiture will be plotted. By analyzing the soil moisture impact factors, it is possible to acquire the correlation coefficients between soil moisture value and its every single impact factor. Daily and monthly comparative analysis of soil moisture products among observations, simulation results and assimilations can be made in this system to display the different trends of these products. Furthermore, soil moisture map production function is realized for business application.

  8. Development Of A Data Assimilation Capability For RAPID

    Science.gov (United States)

    Emery, C. M.; David, C. H.; Turmon, M.; Hobbs, J.; Allen, G. H.; Famiglietti, J. S.

    2017-12-01

    The global decline of in situ observations associated with the increasing ability to monitor surface water from space motivates the creation of data assimilation algorithms that merge computer models and space-based observations to produce consistent estimates of terrestrial hydrology that fill the spatiotemporal gaps in observations. RAPID is a routing model based on the Muskingum method that is capable of estimating river streamflow over large scales with a relatively short computing time. This model only requires limited inputs: a reach-based river network, and lateral surface and subsurface flow into the rivers. The relatively simple model physics imply that RAPID simulations could be significantly improved by including a data assimilation capability. Here we present the early developments of such data assimilation approach into RAPID. Given the linear and matrix-based structure of the model, we chose to apply a direct Kalman filter, hence allowing for the preservation of high computational speed. We correct the simulated streamflows by assimilating streamflow observations and our early results demonstrate the feasibility of the approach. Additionally, the use of in situ gauges at continental scales motivates the application of our new data assimilation scheme to altimetry measurements from existing (e.g. EnviSat, Jason 2) and upcoming satellite missions (e.g. SWOT), and ultimately apply the scheme globally.

  9. Kinetics of 15NH4+ assimilation in Zea mays

    International Nuclear Information System (INIS)

    Magalhaes, J.R.; Ju, G.C.; Rich, P.J.; Rhodes, D.

    1990-01-01

    Comparative studies of 15 NH 4 + assimilation were undertaken with a GDH1-null mutant of Zea mays and a related (but not strictly isogenic) GDH1-positive wild type from which this mutant was derived. The kinetics of 15 NH 4 + assimilation into free amino acids and total reduced nitrogen were monitored in both roots and shoots of 2-week-old seedlings supplied with 5 millimolar 99% ( 15 NH 4 ) 2 SO 4 via the aerated root medium in hydroponic culture over a 24-h period. The GDH1-null mutant, with a 10- to 15-fold lower total root GDH activity in comparison to the wild type, was found to exhibit a 40 to 50% lower rate of 15 NH 4 + assimilation into total reduced nitrogen. The lower rates of 15 NH 4 + assimilation in the mutant was associated with lower rates of labeling of several free amino acids (including glutamate, glutamine-amino N, aspartate, asparagine-amino N, and alanine) in both roots and shoots of the mutant in comparison to the wild type. Qualitatively, these labeling kinetics appear consistent with a reduced flux of 15 N via glutamate in the GDH1-null mutant. However, the responses of the two genotypes to the potent inhibitor of glutamine synthetase, methionine sulfoximine, and differences in morphology of the two genotypes (particularly a lower shoot:root ratio in the GDH1-null mutant) urge caution in concluding that GDH1 is solely responsible for these differences in ammonia assimilation rate

  10. Nitrogen assimilation in denitrifier Bacillus azotoformans LMG 9581T.

    Science.gov (United States)

    Sun, Yihua; De Vos, Paul; Willems, Anne

    2017-12-01

    Until recently, it has not been generally known that some bacteria can contain the gene inventory for both denitrification and dissimilatory nitrate (NO 3 - )/nitrite (NO 2 - ) reduction to ammonium (NH 4 + ) (DNRA). Detailed studies of these microorganisms could shed light on the differentiating environmental drivers of both processes without interference of organism-specific variation. Genome analysis of Bacillus azotoformans LMG 9581 T shows a remarkable redundancy of dissimilatory nitrogen reduction, with multiple copies of each denitrification gene as well as DNRA genes nrfAH, but a reduced capacity for nitrogen assimilation, with no nas operon nor amtB gene. Here, we explored nitrogen assimilation in detail using growth experiments in media with different organic and inorganic nitrogen sources at different concentrations. Monitoring of growth, NO 3 - NO 2 - , NH 4 + concentration and N 2 O production revealed that B. azotoformans LMG 9581 T could not grow with NH 4 + as sole nitrogen source and confirmed the hypothesis of reduced nitrogen assimilation pathways. However, NH 4 + could be assimilated and contributed up to 50% of biomass if yeast extract was also provided. NH 4 + also had a significant but concentration-dependent influence on growth rate. The mechanisms behind these observations remain to be resolved but hypotheses for this deficiency in nitrogen assimilation are discussed. In addition, in all growth conditions tested a denitrification phenotype was observed, with all supplied NO 3 - converted to nitrous oxide (N 2 O).

  11. Multiscale Data Assimilation for Large-Eddy Simulations

    Science.gov (United States)

    Li, Z.; Cheng, X.; Gustafson, W. I., Jr.; Xiao, H.; Vogelmann, A. M.; Endo, S.; Toto, T.

    2017-12-01

    Large-eddy simulation (LES) is a powerful tool for understanding atmospheric turbulence, boundary layer physics and cloud development, and there is a great need for developing data assimilation methodologies that can constrain LES models. The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) User Facility has been developing the capability to routinely generate ensembles of LES. The LES ARM Symbiotic Simulation and Observation (LASSO) project (https://www.arm.gov/capabilities/modeling/lasso) is generating simulations for shallow convection days at the ARM Southern Great Plains site in Oklahoma. One of major objectives of LASSO is to develop the capability to observationally constrain LES using a hierarchy of ARM observations. We have implemented a multiscale data assimilation (MSDA) scheme, which allows data assimilation to be implemented separately for distinct spatial scales, so that the localized observations can be effectively assimilated to constrain the mesoscale fields in the LES area of about 15 km in width. The MSDA analysis is used to produce forcing data that drive LES. With such LES workflow we have examined 13 days with shallow convection selected from the period May-August 2016. We will describe the implementation of MSDA, present LES results, and address challenges and opportunities for applying data assimilation to LES studies.

  12. Assimilating uncertain, dynamic and intermittent streamflow observations in hydrological models

    Science.gov (United States)

    Mazzoleni, Maurizio; Alfonso, Leonardo; Chacon-Hurtado, Juan; Solomatine, Dimitri

    2015-09-01

    Catastrophic floods cause significant socio-economical losses. Non-structural measures, such as real-time flood forecasting, can potentially reduce flood risk. To this end, data assimilation methods have been used to improve flood forecasts by integrating static ground observations, and in some cases also remote sensing observations, within water models. Current hydrologic and hydraulic research works consider assimilation of observations coming from traditional, static sensors. At the same time, low-cost, mobile sensors and mobile communication devices are becoming also increasingly available. The main goal and innovation of this study is to demonstrate the usefulness of assimilating uncertain streamflow observations that are dynamic in space and intermittent in time in the context of two different semi-distributed hydrological model structures. The developed method is applied to the Brue basin, where the dynamic observations are imitated by the synthetic observations of discharge. The results of this study show how model structures and sensors locations affect in different ways the assimilation of streamflow observations. In addition, it proves how assimilation of such uncertain observations from dynamic sensors can provide model improvements similar to those of streamflow observations coming from a non-optimal network of static physical sensors. This can be a potential application of recent efforts to build citizen observatories of water, which can make the citizens an active part in information capturing, evaluation and communication, helping simultaneously to improvement of model-based flood forecasting.

  13. Assimilation of GNSS radio occultation observations in GRAPES

    Science.gov (United States)

    Liu, Y.; Xue, J.

    2014-07-01

    This paper reviews the development of the global navigation satellite system (GNSS) radio occultation (RO) observations assimilation in the Global/Regional Assimilation and PrEdiction System (GRAPES) of China Meteorological Administration, including the choice of data to assimilate, the data quality control, the observation operator, the tuning of observation error, and the results of the observation impact experiments. The results indicate that RO data have a significantly positive effect on analysis and forecast at all ranges in GRAPES not only in the Southern Hemisphere where conventional observations are lacking but also in the Northern Hemisphere where data are rich. It is noted that a relatively simple assimilation and forecast system in which only the conventional and RO observation are assimilated still has analysis and forecast skill even after nine months integration, and the analysis difference between both hemispheres is gradually reduced with height when compared with NCEP (National Centers for Enviromental Prediction) analysis. Finally, as a result of the new onboard payload of the Chinese FengYun-3 (FY-3) satellites, the research status of the RO of FY-3 satellites is also presented.

  14. Assimilating the Future for Better Forecasts and Earlier Warnings

    Science.gov (United States)

    Du, H.; Wheatcroft, E.; Smith, L. A.

    2016-12-01

    Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.

  15. Impact of Diagrams on Recalling Sequential Elements in Expository Texts.

    Science.gov (United States)

    Guri-Rozenblit, Sarah

    1988-01-01

    Examines the instructional effectiveness of abstract diagrams on recall of sequential relations in social science textbooks. Concludes that diagrams assist significantly the recall of sequential relations in a text and decrease significantly the rate of order mistakes. (RS)

  16. A one-sided sequential test

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A.; Lux, I. [Hungarian Academy of Sciences, Budapest (Hungary). Atomic Energy Research Inst.

    1996-04-16

    The applicability of the classical sequential probability ratio testing (SPRT) for early failure detection problems is limited by the fact that there is an extra time delay between the occurrence of the failure and its first recognition. Chien and Adams developed a method to minimize this time for the case when the problem can be formulated as testing the mean value of a Gaussian signal. In our paper we propose a procedure that can be applied for both mean and variance testing and that minimizes the time delay. The method is based on a special parametrization of the classical SPRT. The one-sided sequential tests (OSST) can reproduce the results of the Chien-Adams test when applied for mean values. (author).

  17. Documentscape: Intertextuality, Sequentiality & Autonomy at Work

    DEFF Research Database (Denmark)

    Christensen, Lars Rune; Bjørn, Pernille

    2014-01-01

    On the basis of an ethnographic field study, this article introduces the concept of documentscape to the analysis of document-centric work practices. The concept of documentscape refers to the entire ensemble of documents in their mutual intertextual interlocking. Providing empirical data from...... a global software development case, we show how hierarchical structures and sequentiality across the interlocked documents are critical to how actors make sense of the work of others and what to do next in a geographically distributed setting. Furthermore, we found that while each document is created...... as part of a quasi-sequential order, this characteristic does not make the document, as a single entity, into a stable object. Instead, we found that the documents were malleable and dynamic while suspended in intertextual structures. Our concept of documentscape points to how the hierarchical structure...

  18. A minimax procedure in the context of sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1999-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master or a nonmaster, or to continue sampling and administering another random test item. The framework of minimax sequential decision theory

  19. Applying the minimax principle to sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    2002-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master, a nonmaster, or to continue sampling and administering another random item. The framework of minimax sequential decision theory (minimum

  20. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  1. Storm surge model based on variational data assimilation method

    Directory of Open Access Journals (Sweden)

    Shi-li Huang

    2010-06-01

    Full Text Available By combining computation and observation information, the variational data assimilation method has the ability to eliminate errors caused by the uncertainty of parameters in practical forecasting. It was applied to a storm surge model based on unstructured grids with high spatial resolution meant for improving the forecasting accuracy of the storm surge. By controlling the wind stress drag coefficient, the variation-based model was developed and validated through data assimilation tests in an actual storm surge induced by a typhoon. In the data assimilation tests, the model accurately identified the wind stress drag coefficient and obtained results close to the true state. Then, the actual storm surge induced by Typhoon 0515 was forecast by the developed model, and the results demonstrate its efficiency in practical application.

  2. a Thtee-Dimensional Variational Assimilation Scheme for Satellite Aod

    Science.gov (United States)

    Liang, Y.; Zang, Z.; You, W.

    2018-04-01

    A three-dimensional variational data assimilation scheme is designed for satellite AOD based on the IMPROVE (Interagency Monitoring of Protected Visual Environments) equation. The observation operator that simulates AOD from the control variables is established by the IMPROVE equation. All of the 16 control variables in the assimilation scheme are the mass concentrations of aerosol species from the Model for Simulation Aerosol Interactions and Chemistry scheme, so as to take advantage of this scheme in providing comprehensive analyses of species concentrations and size distributions as well as be calculating efficiently. The assimilation scheme can save computational resources as the IMPROVE equation is a quadratic equation. A single-point observation experiment shows that the information from the single-point AOD is effectively spread horizontally and vertically.

  3. Data Assimilation in Hydrodynamic Models of Continental Shelf Seas

    DEFF Research Database (Denmark)

    Sørensen, Jacob Viborg Tornfeldt

    2004-01-01

    . Assimilation of sea surface temperature and parameter estimation in hydrodynamic models are also considered. The main focus has been on the development of robust and efficient techniques applicable in real operational settings. The applied assimilation techniques all use a Kalman filter approach. They consist....... The assimilation schemes used in this work are primarily based on two ensemble based schemes, the Ensemble Kalman Filter and the Reduced Rank Square Root Kalman Filter. In order to investigate the applicability of these and derived schemes, the sensitivity to filter parameters, nonlinearity and bias is examined...... in artificial tests. Approximate schemes, which are theoretically presented as using regularised Kalman gains, are introduced and successfully applied in artificial as well real case scenarios. Particularly, distant dependent and slowly time varying or constant Kalman gains are shown to possess good hindcast...

  4. On Locally Most Powerful Sequential Rank Tests

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : nonparametric test s * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016

  5. Sequential pattern recognition by maximum conditional informativity

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří

    2014-01-01

    Roč. 45, č. 1 (2014), s. 39-45 ISSN 0167-8655 R&D Projects: GA ČR(CZ) GA14-02652S; GA ČR(CZ) GA14-10911S Keywords : Multivariate statistics * Statistical pattern recognition * Sequential decision making * Product mixtures * EM algorithm * Shannon information Subject RIV: IN - Informatics, Computer Sci ence Impact factor: 1.551, year: 2014 http://library.utia.cas.cz/separaty/2014/RO/grim-0428565.pdf

  6. Comparing two Poisson populations sequentially: an application

    International Nuclear Information System (INIS)

    Halteman, E.J.

    1986-01-01

    Rocky Flats Plant in Golden, Colorado monitors each of its employees for radiation exposure. Excess exposure is detected by comparing the means of two Poisson populations. A sequential probability ratio test (SPRT) is proposed as a replacement for the fixed sample normal approximation test. A uniformly most efficient SPRT exists, however logistics suggest using a truncated SPRT. The truncated SPRT is evaluated in detail and shown to possess large potential savings in average time spent by employees in the monitoring process

  7. Heat accumulation during sequential cortical bone drilling.

    Science.gov (United States)

    Palmisano, Andrew C; Tai, Bruce L; Belmont, Barry; Irwin, Todd A; Shih, Albert; Holmes, James R

    2016-03-01

    Significant research exists regarding heat production during single-hole bone drilling. No published data exist regarding repetitive sequential drilling. This study elucidates the phenomenon of heat accumulation for sequential drilling with both Kirschner wires (K wires) and standard two-flute twist drills. It was hypothesized that cumulative heat would result in a higher temperature with each subsequent drill pass. Nine holes in a 3 × 3 array were drilled sequentially on moistened cadaveric tibia bone kept at body temperature (about 37 °C). Four thermocouples were placed at the center of four adjacent holes and 2 mm below the surface. A battery-driven hand drill guided by a servo-controlled motion system was used. Six samples were drilled with each tool (2.0 mm K wire and 2.0 and 2.5 mm standard drills). K wire drilling increased temperature from 5 °C at the first hole to 20 °C at holes 6 through 9. A similar trend was found in standard drills with less significant increments. The maximum temperatures of both tools increased from drill sizes was found to be insignificant (P > 0.05). In conclusion, heat accumulated during sequential drilling, with size difference being insignificant. K wire produced more heat than its twist-drill counterparts. This study has demonstrated the heat accumulation phenomenon and its significant effect on temperature. Maximizing the drilling field and reducing the number of drill passes may decrease bone injury. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  8. Sequential test procedures for inventory differences

    International Nuclear Information System (INIS)

    Goldman, A.S.; Kern, E.A.; Emeigh, C.W.

    1985-01-01

    By means of a simulation study, we investigated the appropriateness of Page's and power-one sequential tests on sequences of inventory differences obtained from an example materials control unit, a sub-area of a hypothetical UF 6 -to-U 3 O 8 conversion process. The study examined detection probability and run length curves obtained from different loss scenarios. 12 refs., 10 figs., 2 tabs

  9. Sequential neural models with stochastic layers

    DEFF Research Database (Denmark)

    Fraccaro, Marco; Sønderby, Søren Kaae; Paquet, Ulrich

    2016-01-01

    How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural...... generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over...

  10. The Impact of the Assimilation of Aquarius Sea Surface Salinity Data in the GEOS Ocean Data Assimilation System

    Science.gov (United States)

    Vernieres, Guillaume Rene Jean; Kovach, Robin M.; Keppenne, Christian L.; Akella, Santharam; Brucker, Ludovic; Dinnat, Emmanuel Phillippe

    2014-01-01

    Ocean salinity and temperature differences drive thermohaline circulations. These properties also play a key role in the ocean-atmosphere coupling. With the availability of L-band space-borne observations, it becomes possible to provide global scale sea surface salinity (SSS) distribution. This study analyzes globally the along-track (Level 2) Aquarius SSS retrievals obtained using both passive and active L-band observations. Aquarius alongtrack retrieved SSS are assimilated into the ocean data assimilation component of Version 5 of the Goddard Earth Observing System (GEOS-5) assimilation and forecast model. We present a methodology to correct the large biases and errors apparent in Version 2.0 of the Aquarius SSS retrieval algorithm and map the observed Aquarius SSS retrieval into the ocean models bulk salinity in the topmost layer. The impact of the assimilation of the corrected SSS on the salinity analysis is evaluated by comparisons with insitu salinity observations from Argo. The results show a significant reduction of the global biases and RMS of observations-minus-forecast differences at in-situ locations. The most striking results are found in the tropics and southern latitudes. Our results highlight the complementary role and problems that arise during the assimilation of salinity information from in-situ (Argo) and space-borne surface (SSS) observations

  11. Assimilation of satellite color observations in a coupled ocean GCM-ecosystem model

    Science.gov (United States)

    Sarmiento, Jorge L.

    1992-01-01

    Monthly average coastal zone color scanner (CZCS) estimates of chlorophyll concentration were assimilated into an ocean global circulation model(GCM) containing a simple model of the pelagic ecosystem. The assimilation was performed in the simplest possible manner, to allow the assessment of whether there were major problems with the ecosystem model or with the assimilation procedure. The current ecosystem model performed well in some regions, but failed in others to assimilate chlorophyll estimates without disrupting important ecosystem properties. This experiment gave insight into those properties of the ecosystem model that must be changed to allow data assimilation to be generally successful, while raising other important issues about the assimilation procedure.

  12. Biomass assimilation in coupled ecohydrodynamical model of the Mediterranean Sea

    Science.gov (United States)

    Crispi, G.; Bournaski, E.; Crise, A.

    2003-04-01

    Data assimilation has raised new interest in the last years in the context of the environmental sciences. The swift increment of the attention paid to it in oceanography is due to the coming age of operational services for the marine environment which is going to dramatically increase the demand for accurate, timely and reliable estimates of the space and time distribution both for physical and in a near future for biogeochemical fields. Data assimilation combines information derived from measurements with knowledge of the rules that govern the evolution of the system of interest through formalization and implementation in numerical models. The importance of ocean data assimilation has been recognized by several international programmes as JGOFS, GOOS and CLIVAR. This work presents an eco-hydrodynamic model of the Mediterranean Sea developed at the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale - OGS, Trieste, Italy. It includes 3-D MOM-based hydrodynamics of the Mediterranean Sea, coupled with biochemical model of Nitrogen, Phytoplankton, Zooplankton, and Detritus (NPZD). Monthly mean wind forcings are adopted to force this MOM-NPZD model. For better prediction and analysis of N, P, Z and D distributions in the sea the model needs data assimilation from biomass observations on the sea surface. Chosen approach for evaluating performances of data assimilation techniques in coupled model is the definition of a twin experiment testbed where a reference run is carried out assuming its result as the truth. We define a sampling strategy to obtain different datasets to be incorporated in another ecological model in successive runs in order to appraise the potential of the data assimilation and sampling strategy. The runs carried out with different techniques and different spatio-temporal coverages are compared in order to evaluate the sensitivity to different coverage of dataset. The discussed alternative way is to assume the ecosystem at steady state and

  13. Ensemble streamflow assimilation with the National Water Model.

    Science.gov (United States)

    Rafieeinasab, A.; McCreight, J. L.; Noh, S.; Seo, D. J.; Gochis, D.

    2017-12-01

    Through case studies of flooding across the US, we compare the performance of the National Water Model (NWM) data assimilation (DA) scheme to that of a newly implemented ensemble Kalman filter approach. The NOAA National Water Model (NWM) is an operational implementation of the community WRF-Hydro modeling system. As of August 2016, the NWM forecasts of distributed hydrologic states and fluxes (including soil moisture, snowpack, ET, and ponded water) over the contiguous United States have been publicly disseminated by the National Center for Environmental Prediction (NCEP) . It also provides streamflow forecasts at more than 2.7 million river reaches up to 30 days in advance. The NWM employs a nudging scheme to assimilate more than 6,000 USGS streamflow observations and provide initial conditions for its forecasts. A problem with nudging is how the forecasts relax quickly to open-loop bias in the forecast. This has been partially addressed by an experimental bias correction approach which was found to have issues with phase errors during flooding events. In this work, we present an ensemble streamflow data assimilation approach combining new channel-only capabilities of the NWM and HydroDART (a coupling of the offline WRF-Hydro model and NCAR's Data Assimilation Research Testbed; DART). Our approach focuses on the single model state of discharge and incorporates error distributions on channel-influxes (overland and groundwater) in the assimilation via an ensemble Kalman filter (EnKF). In order to avoid filter degeneracy associated with a limited number of ensemble at large scale, DART's covariance inflation (Anderson, 2009) and localization capabilities are implemented and evaluated. The current NWM data assimilation scheme is compared to preliminary results from the EnKF application for several flooding case studies across the US.

  14. Assimilation of satellite altimeter data into an open ocean model

    Science.gov (United States)

    Vogeler, Armin; SchröTer, Jens

    1995-08-01

    Geosat sea surface height data are assimilated into an eddy-resolving quasi-geostrophic open ocean model using the adjoint technique. The method adjusts the initial conditions for all layers and is successful on the timescale of a few weeks. Time-varying values for the open boundaries are prescribed by a much larger quasi-geostrophic model of the Antarctic Circumpolar Current (ACC). Both models have the same resolution of approximately 20×20 km (1/3°×1/6°), have three layers, and include realistic bottom topography and coastlines. The open model box is embedded in the African sector of the ACC. For continuous assimilation of satellite data into the larger model the nudging technique is applied. These results are used for the adjoint optimization procedure as boundary conditions and as a first guess for the initial condition. For the open model box the difference between model and satellite sea surface height that remains after the nudging experiment amounts to a 19-cm root-mean-square error (rmse). By assimilation into the regional model this value can be reduced to a 6-cm rmse for an assimilation period of 20 days. Several experiments which attempt to improve the convergence of the iterative optimization method are reported. Scaling and regularization by smoothing have to be applied carefully. Especially during the first 10 iterations, the convergence can be improved considerably by low-pass filtering of the cost function gradient. The result of a perturbation experiment shows that for longer assimilation periods the influence of the boundary values becomes dominant and they should be determined inversely by data assimilation into the open ocean model.

  15. Assimilation scheme of the Mediterranean Forecasting System: operational implementation

    Directory of Open Access Journals (Sweden)

    E. Demirov

    Full Text Available This paper describes the operational implementation of the data assimilation scheme for the Mediterranean Forecasting System Pilot Project (MFSPP. The assimilation scheme, System for Ocean Forecast and Analysis (SOFA, is a reduced order Optimal Interpolation (OI scheme. The order reduction is achieved by projection of the state vector into vertical Empirical Orthogonal Functions (EOF. The data assimilated are Sea Level Anomaly (SLA and temperature profiles from Expandable Bathy Termographs (XBT. The data collection, quality control, assimilation and forecast procedures are all done in Near Real Time (NRT. The OI is used intermittently with an assimilation cycle of one week so that an analysis is produced once a week. The forecast is then done for ten days following the analysis day. The root mean square (RMS between the model forecast and the analysis (the forecast RMS is below 0.7°C in the surface layers and below 0.2°C in the layers deeper than 200 m for all the ten forecast days. The RMS between forecast and initial condition (persistence RMS is higher than forecast RMS after the first day. This means that the model improves forecast with respect to persistence. The calculation of the misfit between the forecast and the satellite data suggests that the model solution represents well the main space and time variability of the SLA except for a relatively short period of three – four weeks during the summer when the data show a fast transition between the cyclonic winter and anti-cyclonic summer regimes. This occurs in the surface layers that are not corrected by our assimilation scheme hypothesis. On the basis of the forecast skill scores analysis, conclusions are drawn about future improvements.

    Key words. Oceanography; general (marginal and semi-enclosed seas; numerical modeling; ocean prediction

  16. Assimilation scheme of the Mediterranean Forecasting System: operational implementation

    Directory of Open Access Journals (Sweden)

    E. Demirov

    2003-01-01

    Full Text Available This paper describes the operational implementation of the data assimilation scheme for the Mediterranean Forecasting System Pilot Project (MFSPP. The assimilation scheme, System for Ocean Forecast and Analysis (SOFA, is a reduced order Optimal Interpolation (OI scheme. The order reduction is achieved by projection of the state vector into vertical Empirical Orthogonal Functions (EOF. The data assimilated are Sea Level Anomaly (SLA and temperature profiles from Expandable Bathy Termographs (XBT. The data collection, quality control, assimilation and forecast procedures are all done in Near Real Time (NRT. The OI is used intermittently with an assimilation cycle of one week so that an analysis is produced once a week. The forecast is then done for ten days following the analysis day. The root mean square (RMS between the model forecast and the analysis (the forecast RMS is below 0.7°C in the surface layers and below 0.2°C in the layers deeper than 200 m for all the ten forecast days. The RMS between forecast and initial condition (persistence RMS is higher than forecast RMS after the first day. This means that the model improves forecast with respect to persistence. The calculation of the misfit between the forecast and the satellite data suggests that the model solution represents well the main space and time variability of the SLA except for a relatively short period of three – four weeks during the summer when the data show a fast transition between the cyclonic winter and anti-cyclonic summer regimes. This occurs in the surface layers that are not corrected by our assimilation scheme hypothesis. On the basis of the forecast skill scores analysis, conclusions are drawn about future improvements. Key words. Oceanography; general (marginal and semi-enclosed seas; numerical modeling; ocean prediction

  17. Snow water equivalent monitoring retrieved by assimilating passive microwave observations in a coupled snowpack evolution and microwave emission models over North-Eastern Canada

    Science.gov (United States)

    Royer, A.; Larue, F.; De Sève, D.; Roy, A.; Vionnet, V.; Picard, G.; Cosme, E.

    2017-12-01

    Over northern snow-dominated basins, the snow water equivalent (SWE) is of primary interest for spring streamflow forecasting. SWE retrievals from satellite data are still not well resolved, in particular from microwave (MW) measurements, the only type of data sensible to snow mass. Also, the use of snowpack models is challenging due to the large uncertainties in meteorological input forcings. This project aims to improve SWE prediction by assimilation of satellite brightness temperature (TB), without any ground-based observations. The proposed approach is the coupling of a detailed multilayer snowpack model (Crocus) with a MW snow emission model (DMRT-ML). The assimilation scheme is a Sequential Importance Resampling Particle filter, through ensembles of perturbed meteorological forcings according to their respective uncertainties. Crocus simulations driven by operational meteorological forecasts from the Canadian Global Environmental Multiscale model at 10 km spatial resolution were compared to continuous daily SWE measurements over Québec, North-Eastern Canada (56° - 45°N). The results show a mean bias of the maximum SWE overestimated by 16% with variations up to +32%. This observed large variability could lead to dramatic consequences on spring flood forecasts. Results of Crocus-DMRT-ML coupling compared to surface-based TB measurements (at 11, 19 and 37 GHz) show that the Crocus snowpack microstructure described by sticky hard spheres within DMRT has to be scaled by a snow stickiness of 0.18, significantly reducing the overall RMSE of simulated TBs. The ability of assimilation of daily TBs to correct the simulated SWE is first presented through twin experiments with synthetic data, and then with AMSR-2 satellite time series of TBs along the winter taking into account atmospheric and forest canopy interferences (absorption and emission). The differences between TBs at 19-37 GHz and at 11-19 GHz, in vertical polarization, were assimilated. This assimilation

  18. Monte Carlo Particle Lists: MCPL

    DEFF Research Database (Denmark)

    Kittelmann, Thomas; Klinkby, Esben Bryndt; Bergbäck Knudsen, Erik

    2017-01-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular...... simulation packages. Program summary: Program Title: MCPL. Program Files doi: http://dx.doi.org/10.17632/cby92vsv5g.1 Licensing provisions: CC0 for core MCPL, see LICENSE file for details. Programming language: C and C++ External routines/libraries: Geant4, MCNP, McStas, McXtrace Nature of problem: Saving...

  19. Assimilation of LAI time-series in crop production models

    Science.gov (United States)

    Kooistra, Lammert; Rijk, Bert; Nannes, Louis

    2014-05-01

    Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor

  20. Multivariate and multiscale data assimilation in terrestrial systems: a review.

    Science.gov (United States)

    Montzka, Carsten; Pauwels, Valentijn R N; Franssen, Harrie-Jan Hendricks; Han, Xujun; Vereecken, Harry

    2012-11-26

    More and more terrestrial observational networks are being established to monitor climatic, hydrological and land-use changes in different regions of the World. In these networks, time series of states and fluxes are recorded in an automated manner, often with a high temporal resolution. These data are important for the understanding of water, energy, and/or matter fluxes, as well as their biological and physical drivers and interactions with and within the terrestrial system. Similarly, the number and accuracy of variables, which can be observed by spaceborne sensors, are increasing. Data assimilation (DA) methods utilize these observations in terrestrial models in order to increase process knowledge as well as to improve forecasts for the system being studied. The widely implemented automation in observing environmental states and fluxes makes an operational computation more and more feasible, and it opens the perspective of short-time forecasts of the state of terrestrial systems. In this paper, we review the state of the art with respect to DA focusing on the joint assimilation of observational data precedents from different spatial scales and different data types. An introduction is given to different DA methods, such as the Ensemble Kalman Filter (EnKF), Particle Filter (PF) and variational methods (3/4D-VAR). In this review, we distinguish between four major DA approaches: (1) univariate single-scale DA (UVSS), which is the approach used in the majority of published DA applications, (2) univariate multiscale DA (UVMS) referring to a methodology which acknowledges that at least some of the assimilated data are measured at a different scale than the computational grid scale, (3) multivariate single-scale DA (MVSS) dealing with the assimilation of at least two different data types, and (4) combined multivariate multiscale DA (MVMS). Finally, we conclude with a discussion on the advantages and disadvantages of the assimilation of multiple data types in a

  1. Multivariate and Multiscale Data Assimilation in Terrestrial Systems: A Review

    Directory of Open Access Journals (Sweden)

    Harry Vereecken

    2012-11-01

    Full Text Available More and more terrestrial observational networks are being established to monitor climatic, hydrological and land-use changes in different regions of the World. In these networks, time series of states and fluxes are recorded in an automated manner, often with a high temporal resolution. These data are important for the understanding of water, energy, and/or matter fluxes, as well as their biological and physical drivers and interactions with and within the terrestrial system. Similarly, the number and accuracy of variables, which can be observed by spaceborne sensors, are increasing. Data assimilation (DA methods utilize these observations in terrestrial models in order to increase process knowledge as well as to improve forecasts for the system being studied. The widely implemented automation in observing environmental states and fluxes makes an operational computation more and more feasible, and it opens the perspective of short-time forecasts of the state of terrestrial systems. In this paper, we review the state of the art with respect to DA focusing on the joint assimilation of observational data precedents from different spatial scales and different data types. An introduction is given to different DA methods, such as the Ensemble Kalman Filter (EnKF, Particle Filter (PF and variational methods (3/4D-VAR. In this review, we distinguish between four major DA approaches: (1 univariate single-scale DA (UVSS, which is the approach used in the majority of published DA applications, (2 univariate multiscale DA (UVMS referring to a methodology which acknowledges that at least some of the assimilated data are measured at a different scale than the computational grid scale, (3 multivariate single-scale DA (MVSS dealing with the assimilation of at least two different data types, and (4 combined multivariate multiscale DA (MVMS. Finally, we conclude with a discussion on the advantages and disadvantages of the assimilation of multiple data types in a

  2. Reconstruction of Historical Weather by Assimilating Old Weather Diary Data

    Science.gov (United States)

    Neluwala, P.; Yoshimura, K.; Toride, K.; Hirano, J.; Ichino, M.; Okazaki, A.

    2017-12-01

    Climate can control not only human life style but also other living beings. It is important to investigate historical climate to understand the current and future climates. Information about daily weather can give a better understanding of past life on earth. Long-term weather influences crop calendar as well as the development of civilizations. Unfortunately, existing reconstructed daily weather data are limited to 1850s due to the availability of instrumental data. The climate data prior to that are derived from proxy materials (e.g., tree-ring width, ice core isotopes, etc.) which are either in annual or decadal scale. However, there are many historical documents which contain information about weather such as personal diaries. In Japan, around 20 diaries in average during the 16th - 19th centuries have been collected and converted into a digitized form. As such, diary data exist in many other countries. This study aims to reconstruct historical daily weather during the 18th and 19th centuries using personal daily diaries which have analogue weather descriptions such as `cloudy' or `sunny'. A recent study has shown the possibility of assimilating coarse weather data using idealized experiments. We further extend this study by assimilating modern weather descriptions similar to diary data in recent periods. The Global Spectral model (GSM) of National Centers for Environmental Prediction (NCEP) is used to reconstruct weather with the Local Ensemble Kalman filter (LETKF). Descriptive data are first converted to model variables such as total cloud cover (TCC), solar radiation and precipitation using empirical relationships. Those variables are then assimilated on a daily basis after adding random errors to consider the uncertainty of actual diary data. The assimilation of downward short wave solar radiation using weather descriptions improves RMSE from 64.3 w/m2 to 33.0 w/m2 and correlation coefficient (R) from 0.5 to 0.8 compared with the case without any

  3. SMAP Data Assimilation at NASA SPoRT

    Science.gov (United States)

    Blankenship, Clay B.; Case, Jonathan L.; Zavodsky, Bradley T.

    2016-01-01

    The NASA Short-Term Prediction Research and Transition (SPoRT) Center maintains a near-real- time run of the Noah Land Surface Model within the Land Information System (LIS) at 3-km resolution. Soil moisture products from this model are used by several NOAA/National Weather Service Weather Forecast Offices for flood and drought situational awareness. We have implemented assimilation of soil moisture retrievals from the Soil Moisture Ocean Salinity (SMOS) and Soil Moisture Active/ Passive (SMAP) satellites, and are now evaluating the SMAP assimilation. The SMAP-enhanced LIS product is planned for public release by October 2016.

  4. Assimilation of radar-based nowcast into HIRLAM NWP model

    DEFF Research Database (Denmark)

    Jensen, David Getreuer; Petersen, Claus; Rasmussen, Michael R.

    2015-01-01

    The present study introduces a nowcast scheme that assimilates radar extrapolation data (RED) into a nowcasting version of the high resolution limited area model (HIRLAM) numerical weather prediction (NWP) model covering the area of Denmark. The RED are based on the Co-TREC (tracking radar echoes...... by correlation) methodology and are generated from cleaned radar mosaics from the Danish weather radar network. The assimilation technique is a newly developed method that increases model precipitation by increasing low-level convergence and decreasing convergence aloft in order to increase the vertical velocity....... The level of improved predictability relies on the RED quality, which again relies on the type of event....

  5. Data assimilation in integrated hydrological modeling using ensemble Kalman filtering

    DEFF Research Database (Denmark)

    Rasmussen, Jørn; Madsen, H.; Jensen, Karsten Høgh

    2015-01-01

    Groundwater head and stream discharge is assimilated using the ensemble transform Kalman filter in an integrated hydrological model with the aim of studying the relationship between the filter performance and the ensemble size. In an attempt to reduce the required number of ensemble members...... and estimating parameters requires a much larger ensemble size than just assimilating groundwater head observations. However, the required ensemble size can be greatly reduced with the use of adaptive localization, which by far outperforms distance-based localization. The study is conducted using synthetic data...

  6. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  7. Coupled atmosphere and land-surface assimilation of surface observations with a single column model and ensemble data assimilation

    Science.gov (United States)

    Rostkier-Edelstein, Dorita; Hacker, Joshua P.; Snyder, Chris

    2014-05-01

    Numerical weather prediction and data assimilation models are composed of coupled atmosphere and land-surface (LS) components. If possible, the assimilation procedure should be coupled so that observed information in one module is used to correct fields in the coupled module. There have been some attempts in this direction using optimal interpolation, nudging and 2/3DVAR data assimilation techniques. Aside from satellite remote sensed observations, reference height in-situ observations of temperature and moisture have been used in these studies. Among other problems, difficulties in coupled atmosphere and LS assimilation arise as a result of the different time scales characteristic of each component and the unsteady correlation between these components under varying flow conditions. Ensemble data-assimilation techniques rely on flow dependent observations-model covariances. Provided that correlations and covariances between land and atmosphere can be adequately simulated and sampled, ensemble data assimilation should enable appropriate assimilation of observations simultaneously into the atmospheric and LS states. Our aim is to explore assimilation of reference height in-situ temperature and moisture observations into the coupled atmosphere-LS modules(simultaneously) in NCAR's WRF-ARW model using the NCAR's DART ensemble data-assimilation system. Observing system simulation experiments (OSSEs) are performed using the single column model (SCM) version of WRF. Numerical experiments during a warm season are centered on an atmospheric and soil column in the South Great Plains. Synthetic observations are derived from "truth" WRF-SCM runs for a given date,initialized and forced using North American Regional Reanalyses (NARR). WRF-SCM atmospheric and LS ensembles are created by mixing the atmospheric and soil NARR profile centered on a given date with that from another day (randomly chosen from the same season) with weights drawn from a logit-normal distribution. Three

  8. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  9. Monte Carlo surface flux tallies

    International Nuclear Information System (INIS)

    Favorite, Jeffrey A.

    2010-01-01

    Particle fluxes on surfaces are difficult to calculate with Monte Carlo codes because the score requires a division by the surface-crossing angle cosine, and grazing angles lead to inaccuracies. We revisit the standard practice of dividing by half of a cosine 'cutoff' for particles whose surface-crossing cosines are below the cutoff. The theory behind this approximation is sound, but the application of the theory to all possible situations does not account for two implicit assumptions: (1) the grazing band must be symmetric about 0, and (2) a single linear expansion for the angular flux must be applied in the entire grazing band. These assumptions are violated in common circumstances; for example, for separate in-going and out-going flux tallies on internal surfaces, and for out-going flux tallies on external surfaces. In some situations, dividing by two-thirds of the cosine cutoff is more appropriate. If users were able to control both the cosine cutoff and the substitute value, they could use these parameters to make accurate surface flux tallies. The procedure is demonstrated in a test problem in which Monte Carlo surface fluxes in cosine bins are converted to angular fluxes and compared with the results of a discrete ordinates calculation.

  10. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  11. MCOR - Monte Carlo depletion code for reference LWR calculations

    Energy Technology Data Exchange (ETDEWEB)

    Puente Espel, Federico, E-mail: fup104@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Tippayakul, Chanatip, E-mail: cut110@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Ivanov, Kostadin, E-mail: kni1@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Misu, Stefan, E-mail: Stefan.Misu@areva.com [AREVA, AREVA NP GmbH, Erlangen (Germany)

    2011-04-15

    Research highlights: > Introduction of a reference Monte Carlo based depletion code with extended capabilities. > Verification and validation results for MCOR. > Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations. Additionally

  12. MCOR - Monte Carlo depletion code for reference LWR calculations

    International Nuclear Information System (INIS)

    Puente Espel, Federico; Tippayakul, Chanatip; Ivanov, Kostadin; Misu, Stefan

    2011-01-01

    Research highlights: → Introduction of a reference Monte Carlo based depletion code with extended capabilities. → Verification and validation results for MCOR. → Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations

  13. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-01-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration

  14. Monte Carlo simulation based reliability evaluation in a multi-bilateral contracts market

    International Nuclear Information System (INIS)

    Goel, L.; Viswanath, P.A.; Wang, P.

    2004-01-01

    This paper presents a time sequential Monte Carlo simulation technique to evaluate customer load point reliability in multi-bilateral contracts market. The effects of bilateral transactions, reserve agreements, and the priority commitments of generating companies on customer load point reliability have been investigated. A generating company with bilateral contracts is modelled as an equivalent time varying multi-state generation (ETMG). A procedure to determine load point reliability based on ETMG has been developed. The developed procedure is applied to a reliability test system to illustrate the technique. Representing each bilateral contract by an ETMG provides flexibility in determining the reliability at various customer load points. (authors)

  15. Remarks on a financial inverse problem by means of Monte Carlo Methods

    Science.gov (United States)

    Cuomo, Salvatore; Di Somma, Vittorio; Sica, Federica

    2017-10-01

    Estimating the price of a barrier option is a typical inverse problem. In this paper we present a numerical and statistical framework for a market with risk-free interest rate and a risk asset, described by a Geometric Brownian Motion (GBM). After approximating the risk asset with a numerical method, we find the final option price by following an approach based on sequential Monte Carlo methods. All theoretical results are applied to the case of an option whose underlying is a real stock.

  16. Bayesian modeling of the assimilative capacity component of nutrient total maximum daily loads

    Science.gov (United States)

    Faulkner, B. R.

    2008-08-01

    Implementing stream restoration techniques and best management practices to reduce nonpoint source nutrients implies enhancement of the assimilative capacity for the stream system. In this paper, a Bayesian method for evaluating this component of a total maximum daily load (TMDL) load capacity is developed and applied. The joint distribution of nutrient retention metrics from a literature review of 495 measurements was used for Monte Carlo sampling with a process transfer function for nutrient attenuation. Using the resulting histograms of nutrient retention, reference prior distributions were developed for sites in which some of the metrics contributing to the transfer function were measured. Contributing metrics for the prior include stream discharge, cross-sectional area, fraction of storage volume to free stream volume, denitrification rate constant, storage zone mass transfer rate, dispersion coefficient, and others. Confidence of compliance (CC) that any given level of nutrient retention has been achieved is also determined using this approach. The shape of the CC curve is dependent on the metrics measured and serves in part as a measure of the information provided by the metrics to predict nutrient retention. It is also a direct measurement, with a margin of safety, of the fraction of export load that can be reduced through changing retention metrics. For an impaired stream in western Oklahoma, a combination of prior information and measurement of nutrient attenuation was used to illustrate the proposed approach. This method may be considered for TMDL implementation.

  17. A comparison of sequential and information-based methods for determining the co-integration rank in heteroskedastic VAR MODELS

    DEFF Research Database (Denmark)

    Cavaliere, Giuseppe; Angelis, Luca De; Rahbek, Anders

    2015-01-01

    In this article, we investigate the behaviour of a number of methods for estimating the co-integration rank in VAR systems characterized by heteroskedastic innovation processes. In particular, we compare the efficacy of the most widely used information criteria, such as Akaike Information Criterion....... The relative finite-sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC-based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms......-based method to over-estimate the co-integration rank in relatively small sample sizes....

  18. Monte Carlo simulation of VHTR particle fuel with chord length sampling

    International Nuclear Information System (INIS)

    Ji, W.; Martin, W. R.

    2007-01-01

    The Very High Temperature Gas-Cooled Reactor (VHTR) poses a problem for neutronic analysis due to the double heterogeneity posed by the particle fuel and either the fuel compacts in the case of the prismatic block reactor or the fuel pebbles in the case of the pebble bed reactor. Direct Monte Carlo simulation has been used in recent years to analyze these VHTR configurations but is computationally challenged when space dependent phenomena are considered such as depletion or temperature feedback. As an alternative approach, we have considered chord length sampling to reduce the computational burden of the Monte Carlo simulation. We have improved on an existing method called 'limited chord length sampling' and have used it to analyze stochastic media representative of either pebble bed or prismatic VHTR fuel geometries. Based on the assumption that the PDF had an exponential form, a theoretical chord length distribution is derived and shown to be an excellent model for a wide range of packing fractions. This chord length PDF was then used to analyze a stochastic medium that was constructed using the RSA (Random Sequential Addition) algorithm and the results were compared to a benchmark Monte Carlo simulation of the actual stochastic geometry. The results are promising and suggest that the theoretical chord length PDF can be used instead of a full Monte Carlo random walk simulation in the stochastic medium, saving orders of magnitude in computational time (and memory demand) to perform the simulation. (authors)

  19. Effects of Model Chemistry and Data Biases on Stratospheric Ozone Assimilation

    National Research Council Canada - National Science Library

    Coy, L; Allen, D. R; Eckermann, S. D; McCormack, J. P; Stajner, I; Hogan, T. F

    2007-01-01

    .... In this study, O-F statistics from the Global Ozone Assimilation Testing System (GOATS) are used to examine how ozone assimilation products and their associated O-F statistics depend on input data biases and ozone photochemistry parameterizations (OPP...

  20. Sensitivity of Satellite Altimetry Data Assimilation on a Weapon Acoustic Preset Using MODAS

    National Research Council Canada - National Science Library

    Chu, Peter; Mancini, Steven; Gottshall, Eric; Cwalina, David; Barron, Charlie N

    2007-01-01

    ...) is analyzed with SSP derived from the modular ocean data assimilation system (MODAS). The MODAS fields differ in that one uses altimeter data assimilated from three satellites while the other uses no altimeter data...

  1. Sucrose assimilation and the role of sucrose transporters in plant ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-12-29

    Dec 29, 2008 ... African Journal of Biotechnology Vol. 7 (25), pp. ... Review. Sucrose assimilation and the role of sucrose transporters in plant wound response. Omodele ... Key words: Sucrose transporters, Plasma membrane, carbohydrate, sieve element, source-sink. ... pathogens (Paul et al., 2000) and results in a severe.

  2. Economic Assimilation and Outmigration of Immigrants in West-Germany

    NARCIS (Netherlands)

    Bellemare, C.

    2003-01-01

    By analyzing earnings of observed immigrants workers, the literature on the economic assimilation of immigrants has generally overlooked two potentially important selectivity issues.First, earnings of immigrant workers may di¿er substantially from those of non-workers.Second, earnings of immigrants

  3. Opinion Dynamics with Heterogeneous Interactions and Information Assimilation

    Science.gov (United States)

    Mir Tabatabaei, Seydeh Anahita

    2013-01-01

    In any modern society, individuals interact to form opinions on various topics, including economic, political, and social aspects. Opinions evolve as the result of the continuous exchange of information among individuals and of the assimilation of information distributed by media. The impact of individuals' opinions on each other forms a network,…

  4. Anterior Cingulate Cortex in Schema Assimilation and Expression

    Science.gov (United States)

    Wang, Szu-Han; Tse, Dorothy; Morris, Richard G. M.

    2012-01-01

    In humans and in animals, mental schemas can store information within an associative framework that enables rapid and efficient assimilation of new information. Using a hippocampal-dependent paired-associate task, we now report that the anterior cingulate cortex is part of a neocortical network of schema storage with NMDA receptor-mediated…

  5. Naming game with biased assimilation over adaptive networks

    Science.gov (United States)

    Fu, Guiyuan; Zhang, Weidong

    2018-01-01

    The dynamics of two-word naming game incorporating the influence of biased assimilation over adaptive network is investigated in this paper. Firstly an extended naming game with biased assimilation (NGBA) is proposed. The hearer in NGBA accepts the received information in a biased manner, where he may refuse to accept the conveyed word from the speaker with a predefined probability, if the conveyed word is different from his current memory. Secondly, the adaptive network is formulated by rewiring the links. Theoretical analysis is developed to show that the population in NGBA will eventually reach global consensus on either A or B. Numerical simulation results show that the larger strength of biased assimilation on both words, the slower convergence speed, while larger strength of biased assimilation on only one word can slightly accelerate the convergence; larger population size can make the rate of convergence slower to a large extent when it increases from a relatively small size, while such effect becomes minor when the population size is large; the behavior of adaptively reconnecting the existing links can greatly accelerate the rate of convergence especially on the sparse connected network.

  6. Volcanic Ash Data Assimilation System for Atmospheric Transport Model

    Science.gov (United States)

    Ishii, K.; Shimbori, T.; Sato, E.; Tokumoto, T.; Hayashi, Y.; Hashimoto, A.

    2017-12-01

    The Japan Meteorological Agency (JMA) has two operations for volcanic ash forecasts, which are Volcanic Ash Fall Forecast (VAFF) and Volcanic Ash Advisory (VAA). In these operations, the forecasts are calculated by atmospheric transport models including the advection process, the turbulent diffusion process, the gravitational fall process and the deposition process (wet/dry). The initial distribution of volcanic ash in the models is the most important but uncertain factor. In operations, the model of Suzuki (1983) with many empirical assumptions is adopted to the initial distribution. This adversely affects the reconstruction of actual eruption plumes.We are developing a volcanic ash data assimilation system using weather radars and meteorological satellite observation, in order to improve the initial distribution of the atmospheric transport models. Our data assimilation system is based on the three-dimensional variational data assimilation method (3D-Var). Analysis variables are ash concentration and size distribution parameters which are mutually independent. The radar observation is expected to provide three-dimensional parameters such as ash concentration and parameters of ash particle size distribution. On the other hand, the satellite observation is anticipated to provide two-dimensional parameters of ash clouds such as mass loading, top height and particle effective radius. In this study, we estimate the thickness of ash clouds using vertical wind shear of JMA numerical weather prediction, and apply for the volcanic ash data assimilation system.

  7. A reduced adjoint approach to variational data assimilation

    KAUST Repository

    Altaf, Muhammad; El Gharamti, Mohamad; Heemink, Arnold W.; Hoteit, Ibrahim

    2013-01-01

    The adjoint method has been used very often for variational data assimilation. The computational cost to run the adjoint model often exceeds several original model runs and the method needs significant programming efforts to implement the adjoint model code. The work proposed here is variational data assimilation based on proper orthogonal decomposition (POD) which avoids the implementation of the adjoint of the tangent linear approximation of the original nonlinear model. An ensemble of the forward model simulations is used to determine the approximation of the covariance matrix and only the dominant eigenvectors of this matrix are used to define a model subspace. The adjoint of the tangent linear model is replaced by the reduced adjoint based on this reduced space. Thus the adjoint model is run in reduced space with negligible computational cost. Once the gradient is obtained in reduced space it is projected back in full space and the minimization process is carried in full space. In the paper the reduced adjoint approach to variational data assimilation is introduced. The characteristics and performance of the method are illustrated with a number of data assimilation experiments in a ground water subsurface contaminant model. © 2012 Elsevier B.V.

  8. A reduced adjoint approach to variational data assimilation

    KAUST Repository

    Altaf, Muhammad

    2013-02-01

    The adjoint method has been used very often for variational data assimilation. The computational cost to run the adjoint model often exceeds several original model runs and the method needs significant programming efforts to implement the adjoint model code. The work proposed here is variational data assimilation based on proper orthogonal decomposition (POD) which avoids the implementation of the adjoint of the tangent linear approximation of the original nonlinear model. An ensemble of the forward model simulations is used to determine the approximation of the covariance matrix and only the dominant eigenvectors of this matrix are used to define a model subspace. The adjoint of the tangent linear model is replaced by the reduced adjoint based on this reduced space. Thus the adjoint model is run in reduced space with negligible computational cost. Once the gradient is obtained in reduced space it is projected back in full space and the minimization process is carried in full space. In the paper the reduced adjoint approach to variational data assimilation is introduced. The characteristics and performance of the method are illustrated with a number of data assimilation experiments in a ground water subsurface contaminant model. © 2012 Elsevier B.V.

  9. Probability Maps for the Visualization of Assimilation Ensemble Flow Data

    KAUST Repository

    Hollt, Thomas

    2015-05-25

    Ocean forecasts nowadays are created by running ensemble simulations in combination with data assimilation techniques. Most of these techniques resample the ensemble members after each assimilation cycle. This means that in a time series, after resampling, every member can follow up on any of the members before resampling. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially with the number of assimilation cycles. In general a single possible path is not of interest but only the probabilities that any point in space might be reached by a particle at some point in time. In this work we present an approach using probability-weighted piecewise particle trajectories to allow such a mapping interactively, instead of tracing quadrillions of individual particles. We achieve interactive rates by binning the domain and splitting up the tracing process into the individual assimilation cycles, so that particles that fall into the same bin after a cycle can be treated as a single particle with a larger probability as input for the next time step. As a result we loose the possibility to track individual particles, but can create probability maps for any desired seed at interactive rates.

  10. Music playlist generation by assimilating GMMs into SOMs

    NARCIS (Netherlands)

    Balkema, Wietse; van der Heijden, Ferdinand

    A method for music playlist generation, using assimilated Gaussian mixture models (GMMs) in self organizing maps (SOMs) is presented. Traditionally, the neurons in a SOM are represented by vectors, but in this paper we propose to use GMMs instead. To this end, we introduce a method to adapt a GMM

  11. Growth, assimilate partitioning and grain yield response of soybean ...

    African Journals Online (AJOL)

    This investigation tested variation in the growth components, assimilate partitioning and grain yield of soybean (Glycine max L. Merrrill) varieties established in CO2 enriched atmosphere when inoculated with mixtures of Arbuscular mycorrhizal fungi (AMF) species in the humid rainforest of Nigeria. A pot and a field ...

  12. Homophily and assimilation among sportactive adolescent substance users

    NARCIS (Netherlands)

    Pearson, M; Steglich, Ch.; Snijders, T.A.B.

    2006-01-01

    We analyse the co-evolution of social networks and substance use behaviour of adolescents and address the problem of separating the effects of homophily and assimilation. Adolescents who prefer friends with the same substance-use behaviour exhibit the homophily principle. Adolescents who adapt their

  13. Modelling Effluent Assimilative Capacity of Ikpoba River, Benin City ...

    African Journals Online (AJOL)

    The sheer display of reprehensible propensity on the part of public hospitals, abattoirs, breweries and city dwellers at large to discharge untreated waste, debris, scum and, in particular, municipal and industrial effluents into Ikpoba River has morphed into a situation whereby the assimilative capacity of the river has reached ...

  14. Satellite Data Assimilation within KIAPS-LETKF system

    Science.gov (United States)

    Jo, Y.; Lee, S., Sr.; Cho, K.

    2016-12-01

    Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing an ensemble data assimilation system using four-dimensional local ensemble transform kalman filter (LETKF; Hunt et al., 2007) within KIAPS Integrated Model (KIM), referred to as "KIAPS-LETKF". KIAPS-LETKF system was successfully evaluated with various Observing System Simulation Experiments (OSSEs) with NCAR Community Atmospheric Model - Spectral Element (Kang et al., 2013), which has fully unstructured quadrilateral meshes based on the cubed-sphere grid as the same grid system of KIM. Recently, assimilation of real observations has been conducted within the KIAPS-LETKF system with four-dimensional covariance functions over the 6-hr assimilation window. Then, conventional (e.g., sonde, aircraft, and surface) and satellite (e.g., AMSU-A, IASI, GPS-RO, and AMV) observations have been provided by the KIAPS Package for Observation Processing (KPOP). Wind speed prediction was found most beneficial due to ingestion of AMV and for the temperature prediction the improvement in assimilation is mostly due to ingestion of AMSU-A and IASI. However, some degradation in the simulation of the GPS-RO is presented in the upper stratosphere, even though GPS-RO leads positive impacts on the analysis and forecasts. We plan to test the bias correction method and several vertical localization strategies for radiance observations to improve analysis and forecast impacts.

  15. Assimilation of Long-Range Lightning Data over the Pacific

    Science.gov (United States)

    2011-09-30

    convective rainfall analyses over the Pacific, and (iii) to improve marine prediction of cyclogenesis of both tropical and extratropical cyclones through...data over the North Pacific Ocean, refine the relationships between lightning and storm hydrometeor characteristics, and assimilate lightning...unresolved storm -scale areas of deep convection over the data-sparse open oceans. Diabatic heating sources, especially latent heat release in deep

  16. Abscisic acid and assimilate partitioning during seed development

    NARCIS (Netherlands)

    Bruijn, de S.M.

    1993-01-01

    This thesis describes the influence of abscisic acid (ABA) on the transport of assimilates to seeds and the deposition of reserves in seeds. It is well-known from literature that ABA accumulates in seeds during development, and that ABA concentrations in seeds correlate rather well with

  17. A Generic Software Framework for Data Assimilation and Model Calibration

    NARCIS (Netherlands)

    Van Velzen, N.

    2010-01-01

    The accuracy of dynamic simulation models can be increased by using observations in conjunction with a data assimilation or model calibration algorithm. However, implementing such algorithms usually increases the complexity of the model software significantly. By using concepts from object oriented

  18. Educational Attainments of Immigrant Offspring: Success or Segmented Assimilation?

    Science.gov (United States)

    Boyd, Monica

    2002-01-01

    Examined the educational attainments of adult offspring of immigrants age 20-64 years, analyzing data from Canada's 1996 Survey of Labour and Income Dynamics. Contrary to second generation decline and segmented underclass assimilation found in the United States, Canadian adult visible-minority immigrant offspring did not have lower educational…

  19. Global assimilation of X Project Loon stratospheric balloon observations

    Science.gov (United States)

    Coy, L.; Schoeberl, M. R.; Pawson, S.; Candido, S.; Carver, R. W.

    2017-12-01

    Project Loon has an overall goal of providing worldwide internet coverage using a network of long-duration super-pressure balloons. Beginning in 2013, Loon has launched over 1600 balloons from multiple tropical and middle latitude locations. These GPS tracked balloon trajectories provide lower stratospheric wind information over the oceans and remote land areas where traditional radiosonde soundings are sparse, thus providing unique coverage of lower stratospheric winds. To fully investigate these Loon winds we: 1) compare the Loon winds to winds produced by a global data assimilation system (DAS: NASA GEOS) and 2) assimilate the Loon winds into the same comprehensive DAS. Results show that in middle latitudes the Loon winds and DAS winds agree well and assimilating the Loon winds have only a small impact on short-term forecasting of the Loon winds, however, in the tropics the loon winds and DAS winds often disagree substantially (8 m/s or more in magnitude) and in these cases assimilating the loon winds significantly improves the forecast of the loon winds. By highlighting cases where the Loon and DAS winds differ, these results can lead to improved understanding of stratospheric winds, especially in the tropics.

  20. Lidar data assimilation for improved analyses of volcanic aerosol events

    Science.gov (United States)

    Lange, Anne Caroline; Elbern, Hendrik

    2014-05-01

    Observations of hazardous events with release of aerosols are hardly analyzable by today's data assimilation algorithms, without producing an attenuating bias. Skillful forecasts of unexpected aerosol events are essential for human health and to prevent an exposure of infirm persons and aircraft with possibly catastrophic outcome. Typical cases include mineral dust outbreaks, mostly from large desert regions, wild fires, and sea salt uplifts, while the focus aims for volcanic eruptions. In general, numerical chemistry and aerosol transport models cannot simulate such events without manual adjustments. The concept of data assimilation is able to correct the analysis, as long it is operationally implemented in the model system. Though, the tangent-linear approximation, which describes a substantial precondition for today's cutting edge data assimilation algorithms, is not valid during unexpected aerosol events. As part of the European COPERNICUS (earth observation) project MACC II and the national ESKP (Earth System Knowledge Platform) initiative, we developed a module that enables the assimilation of aerosol lidar observations, even during unforeseeable incidences of extreme emissions of particulate matter. Thereby, the influence of the background information has to be reduced adequately. Advanced lidar instruments comprise on the one hand the aspect of radiative transfer within the atmosphere and on the other hand they can deliver a detailed quantification of the detected aerosols. For the assimilation of maximal exploited lidar data, an appropriate lidar observation operator is constructed, compatible with the EURAD-IM (European Air Pollution and Dispersion - Inverse Model) system. The observation operator is able to map the modeled chemical and physical state on lidar attenuated backscatter, transmission, aerosol optical depth, as well as on the extinction and backscatter coefficients. Further, it has the ability to process the observed discrepancies with lidar

  1. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  2. Improving Forecast Skill by Assimilation of AIRS Temperature Soundings

    Science.gov (United States)

    Susskind, Joel; Reale, Oreste

    2010-01-01

    AIRS was launched on EOS Aqua on May 4, 2002, together with AMSU-A and HSB, to form a next generation polar orbiting infrared and microwave atmospheric sounding system. The primary products of AIRS/AMSU-A are twice daily global fields of atmospheric temperature-humidity profiles, ozone profiles, sea/land surface skin temperature, and cloud related parameters including OLR. The AIRS Version 5 retrieval algorithm, is now being used operationally at the Goddard DISC in the routine generation of geophysical parameters derived from AIRS/AMSU data. A major innovation in Version 5 is the ability to generate case-by-case level-by-level error estimates delta T(p) for retrieved quantities and the use of these error estimates for Quality Control. We conducted a number of data assimilation experiments using the NASA GEOS-5 Data Assimilation System as a step toward finding an optimum balance of spatial coverage and sounding accuracy with regard to improving forecast skill. The model was run at a horizontal resolution of 0.5 deg. latitude X 0.67 deg longitude with 72 vertical levels. These experiments were run during four different seasons, each using a different year. The AIRS temperature profiles were presented to the GEOS-5 analysis as rawinsonde profiles, and the profile error estimates delta (p) were used as the uncertainty for each measurement in the data assimilation process. We compared forecasts analyses generated from the analyses done by assimilation of AIRS temperature profiles with three different sets of thresholds; Standard, Medium, and Tight. Assimilation of Quality Controlled AIRS temperature profiles significantly improve 5-7 day forecast skill compared to that obtained without the benefit of AIRS data in all of the cases studied. In addition, assimilation of Quality Controlled AIRS temperature soundings performs better than assimilation of AIRS observed radiances. Based on the experiments shown, Tight Quality Control of AIRS temperature profile performs best

  3. Snow multivariable data assimilation for hydrological predictions in mountain areas

    Science.gov (United States)

    Piazzi, Gaia; Campo, Lorenzo; Gabellani, Simone; Rudari, Roberto; Castelli, Fabio; Cremonese, Edoardo; Morra di Cella, Umberto; Stevenin, Hervé; Ratto, Sara Maria

    2016-04-01

    The seasonal presence of snow on alpine catchments strongly impacts both surface energy balance and water resource. Thus, the knowledge of the snowpack dynamics is of critical importance for several applications, such as water resource management, floods prediction and hydroelectric power production. Several independent data sources provide information about snowpack state: ground-based measurements, satellite data and physical models. Although all these data types are reliable, each of them is affected by specific flaws and errors (respectively dependency on local conditions, sensor biases and limitations, initialization and poor quality forcing data). Moreover, there are physical factors that make an exhaustive reconstruction of snow dynamics complicated: snow intermittence in space and time, stratification and slow phenomena like metamorphism processes, uncertainty in snowfall evaluation, wind transportation, etc. Data Assimilation (DA) techniques provide an objective methodology to combine observational and modeled information to obtain the most likely estimate of snowpack state. Indeed, by combining all the available sources of information, the implementation of DA schemes can quantify and reduce the uncertainties of the estimations. This study presents SMASH (Snow Multidata Assimilation System for Hydrology), a multi-layer snow dynamic model, strengthened by a robust multivariable data assimilation algorithm. The model is physically based on mass and energy balances and can be used to reproduce the main physical processes occurring within the snowpack: accumulation, density dynamics, melting, sublimation, radiative balance, heat and mass exchanges. The model is driven by observed forcing meteorological data (air temperature, wind velocity, relative air humidity, precipitation and incident solar radiation) to provide a complete estimate of snowpack state. The implementation of an Ensemble Kalman Filter (EnKF) scheme enables to assimilate simultaneously ground

  4. Quantum Monte-Carlo programming for atoms, molecules, clusters, and solids

    International Nuclear Information System (INIS)

    Schattke, Wolfgang; Diez Muino, Ricardo

    2013-01-01

    This is a book that initiates the reader into the basic concepts and practical applications of Quantum Monte Carlo. Because of the simplicity of its theoretical concept, the authors focus on the variational Quantum Monte Carlo scheme. The reader is enabled to proceed from simple examples as the hydrogen atom to advanced ones as the Lithium solid. In between, several intermediate steps are introduced, including the Hydrogen molecule (2 electrons), the Lithium atom (3 electrons) and expanding to an arbitrary number of electrons to finally treat the three-dimensional periodic array of Lithium atoms in a crystal. The book is unique, because it provides both theory and numerical programs. It pedagogically explains how to transfer into computational tools what is usually described in a theoretical textbook. It also includes the detailed physical understanding of methodology that cannot be found in a code manual. The combination of both aspects allows the reader to assimilate the fundamentals of Quantum Monte Carlo not only by reading but also by practice.

  5. On Locally Most Powerful Sequential Rank Tests

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985556 Keywords : nonparametric test s * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/kalina-0474065.pdf

  6. Decoding restricted participation in sequential electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Knaut, Andreas; Paschmann, Martin

    2017-06-15

    Restricted participation in sequential markets may cause high price volatility and welfare losses. In this paper we therefore analyze the drivers of restricted participation in the German intraday auction which is a short-term electricity market with quarter-hourly products. Applying a fundamental electricity market model with 15-minute temporal resolution, we identify the lack of sub-hourly market coupling being the most relevant driver of restricted participation. We derive a proxy for price volatility and find that full market coupling may trigger quarter-hourly price volatility to decrease by a factor close to four.

  7. THE DEVELOPMENT OF SPECIAL SEQUENTIALLY-TIMED

    Directory of Open Access Journals (Sweden)

    Stanislav LICHOROBIEC

    2016-06-01

    Full Text Available This article documents the development of the noninvasive use of explosives during the destruction of ice mass in river flows. The system of special sequentially-timed charges utilizes the increase in efficiency of cutting charges by covering them with bags filled with water, while simultaneously increasing the effect of the entire system of timed charges. Timing, spatial combinations during placement, and the linking of these charges results in the loosening of ice barriers on a frozen waterway, while at the same time regulating the size of the ice fragments. The developed charges will increase the operability and safety of IRS units.

  8. Pass-transistor asynchronous sequential circuits

    Science.gov (United States)

    Whitaker, Sterling R.; Maki, Gary K.

    1989-01-01

    Design methods for asynchronous sequential pass-transistor circuits, which result in circuits that are hazard- and critical-race-free and which have added degrees of freedom for the input signals, are discussed. The design procedures are straightforward and easy to implement. Two single-transition-time state assignment methods are presented, and hardware bounds for each are established. A surprising result is that the hardware realizations for each next state variable and output variable is identical for a given flow table. Thus, a state machine with N states and M outputs can be constructed using a single layout replicated N + M times.

  9. Estimation After a Group Sequential Trial.

    Science.gov (United States)

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Kenward, Michael G; Tsiatis, Anastasios A; Davidian, Marie; Verbeke, Geert

    2015-10-01

    Group sequential trials are one important instance of studies for which the sample size is not fixed a priori but rather takes one of a finite set of pre-specified values, dependent on the observed data. Much work has been devoted to the inferential consequences of this design feature. Molenberghs et al (2012) and Milanzi et al (2012) reviewed and extended the existing literature, focusing on a collection of seemingly disparate, but related, settings, namely completely random sample sizes, group sequential studies with deterministic and random stopping rules, incomplete data, and random cluster sizes. They showed that the ordinary sample average is a viable option for estimation following a group sequential trial, for a wide class of stopping rules and for random outcomes with a distribution in the exponential family. Their results are somewhat surprising in the sense that the sample average is not optimal, and further, there does not exist an optimal, or even, unbiased linear estimator. However, the sample average is asymptotically unbiased, both conditionally upon the observed sample size as well as marginalized over it. By exploiting ignorability they showed that the sample average is the conventional maximum likelihood estimator. They also showed that a conditional maximum likelihood estimator is finite sample unbiased, but is less efficient than the sample average and has the larger mean squared error. Asymptotically, the sample average and the conditional maximum likelihood estimator are equivalent. This previous work is restricted, however, to the situation in which the the random sample size can take only two values, N = n or N = 2 n . In this paper, we consider the more practically useful setting of sample sizes in a the finite set { n 1 , n 2 , …, n L }. It is shown that the sample average is then a justifiable estimator , in the sense that it follows from joint likelihood estimation, and it is consistent and asymptotically unbiased. We also show why

  10. A sequential/parallel track selector

    CERN Document Server

    Bertolino, F; Bressani, Tullio; Chiavassa, E; Costa, S; Dellacasa, G; Gallio, M; Musso, A

    1980-01-01

    A medium speed ( approximately 1 mu s) hardware pre-analyzer for the selection of events detected in four planes of drift chambers in the magnetic field of the Omicron Spectrometer at the CERN SC is described. Specific geometrical criteria determine patterns of hits in the four planes of vertical wires that have to be recognized and that are stored as patterns of '1's in random access memories. Pairs of good hits are found sequentially, then the RAMs are used as look-up tables. (6 refs).

  11. Boundary conditions in random sequential adsorption

    Science.gov (United States)

    Cieśla, Michał; Ziff, Robert M.

    2018-04-01

    The influence of different boundary conditions on the density of random packings of disks is studied. Packings are generated using the random sequential adsorption algorithm with three different types of boundary conditions: periodic, open, and wall. It is found that the finite size effects are smallest for periodic boundary conditions, as expected. On the other hand, in the case of open and wall boundaries it is possible to introduce an effective packing size and a constant correction term to significantly improve the packing densities.

  12. Automatic synthesis of sequential control schemes

    International Nuclear Information System (INIS)

    Klein, I.

    1993-01-01

    Of all hard- and software developed for industrial control purposes, the majority is devoted to sequential, or binary valued, control and only a minor part to classical linear control. Typically, the sequential parts of the controller are invoked during startup and shut-down to bring the system into its normal operating region and into some safe standby region, respectively. Despite its importance, fairly little theoretical research has been devoted to this area, and sequential control programs are therefore still created manually without much theoretical support to obtain a systematic approach. We propose a method to create sequential control programs automatically. The main ideas is to spend some effort off-line modelling the plant, and from this model generate the control strategy, that is the plan. The plant is modelled using action structures, thereby concentrating on the actions instead of the states of the plant. In general the planning problem shows exponential complexity in the number of state variables. However, by focusing on the actions, we can identify problem classes as well as algorithms such that the planning complexity is reduced to polynomial complexity. We prove that these algorithms are sound, i.e., the generated solution will solve the stated problem, and complete, i.e., if the algorithms fail, then no solution exists. The algorithms generate a plan as a set of actions and a partial order on this set specifying the execution order. The generated plant is proven to be minimal and maximally parallel. For a larger class of problems we propose a method to split the original problem into a number of simple problems that can each be solved using one of the presented algorithms. It is also shown how a plan can be translated into a GRAFCET chart, and to illustrate these ideas we have implemented a planing tool, i.e., a system that is able to automatically create control schemes. Such a tool can of course also be used on-line if it is fast enough. This

  13. From sequential to parallel programming with patterns

    CERN Document Server

    CERN. Geneva

    2018-01-01

    To increase in both performance and efficiency, our programming models need to adapt to better exploit modern processors. The classic idioms and patterns for programming such as loops, branches or recursion are the pillars of almost every code and are well known among all programmers. These patterns all have in common that they are sequential in nature. Embracing parallel programming patterns, which allow us to program for multi- and many-core hardware in a natural way, greatly simplifies the task of designing a program that scales and performs on modern hardware, independently of the used programming language, and in a generic way.

  14. Sequential extraction of uranium metal contamination

    International Nuclear Information System (INIS)

    Murry, M.M.; Spitz, H.B.; Connick, W.B.

    2016-01-01

    Samples of uranium contaminated dirt collected from the dirt floor of an abandoned metal rolling mill were analyzed for uranium using a sequential extraction protocol involving a series of five increasingly aggressive solvents. The quantity of uranium extracted from the contaminated dirt by each reagent can aid in predicting the fate and transport of the uranium contamination in the environment. Uranium was separated from each fraction using anion exchange, electrodeposition and analyzed by alpha spectroscopy analysis. Results demonstrate that approximately 77 % of the uranium was extracted using NH 4 Ac in 25 % acetic acid. (author)

  15. Simultaneous optimization of sequential IMRT plans

    International Nuclear Information System (INIS)

    Popple, Richard A.; Prellop, Perri B.; Spencer, Sharon A.; Santos, Jennifer F. de los; Duan, Jun; Fiveash, John B.; Brezovich, Ivan A.

    2005-01-01

    Radiotherapy often comprises two phases, in which irradiation of a volume at risk for microscopic disease is followed by a sequential dose escalation to a smaller volume either at a higher risk for microscopic disease or containing only gross disease. This technique is difficult to implement with intensity modulated radiotherapy, as the tolerance doses of critical structures must be respected over the sum of the two plans. Techniques that include an integrated boost have been proposed to address this problem. However, clinical experience with such techniques is limited, and many clinicians are uncomfortable prescribing nonconventional fractionation schemes. To solve this problem, we developed an optimization technique that simultaneously generates sequential initial and boost IMRT plans. We have developed an optimization tool that uses a commercial treatment planning system (TPS) and a high level programming language for technical computing. The tool uses the TPS to calculate the dose deposition coefficients (DDCs) for optimization. The DDCs were imported into external software and the treatment ports duplicated to create the boost plan. The initial, boost, and tolerance doses were specified and used to construct cost functions. The initial and boost plans were optimized simultaneously using a gradient search technique. Following optimization, the fluence maps were exported to the TPS for dose calculation. Seven patients treated using sequential techniques were selected from our clinical database. The initial and boost plans used to treat these patients were developed independently of each other by dividing the tolerance doses proportionally between the initial and boost plans and then iteratively optimizing the plans until a summation that met the treatment goals was obtained. We used the simultaneous optimization technique to generate plans that met the original planning goals. The coverage of the initial and boost target volumes in the simultaneously optimized

  16. Assimilation of diazotrophic nitrogen into pelagic food webs.

    Directory of Open Access Journals (Sweden)

    Ryan J Woodland

    Full Text Available The fate of diazotrophic nitrogen (N(D fixed by planktonic cyanobacteria in pelagic food webs remains unresolved, particularly for toxic cyanophytes that are selectively avoided by most herbivorous zooplankton. Current theory suggests that N(D fixed during cyanobacterial blooms can enter planktonic food webs contemporaneously with peak bloom biomass via direct grazing of zooplankton on cyanobacteria or via the uptake of bioavailable N(D (exuded from viable cyanobacterial cells by palatable phytoplankton or microbial consortia. Alternatively, N(D can enter planktonic food webs post-bloom following the remineralization of bloom detritus. Although the relative contribution of these processes to planktonic nutrient cycles is unknown, we hypothesized that assimilation of bioavailable N(D (e.g., nitrate, ammonium by palatable phytoplankton and subsequent grazing by zooplankton (either during or after the cyanobacterial bloom would be the primary pathway by which N(D was incorporated into the planktonic food web. Instead, in situ stable isotope measurements and grazing experiments clearly documented that the assimilation of N(D by zooplankton outpaced assimilation by palatable phytoplankton during a bloom of toxic Nodularia spumigena Mertens. We identified two distinct temporal phases in the trophic transfer of N(D from N. spumigena to the plankton community. The first phase was a highly dynamic transfer of N(D to zooplankton with rates that covaried with bloom biomass while bypassing other phytoplankton taxa; a trophic transfer that we infer was routed through bloom-associated bacteria. The second phase was a slowly accelerating assimilation of the dissolved-N(D pool by phytoplankton that was decoupled from contemporaneous variability in N. spumigena concentrations. These findings provide empirical evidence that N(D can be assimilated and transferred rapidly throughout natural plankton communities and yield insights into the specific processes

  17. Assimilation of SMOS Retrievals in the Land Information System

    Science.gov (United States)

    Blankenship, Clay B.; Case, Jonathan L.; Zavodsky, Bradley T.; Crosson, William L.

    2016-01-01

    The Soil Moisture and Ocean Salinity (SMOS) satellite provides retrievals of soil moisture in the upper 5 cm with a 30-50 km resolution and a mission accuracy requirement of 0.04 cm(sub 3 cm(sub -3). These observations can be used to improve land surface model soil moisture states through data assimilation. In this paper, SMOS soil moisture retrievals are assimilated into the Noah land surface model via an Ensemble Kalman Filter within the NASA Land Information System. Bias correction is implemented using Cumulative Distribution Function (CDF) matching, with points aggregated by either land cover or soil type to reduce sampling error in generating the CDFs. An experiment was run for the warm season of 2011 to test SMOS data assimilation and to compare assimilation methods. Verification of soil moisture analyses in the 0-10 cm upper layer and root zone (0-1 m) was conducted using in situ measurements from several observing networks in the central and southeastern United States. This experiment showed that SMOS data assimilation significantly increased the anomaly correlation of Noah soil moisture with station measurements from 0.45 to 0.57 in the 0-10 cm layer. Time series at specific stations demonstrate the ability of SMOS DA to increase the dynamic range of soil moisture in a manner consistent with station measurements. Among the bias correction methods, the correction based on soil type performed best at bias reduction but also reduced correlations. The vegetation-based correction did not produce any significant differences compared to using a simple uniform correction curve.

  18. Assimilation of diazotrophic nitrogen into pelagic food webs.

    Science.gov (United States)

    Woodland, Ryan J; Holland, Daryl P; Beardall, John; Smith, Jonathan; Scicluna, Todd; Cook, Perran L M

    2013-01-01

    The fate of diazotrophic nitrogen (N(D)) fixed by planktonic cyanobacteria in pelagic food webs remains unresolved, particularly for toxic cyanophytes that are selectively avoided by most herbivorous zooplankton. Current theory suggests that N(D) fixed during cyanobacterial blooms can enter planktonic food webs contemporaneously with peak bloom biomass via direct grazing of zooplankton on cyanobacteria or via the uptake of bioavailable N(D) (exuded from viable cyanobacterial cells) by palatable phytoplankton or microbial consortia. Alternatively, N(D) can enter planktonic food webs post-bloom following the remineralization of bloom detritus. Although the relative contribution of these processes to planktonic nutrient cycles is unknown, we hypothesized that assimilation of bioavailable N(D) (e.g., nitrate, ammonium) by palatable phytoplankton and subsequent grazing by zooplankton (either during or after the cyanobacterial bloom) would be the primary pathway by which N(D) was incorporated into the planktonic food web. Instead, in situ stable isotope measurements and grazing experiments clearly documented that the assimilation of N(D) by zooplankton outpaced assimilation by palatable phytoplankton during a bloom of toxic Nodularia spumigena Mertens. We identified two distinct temporal phases in the trophic transfer of N(D) from N. spumigena to the plankton community. The first phase was a highly dynamic transfer of N(D) to zooplankton with rates that covaried with bloom biomass while bypassing other phytoplankton taxa; a trophic transfer that we infer was routed through bloom-associated bacteria. The second phase was a slowly accelerating assimilation of the dissolved-N(D) pool by phytoplankton that was decoupled from contemporaneous variability in N. spumigena concentrations. These findings provide empirical evidence that N(D) can be assimilated and transferred rapidly throughout natural plankton communities and yield insights into the specific processes underlying

  19. A virtual reality catchment for data assimilation experiments

    Science.gov (United States)

    Schalge, Bernd; Rihani, Jehan; Haese, Barbara; Baroni, Gabriele; Erdal, Daniel; Neuweiler, Insa; Hendricks-Franssen, Harrie-Jan; Geppert, Gernot; Ament, Felix; Kollet, Stefan; Cirpka, Olaf; Saavedra, Pablo; Han, Xujun; Attinger, Sabine; Kunstmann, Harald; Vereecken, Harry; Simmer, Clemens

    2016-04-01

    Current data assimilation (DA) systems often lack the possibility to assimilate measurements across compartments to accurately estimate states and fluxes in subsurface-land surface-atmosphere systems (SLAS). In order to develop a new DA framework that is able to realize this cross-compartmental assimilation a comprehensive testing environment is needed. Therefore a virtual reality (VR) catchment is constructed with the Terrestrial System Modeling Platform (TerrSysMP). This catchment mimics the Neckar catchment in Germany. TerrSysMP employs the atmospheric model COSMO, the land surface model CLM and the hydrological model ParFlow coupled with the external coupler OASIS. We will show statistical tests to prove the plausibility of the VR. The VR is running in a fully-coupled mode (subsurface - land surface - atmosphere) which includes the interactions of subsurface dynamics with the atmosphere, such as the effects of soil moisture, which can influence near-surface temperatures, convection patterns or the surface heat fluxes. A reference high resolution run serves as the "truth" from which virtual observations are extracted with observation operators like virtual rain gauges, synoptic stations and satellite observations (amongst others). This effectively solves the otherwise often encountered data scarcity issues with respect to DA. Furthermore an ensemble of model runs at a reduced resolution is performed. This ensemble serves also for open loop runs to be compared with data assimilation experiments. The model runs with this ensemble served to identify sets of parameters that are especially sensitive to changes and have the largest impact on the system. These parameters were the focus of subsequent ensemble simulations and DA experiments. We will show to what extend the VR states can be re-constructed using data assimilation methods with only a limited number of virtual observations available.

  20. Development of KIAPS Observation Processing Package for Data Assimilation System

    Science.gov (United States)

    Kang, Jeon-Ho; Chun, Hyoung-Wook; Lee, Sihye; Han, Hyun-Jun; Ha, Su-Jin

    2015-04-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded in 2011 by the Korea Meteorological Administration (KMA) to develop Korea's own global Numerical Weather Prediction (NWP) system as nine year (2011-2019) project. Data assimilation team at KIAPS has been developing the observation processing system (KIAPS Package for Observation Processing: KPOP) to provide optimal observations to the data assimilation system for the KIAPS Global Model (KIAPS Integrated Model - Spectral Element method based on HOMME: KIM-SH). Currently, the KPOP is capable of processing the satellite radiance data (AMSU-A, IASI), GPS Radio Occultation (GPS-RO), AIRCRAFT (AMDAR, AIREP, and etc…), and synoptic observation (SONDE and SURFACE). KPOP adopted Radiative Transfer for TOVS version 10 (RTTOV_v10) to get brightness temperature (TB) for each channel at top of the atmosphere (TOA), and Radio Occultation Processing Package (ROPP) 1-dimensional forward module to get bending angle (BA) at each tangent point. The observation data are obtained from the KMA which has been composited with BUFR format to be converted with ODB that are used for operational data assimilation and monitoring at the KMA. The Unified Model (UM), Community Atmosphere - Spectral Element (CAM-SE) and KIM-SH model outputs are used for the bias correction (BC) and quality control (QC) of the observations, respectively. KPOP provides radiance and RO data for Local Ensemble Transform Kalman Filter (LETKF) and also provides SONDE, SURFACE and AIRCRAFT data for Three-Dimensional Variational Assimilation (3DVAR). We are expecting all of the observation type which processed in KPOP could be combined with both of the data assimilation method as soon as possible. The preliminary results from each observation type will be introduced with the current development status of the KPOP.

  1. General Monte Carlo code MONK

    International Nuclear Information System (INIS)

    Moore, J.G.

    1974-01-01

    The Monte Carlo code MONK is a general program written to provide a high degree of flexibility to the user. MONK is distinguished by its detailed representation of nuclear data in point form i.e., the cross-section is tabulated at specific energies instead of the more usual group representation. The nuclear data are unadjusted in the point form but recently the code has been modified to accept adjusted group data as used in fast and thermal reactor applications. The various geometrical handling capabilities and importance sampling techniques are described. In addition to the nuclear data aspects, the following features are also described; geometrical handling routines, tracking cycles, neutron source and output facilities. 12 references. (U.S.)

  2. Monte Carlo lattice program KIM

    International Nuclear Information System (INIS)

    Cupini, E.; De Matteis, A.; Simonini, R.

    1980-01-01

    The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed

  3. A problem-solving environment for data assimilation in air quality modelling

    NARCIS (Netherlands)

    Velzen, N. van; Segers, A.J.

    2010-01-01

    A generic toolbox for data assimilation called COSTA (COmmon Set of Tools for the Assimilation of data) makes it possible to simplify the application of data assimilation to models and to try out various methods for a particular model. Concepts of object oriented programming are used to define

  4. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  5. Nested Sampling with Constrained Hamiltonian Monte Carlo

    OpenAIRE

    Betancourt, M. J.

    2010-01-01

    Nested sampling is a powerful approach to Bayesian inference ultimately limited by the computationally demanding task of sampling from a heavily constrained probability distribution. An effective algorithm in its own right, Hamiltonian Monte Carlo is readily adapted to efficiently sample from any smooth, constrained distribution. Utilizing this constrained Hamiltonian Monte Carlo, I introduce a general implementation of the nested sampling algorithm.

  6. Inverse problems with non-trivial priors: efficient solution through sequential Gibbs sampling

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Mosegaard, Klaus

    2012-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample solutions to non-linear inverse problems. In principle, these methods allow incorporation of prior information of arbitrary complexity. If an analytical closed form description of the prior...... is available, which is the case when the prior can be described by a multidimensional Gaussian distribution, such prior information can easily be considered. In reality, prior information is often more complex than can be described by the Gaussian model, and no closed form expression of the prior can be given....... We propose an algorithm, called sequential Gibbs sampling, allowing the Metropolis algorithm to efficiently incorporate complex priors into the solution of an inverse problem, also for the case where no closed form description of the prior exists. First, we lay out the theoretical background...

  7. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    This Ph.d. project describes the development of a workflow for Monte Carlo Treatment Planning for clinical radiotherapy plans. The workflow may be utilized to perform an independent dose verification of treatment plans. Modern radiotherapy treatment delivery is often conducted by dynamically...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...... and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...

  8. The MC21 Monte Carlo Transport Code

    International Nuclear Information System (INIS)

    Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H

    2007-01-01

    MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities

  9. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  10. Application of Observed Precipitation in NCEP Global and Regional Data Assimilation Systems, Including Reanalysis and Land Data Assimilation

    Science.gov (United States)

    Mitchell, K. E.

    2006-12-01

    The Environmental Modeling Center (EMC) of the National Centers for Environmental Prediction (NCEP) applies several different analyses of observed precipitation in both the data assimilation and validation components of NCEP's global and regional numerical weather and climate prediction/analysis systems (including in NCEP global and regional reanalysis). This invited talk will survey these data assimilation and validation applications and methodologies, as well as the temporal frequency, spatial domains, spatial resolution, data sources, data density and data quality control in the precipitation analyses that are applied. Some of the precipitation analyses applied by EMC are produced by NCEP's Climate Prediction Center (CPC), while others are produced by the River Forecast Centers (RFCs) of the National Weather Service (NWS), or by automated algorithms of the NWS WSR-88D Radar Product Generator (RPG). Depending on the specific type of application in data assimilation or model forecast validation, the temporal resolution of the precipitation analyses may be hourly, daily, or pentad (5-day) and the domain may be global, continental U.S. (CONUS), or Mexico. The data sources for precipitation include ground-based gauge observations, radar-based estimates, and satellite-based estimates. The precipitation analyses over the CONUS are analyses of either hourly, daily or monthly totals of precipitation, and they are of two distinct types: gauge-only or primarily radar-estimated. The gauge-only CONUS analysis of daily precipitation utilizes an orographic-adjustment technique (based on the well-known PRISM precipitation climatology of Oregon State University) developed by the NWS Office of Hydrologic Development (OHD). The primary NCEP global precipitation analysis is the pentad CPC Merged Analysis of Precipitation (CMAP), which blends both gauge observations and satellite estimates. The presentation will include a brief comparison between the CMAP analysis and other global

  11. Time scale of random sequential adsorption.

    Science.gov (United States)

    Erban, Radek; Chapman, S Jonathan

    2007-04-01

    A simple multiscale approach to the diffusion-driven adsorption from a solution to a solid surface is presented. The model combines two important features of the adsorption process: (i) The kinetics of the chemical reaction between adsorbing molecules and the surface and (ii) geometrical constraints on the surface made by molecules which are already adsorbed. The process (i) is modeled in a diffusion-driven context, i.e., the conditional probability of adsorbing a molecule provided that the molecule hits the surface is related to the macroscopic surface reaction rate. The geometrical constraint (ii) is modeled using random sequential adsorption (RSA), which is the sequential addition of molecules at random positions on a surface; one attempt to attach a molecule is made per one RSA simulation time step. By coupling RSA with the diffusion of molecules in the solution above the surface the RSA simulation time step is related to the real physical time. The method is illustrated on a model of chemisorption of reactive polymers to a virus surface.

  12. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    Science.gov (United States)

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  13. Sequential and simultaneous choices: testing the diet selection and sequential choice models.

    Science.gov (United States)

    Freidin, Esteban; Aw, Justine; Kacelnik, Alex

    2009-03-01

    We investigate simultaneous and sequential choices in starlings, using Charnov's Diet Choice Model (DCM) and Shapiro, Siller and Kacelnik's Sequential Choice Model (SCM) to integrate function and mechanism. During a training phase, starlings encountered one food-related option per trial (A, B or R) in random sequence and with equal probability. A and B delivered food rewards after programmed delays (shorter for A), while R ('rejection') moved directly to the next trial without reward. In this phase we measured latencies to respond. In a later, choice, phase, birds encountered the pairs A-B, A-R and B-R, the first implementing a simultaneous choice and the second and third sequential choices. The DCM predicts when R should be chosen to maximize intake rate, and SCM uses latencies of the training phase to predict choices between any pair of options in the choice phase. The predictions of both models coincided, and both successfully predicted the birds' preferences. The DCM does not deal with partial preferences, while the SCM does, and experimental results were strongly correlated to this model's predictions. We believe that the SCM may expose a very general mechanism of animal choice, and that its wider domain of success reflects the greater ecological significance of sequential over simultaneous choices.

  14. Spatial updating grand canonical Monte Carlo algorithms for fluid simulation: generalization to continuous potentials and parallel implementation.

    Science.gov (United States)

    O'Keeffe, C J; Ren, Ruichao; Orkoulas, G

    2007-11-21

    Spatial updating grand canonical Monte Carlo algorithms are generalizations of random and sequential updating algorithms for lattice systems to continuum fluid models. The elementary steps, insertions or removals, are constructed by generating points in space either at random (random updating) or in a prescribed order (sequential updating). These algorithms have previously been developed only for systems of impenetrable spheres for which no particle overlap occurs. In this work, spatial updating grand canonical algorithms are generalized to continuous, soft-core potentials to account for overlapping configurations. Results on two- and three-dimensional Lennard-Jones fluids indicate that spatial updating grand canonical algorithms, both random and sequential, converge faster than standard grand canonical algorithms. Spatial algorithms based on sequential updating not only exhibit the fastest convergence but also are ideal for parallel implementation due to the absence of strict detailed balance and the nature of the updating that minimizes interprocessor communication. Parallel simulation results for three-dimensional Lennard-Jones fluids show a substantial reduction of simulation time for systems of moderate and large size. The efficiency improvement by parallel processing through domain decomposition is always in addition to the efficiency improvement by sequential updating.

  15. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  16. Open source data assimilation framework for hydrological modeling

    Science.gov (United States)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  17. Towards Data-Driven Simulations of Wildfire Spread using Ensemble-based Data Assimilation

    Science.gov (United States)

    Rochoux, M. C.; Bart, J.; Ricci, S. M.; Cuenot, B.; Trouvé, A.; Duchaine, F.; Morel, T.

    2012-12-01

    Real-time predictions of a propagating wildfire remain a challenging task because the problem involves both multi-physics and multi-scales. The propagation speed of wildfires, also called the rate of spread (ROS), is indeed determined by complex interactions between pyrolysis, combustion and flow dynamics, atmospheric dynamics occurring at vegetation, topographical and meteorological scales. Current operational fire spread models are mainly based on a semi-empirical parameterization of the ROS in terms of vegetation, topographical and meteorological properties. For the fire spread simulation to be predictive and compatible with operational applications, the uncertainty on the ROS model should be reduced. As recent progress made in remote sensing technology provides new ways to monitor the fire front position, a promising approach to overcome the difficulties found in wildfire spread simulations is to integrate fire modeling and fire sensing technologies using data assimilation (DA). For this purpose we have developed a prototype data-driven wildfire spread simulator in order to provide optimal estimates of poorly known model parameters [*]. The data-driven simulation capability is adapted for more realistic wildfire spread : it considers a regional-scale fire spread model that is informed by observations of the fire front location. An Ensemble Kalman Filter algorithm (EnKF) based on a parallel computing platform (OpenPALM) was implemented in order to perform a multi-parameter sequential estimation where wind magnitude and direction are in addition to vegetation properties (see attached figure). The EnKF algorithm shows its good ability to track a small-scale grassland fire experiment and ensures a good accounting for the sensitivity of the simulation outcomes to the control parameters. As a conclusion, it was shown that data assimilation is a promising approach to more accurately forecast time-varying wildfire spread conditions as new airborne-like observations of

  18. A data assimilation framework for constraining upscaled cropland carbon flux seasonality and biometry with MODIS

    Directory of Open Access Journals (Sweden)

    O. Sus

    2013-04-01

    Full Text Available Agroecosystem models are strongly dependent on information on land management patterns for regional applications. Land management practices play a major role in determining global yield variability, and add an anthropogenic signal to the observed seasonality of atmospheric CO2 concentrations. However, there is still little knowledge on spatial and temporal variability of important farmland activities such as crop sowing dates, and thus these remain rather crudely approximated within carbon cycle studies. In this study, we present a framework allowing for spatio-temporally resolved simulation of cropland carbon fluxes under observational constraints on land management and canopy greenness. We apply data assimilation methodology in order to explicitly account for information on sowing dates and model leaf area index. MODIS 250 m vegetation index data were assimilated both in batch-calibration for sowing date estimation and sequentially for improved model state estimation, using the ensemble Kalman filter (EnKF, into a crop carbon mass balance model (SPAc. In doing so, we are able to quantify the multiannual (2000–2006 regional carbon flux and biometry seasonality of maize–soybean crop rotations surrounding the Bondville Ameriflux eddy covariance site, averaged over 104 pixel locations within the wider area. (1 Validation at the Bondville site shows that growing season C cycling is simulated accurately with MODIS-derived sowing dates, and we expect that this framework allows for accurate simulations of C cycling at locations for which ground-truth data are not available. Thus, this framework enables modellers to simulate current (i.e. last 10 yr carbon cycling of major agricultural regions. Averaged over the 104 field patches analysed, relative spatial variability for biometry and net ecosystem exchange ranges from ∼7% to ∼18%. The annual sign of net biome productivity is not significantly different from carbon neutrality. (2 Moreover

  19. Equivalence between quantum simultaneous games and quantum sequential games

    OpenAIRE

    Kobayashi, Naoki

    2007-01-01

    A framework for discussing relationships between different types of games is proposed. Within the framework, quantum simultaneous games, finite quantum simultaneous games, quantum sequential games, and finite quantum sequential games are defined. In addition, a notion of equivalence between two games is defined. Finally, the following three theorems are shown: (1) For any quantum simultaneous game G, there exists a quantum sequential game equivalent to G. (2) For any finite quantum simultaneo...

  20. Discrimination between sequential and simultaneous virtual channels with electrical hearing

    OpenAIRE

    Landsberger, David; Galvin, John J.

    2011-01-01

    In cochlear implants (CIs), simultaneous or sequential stimulation of adjacent electrodes can produce intermediate pitch percepts between those of the component electrodes. However, it is unclear whether simultaneous and sequential virtual channels (VCs) can be discriminated. In this study, CI users were asked to discriminate simultaneous and sequential VCs; discrimination was measured for monopolar (MP) and bipolar + 1 stimulation (BP + 1), i.e., relatively broad and focused stimulation mode...

  1. C-quence: a tool for analyzing qualitative sequential data.

    Science.gov (United States)

    Duncan, Starkey; Collier, Nicholson T

    2002-02-01

    C-quence is a software application that matches sequential patterns of qualitative data specified by the user and calculates the rate of occurrence of these patterns in a data set. Although it was designed to facilitate analyses of face-to-face interaction, it is applicable to any data set involving categorical data and sequential information. C-quence queries are constructed using a graphical user interface. The program does not limit the complexity of the sequential patterns specified by the user.

  2. Discrimination between sequential and simultaneous virtual channels with electrical hearing.

    Science.gov (United States)

    Landsberger, David; Galvin, John J

    2011-09-01

    In cochlear implants (CIs), simultaneous or sequential stimulation of adjacent electrodes can produce intermediate pitch percepts between those of the component electrodes. However, it is unclear whether simultaneous and sequential virtual channels (VCs) can be discriminated. In this study, CI users were asked to discriminate simultaneous and sequential VCs; discrimination was measured for monopolar (MP) and bipolar + 1 stimulation (BP + 1), i.e., relatively broad and focused stimulation modes. For sequential VCs, the interpulse interval (IPI) varied between 0.0 and 1.8 ms. All stimuli were presented at comfortably loud, loudness-balanced levels at a 250 pulse per second per electrode (ppse) stimulation rate. On average, CI subjects were able to reliably discriminate between sequential and simultaneous VCs. While there was no significant effect of IPI or stimulation mode on VC discrimination, some subjects exhibited better VC discrimination with BP + 1 stimulation. Subjects' discrimination between sequential and simultaneous VCs was correlated with electrode discrimination, suggesting that spatial selectivity may influence perception of sequential VCs. To maintain equal loudness, sequential VC amplitudes were nearly double those of simultaneous VCs, presumably resulting in a broader spread of excitation. These results suggest that perceptual differences between simultaneous and sequential VCs might be explained by differences in the spread of excitation. © 2011 Acoustical Society of America

  3. Lineup composition, suspect position, and the sequential lineup advantage.

    Science.gov (United States)

    Carlson, Curt A; Gronlund, Scott D; Clark, Steven E

    2008-06-01

    N. M. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001) argued that sequential lineups reduce the likelihood of mistaken eyewitness identification. Experiment 1 replicated the design of R. C. L. Lindsay and G. L. Wells (1985), the first study to show the sequential lineup advantage. However, the innocent suspect was chosen at a lower rate in the simultaneous lineup, and no sequential lineup advantage was found. This led the authors to hypothesize that protection from a sequential lineup might emerge only when an innocent suspect stands out from the other lineup members. In Experiment 2, participants viewed a simultaneous or sequential lineup with either the guilty suspect or 1 of 3 innocent suspects. Lineup fairness was varied to influence the degree to which a suspect stood out. A sequential lineup advantage was found only for the unfair lineups. Additional analyses of suspect position in the sequential lineups showed an increase in the diagnosticity of suspect identifications as the suspect was placed later in the sequential lineup. These results suggest that the sequential lineup advantage is dependent on lineup composition and suspect position. (c) 2008 APA, all rights reserved

  4. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG-PET/CT mea......BACKGROUND: Group-sequential testing is widely used in pivotal therapeutic, but rarely in diagnostic research, although it may save studies, time, and costs. The purpose of this paper was to demonstrate a group-sequential analysis strategy in an intra-observer study on quantitative FDG...

  5. Investigation of glycerol assimilation and cofactor metabolism in Lactococcus lactis

    DEFF Research Database (Denmark)

    Holm, Anders Koefoed

    of glycerol kinase from L. lactis, introduction of a heterologous glycerol assimilation pathway and construction of a library of NADH oxidase activity. Based on a preliminary analysis of transcription level data, an attempt was made to stimulate glycerol assimilation by overexpressing the glycerol kinase...... already present in L. lactis. The construction and verification of a strain with increased glycerol kinase activity was not fully completed and is still ongoing. Similarly the construction of mutants expressing a heterologous pathway for glycerol dissimilation is also an ongoing task. An artificial...... effects and improve the growth rate, though not completely to the level of the reference strain. The fact that this effect was predominantly observed while utilizing xylose implicates the involvement of the pentose phosphate pathway. A possible mechanism underlying the observed growth characteristics...

  6. Modulation of intestinal sulfur assimilation metabolism regulates iron homeostasis

    Science.gov (United States)

    Hudson, Benjamin H.; Hale, Andrew T.; Irving, Ryan P.; Li, Shenglan; York, John D.

    2018-01-01

    Sulfur assimilation is an evolutionarily conserved pathway that plays an essential role in cellular and metabolic processes, including sulfation, amino acid biosynthesis, and organismal development. We report that loss of a key enzymatic component of the pathway, bisphosphate 3′-nucleotidase (Bpnt1), in mice, both whole animal and intestine-specific, leads to iron-deficiency anemia. Analysis of mutant enterocytes demonstrates that modulation of their substrate 3′-phosphoadenosine 5′-phosphate (PAP) influences levels of key iron homeostasis factors involved in dietary iron reduction, import and transport, that in part mimic those reported for the loss of hypoxic-induced transcription factor, HIF-2α. Our studies define a genetic basis for iron-deficiency anemia, a molecular approach for rescuing loss of nucleotidase function, and an unanticipated link between nucleotide hydrolysis in the sulfur assimilation pathway and iron homeostasis. PMID:29507250

  7. Assimilation and transformation of benzene by higher plants

    Energy Technology Data Exchange (ETDEWEB)

    Durmishidze, S V; Ugrekhelidze, D Sh; Dzhikiya, A N

    1974-01-01

    Higher plants are capable of assimilating benzene, the molecules of which are subjected to deep chemical transformations; the products of its metabolism move along the plant. Taking part in total metabolism, carbon atoms of benzene molecules incorporate into composition of low-molecular compounds of the plant cell. The bulk of benzene carbon incorporates into composition of organic acids and a comparatively small part - into composition of amino acids. In the metabolism process benzene carbon localizes mainly in the chloroplasts. Phenol, muconic acid and CO/sub 2/ are isolated and identified from the products of benzene enzymatic oxidation. A range of benzene assimilation by higher plants is extremely wide. 9 references, 5 tables.

  8. Ensemble-Based Data Assimilation in Reservoir Characterization: A Review

    Directory of Open Access Journals (Sweden)

    Seungpil Jung

    2018-02-01

    Full Text Available This paper presents a review of ensemble-based data assimilation for strongly nonlinear problems on the characterization of heterogeneous reservoirs with different production histories. It concentrates on ensemble Kalman filter (EnKF and ensemble smoother (ES as representative frameworks, discusses their pros and cons, and investigates recent progress to overcome their drawbacks. The typical weaknesses of ensemble-based methods are non-Gaussian parameters, improper prior ensembles and finite population size. Three categorized approaches, to mitigate these limitations, are reviewed with recent accomplishments; improvement of Kalman gains, add-on of transformation functions, and independent evaluation of observed data. The data assimilation in heterogeneous reservoirs, applying the improved ensemble methods, is discussed on predicting unknown dynamic data in reservoir characterization.

  9. Herbicides effect on the nitrogen fertilizer assimilation by sensitive plants

    International Nuclear Information System (INIS)

    Ladonin, V.F.; Samojlov, L.N.

    1976-01-01

    It has been established in studying the effect of herbicides on pea plants that the penetration of the preparations into the tissues of leaves and stems results in a slight increase of the rate of formation of dry substance in the leaves of the treated plants within 24 hours after treatment as compared with control, whereas in the last period of the analysis the herbicides strongly inhibit the formation of dry substance in leaves. The applied herbicide doses have resulted in drastic changes of the distribution of the plant-assimilated nitrogen between the protein and non-protein fractions in the leaves and stems of pea. When affected by the studied herbicides, the fertilizer nitrogen supply to the pea plants changes and the rate of the fertilizer nitrogen assimilation by the plants varies noticeably. The regularities of the fertilizer nitrogen inclusion in the protein and non-protein nitrogen compounds of the above-ground pea organs have been studied

  10. Mathematical foundations of hybrid data assimilation from a synchronization perspective

    Science.gov (United States)

    Penny, Stephen G.

    2017-12-01

    The state-of-the-art data assimilation methods used today in operational weather prediction centers around the world can be classified as generalized one-way coupled impulsive synchronization. This classification permits the investigation of hybrid data assimilation methods, which combine dynamic error estimates of the system state with long time-averaged (climatological) error estimates, from a synchronization perspective. Illustrative results show how dynamically informed formulations of the coupling matrix (via an Ensemble Kalman Filter, EnKF) can lead to synchronization when observing networks are sparse and how hybrid methods can lead to synchronization when those dynamic formulations are inadequate (due to small ensemble sizes). A large-scale application with a global ocean general circulation model is also presented. Results indicate that the hybrid methods also have useful applications in generalized synchronization, in particular, for correcting systematic model errors.

  11. Regulation of assimilate partitioning by daylength and spectral quality

    Energy Technology Data Exchange (ETDEWEB)

    Britz, S.J. [USDA-Climate Stress Lab., Beltsville, MD (United States)

    1994-12-31

    Photosynthesis is the process by which plants utilize light energy to assimilate and transform carbon dioxide into products that support growth and development. The preceding review provides an excellent summary of photosynthetic mechanisms and diurnal patterns of carbon metabolism with emphasis on the importance of gradual changes in photosynthetically-active radiation at dawn and dusk. In addition to these direct effects of irradiance, there are indirect effects of light period duration and spectral quality on carbohydrate metabolism and assimilate partitioning. Both daylength and spectral quality trigger developmental phenomena such as flowering (e.g., photoperiodism) and shade avoidance responses, but their effects on partitioning of photoassimilates in leaves are less well known. Moreover, the adaptive significance to the plants of such effects is sometimes not clear.

  12. Leaching of assimilable silicon species from fly ash

    International Nuclear Information System (INIS)

    Piekos, R.; Paslawska, S.

    1998-01-01

    The objective of this study was to investigate the leaching of assimilable silicon species from coal fly ash with distilled water, sea waterand synthetic sea water at various fly ash/water ratios, pHs and temperatures. At the 1 g/100 ml fly ash/water ratio, less than 1 mg Si was found in 11 of aqueous slurries over the pH range 4-8 after 2 h at ambient temperature. The leaching was most effective at pH 10.5. At the fly ash/waterratio indicated, the pH of the suspensions decreased from 10.4 to 8.4 after 5days. The pH of fly ash slurries in sea water varied only slightly over time as compared with that in distilled water. Generally, the leaching of assimilable silicon species with distilled water was more intense than that with the sea water. 27 refs., 6 figs., 3 tabs

  13. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-01-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example that shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation

  14. Monte Carlo approaches to light nuclei

    International Nuclear Information System (INIS)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of 16 O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs

  15. Monte Carlo approaches to light nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.

  16. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-02-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)

  17. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  18. Sequential infiltration synthesis for advanced lithography

    Energy Technology Data Exchange (ETDEWEB)

    Darling, Seth B.; Elam, Jeffrey W.; Tseng, Yu-Chih; Peng, Qing

    2017-10-10

    A plasma etch resist material modified by an inorganic protective component via sequential infiltration synthesis (SIS) and methods of preparing the modified resist material. The modified resist material is characterized by an improved resistance to a plasma etching or related process relative to the unmodified resist material, thereby allowing formation of patterned features into a substrate material, which may be high-aspect ratio features. The SIS process forms the protective component within the bulk resist material through a plurality of alternating exposures to gas phase precursors which infiltrate the resist material. The plasma etch resist material may be initially patterned using photolithography, electron-beam lithography or a block copolymer self-assembly process.

  19. Clinical evaluation of synthetic aperture sequential beamforming

    DEFF Research Database (Denmark)

    Hansen, Peter Møller; Hemmsen, Martin Christian; Lange, Theis

    2012-01-01

    In this study clinically relevant ultrasound images generated with synthetic aperture sequential beamforming (SASB) is compared to images generated with a conventional technique. The advantage of SASB is the ability to produce high resolution ultrasound images with a high frame rate and at the same...... time massively reduce the amount of generated data. SASB was implemented in a system consisting of a conventional ultrasound scanner connected to a PC via a research interface. This setup enables simultaneous recording with both SASB and conventional technique. Eighteen volunteers were ultrasound...... scanned abdominally, and 84 sequence pairs were recorded. Each sequence pair consists of two simultaneous recordings of the same anatomical location with SASB and conventional B-mode imaging. The images were evaluated in terms of spatial resolution, contrast, unwanted artifacts, and penetration depth...

  20. Sequential cooling insert for turbine stator vane

    Science.gov (United States)

    Jones, Russel B

    2017-04-04

    A sequential flow cooling insert for a turbine stator vane of a small gas turbine engine, where the impingement cooling insert is formed as a single piece from a metal additive manufacturing process such as 3D metal printing, and where the insert includes a plurality of rows of radial extending impingement cooling air holes alternating with rows of radial extending return air holes on a pressure side wall, and where the insert includes a plurality of rows of chordwise extending second impingement cooling air holes on a suction side wall. The insert includes alternating rows of radial extending cooling air supply channels and return air channels that form a series of impingement cooling on the pressure side followed by the suction side of the insert.

  1. Gleason-Busch theorem for sequential measurements

    Science.gov (United States)

    Flatt, Kieran; Barnett, Stephen M.; Croke, Sarah

    2017-12-01

    Gleason's theorem is a statement that, given some reasonable assumptions, the Born rule used to calculate probabilities in quantum mechanics is essentially unique [A. M. Gleason, Indiana Univ. Math. J. 6, 885 (1957), 10.1512/iumj.1957.6.56050]. We show that Gleason's theorem contains within it also the structure of sequential measurements, and along with this the state update rule. We give a small set of axioms, which are physically motivated and analogous to those in Busch's proof of Gleason's theorem [P. Busch, Phys. Rev. Lett. 91, 120403 (2003), 10.1103/PhysRevLett.91.120403], from which the familiar Kraus operator form follows. An axiomatic approach has practical relevance as well as fundamental interest, in making clear those assumptions which underlie the security of quantum communication protocols. Interestingly, the two-time formalism is seen to arise naturally in this approach.

  2. Sequential Stereotype Priming: A Meta-Analysis.

    Science.gov (United States)

    Kidder, Ciara K; White, Katherine R; Hinojos, Michelle R; Sandoval, Mayra; Crites, Stephen L

    2017-08-01

    Psychological interest in stereotype measurement has spanned nearly a century, with researchers adopting implicit measures in the 1980s to complement explicit measures. One of the most frequently used implicit measures of stereotypes is the sequential priming paradigm. The current meta-analysis examines stereotype priming, focusing specifically on this paradigm. To contribute to ongoing discussions regarding methodological rigor in social psychology, one primary goal was to identify methodological moderators of the stereotype priming effect-whether priming is due to a relation between the prime and target stimuli, the prime and target response, participant task, stereotype dimension, stimulus onset asynchrony (SOA), and stimuli type. Data from 39 studies yielded 87 individual effect sizes from 5,497 participants. Analyses revealed that stereotype priming is significantly moderated by the presence of prime-response relations, participant task, stereotype dimension, target stimulus type, SOA, and prime repetition. These results carry both practical and theoretical implications for future research on stereotype priming.

  3. Sequential Acral Lentiginous Melanomas of the Foot

    Directory of Open Access Journals (Sweden)

    Jiro Uehara

    2010-12-01

    Full Text Available A 64-year-old Japanese woman had a lightly brown-blackish pigmented macule (1.2 cm in diameter on the left sole of her foot. She received surgical excision following a diagnosis of acral lentiginous melanoma (ALM, which was confirmed histopathologically. One month after the operation, a second melanoma lesion was noticed adjacent to the grafted site. Histopathologically, the two lesions had no continuity, but HMB-45 and cyclin D1 double-positive cells were detected not only on aggregates of atypical melanocytes but also on single cells near the cutting edge of the first lesion. The unique occurrence of a sequential lesion of a primary melanoma might be caused by stimulated subclinical field cells during the wound healing process following the initial operation. This case warrants further investigation to establish the appropriate surgical margin of ALM lesions.

  4. Dancing Twins: Stellar Hierarchies That Formed Sequentially?

    Science.gov (United States)

    Tokovinin, Andrei

    2018-04-01

    This paper draws attention to the class of resolved triple stars with moderate ratios of inner and outer periods (possibly in a mean motion resonance) and nearly circular, mutually aligned orbits. Moreover, stars in the inner pair are twins with almost identical masses, while the mass sum of the inner pair is comparable to the mass of the outer component. Such systems could be formed either sequentially (inside-out) by disk fragmentation with subsequent accretion and migration, or by a cascade hierarchical fragmentation of a rotating cloud. Orbits of the outer and inner subsystems are computed or updated in four such hierarchies: LHS 1070 (GJ 2005, periods 77.6 and 17.25 years), HIP 9497 (80 and 14.4 years), HIP 25240 (1200 and 47.0 years), and HIP 78842 (131 and 10.5 years).

  5. Sequential Therapy in Metastatic Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Bradford R Hirsch

    2016-04-01

    Full Text Available The treatment of metastatic renal cell carcinoma (mRCC has changed dramatically in the past decade. As the number of available agents, and related volume of research, has grown, it is increasingly complex to know how to optimally treat patients. The authors are practicing medical oncologists at the US Oncology Network, the largest community-based network of oncology providers in the country, and represent the leadership of the Network's Genitourinary Research Committee. We outline our thought process in approaching sequential therapy of mRCC and the use of real-world data to inform our approach. We also highlight the evolving literature that will impact practicing oncologists in the near future.

  6. Microstructure history effect during sequential thermomechanical processing

    International Nuclear Information System (INIS)

    Yassar, Reza S.; Murphy, John; Burton, Christina; Horstemeyer, Mark F.; El kadiri, Haitham; Shokuhfar, Tolou

    2008-01-01

    The key to modeling the material processing behavior is the linking of the microstructure evolution to its processing history. This paper quantifies various microstructural features of an aluminum automotive alloy that undergoes sequential thermomechanical processing which is comprised hot rolling of a 150-mm billet to a 75-mm billet, rolling to 3 mm, annealing, and then cold rolling to a 0.8-mm thickness sheet. The microstructural content was characterized by means of electron backscatter diffraction, scanning electron microscopy, and transmission electron microscopy. The results clearly demonstrate the evolution of precipitate morphologies, dislocation structures, and grain orientation distributions. These data can be used to improve material models that claim to capture the history effects of the processing materials

  7. Prosody and alignment: a sequential perspective

    Science.gov (United States)

    Szczepek Reed, Beatrice

    2010-12-01

    In their analysis of a corpus of classroom interactions in an inner city high school, Roth and Tobin describe how teachers and students accomplish interactional alignment by prosodically matching each other's turns. Prosodic matching, and specific prosodic patterns are interpreted as signs of, and contributions to successful interactional outcomes and positive emotions. Lack of prosodic matching, and other specific prosodic patterns are interpreted as features of unsuccessful interactions, and negative emotions. This forum focuses on the article's analysis of the relation between interpersonal alignment, emotion and prosody. It argues that prosodic matching, and other prosodic linking practices, play a primarily sequential role, i.e. one that displays the way in which participants place and design their turns in relation to other participants' turns. Prosodic matching, rather than being a conversational action in itself, is argued to be an interactional practice (Schegloff 1997), which is not always employed for the accomplishment of `positive', or aligning actions.

  8. DARLA: Data Assimilation and Remote Sensing for Littoral Applications

    Science.gov (United States)

    2017-03-01

    at reasonable logistical or financial costs . Remote sensing provides an attractive alternative. We discuss the range of different sensors that are...DARLA: Data Assimilation and Remote Sensing for Littoral Applications Final Report Award Number: N000141010932 Andrew T. Jessup Chris Chickadel...20. Radermacher, M., M. Wengrove, J. V. de Vries, and R. Holman (2014), Applicability of video-derived bathymetry estimates to nearshore current

  9. Nitrogen and sulfur assimilation in plants and algae

    Czech Academy of Sciences Publication Activity Database

    Giordano, Mario; Raven, John A.

    2014-01-01

    Roč. 118, č. 2 (2014), s. 45-61 ISSN 0304-3770 Grant - others:University of Dundee(GB) SC 015096; Italian Ministry for Agriculture(IT) MIPAF, Bioforme project; Italian Ministry of Foreign Affairs(IT) MAE. Joint Italian-Israel Cooperation Program Institutional support: RVO:61388971 Keywords : nitrogen * sulfur * assimilation * algae Subject RIV: EE - Microbiology, Virology Impact factor: 1.608, year: 2014

  10. Bayesian Nonlinear Assimilation of Eulerian and Lagrangian Coastal Flow Data

    Science.gov (United States)

    2015-09-30

    Lagrangian Coastal Flow Data Dr. Pierre F.J. Lermusiaux Department of Mechanical Engineering Center for Ocean Science and Engineering Massachusetts...Develop and apply theory, schemes and computational systems for rigorous Bayesian nonlinear assimilation of Eulerian and Lagrangian coastal flow data...coastal ocean fields, both in Eulerian and Lagrangian forms. - Further develop and implement our GMM-DO schemes for robust Bayesian nonlinear estimation

  11. Carbon and nitrogen assimilation in deep subseafloor microbial cells

    OpenAIRE

    Morono, Yuki; Terada, Takeshi; Nishizawa, Manabu; Ito, Motoo; Hillion, François; Takahata, Naoto; Sano, Yuji; Inagaki, Fumio

    2011-01-01

    Remarkable numbers of microbial cells have been observed in global shallow to deep subseafloor sediments. Accumulating evidence indicates that deep and ancient sediments harbor living microbial life, where the flux of nutrients and energy are extremely low. However, their physiology and energy requirements remain largely unknown. We used stable isotope tracer incubation and nanometer-scale secondary ion MS to investigate the dynamics of carbon and nitrogen assimilation activities in individua...

  12. Monte Carlo Codes Invited Session

    International Nuclear Information System (INIS)

    Trama, J.C.; Malvagi, F.; Brown, F.

    2013-01-01

    This document lists 22 Monte Carlo codes used in radiation transport applications throughout the world. For each code the names of the organization and country and/or place are given. We have the following computer codes. 1) ARCHER, USA, RPI; 2) COG11, USA, LLNL; 3) DIANE, France, CEA/DAM Bruyeres; 4) FLUKA, Italy and CERN, INFN and CERN; 5) GEANT4, International GEANT4 collaboration; 6) KENO and MONACO (SCALE), USA, ORNL; 7) MC21, USA, KAPL and Bettis; 8) MCATK, USA, LANL; 9) MCCARD, South Korea, Seoul National University; 10) MCNP6, USA, LANL; 11) MCU, Russia, Kurchatov Institute; 12) MONK and MCBEND, United Kingdom, AMEC; 13) MORET5, France, IRSN Fontenay-aux-Roses; 14) MVP2, Japan, JAEA; 15) OPENMC, USA, MIT; 16) PENELOPE, Spain, Barcelona University; 17) PHITS, Japan, JAEA; 18) PRIZMA, Russia, VNIITF; 19) RMC, China, Tsinghua University; 20) SERPENT, Finland, VTT; 21) SUPERMONTECARLO, China, CAS INEST FDS Team Hefei; and 22) TRIPOLI-4, France, CEA Saclay

  13. Advanced computers and Monte Carlo

    International Nuclear Information System (INIS)

    Jordan, T.L.

    1979-01-01

    High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables

  14. Adaptive Markov Chain Monte Carlo

    KAUST Repository

    Jadoon, Khan

    2016-08-08

    A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In the MCMC simulations, posterior distribution was computed using Bayes rule. The electromagnetic forward model based on the full solution of Maxwell\\'s equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD mini-Explorer. The model parameters and uncertainty for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness are not well estimated as compared to layers electrical conductivity because layer thicknesses in the model exhibits a low sensitivity to the EMI measurements, and is hence difficult to resolve. Application of the proposed MCMC based inversion to the field measurements in a drip irrigation system demonstrate that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provide useful insight about parameter uncertainty for the assessment of the model outputs.

  15. AMSR2 all-sky radiance assimilation and its impact on the analysis and forecast of Hurricane Sandy with a limited-area data assimilation system

    Directory of Open Access Journals (Sweden)

    Chun Yang

    2016-06-01

    Full Text Available A method to assimilate all-sky radiances from the Advanced Microwave Scanning Radiometer 2 (AMSR2 was developed within the Weather Research and Forecasting (WRF model's data assimilation (WRFDA system. The four essential elements are: (1 extending the community radiative transform model's (CRTM interface to include hydrometeor profiles; (2 using total water Qt as the moisture control variable; (3 using a warm-rain physics scheme for partitioning the Qt increment into individual increments of water vapour, cloud liquid water and rain; and (4 adopting a symmetric observation error model for all-sky radiance assimilation.Compared to a benchmark experiment with no AMSR2 data, the impact of assimilating clear-sky or all-sky AMSR2 radiances on the analysis and forecast of Hurricane Sandy (2012 was assessed through analysis/forecast cycling experiments using WRF and WRFDA's three-dimensional variational (3DVAR data assimilation scheme. With more cloud/precipitation-affected data being assimilated around tropical cyclone (TC core areas in the all-sky AMSR2 assimilation experiment, better analyses were obtained in terms of the TC's central sea level pressure (CSLP, warm-core structure and cloud distribution. Substantial (>20 % error reduction in track and CSLP forecasts was achieved from both clear-sky and all-sky AMSR2 assimilation experiments, and this improvement was consistent from the analysis time to 72-h forecasts. Moreover, the all-sky assimilation experiment consistently yielded better track and CSLP forecasts than the clear-sky did for all forecast lead times, due to a better analysis in the TC core areas. Positive forecast impact from assimilating AMSR2 radiances is also seen when verified against the European Center for Medium-Range Weather Forecasts (ECMWF analysis and the Stage IV precipitation analysis, with an overall larger positive impact from the all-sky assimilation experiment.

  16. Monitoring sequential electron transfer with EPR

    International Nuclear Information System (INIS)

    Thurnauer, M.C.; Feezel, L.L.; Snyder, S.W.; Tang, J.; Norris, J.R.; Morris, A.L.; Rustandi, R.R.

    1989-01-01

    A completely general model which treats electron spin polarization (ESP) found in a system in which radical pairs with different magnetic interactions are formed sequentially has been described. This treatment has been applied specifically to the ESP found in the bacterial reaction center. Test cases show clearly how parameters such as structure, lifetime, and magnetic interactions within the successive radical pairs affect the ESP, and demonstrate that previous treatments of this problem have been incomplete. The photosynthetic bacterial reaction center protein is an ideal system for testing the general model of ESP. The radical pair which exhibits ESP, P 870 + Q - (P 870 + is the oxidized, primary electron donor, a bacteriochlorophyll special pair and Q - is the reduced, primary quinone acceptor) is formed via sequential electron transport through the intermediary radical pair P 870 + I - (I - is the reduced, intermediary electron acceptor, a bacteriopheophytin). In addition, it is possible to experimentally vary most of the important parameters, such as the lifetime of the intermediary radical pair and the magnetic interactions in each pair. It has been shown how selective isotopic substitution ( 1 H or 2 H) on P 870 , I and Q affects the ESP of the EPR spectrum of P 870 + Q - , observed at two different microwave frequencies, in Fe 2+ -depleted bacterial reaction centers of Rhodobacter sphaeroides R26. Thus, the relative magnitudes of the magnetic properties (nuclear hyperfine and g-factor differences) which influence ESP development were varied. The results support the general model of ESP in that they suggest that the P 870 + Q - radical pair interactions are the dominant source of ESP production in 2 H bacterial reaction centers

  17. Data assimilation in the early phase: Kalman filtering RIMPUFF

    International Nuclear Information System (INIS)

    Astrup, P.; Turcanu, C.; Puch, R.O.; Palma, C.R.; Mikkelsen, T.

    2004-09-01

    In the framework of the DAONEM project (Data Assimilation for Off-site Nuclear Emergency Management), a data assimilation module, ADUM (Atmospheric Dispersion Updating Module), for the mesoscale atmospheric dispersion program RIMPUFF (Risoe Mesoscale Puff model) part of the early-phase programs of RODOS (Realtime Online DecisiOn Support system for nuclear emergencies) has been developed. It is built on the Kalman filtering algorithm and it assimilates 10-minute averaged gamma dose rates measured at ground level stations. Since the gamma rates are non-linear functions of the state vector variables, the applied Kalman filter is the so-called Extended Kalman filter. In more ways the implementation is non standard: 1) the number of state vector variables varies with time, and 2) the state vector variables are prediction updated with 1-minute time steps but only Kalman filtered every 10 minutes, and this based on time averaged measurements. Given reasonable conditions, i.e. a spatially dense distribution of gamma monitors and a realistic wind field, the developed ADUM module is found to be able to enhance the prediction of the gamma dose field. Based on some of the Kalman filtering parameters, another module, ToDeMM, has been developed for providing the late-phase DeMM (Deposition Monitoring Module) of RODOS with an ensemble of fields of ground level air concentrations and wet deposited material. This accounts for the uncertainty estimation of this kind of quantities as calculated by RIMPUFF for use by DeMM. (au)

  18. Oxidation and Assimilation of Carbohydrates by Micrococcus sodonensis1

    Science.gov (United States)

    Perry, Jerome J.; Evans, James B.

    1966-01-01

    Perry, Jerome J. (North Carolina State University, Raleigh), and James B. Evans. Oxidation and assimilation of carbohydrates by Micrococcus sodonensis. J. Bacteriol. 91:33–38. 1966.—Micrococcus sodonensis is a biotin-requiring strict aerobe that cannot utilize carbohydrates as sole sources of carbon and energy. However, addition of mannose, glucose, sucrose, or maltose to a medium on which the organism can grow resulted in an increase in total growth. M. sodonensis oxidized these sugars without induction, thus indicating the presence of constitutive enzymes for their transport, activation, and metabolism. Under appropriate nonproliferating cell conditions, glucose was readily incorporated into essential constituents of the cell. When glucose-1-C14 and glucose-6-C14 were oxidized by nonproliferating cells, the label was found in both the protein and nucleic acid fractions of the cell. The respiratory quotients of cells oxidizing glucose in saline and in phosphate buffer indicated assimilation of sugar carbon in buffer and virtually no assimilation in saline. The ability of M. sodonensis to completely oxidize glucose and to grow on intermediates of glucose oxidation but not on glucose suggests that glucose may suppress or repress some reaction(s) necessary for growth, and that growth substrates either derepress or circumvent this block. PMID:5903100

  19. Nonlinear problems in data-assimilation : Can synchronization help?

    Science.gov (United States)

    Tribbia, J. J.; Duane, G. S.

    2009-12-01

    Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.

  20. Data assimilation in the decision support system RODOS

    International Nuclear Information System (INIS)

    Rojas-Palma, C.; Madsen, H.; Gering, F.; Puch, R.; Turcanu, C.; Astrup, P.; Mueller, H.; Richter, K.; Zheleznyak, M.; Treebushny, D.; Kolomeev, M.; Kamaev, D.; Wynn, H.

    2003-01-01

    Model predictions for a rapid assessment and prognosis of possible radiological consequences after an accidental release of radionuclides play an important role in nuclear emergency management. Radiological observations, e.g. dose rate measurements, can be used to improve such model predictions. The process of combining model predictions and observations, usually referred to as data assimilation, is described in this article within the framework of the real time on-line decision support system (RODOS) for off-site nuclear emergency management in Europe. Data assimilation capabilities, based on Kalman filters,are under development for several modules of the RODOS system, including the atmospheric dispersion, deposition, food chain and hydrological models. The use of such a generic data assimilation methodology enables the propagation of uncertainties throughout the various modules of the system. This would in turn provide decision makers with uncertainty estimates taking into account both model and observation errors. This paper describes the methodology employed as well as results of some preliminary studies based on simulated data. (author)