Directory of Open Access Journals (Sweden)
M. J. Werner
2011-02-01
Full Text Available Data assimilation is routinely employed in meteorology, engineering and computer sciences to optimally combine noisy observations with prior model information for obtaining better estimates of a state, and thus better forecasts, than achieved by ignoring data uncertainties. Earthquake forecasting, too, suffers from measurement errors and partial model information and may thus gain significantly from data assimilation. We present perhaps the first fully implementable data assimilation method for earthquake forecasts generated by a point-process model of seismicity. We test the method on a synthetic and pedagogical example of a renewal process observed in noise, which is relevant for the seismic gap hypothesis, models of characteristic earthquakes and recurrence statistics of large quakes inferred from paleoseismic data records. To address the non-Gaussian statistics of earthquakes, we use sequential Monte Carlo methods, a set of flexible simulation-based methods for recursively estimating arbitrary posterior distributions. We perform extensive numerical simulations to demonstrate the feasibility and benefits of forecasting earthquakes based on data assimilation.
Ramgraber, M.; Schirmer, M.
2017-12-01
As computational power grows and wireless sensor networks find their way into common practice, it becomes increasingly feasible to pursue on-line numerical groundwater modelling. The reconciliation of model predictions with sensor measurements often necessitates the application of Sequential Monte Carlo (SMC) techniques, most prominently represented by the Ensemble Kalman Filter. In the pursuit of on-line predictions it seems advantageous to transcend the scope of pure data assimilation and incorporate on-line parameter calibration as well. Unfortunately, the interplay between shifting model parameters and transient states is non-trivial. Several recent publications (e.g. Chopin et al., 2013, Kantas et al., 2015) in the field of statistics discuss potential algorithms addressing this issue. However, most of these are computationally intractable for on-line application. In this study, we investigate to what extent compromises between mathematical rigour and computational restrictions can be made within the framework of on-line numerical modelling of groundwater. Preliminary studies are conducted in a synthetic setting, with the goal of transferring the conclusions drawn into application in a real-world setting. To this end, a wireless sensor network has been established in the valley aquifer around Fehraltorf, characterized by a highly dynamic groundwater system and located about 20 km to the East of Zürich, Switzerland. By providing continuous probabilistic estimates of the state and parameter distribution, a steady base for branched-off predictive scenario modelling could be established, providing water authorities with advanced tools for assessing the impact of groundwater management practices. Chopin, N., Jacob, P.E. and Papaspiliopoulos, O. (2013): SMC2: an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (3), p. 397-426. Kantas, N., Doucet, A., Singh, S
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
Merging particle filter for sequential data assimilation
Directory of Open Access Journals (Sweden)
S. Nakano
2007-07-01
Full Text Available A new filtering technique for sequential data assimilation, the merging particle filter (MPF, is proposed. The MPF is devised to avoid the degeneration problem, which is inevitable in the particle filter (PF, without prohibitive computational cost. In addition, it is applicable to cases in which a nonlinear relationship exists between a state and observed data where the application of the ensemble Kalman filter (EnKF is not effectual. In the MPF, the filtering procedure is performed based on sampling of a forecast ensemble as in the PF. However, unlike the PF, each member of a filtered ensemble is generated by merging multiple samples from the forecast ensemble such that the mean and covariance of the filtered distribution are approximately preserved. This merging of multiple samples allows the degeneration problem to be avoided. In the present study, the newly proposed MPF technique is introduced, and its performance is demonstrated experimentally.
Multilevel sequential Monte-Carlo samplers
Jasra, Ajay
2016-01-05
Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.
Fast sequential Monte Carlo methods for counting and optimization
Rubinstein, Reuven Y; Vaisman, Radislav
2013-01-01
A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the
Sequential effects in preference decision: Prior preference assimilates current preference.
Directory of Open Access Journals (Sweden)
Seah Chang
Full Text Available An important factor affecting preference formation is the context in which that preference decision takes place. The current research examined whether one's preference formed for a previously presented stimulus influences the processing of a subsequent preference decision, henceforth referred to as the preference sequence effect. Using a novel sequential rating/judgment paradigm, the present study demonstrated the presence of a preference sequence effect using artistic photographs and face stimuli: A neutral stimulus was preferred more following a preferable stimulus than a less preferable stimulus. Furthermore, a similar trend was found even when the potential influence of response bias was controlled. These results suggest that an assimilative sequential effect exists even when sequential judgments are made solely based on one's subjective feeling; preference formed for a preceding stimulus modulates preference for a subsequent stimulus. This implies the need for a consideration of trial sequence as a factor creating a psychological context affecting the subsequent preference decisions.
Directory of Open Access Journals (Sweden)
S. J. Noh
2011-10-01
Full Text Available Data assimilation techniques have received growing attention due to their capability to improve prediction. Among various data assimilation techniques, sequential Monte Carlo (SMC methods, known as "particle filters", are a Bayesian learning process that has the capability to handle non-linear and non-Gaussian state-space models. In this paper, we propose an improved particle filtering approach to consider different response times of internal state variables in a hydrologic model. The proposed method adopts a lagged filtering approach to aggregate model response until the uncertainty of each hydrologic process is propagated. The regularization with an additional move step based on the Markov chain Monte Carlo (MCMC methods is also implemented to preserve sample diversity under the lagged filtering approach. A distributed hydrologic model, water and energy transfer processes (WEP, is implemented for the sequential data assimilation through the updating of state variables. The lagged regularized particle filter (LRPF and the sequential importance resampling (SIR particle filter are implemented for hindcasting of streamflow at the Katsura catchment, Japan. Control state variables for filtering are soil moisture content and overland flow. Streamflow measurements are used for data assimilation. LRPF shows consistent forecasts regardless of the process noise assumption, while SIR has different values of optimal process noise and shows sensitive variation of confidential intervals, depending on the process noise. Improvement of LRPF forecasts compared to SIR is particularly found for rapidly varied high flows due to preservation of sample diversity from the kernel, even if particle impoverishment takes place.
Optimization of sequential decisions by least squares Monte Carlo method
DEFF Research Database (Denmark)
Nishijima, Kazuyoshi; Anders, Annett
change adaptation measures, and evacuation of people and assets in the face of an emerging natural hazard event. Focusing on the last example, an efficient solution scheme is proposed by Anders and Nishijima (2011). The proposed solution scheme takes basis in the least squares Monte Carlo method, which......The present paper considers the sequential decision optimization problem. This is an important class of decision problems in engineering. Important examples include decision problems on the quality control of manufactured products and engineering components, timing of the implementation of climate....... For the purpose to demonstrate the use and advantages two numerical examples are provided, which is on the quality control of manufactured products....
Evolutionary Sequential Monte Carlo Samplers for Change-Point Models
Directory of Open Access Journals (Sweden)
Arnaud Dufays
2016-03-01
Full Text Available Sequential Monte Carlo (SMC methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC methods. Not only do SMC algorithms draw posterior distributions of static or dynamic parameters but additionally they provide an estimate of the marginal likelihood. The tempered and time (TNT algorithm, developed in this paper, combines (off-line tempered SMC inference with on-line SMC inference for drawing realizations from many sequential posterior distributions without experiencing a particle degeneracy problem. Furthermore, it introduces a new MCMC rejuvenation step that is generic, automated and well-suited for multi-modal distributions. As this update relies on the wide heuristic optimization literature, numerous extensions are readily available. The algorithm is notably appropriate for estimating change-point models. As an example, we compare several change-point GARCH models through their marginal log-likelihoods over time.
Vrugt, Jasper A.; ter Braak, Cajo J. F.; Diks, Cees G. H.; Schoups, Gerrit
2013-01-01
During the past decades much progress has been made in the development of computer based methods for parameter and predictive uncertainty estimation of hydrologic models. The goal of this paper is twofold. As part of this special anniversary issue we first shortly review the most important historical developments in hydrologic model calibration and uncertainty analysis that has led to current perspectives. Then, we introduce theory, concepts and simulation results of a novel data assimilation scheme for joint inference of model parameters and state variables. This Particle-DREAM method combines the strengths of sequential Monte Carlo sampling and Markov chain Monte Carlo simulation and is especially designed for treatment of forcing, parameter, model structural and calibration data error. Two different variants of Particle-DREAM are presented to satisfy assumptions regarding the temporal behavior of the model parameters. Simulation results using a 40-dimensional atmospheric “toy” model, the Lorenz attractor and a rainfall-runoff model show that Particle-DREAM, P-DREAM(VP) and P-DREAM(IP) require far fewer particles than current state-of-the-art filters to closely track the evolving target distribution of interest, and provide important insights into the information content of discharge data and non-stationarity of model parameters. Our development follows formal Bayes, yet Particle-DREAM and its variants readily accommodate hydrologic signatures, informal likelihood functions or other (in)sufficient statistics if those better represent the salient features of the calibration data and simulation model used.
Multiparameter estimation along quantum trajectories with sequential Monte Carlo methods
Ralph, Jason F.; Maskell, Simon; Jacobs, Kurt
2017-11-01
This paper proposes an efficient method for the simultaneous estimation of the state of a quantum system and the classical parameters that govern its evolution. This hybrid approach benefits from efficient numerical methods for the integration of stochastic master equations for the quantum system, and efficient parameter estimation methods from classical signal processing. The classical techniques use sequential Monte Carlo (SMC) methods, which aim to optimize the selection of points within the parameter space, conditioned by the measurement data obtained. We illustrate these methods using a specific example, an SMC sampler applied to a nonlinear system, the Duffing oscillator, where the evolution of the quantum state of the oscillator and three Hamiltonian parameters are estimated simultaneously.
Directory of Open Access Journals (Sweden)
S. Skachko
2008-12-01
Full Text Available This study focuses on an accurate estimation of ocean circulation via assimilation of satellite measurements of ocean dynamical topography into the global finite-element ocean model (FEOM. The dynamical topography data are derived from a complex analysis of multi-mission altimetry data combined with a referenced earth geoid. The assimilation is split into two parts. First, the mean dynamic topography is adjusted. To this end an adiabatic pressure correction method is used which reduces model divergence from the real evolution. Second, a sequential assimilation technique is applied to improve the representation of thermodynamical processes by assimilating the time varying dynamic topography. A method is used according to which the temperature and salinity are updated following the vertical structure of the first baroclinic mode. It is shown that the method leads to a partially successful assimilation approach reducing the rms difference between the model and data from 16 cm to 2 cm. This improvement of the mean state is accompanied by significant improvement of temporal variability in our analysis. However, it remains suboptimal, showing a tendency in the forecast phase of returning toward a free run without data assimilation. Both the mean difference and standard deviation of the difference between the forecast and observation data are reduced as the result of assimilation.
Online Bayesian phylogenetic inference: theoretical foundations via Sequential Monte Carlo.
Dinh, Vu; Darling, Aaron E; Matsen Iv, Frederick A
2017-12-13
Phylogenetics, the inference of evolutionary trees from molecular sequence data such as DNA, is an enterprise that yields valuable evolutionary understanding of many biological systems. Bayesian phylogenetic algorithms, which approximate a posterior distribution on trees, have become a popular if computationally expensive means of doing phylogenetics. Modern data collection technologies are quickly adding new sequences to already substantial databases. With all current techniques for Bayesian phylogenetics, computation must start anew each time a sequence becomes available, making it costly to maintain an up-to-date estimate of a phylogenetic posterior. These considerations highlight the need for an online Bayesian phylogenetic method which can update an existing posterior with new sequences. Here we provide theoretical results on the consistency and stability of methods for online Bayesian phylogenetic inference based on Sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We first show a consistency result, demonstrating that the method samples from the correct distribution in the limit of a large number of particles. Next we derive the first reported set of bounds on how phylogenetic likelihood surfaces change when new sequences are added. These bounds enable us to characterize the theoretical performance of sampling algorithms by bounding the effective sample size (ESS) with a given number of particles from below. We show that the ESS is guaranteed to grow linearly as the number of particles in an SMC sampler grows. Surprisingly, this result holds even though the dimensions of the phylogenetic model grow with each new added sequence. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Khaki, M.
2017-07-06
The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.
Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation
Fichtner, A.; van Dinther, Y.; Kuensch, H. R.
2017-12-01
Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct
A Hybrid Monte Carlo Sampling Filter for Non-Gaussian Data Assimilation
Directory of Open Access Journals (Sweden)
Adrian Sandu
2015-12-01
Full Text Available Data assimilation combines information from models, measurements, and priors to obtain improved estimates of the state of a dynamical system such as the atmosphere. Ensemble-based data assimilation approaches such as the Ensemble Kalman filter (EnKF have gained wide popularity due to their simple formulation, ease of implementation, and good practical results. Many of these methods are derived under the assumption that the underlying probability distributions are Gaussian. It is well accepted, however, that the Gaussianity assumption is too restrictive when applied to large nonlinear models, nonlinear observation operators, and large levels of uncertainty. When the Gaussianity assumptions are severely violated, the performance of EnKF variations degrades. This paper proposes a new ensemble-based data assimilation method, named the sampling filter, which obtains the analysis by sampling directly from the posterior distribution. The sampling strategy is based on a Hybrid Monte Carlo (HMC approach that can handle non-Gaussian probability distributions. Numerical experiments are carried out using the Lorenz-96 model and observation operators with different levels of non-linearity and differentiability. The proposed filter is also tested with shallow water model on a sphere with linear observation operator. Numerical results show that the sampling filter performs well even in highly nonlinear situations where the traditional filters diverge.
Directory of Open Access Journals (Sweden)
Yan Zhan
2017-12-01
Full Text Available Quantifying the eruption potential of a restless volcano requires the ability to model parameters such as overpressure and calculate the host rock stress state as the system evolves. A critical challenge is developing a model-data fusion framework to take advantage of observational data and provide updates of the volcanic system through time. The Ensemble Kalman Filter (EnKF uses a Monte Carlo approach to assimilate volcanic monitoring data and update models of volcanic unrest, providing time-varying estimates of overpressure and stress. Although the EnKF has been proven effective to forecast volcanic deformation using synthetic InSAR and GPS data, until now, it has not been applied to assimilate data from an active volcanic system. In this investigation, the EnKF is used to provide a “hindcast” of the 2009 explosive eruption of Kerinci volcano, Indonesia. A two-sources analytical model is used to simulate the surface deformation of Kerinci volcano observed by InSAR time-series data and to predict the system evolution. A deep, deflating dike-like source reproduces the subsiding signal on the flanks of the volcano, and a shallow spherical McTigue source reproduces the central uplift. EnKF predicted parameters are used in finite element models to calculate the host-rock stress state prior to the 2009 eruption. Mohr-Coulomb failure models reveal that the host rock around the shallow magma reservoir is trending toward tensile failure prior to 2009, which may be the catalyst for the 2009 eruption. Our results illustrate that the EnKF shows significant promise for future applications to forecasting the eruption potential of restless volcanoes and hind-cast the triggering mechanisms of observed eruptions.
Monte Carlo Tree Search for Continuous and Stochastic Sequential Decision Making Problems
International Nuclear Information System (INIS)
Couetoux, Adrien
2013-01-01
In this thesis, I studied sequential decision making problems, with a focus on the unit commitment problem. Traditionally solved by dynamic programming methods, this problem is still a challenge, due to its high dimension and to the sacrifices made on the accuracy of the model to apply state of the art methods. I investigated on the applicability of Monte Carlo Tree Search methods for this problem, and other problems that are single player, stochastic and continuous sequential decision making problems. In doing so, I obtained a consistent and anytime algorithm, that can easily be combined with existing strong heuristic solvers. (author)
Simulation based sequential Monte Carlo methods for discretely observed Markov processes
Neal, Peter
2014-01-01
Parameter estimation for discretely observed Markov processes is a challenging problem. However, simulation of Markov processes is straightforward using the Gillespie algorithm. We exploit this ease of simulation to develop an effective sequential Monte Carlo (SMC) algorithm for obtaining samples from the posterior distribution of the parameters. In particular, we introduce two key innovations, coupled simulations, which allow us to study multiple parameter values on the basis of a single sim...
A reduced order model based on Kalman filtering for sequential data assimilation of turbulent flows
Meldi, M.; Poux, A.
2017-10-01
A Kalman filter based sequential estimator is presented in this work. The estimator is integrated in the structure of segregated solvers for the analysis of incompressible flows. This technique provides an augmented flow state integrating available observation in the CFD model, naturally preserving a zero-divergence condition for the velocity field. Because of the prohibitive costs associated with a complete Kalman Filter application, two model reduction strategies have been proposed and assessed. These strategies dramatically reduce the increase in computational costs of the model, which can be quantified in an augmentation of 10%- 15% with respect to the classical numerical simulation. In addition, an extended analysis of the behavior of the numerical model covariance Q has been performed. Optimized values are strongly linked to the truncation error of the discretization procedure. The estimator has been applied to the analysis of a number of test cases exhibiting increasing complexity, including turbulent flow configurations. The results show that the augmented flow successfully improves the prediction of the physical quantities investigated, even when the observation is provided in a limited region of the physical domain. In addition, the present work suggests that these Data Assimilation techniques, which are at an embryonic stage of development in CFD, may have the potential to be pushed even further using the augmented prediction as a powerful tool for the optimization of the free parameters in the numerical simulation.
Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time
Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.
2017-12-01
We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.
Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus
2016-04-01
Bayesian sequential simulation (BSS) is a powerful geostatistical technique, which notably has shown significant potential for the assimilation of datasets that are diverse with regard to the spatial resolution and their relationship. However, these types of applications of BSS require a large number of realizations to adequately explore the solution space and to assess the corresponding uncertainties. Moreover, such simulations generally need to be performed on very fine grids in order to adequately exploit the technique's potential for characterizing heterogeneous environments. Correspondingly, the computational cost of BSS algorithms in their classical form is very high, which so far has limited an effective application of this method to large models and/or vast datasets. In this context, it is also important to note that the inherent assumption regarding the independence of the considered datasets is generally regarded as being too strong in the context of sequential simulation. To alleviate these problems, we have revisited the classical implementation of BSS and incorporated two key features to increase the computational efficiency. The first feature is a combined quadrant spiral - superblock search, which targets run-time savings on large grids and adds flexibility with regard to the selection of neighboring points using equal directional sampling and treating hard data and previously simulated points separately. The second feature is a constant path of simulation, which enhances the efficiency for multiple realizations. We have also modified the aggregation operator to be more flexible with regard to the assumption of independence of the considered datasets. This is achieved through log-linear pooling, which essentially allows for attributing weights to the various data components. Finally, a multi-grid simulating path was created to enforce large-scale variance and to allow for adapting parameters, such as, for example, the log-linear weights or the type
Zhu, Gaofeng; Li, Xin; Ma, Jinzhu; Wang, Yunquan; Liu, Shaomin; Huang, Chunlin; Zhang, Kun; Hu, Xiaoli
2018-04-01
Sequential Monte Carlo (SMC) samplers have become increasing popular for estimating the posterior parameter distribution with the non-linear dependency structures and multiple modes often present in hydrological models. However, the explorative capabilities and efficiency of the sampler depends strongly on the efficiency in the move step of SMC sampler. In this paper we presented a new SMC sampler entitled the Particle Evolution Metropolis Sequential Monte Carlo (PEM-SMC) algorithm, which is well suited to handle unknown static parameters of hydrologic model. The PEM-SMC sampler is inspired by the works of Liang and Wong (2001) and operates by incorporating the strengths of the genetic algorithm, differential evolution algorithm and Metropolis-Hasting algorithm into the framework of SMC. We also prove that the sampler admits the target distribution to be a stationary distribution. Two case studies including a multi-dimensional bimodal normal distribution and a conceptual rainfall-runoff hydrologic model by only considering parameter uncertainty and simultaneously considering parameter and input uncertainty show that PEM-SMC sampler is generally superior to other popular SMC algorithms in handling the high dimensional problems. The study also indicated that it may be important to account for model structural uncertainty by using multiplier different hydrological models in the SMC framework in future study.
Energy Technology Data Exchange (ETDEWEB)
Jennings, E.; Madigan, M.
2017-04-01
Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrain high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC
A sequential Monte Carlo model of the combined GB gas and electricity network
International Nuclear Information System (INIS)
Chaudry, Modassar; Wu, Jianzhong; Jenkins, Nick
2013-01-01
A Monte Carlo model of the combined GB gas and electricity network was developed to determine the reliability of the energy infrastructure. The model integrates the gas and electricity network into a single sequential Monte Carlo simulation. The model minimises the combined costs of the gas and electricity network, these include gas supplies, gas storage operation and electricity generation. The Monte Carlo model calculates reliability indices such as loss of load probability and expected energy unserved for the combined gas and electricity network. The intention of this tool is to facilitate reliability analysis of integrated energy systems. Applications of this tool are demonstrated through a case study that quantifies the impact on the reliability of the GB gas and electricity network given uncertainties such as wind variability, gas supply availability and outages to energy infrastructure assets. Analysis is performed over a typical midwinter week on a hypothesised GB gas and electricity network in 2020 that meets European renewable energy targets. The efficacy of doubling GB gas storage capacity on the reliability of the energy system is assessed. The results highlight the value of greater gas storage facilities in enhancing the reliability of the GB energy system given various energy uncertainties. -- Highlights: •A Monte Carlo model of the combined GB gas and electricity network was developed. •Reliability indices are calculated for the combined GB gas and electricity system. •The efficacy of doubling GB gas storage capacity on reliability of the energy system is assessed. •Integrated reliability indices could be used to assess the impact of investment in energy assets
Sequential Monte Carlo Localization Methods in Mobile Wireless Sensor Networks: A Review
Directory of Open Access Journals (Sweden)
Ammar M. A. Abu Znaid
2017-01-01
Full Text Available The advancement of digital technology has increased the deployment of wireless sensor networks (WSNs in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages and disadvantages. The similarities and differences of each scheme are investigated on the basis of significant parameters, namely, localization accuracy, computational cost, communication cost, and number of samples. We discuss the challenges and direction of the future research work for each parameter.
Ham, Yoo-Geun; Song, Hyo-Jong; Jung, Jaehee; Lim, Gyu-Ho
2017-04-01
This study introduces a altered version of the incremental analysis updates (IAU), called the nonstationary IAU (NIAU) method, to enhance the assimilation accuracy of the IAU while retaining the continuity of the analysis. Analogous to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. Still, unlike the IAU, the NIAU method applies time-evolved forcing employing the forward operator as rectifications to the model. The solution of the NIAU is better than that of the IAU, of which analysis is performed at the start of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.
Directory of Open Access Journals (Sweden)
Yoo-Geun Ham
2016-01-01
Full Text Available This study introduces a modified version of the incremental analysis updates (IAU, called the nonstationary IAU (NIAU method, to improve the assimilation accuracy of the IAU while keeping the continuity of the analysis. Similar to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. However, unlike the IAU, the NIAU procedure uses time-evolved forcing using the forward operator as corrections to the model. The solution of the NIAU is superior to that of the forward IAU, of which analysis is performed at the beginning of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.
Nguyen, Ngoc Minh; Corff, Sylvain Le; Moulines, Éric
2017-12-01
This paper focuses on sequential Monte Carlo approximations of smoothing distributions in conditionally linear and Gaussian state spaces. To reduce Monte Carlo variance of smoothers, it is typical in these models to use Rao-Blackwellization: particle approximation is used to sample sequences of hidden regimes while the Gaussian states are explicitly integrated conditional on the sequence of regimes and observations, using variants of the Kalman filter/smoother. The first successful attempt to use Rao-Blackwellization for smoothing extends the Bryson-Frazier smoother for Gaussian linear state space models using the generalized two-filter formula together with Kalman filters/smoothers. More recently, a forward-backward decomposition of smoothing distributions mimicking the Rauch-Tung-Striebel smoother for the regimes combined with backward Kalman updates has been introduced. This paper investigates the benefit of introducing additional rejuvenation steps in all these algorithms to sample at each time instant new regimes conditional on the forward and backward particles. This defines particle-based approximations of the smoothing distributions whose support is not restricted to the set of particles sampled in the forward or backward filter. These procedures are applied to commodity markets which are described using a two-factor model based on the spot price and a convenience yield for crude oil data.
We developed a sequential Monte Carlo filter to estimate the states and the parameters in a stochastic model of Japanese Encephalitis (JE) spread in the Philippines. This method is particularly important for its adaptability to the availability of new incidence data. This method can also capture the...
Monte Carlo simulation of the sequential probability ratio test for radiation monitoring
International Nuclear Information System (INIS)
Coop, K.L.
1984-01-01
A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table
Adluru, Nagesh; Yang, Xingwei; Latecki, Longin Jan
2015-05-01
We consider a problem of finding maximum weight subgraphs (MWS) that satisfy hard constraints in a weighted graph. The constraints specify the graph nodes that must belong to the solution as well as mutual exclusions of graph nodes, i.e., pairs of nodes that cannot belong to the same solution. Our main contribution is a novel inference approach for solving this problem in a sequential monte carlo (SMC) sampling framework. Usually in an SMC framework there is a natural ordering of the states of the samples. The order typically depends on observations about the states or on the annealing setup used. In many applications (e.g., image jigsaw puzzle problems), all observations (e.g., puzzle pieces) are given at once and it is hard to define a natural ordering. Therefore, we relax the assumption of having ordered observations about states and propose a novel SMC algorithm for obtaining maximum a posteriori estimate of a high-dimensional posterior distribution. This is achieved by exploring different orders of states and selecting the most informative permutations in each step of the sampling. Our experimental results demonstrate that the proposed inference framework significantly outperforms loopy belief propagation in solving the image jigsaw puzzle problem. In particular, our inference quadruples the accuracy of the puzzle assembly compared to that of loopy belief propagation.
Dewaele, Hélène; Munier, Simon; Albergel, Clément; Planque, Carole; Laanaia, Nabil; Carrer, Dominique; Calvet, Jean-Christophe
2017-09-01
Soil maximum available water content (MaxAWC) is a key parameter in land surface models (LSMs). However, being difficult to measure, this parameter is usually uncertain. This study assesses the feasibility of using a 15-year (1999-2013) time series of satellite-derived low-resolution observations of leaf area index (LAI) to estimate MaxAWC for rainfed croplands over France. LAI interannual variability is simulated using the CO2-responsive version of the Interactions between Soil, Biosphere and Atmosphere (ISBA) LSM for various values of MaxAWC. Optimal value is then selected by using (1) a simple inverse modelling technique, comparing simulated and observed LAI and (2) a more complex method consisting in integrating observed LAI in ISBA through a land data assimilation system (LDAS) and minimising LAI analysis increments. The evaluation of the MaxAWC estimates from both methods is done using simulated annual maximum above-ground biomass (Bag) and straw cereal grain yield (GY) values from the Agreste French agricultural statistics portal, for 45 administrative units presenting a high proportion of straw cereals. Significant correlations (p value Bag and GY are found for up to 36 and 53 % of the administrative units for the inverse modelling and LDAS tuning methods, respectively. It is found that the LDAS tuning experiment gives more realistic values of MaxAWC and maximum Bag than the inverse modelling experiment. Using undisaggregated LAI observations leads to an underestimation of MaxAWC and maximum Bag in both experiments. Median annual maximum values of disaggregated LAI observations are found to correlate very well with MaxAWC.
Directory of Open Access Journals (Sweden)
Ke Tang
2014-04-01
Full Text Available Loops in proteins are flexible regions connecting regular secondary structures. They are often involved in protein functions through interacting with other molecules. The irregularity and flexibility of loops make their structures difficult to determine experimentally and challenging to model computationally. Conformation sampling and energy evaluation are the two key components in loop modeling. We have developed a new method for loop conformation sampling and prediction based on a chain growth sequential Monte Carlo sampling strategy, called Distance-guided Sequential chain-Growth Monte Carlo (DISGRO. With an energy function designed specifically for loops, our method can efficiently generate high quality loop conformations with low energy that are enriched with near-native loop structures. The average minimum global backbone RMSD for 1,000 conformations of 12-residue loops is 1:53 A° , with a lowest energy RMSD of 2:99 A° , and an average ensembleRMSD of 5:23 A° . A novel geometric criterion is applied to speed up calculations. The computational cost of generating 1,000 conformations for each of the x loops in a benchmark dataset is only about 10 cpu minutes for 12-residue loops, compared to ca 180 cpu minutes using the FALCm method. Test results on benchmark datasets show that DISGRO performs comparably or better than previous successful methods, while requiring far less computing time. DISGRO is especially effective in modeling longer loops (10-17 residues.
Enhancing hydrologic data assimilation by evolutionary Particle Filter and Markov Chain Monte Carlo
Abbaszadeh, Peyman; Moradkhani, Hamid; Yan, Hongxiang
2018-01-01
Particle Filters (PFs) have received increasing attention by researchers from different disciplines including the hydro-geosciences, as an effective tool to improve model predictions in nonlinear and non-Gaussian dynamical systems. The implication of dual state and parameter estimation using the PFs in hydrology has evolved since 2005 from the PF-SIR (sampling importance resampling) to PF-MCMC (Markov Chain Monte Carlo), and now to the most effective and robust framework through evolutionary PF approach based on Genetic Algorithm (GA) and MCMC, the so-called EPFM. In this framework, the prior distribution undergoes an evolutionary process based on the designed mutation and crossover operators of GA. The merit of this approach is that the particles move to an appropriate position by using the GA optimization and then the number of effective particles is increased by means of MCMC, whereby the particle degeneracy is avoided and the particle diversity is improved. In this study, the usefulness and effectiveness of the proposed EPFM is investigated by applying the technique on a conceptual and highly nonlinear hydrologic model over four river basins located in different climate and geographical regions of the United States. Both synthetic and real case studies demonstrate that the EPFM improves both the state and parameter estimation more effectively and reliably as compared with the PF-MCMC.
Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria
2017-08-01
Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.
Li, Jiahao; Klee Barillas, Joaquin; Guenther, Clemens; Danzer, Michael A.
2014-02-01
Battery state monitoring is one of the key techniques in battery management systems e.g. in electric vehicles. An accurate estimation can help to improve the system performance and to prolong the battery remaining useful life. Main challenges for the state estimation for LiFePO4 batteries are the flat characteristic of open-circuit-voltage over battery state of charge (SOC) and the existence of hysteresis phenomena. Classical estimation approaches like Kalman filtering show limitations to handle nonlinear and non-Gaussian error distribution problems. In addition, uncertainties in the battery model parameters must be taken into account to describe the battery degradation. In this paper, a novel model-based method combining a Sequential Monte Carlo filter with adaptive control to determine the cell SOC and its electric impedance is presented. The applicability of this dual estimator is verified using measurement data acquired from a commercial LiFePO4 cell. Due to a better handling of the hysteresis problem, results show the benefits of the proposed method against the estimation with an Extended Kalman filter.
International Nuclear Information System (INIS)
Sorrentino, Alberto; Luria, Gianvittorio; Aramini, Riccardo
2014-01-01
In this paper, we develop a novel Bayesian approach to the problem of estimating neural currents in the brain from a fixed distribution of magnetic field (called topography), measured by magnetoencephalography. Differently from recent studies that describe inversion techniques, such as spatio-temporal regularization/filtering, in which neural dynamics always plays a role, we face here a purely static inverse problem. Neural currents are modelled as an unknown number of current dipoles, whose state space is described in terms of a variable-dimension model. Within the resulting Bayesian framework, we set up a sequential Monte Carlo sampler to explore the posterior distribution. An adaptation technique is employed in order to effectively balance the computational cost and the quality of the sample approximation. Then, both the number and the parameters of the unknown current dipoles are simultaneously estimated. The performance of the method is assessed by means of synthetic data, generated by source configurations containing up to four dipoles. Eventually, we describe the results obtained by analysing data from a real experiment, involving somatosensory evoked fields, and compare them to those provided by three other methods. (paper)
Probabilistic forecasting and Bayesian data assimilation
Reich, Sebastian
2015-01-01
In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...
A coherent structure approach for parameter estimation in Lagrangian Data Assimilation
Maclean, John; Santitissadeekorn, Naratip; Jones, Christopher K. R. T.
2017-12-01
We introduce a data assimilation method to estimate model parameters with observations of passive tracers by directly assimilating Lagrangian Coherent Structures. Our approach differs from the usual Lagrangian Data Assimilation approach, where parameters are estimated based on tracer trajectories. We employ the Approximate Bayesian Computation (ABC) framework to avoid computing the likelihood function of the coherent structure, which is usually unavailable. We solve the ABC by a Sequential Monte Carlo (SMC) method, and use Principal Component Analysis (PCA) to identify the coherent patterns from tracer trajectory data. Our new method shows remarkably improved results compared to the bootstrap particle filter when the physical model exhibits chaotic advection.
Panday, Prajjwal K.; Williams, Christopher A.; Frey, Karen E.; Brown, Molly E.
2013-01-01
Previous studies have drawn attention to substantial hydrological changes taking place in mountainous watersheds where hydrology is dominated by cryospheric processes. Modelling is an important tool for understanding these changes but is particularly challenging in mountainous terrain owing to scarcity of ground observations and uncertainty of model parameters across space and time. This study utilizes a Markov Chain Monte Carlo data assimilation approach to examine and evaluate the performance of a conceptual, degree-day snowmelt runoff model applied in the Tamor River basin in the eastern Nepalese Himalaya. The snowmelt runoff model is calibrated using daily streamflow from 2002 to 2006 with fairly high accuracy (average Nash-Sutcliffe metric approx. 0.84, annual volume bias <3%). The Markov Chain Monte Carlo approach constrains the parameters to which the model is most sensitive (e.g. lapse rate and recession coefficient) and maximizes model fit and performance. Model simulated streamflow using an interpolated precipitation data set decreases the fractional contribution from rainfall compared with simulations using observed station precipitation. The average snowmelt contribution to total runoff in the Tamor River basin for the 2002-2006 period is estimated to be 29.7+/-2.9% (which includes 4.2+/-0.9% from snowfall that promptly melts), whereas 70.3+/-2.6% is attributed to contributions from rainfall. On average, the elevation zone in the 4000-5500m range contributes the most to basin runoff, averaging 56.9+/-3.6% of all snowmelt input and 28.9+/-1.1% of all rainfall input to runoff. Model simulated streamflow using an interpolated precipitation data set decreases the fractional contribution from rainfall versus snowmelt compared with simulations using observed station precipitation. Model experiments indicate that the hydrograph itself does not constrain estimates of snowmelt versus rainfall contributions to total outflow but that this derives from the degree
Energy Technology Data Exchange (ETDEWEB)
Aguilar, Charles M.; De Almeida, Wagner B. [Laboratorio de Quimica Computacional e Modelagem Molecular (LQC-MM), Departamento de Quimica - ICEX, Universidade Federal de Minas Gerais 31270-901, Belo Horizonte, MG (Brazil); Rocha, Willian R. [Laboratorio de Quimica Computacional e Modelagem Molecular (LQC-MM), Departamento de Quimica - ICEX, Universidade Federal de Minas Gerais 31270-901, Belo Horizonte, MG (Brazil)], E-mail: wrocha@ufmg.br
2008-11-03
A sequential Monte Carlo/Quantum Mechanics approach was used to investigate the solvent effects on d {yields} d transition of the Ni{sup 2+} ion in aqueous and ammonia solutions. A set of Lennard-Jones parameters were generated by modification of the UFF Force Field. The structural results obtained for the liquid structure around the Ni{sup 2+} ion are in very good agreement with the experimental findings. The water molecules in the second coordination shell interact strongly with the first shell, with hydrogen bonds of -14.6 {+-} 3.3 kcal mol{sup -1} which is 30% stronger than in the ammonia complex. The electronic spectrum was evaluated within the TD-DFT approach on the gas phase geometry and also on the Monte Carlo generated clusters, including the long range solvent effects by means of the PCM continuum model. We show that the computed electronic transitions are all red-shifted compared with the experimental results and, the agreement with the experimental values are only qualitative.
International Nuclear Information System (INIS)
Aguilar, Charles M.; De Almeida, Wagner B.; Rocha, Willian R.
2008-01-01
A sequential Monte Carlo/Quantum Mechanics approach was used to investigate the solvent effects on d → d transition of the Ni 2+ ion in aqueous and ammonia solutions. A set of Lennard-Jones parameters were generated by modification of the UFF Force Field. The structural results obtained for the liquid structure around the Ni 2+ ion are in very good agreement with the experimental findings. The water molecules in the second coordination shell interact strongly with the first shell, with hydrogen bonds of -14.6 ± 3.3 kcal mol -1 which is 30% stronger than in the ammonia complex. The electronic spectrum was evaluated within the TD-DFT approach on the gas phase geometry and also on the Monte Carlo generated clusters, including the long range solvent effects by means of the PCM continuum model. We show that the computed electronic transitions are all red-shifted compared with the experimental results and, the agreement with the experimental values are only qualitative
Directory of Open Access Journals (Sweden)
J. D. Rösevall
2007-01-01
Full Text Available The objective of this study is to demonstrate how polar ozone depletion can be mapped and quantified by assimilating ozone data from satellites into the wind driven transport model DIAMOND, (Dynamical Isentropic Assimilation Model for OdiN Data. By assimilating a large set of satellite data into a transport model, ozone fields can be built up that are less noisy than the individual satellite ozone profiles. The transported fields can subsequently be compared to later sets of incoming satellite data so that the rates and geographical distribution of ozone depletion can be determined. By tracing the amounts of solar irradiation received by different air parcels in a transport model it is furthermore possible to study the photolytic reactions that destroy ozone. In this study, destruction of ozone that took place in the Antarctic winter of 2003 and in the Arctic winter of 2002/2003 have been examined by assimilating ozone data from the ENVISAT/MIPAS and Odin/SMR satellite-instruments. Large scale depletion of ozone was observed in the Antarctic polar vortex of 2003 when sunlight returned after the polar night. By mid October ENVISAT/MIPAS data indicate vortex ozone depletion in the ranges 80–100% and 70–90% on the 425 and 475 K potential temperature levels respectively while the Odin/SMR data indicates depletion in the ranges 70–90% and 50–70%. The discrepancy between the two instruments has been attributed to systematic errors in the Odin/SMR data. Assimilated fields of ENVISAT/MIPAS data indicate ozone depletion in the range 10–20% on the 475 K potential temperature level, (~19 km altitude, in the central regions of the 2002/2003 Arctic polar vortex. Assimilated fields of Odin/SMR data on the other hand indicate ozone depletion in the range 20–30%.
Liu, Qinming; Dong, Ming; Peng, Ying
2012-10-01
Health prognosis of equipment is considered as a key process of the condition based maintenance strategy. It contributes to reduce the related risks and the maintenance costs of equipment and improve the availability, the reliability and the security of equipment. However, equipment often operates under dynamically operational and environmental conditions, and its lifetime is generally described by the monitored nonlinear time-series data. Equipment subjects to high levels of uncertainty and unpredictability so that effective methods for its online health prognosis are still in need now. This paper addresses prognostic methods based on hidden semi-Markov model (HSMM) by using sequential Monte Carlo (SMC) method. HSMM is applied to obtain the transition probabilities among health states and the state durations. The SMC method is adopted to describe the probability relationships between health states and the monitored observations of equipment. This paper proposes a novel multi-step-ahead health recognition algorithm based on joint probability distribution to recognize the health states of equipment and its health state change point. A new online health prognostic method is also developed to estimate the residual useful lifetime (RUL) values of equipment. At the end of the paper, a real case study is used to demonstrate the performance and potential applications of the proposed methods for online health prognosis of equipment.
Thorn, Graeme J; King, John R
2016-01-01
The Gram-positive bacterium Clostridium acetobutylicum is an anaerobic endospore-forming species which produces acetone, butanol and ethanol via the acetone-butanol (AB) fermentation process, leading to biofuels including butanol. In previous work we looked to estimate the parameters in an ordinary differential equation model of the glucose metabolism network using data from pH-controlled continuous culture experiments. Here we combine two approaches, namely the approximate Bayesian computation via an existing sequential Monte Carlo (ABC-SMC) method (to compute credible intervals for the parameters), and the profile likelihood estimation (PLE) (to improve the calculation of confidence intervals for the same parameters), the parameters in both cases being derived from experimental data from forward shift experiments. We also apply the ABC-SMC method to investigate which of the models introduced previously (one non-sporulation and four sporulation models) have the greatest strength of evidence. We find that the joint approximate posterior distribution of the parameters determines the same parameters as previously, including all of the basal and increased enzyme production rates and enzyme reaction activity parameters, as well as the Michaelis-Menten kinetic parameters for glucose ingestion, while other parameters are not as well-determined, particularly those connected with the internal metabolites acetyl-CoA, acetoacetyl-CoA and butyryl-CoA. We also find that the approximate posterior is strongly non-Gaussian, indicating that our previous assumption of elliptical contours of the distribution is not valid, which has the effect of reducing the numbers of pairs of parameters that are (linearly) correlated with each other. Calculations of confidence intervals using the PLE method back this up. Finally, we find that all five of our models are equally likely, given the data available at present. Copyright © 2015 Elsevier Inc. All rights reserved.
Altimeter data assimilation in the tropical Indian Ocean using water ...
Indian Academy of Sciences (India)
Altimeter data have been assimilated in an ocean general circulation model using the water property conserving scheme. Two runs of the model have been conducted for the year 2004. In one of the runs, altimeter data have been assimilated sequentially, while in another run, assimilation has been suppressed. Assimilation ...
Data Assimilation in Marine Models
DEFF Research Database (Denmark)
Frydendall, Jan
maximum likelihood framework. These issues are discussed in paper B. The third part of the thesis falls a bit out of the above context is work published in papers C, F. In the first paper, a simple data assimilation scheme was investigated to examine the potential benefits of incorporating a data......This thesis consists of six research papers published or submitted for publication in the period 2006-2009 together with a summary report. The main topics of this thesis are nonlinear data assimilation techniques and estimation in dynamical models. The focus has been on the nonlinear filtering...... techniques for large scale geophysical numerical models and making them feasible to work with in the data assimilation framework. The filtering techniques investigated are all Monte Carlo simulation based. Some very nice features that can be exploited in the Monte Carlo based data assimilation framework from...
Amezcua, Javier
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn't represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion
Regional Ocean Data Assimilation
Edwards, Christopher A.
2015-01-03
This article reviews the past 15 years of developments in regional ocean data assimilation. A variety of scientific, management, and safety-related objectives motivate marine scientists to characterize many ocean environments, including coastal regions. As in weather prediction, the accurate representation of physical, chemical, and/or biological properties in the ocean is challenging. Models and observations alone provide imperfect representations of the ocean state, but together they can offer improved estimates. Variational and sequential methods are among the most widely used in regional ocean systems, and there have been exciting recent advances in ensemble and four-dimensional variational approaches. These techniques are increasingly being tested and adapted for biogeochemical applications.
Monte-Carlo simulation and analysis of the spectrum of p + sup 1 sup 1 B three-body sequential decay
Li Chen; Meng Qiu Ying; Zhang Pei Hua; Lin Er Kang
2002-01-01
The new experimental data of sup 1 sup 1 B(p, alpha sub 1) sup 8 Be* sup ( sup 1 sup ) (2 alpha) three-body decay show that the continuous alpha spectrum of the two alpha particles produced by the intermediate nuclear sup 8 Be* sup ( sup 1 sup ) looks like a saddle type distribution. To explain the experimental facts, the authors have written a Monte Carlo simulation program to the p + sup 1 sup 1 B reaction. The calculation results of the program indicate that the anisotropy distribution emission of the decay alpha particles produced by sup 8 Be* sup ( sup 1 sup ) can give a satisfying explanation to the experimental spectrum
Directory of Open Access Journals (Sweden)
C. Albergel
2017-10-01
Full Text Available In this study, a global land data assimilation system (LDAS-Monde is applied over Europe and the Mediterranean basin to increase monitoring accuracy for land surface variables. LDAS-Monde is able to ingest information from satellite-derived surface soil moisture (SSM and leaf area index (LAI observations to constrain the interactions between soil–biosphere–atmosphere (ISBA, Interactions between Soil, Biosphere and Atmosphere land surface model (LSM coupled with the CNRM (Centre National de Recherches Météorologiques version of the Total Runoff Integrating Pathways (ISBA-CTRIP continental hydrological system. It makes use of the CO2-responsive version of ISBA which models leaf-scale physiological processes and plant growth. Transfer of water and heat in the soil rely on a multilayer diffusion scheme. SSM and LAI observations are assimilated using a simplified extended Kalman filter (SEKF, which uses finite differences from perturbed simulations to generate flow dependence between the observations and the model control variables. The latter include LAI and seven layers of soil (from 1 to 100 cm depth. A sensitivity test of the Jacobians over 2000–2012 exhibits effects related to both depth and season. It also suggests that observations of both LAI and SSM have an impact on the different control variables. From the assimilation of SSM, the LDAS is more effective in modifying soil moisture (SM from the top layers of soil, as model sensitivity to SSM decreases with depth and has almost no impact from 60 cm downwards. From the assimilation of LAI, a strong impact on LAI itself is found. The LAI assimilation impact is more pronounced in SM layers that contain the highest fraction of roots (from 10 to 60 cm. The assimilation is more efficient in summer and autumn than in winter and spring. Results shows that the LDAS works well constraining the model to the observations and that stronger corrections are applied to LAI than to SM. A
Wald, Abraham
2013-01-01
In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio
Mean field simulation for Monte Carlo integration
Del Moral, Pierre
2013-01-01
In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko
Directory of Open Access Journals (Sweden)
M. Morzfeld
2012-06-01
Full Text Available Implicit particle filtering is a sequential Monte Carlo method for data assimilation, designed to keep the number of particles manageable by focussing attention on regions of large probability. These regions are found by minimizing, for each particle, a scalar function F of the state variables. Some previous implementations of the implicit filter rely on finding the Hessians of these functions. The calculation of the Hessians can be cumbersome if the state dimension is large or if the underlying physics are such that derivatives of F are difficult to calculate, as happens in many geophysical applications, in particular in models with partial noise, i.e. with a singular state covariance matrix. Examples of models with partial noise include models where uncertain dynamic equations are supplemented by conservation laws with zero uncertainty, or with higher order (in time stochastic partial differential equations (PDE or with PDEs driven by spatially smooth noise processes. We make the implicit particle filter applicable to such situations by combining gradient descent minimization with random maps and show that the filter is efficient, accurate and reliable because it operates in a subspace of the state space. As an example, we consider a system of nonlinear stochastic PDEs that is of importance in geomagnetic data assimilation.
Implicit sampling for data assimilation
Tu, X.; Chorin, A. J.; Morzfeld, M.
2014-12-01
Applications of filtering and data assimilation arise in engineering, geosciences, weather forecasting, and many other areas where one has to make estimations or predictions based on uncertain models supplemented by a stream of noisy data. For nonlinear problems, filtering can be very expensive because the number of particles required can be catastrophically large. We will present a nonlinear filtering scheme that is based on implicit sampling, a new sampling technique related to a chainless Monte Carlo method. This sampling strategy generates a particle (sample) beam which is focused on the high probability region of the target probability density function and the focusing makes the number of particles required manageable even if the state dimension is large. Several examples will be given.
Data Assimilation - Advances and Applications
Energy Technology Data Exchange (ETDEWEB)
Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batch sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.
Advanced Multilevel Monte Carlo Methods
Jasra, Ajay
2017-04-24
This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.
Conditions for successful data assimilation
Morzfeld, M.; Chorin, A. J.
2013-12-01
Many applications in science and engineering require that the predictions of uncertain models be updated by information from a stream of noisy data. The model and the data jointly define a conditional probability density function (pdf), which contains all the information one has about the process of interest and various numerical methods can be used to study and approximate this pdf, e.g. the Kalman filter, variational methods or particle filters. Given a model and data, each of these algorithms will produce a result. We are interested in the conditions under which this result is reasonable, i.e. consistent with the real-life situation one is modeling. In particular, we show, using idealized models, that numerical data assimilation is feasible in principle only if a suitably defined effective dimension of the problem is not excessive. This effective dimension depends on the noise in the model and the data, and in physically reasonable problems it can be moderate even when the number of variables is huge. In particular, we find that the effective dimension being moderate induces a balance condition between the noises in the model and the data; this balance condition is often satisfied in realistic applications or else the noise levels are excessive and drown the underlying signal. We also study the effects of the effective dimension on particle filters in two instances, one in which the importance function is based on the model alone, and one in which it is based on both the model and the data. We have three main conclusions: (1) the stability (i.e., non-collapse of weights) in particle filtering depends on the effective dimension of the problem. Particle filters can work well if the effective dimension is moderate even if the true dimension is large (which we expect to happen often in practice). (2) A suitable choice of importance function is essential, or else particle filtering fails even when data assimilation is feasible in principle with a sequential algorithm
Data Assimilation with Optimal Maps
El Moselhy, T.; Marzouk, Y.
2012-12-01
Tarek El Moselhy and Youssef Marzouk Massachusetts Institute of Technology We present a new approach to Bayesian inference that entirely avoids Markov chain simulation and sequential importance resampling, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. The map is written as a multivariate polynomial expansion and computed efficiently through the solution of a stochastic optimization problem. While our previous work [1] focused on static Bayesian inference problems, we now extend the map-based approach to sequential data assimilation, i.e., nonlinear filtering and smoothing. One scheme involves pushing forward a fixed reference measure to each filtered state distribution, while an alternative scheme computes maps that push forward the filtering distribution from one stage to the other. We compare the performance of these schemes and extend the former to problems of smoothing, using a map implementation of the forward-backward smoothing formula. Advantages of a map-based representation of the filtering and smoothing distributions include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent uniformly-weighted posterior samples without additional evaluations of the dynamical model. Perhaps the main advantage, however, is that the map approach inherently avoids issues of sample impoverishment, since it explicitly represents the posterior as the pushforward of a reference measure, rather than with a particular set of samples. The computational complexity of our algorithm is comparable to state-of-the-art particle filters. Moreover, the accuracy of the approach is controlled via the convergence criterion of the underlying optimization problem. We demonstrate the efficiency and accuracy of the map approach via data assimilation in
Multilevel Monte Carlo in Approximate Bayesian Computation
Jasra, Ajay
2017-02-13
In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.
Minimal sequential Hausdorff spaces
Directory of Open Access Journals (Sweden)
Bhamini M. P. Nayar
2004-01-01
Full Text Available A sequential space (X,T is called minimal sequential if no sequential topology on X is strictly weaker than T. This paper begins the study of minimal sequential Hausdorff spaces. Characterizations of minimal sequential Hausdorff spaces are obtained using filter bases, sequences, and functions satisfying certain graph conditions. Relationships between this class of spaces and other classes of spaces, for example, minimal Hausdorff spaces, countably compact spaces, H-closed spaces, SQ-closed spaces, and subspaces of minimal sequential spaces, are investigated. While the property of being sequential is not (in general preserved by products, some information is provided on the question of when the product of minimal sequential spaces is minimal sequential.
Displacement data assimilation
Energy Technology Data Exchange (ETDEWEB)
Rosenthal, W. Steven [Pacific Northwest Laboratory, Richland, WA 99354 (United States); Venkataramani, Shankar [Department of Mathematics and Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721 (United States); Mariano, Arthur J. [Rosenstiel School of Marine & Atmospheric Science, University of Miami, Miami, FL 33149 (United States); Restrepo, Juan M., E-mail: restrepo@math.oregonstate.edu [Department of Mathematics, Oregon State University, Corvallis, OR 97331 (United States)
2017-02-01
We show that modifying a Bayesian data assimilation scheme by incorporating kinematically-consistent displacement corrections produces a scheme that is demonstrably better at estimating partially observed state vectors in a setting where feature information is important. While the displacement transformation is generic, here we implement it within an ensemble Kalman Filter framework and demonstrate its effectiveness in tracking stochastically perturbed vortices.
Improving operational flood forecasting through data assimilation
Rakovec, Oldrich; Weerts, Albrecht; Uijlenhoet, Remko; Hazenberg, Pieter; Torfs, Paul
2010-05-01
Accurate flood forecasts have been a challenging topic in hydrology for decades. Uncertainty in hydrological forecasts is due to errors in initial state (e.g. forcing errors in historical mode), errors in model structure and parameters and last but not least the errors in model forcings (weather forecasts) during the forecast mode. More accurate flood forecasts can be obtained through data assimilation by merging observations with model simulations. This enables to identify the sources of uncertainties in the flood forecasting system. Our aim is to assess the different sources of error that affect the initial state and to investigate how they propagate through hydrological models with different levels of spatial variation, starting from lumped models. The knowledge thus obtained can then be used in a data assimilation scheme to improve the flood forecasts. This study presents the first results of this framework and focuses on quantifying precipitation errors and its effect on discharge simulations within the Ourthe catchment (1600 km2), which is situated in the Belgian Ardennes and is one of the larger subbasins of the Meuse River. Inside the catchment, hourly rain gauge information from 10 different locations is available over a period of 15 years. Based on these time series, the bootstrap method has been applied to generate precipitation ensembles. These were then used to simulate the catchment's discharges at the outlet. The corresponding streamflow ensembles were further assimilated with observed river discharges to update the model states of lumped hydrological models (R-PDM, HBV) through Residual Resampling. This particle filtering technique is a sequential data assimilation method and takes no prior assumption of the probability density function for the model states, which in contrast to the Ensemble Kalman filter does not have to be Gaussian. Our further research will be aimed at quantifying and reducing the sources of uncertainty that affect the initial
Real-time projections of cholera outbreaks through data assimilation and rainfall forecasting
Pasetto, Damiano; Finger, Flavio; Rinaldo, Andrea; Bertuzzo, Enrico
2017-10-01
Although treatment for cholera is well-known and cheap, outbreaks in epidemic regions still exact high death tolls mostly due to the unpreparedness of health care infrastructures to face unforeseen emergencies. In this context, mathematical models for the prediction of the evolution of an ongoing outbreak are of paramount importance. Here, we test a real-time forecasting framework that readily integrates new information as soon as available and periodically issues an updated forecast. The spread of cholera is modeled by a spatially-explicit scheme that accounts for the dynamics of susceptible, infected and recovered individuals hosted in different local communities connected through hydrologic and human mobility networks. The framework presents two major innovations for cholera modeling: the use of a data assimilation technique, specifically an ensemble Kalman filter, to update both state variables and parameters based on the observations, and the use of rainfall forecasts to force the model. The exercise of simulating the state of the system and the predictive capabilities of the novel tools, set at the initial phase of the 2010 Haitian cholera outbreak using only information that was available at that time, serves as a benchmark. Our results suggest that the assimilation procedure with the sequential update of the parameters outperforms calibration schemes based on Markov chain Monte Carlo. Moreover, in a forecasting mode the model usefully predicts the spatial incidence of cholera at least one month ahead. The performance decreases for longer time horizons yet allowing sufficient time to plan for deployment of medical supplies and staff, and to evaluate alternative strategies of emergency management.
Data assimilation strategies for volcano geodesy
Zhan, Yan; Gregg, Patricia M.
2017-09-01
Ground deformation observed using near-real time geodetic methods, such as InSAR and GPS, can provide critical information about the evolution of a magma chamber prior to volcanic eruption. Rapid advancement in numerical modeling capabilities has resulted in a number of finite element models targeted at better understanding the connection between surface uplift associated with magma chamber pressurization and the potential for volcanic eruption. Robust model-data fusion techniques are necessary to take full advantage of the numerical models and the volcano monitoring observations currently available. In this study, we develop a 3D data assimilation framework using the Ensemble Kalman Filter (EnKF) approach in order to combine geodetic observations of surface deformation with geodynamic models to investigate volcanic unrest. The EnKF sequential assimilation method utilizes disparate data sets as they become available to update geodynamic models of magma reservoir evolution. While the EnKF has been widely applied in hydrologic and climate modeling, the adaptation for volcano monitoring is in its initial stages. As such, our investigation focuses on conducting a series of sensitivity tests to optimize the EnKF for volcano applications and on developing specific strategies for assimilation of geodetic data. Our numerical experiments illustrate that the EnKF is able to adapt well to the spatial limitations posed by GPS data and the temporal limitations of InSAR, and that specific strategies can be adopted to enhance EnKF performance to improve model forecasts. Specifically, our numerical experiments indicate that: (1) incorporating additional iterations of the EnKF analysis step is more efficient than increasing the number of ensemble members; (2) the accuracy of the EnKF results are not affected by initial parameter assumptions; (3) GPS observations near the center of uplift improve the quality of model forecasts; (4) occasionally shifting continuous GPS stations to
Impact of real-time measurements for data assimilation in reservoir simulation
Energy Technology Data Exchange (ETDEWEB)
Schulze-Riegert, R.; Krosche, M. [Scandpower Petroleum Technology GmbH, Hamburg (Germany); Pajonk, O. [TU Braunschweig (Germany). Inst. fuer Wissenschaftliches Rechnen; Myrland, T. [Morges Teknisk-Naturvitenskapelige Univ. (NTNU), Trondheim (Germany)
2008-10-23
This paper gives an overview on the conceptual background of data assimilation techniques. The framework of sequential data assimilation as described for the ensemble Kalman filter implementation allows a continuous integration of new measurement data. The initial diversity of ensemble members will be critical for the assimilation process and the ability to successfully assimilate measurement data. At the same time the initial ensemble will impact the propagation of uncertainties with crucial consequences for production forecasts. Data assimilation techniques have complimentary features compared to other optimization techniques built on selection or regression schemes. Specifically, EnKF is applicable to real field cases and defines an important perspective for facilitating continuous reservoir simulation model updates in a reservoir life cycle. (orig.)
Integrated Data Assimilation Architecture Project
National Aeronautics and Space Administration — The Integrated Data Assimilation Architecture (IDAA) addresses the fundamental problem of command, control, and communications systems interoperability....
Handling the unknown soil hydraulic parameters in data assimilation for unsaturated flow problems
Lange, Natascha; Erdal, Daniel; Neuweiler, Insa
2017-04-01
. Considering heterogeneous soils, we discuss the representativeness of different observation types to be used for the assimilation. G. Evensen. Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. Journal of Geophysical Research: Oceans, 99(C5):10143-10162, 1994
Variational data assimilation using targetted random walks
Cotter, S. L.
2011-02-15
The variational approach to data assimilation is a widely used methodology for both online prediction and for reanalysis. In either of these scenarios, it can be important to assess uncertainties in the assimilated state. Ideally, it is desirable to have complete information concerning the Bayesian posterior distribution for unknown state given data. We show that complete computational probing of this posterior distribution is now within the reach in the offline situation. We introduce a Markov chain-Monte Carlo (MCMC) method which enables us to directly sample from the Bayesian posterior distribution on the unknown functions of interest given observations. Since we are aware that these methods are currently too computationally expensive to consider using in an online filtering scenario, we frame this in the context of offline reanalysis. Using a simple random walk-type MCMC method, we are able to characterize the posterior distribution using only evaluations of the forward model of the problem, and of the model and data mismatch. No adjoint model is required for the method we use; however, more sophisticated MCMC methods are available which exploit derivative information. For simplicity of exposition, we consider the problem of assimilating data, either Eulerian or Lagrangian, into a low Reynolds number flow in a two-dimensional periodic geometry. We will show that in many cases it is possible to recover the initial condition and model error (which we describe as unknown forcing to the model) from data, and that with increasing amounts of informative data, the uncertainty in our estimations reduces. © 2011 John Wiley & Sons, Ltd.
Assimilation: central and peripheral effects
Weert, C.M.M. de; Kruysbergen, N.A.W.H. van
1997-01-01
Assimilation and contrast have opposite effects: Contrast leads to an increase of perceived differences between neighbouring fields, whereas assimilation leads to a reduction. It is relatively easy to demonstrate these effects, but the precise localisation of these effects in the perceptual system
Van Leeuwen, Peter Jan; Reich, Sebastian
2015-01-01
This book contains two review articles on nonlinear data assimilation that deal with closely related topics but were written and can be read independently. Both contributions focus on so-called particle filters. The first contribution by Jan van Leeuwen focuses on the potential of proposal densities. It discusses the issues with present-day particle filters and explorers new ideas for proposal densities to solve them, converging to particle filters that work well in systems of any dimension, closing the contribution with a high-dimensional example. The second contribution by Cheng and Reich discusses a unified framework for ensemble-transform particle filters. This allows one to bridge successful ensemble Kalman filters with fully nonlinear particle filters, and allows a proper introduction of localization in particle filters, which has been lacking up to now.
Priming in concert: Assimilation and contrast with multiple affective and gender primes.
Fockenberg, D.A.; Koole, S.L.; Semin, G.R.
2008-01-01
The present research investigated the influence of multiple sequential primes on social categorization processes. Study 1 examined an evaluative decision task in which targets were preceded and succeeded by two primes. As expected, the temporally closest forward primes had assimilative effects on
Identifying a land use change cellular automaton by Bayesian data assimilation
Verstegen, J.A.; Karssenberg, D.J.; Hilst, F. van der; Faaij, A.
2014-01-01
We present a Bayesian method that simultaneously identifies the model structure and calibrates the parameters of a cellular automaton (CA). The method entails sequential assimilation of observations, using a particle filter. It employs prior knowledge of experts to define which processes might
Identifying a land use change cellular automaton by Bayesian data assimilation
Verstegen, J.A.; Karssenberg, D.J.|info:eu-repo/dai/nl/241557119; van der Hilst, F.|info:eu-repo/dai/nl/314099905; Faaij, A.|info:eu-repo/dai/nl/10685903X
2014-01-01
We present a Bayesian method that simultaneously identifies the model structure and calibrates the parameters of a cellular automaton (CA). The method entails sequential assimilation of observations, using a particle filter. It employs prior knowledge of experts to define which processes might be
Noh, S.J.; Rakovec, O.; Weerts, A.H.; Tachikawa, Y.
2014-01-01
We investigate the effects of noise specification on the quality of hydrological forecasts via an advanced data assimilation (DA) procedure using a distributed hydrological model driven by numerical weather predictions. The sequential DA procedure is based on (1) a multivariate rainfall ensemble
Sequential charged particle reaction
International Nuclear Information System (INIS)
Hori, Jun-ichi; Ochiai, Kentaro; Sato, Satoshi; Yamauchi, Michinori; Nishitani, Takeo
2004-01-01
The effective cross sections for producing the sequential reaction products in F82H, pure vanadium and LiF with respect to the 14.9-MeV neutron were obtained and compared with the estimation ones. Since the sequential reactions depend on the secondary charged particles behavior, the effective cross sections are corresponding to the target nuclei and the material composition. The effective cross sections were also estimated by using the EAF-libraries and compared with the experimental ones. There were large discrepancies between estimated and experimental values. Additionally, we showed the contribution of the sequential reaction on the induced activity and dose rate in the boundary region with water. From the present study, it has been clarified that the sequential reactions are of great importance to evaluate the dose rates around the surface of cooling pipe and the activated corrosion products. (author)
Zapata Norberto, B.; Morales-Casique, E.; Herrera, G. S.
2017-12-01
Severe land subsidence due to groundwater extraction may occur in multiaquifer systems where highly compressible aquitards are present. The highly compressible nature of the aquitards leads to nonlinear consolidation where the groundwater flow parameters are stress-dependent. The case is further complicated by the heterogeneity of the hydrogeologic and geotechnical properties of the aquitards. We explore the effect of realistic vertical heterogeneity of hydrogeologic and geotechnical parameters on the consolidation of highly compressible aquitards by means of 1-D Monte Carlo numerical simulations. 2000 realizations are generated for each of the following parameters: hydraulic conductivity (K), compression index (Cc) and void ratio (e). The correlation structure, the mean and the variance for each parameter were obtained from a literature review about field studies in the lacustrine sediments of Mexico City. The results indicate that among the parameters considered, random K has the largest effect on the ensemble average behavior of the system. Random K leads to the largest variance (and therefore largest uncertainty) of total settlement, groundwater flux and time to reach steady state conditions. We further propose a data assimilation scheme by means of ensemble Kalman filter to estimate the ensemble mean distribution of K, pore-pressure and total settlement. We consider the case where pore-pressure measurements are available at given time intervals. We test our approach by generating a 1-D realization of K with exponential spatial correlation, and solving the nonlinear flow and consolidation problem. These results are taken as our "true" solution. We take pore-pressure "measurements" at different times from this "true" solution. The ensemble Kalman filter method is then employed to estimate ensemble mean distribution of K, pore-pressure and total settlement based on the sequential assimilation of these pore-pressure measurements. The ensemble-mean estimates from
Data Assimilation by Conditioning of Driving Noise on Future Observations
Lee, Wonjung
2014-08-01
Conventional recursive filtering approaches, designed for quantifying the state of an evolving stochastic dynamical system with intermittent observations, use a sequence of i) an uncertainty propagation step followed by ii) a step where the associated data is assimilated using Bayes\\' rule. Alternatively, the order of the steps can be switched to i) one step ahead data assimilation followed by ii) uncertainty propagation. In this paper, we apply this smoothing-based sequential filter to systems driven by random noise, however with the conditioning on future observation not only to the system variable but to the driving noise. Our research reveals that, for the nonlinear filtering problem, the conditioned driving noise is biased by a nonzero mean and in turn pushes forward the filtering solution in time closer to the true state when it drives the system. As a result our proposed method can yield a more accurate approximate solution for the state estimation problem. © 1991-2012 IEEE.
Global Data Assimilation System (GDAS)
National Oceanic and Atmospheric Administration, Department of Commerce — The Global Data Assimilation System (GDAS) is the system used by the Global Forecast System (GFS) model to place observations into a gridded model space for the...
Energy Technology Data Exchange (ETDEWEB)
Gratch, J. [Univ. of Southern California, Marina del Rey, CA (United States)
1996-12-31
This article advocates a new model for inductive learning. Called sequential induction, it helps bridge classical fixed-sample learning techniques (which are efficient but difficult to formally characterize), and worst-case approaches (which provide strong statistical guarantees but are too inefficient for practical use). Learning proceeds as a sequence of decisions which are informed by training data. By analyzing induction at the level of these decisions, and by utilizing the only enough data to make each decision, sequential induction provides statistical guarantees but with substantially less data than worst-case methods require. The sequential inductive model is also useful as a method for determining a sufficient sample size for inductive learning and as such, is relevant to learning problems where the preponderance of data or the cost of gathering data precludes the use of traditional methods.
Khaki, M.; Schumacher, M.; Forootan, E.; Kuhn, M.; Awange, J. L.; van Dijk, A. I. J. M.
2017-10-01
Assimilation of terrestrial water storage (TWS) information from the Gravity Recovery And Climate Experiment (GRACE) satellite mission can provide significant improvements in hydrological modelling. However, the rather coarse spatial resolution of GRACE TWS and its spatially correlated errors pose considerable challenges for achieving realistic assimilation results. Consequently, successful data assimilation depends on rigorous modelling of the full error covariance matrix of the GRACE TWS estimates, as well as realistic error behavior for hydrological model simulations. In this study, we assess the application of local analysis (LA) to maximize the contribution of GRACE TWS in hydrological data assimilation. For this, we assimilate GRACE TWS into the World-Wide Water Resources Assessment system (W3RA) over the Australian continent while applying LA and accounting for existing spatial correlations using the full error covariance matrix. GRACE TWS data is applied with different spatial resolutions including 1° to 5° grids, as well as basin averages. The ensemble-based sequential filtering technique of the Square Root Analysis (SQRA) is applied to assimilate TWS data into W3RA. For each spatial scale, the performance of the data assimilation is assessed through comparison with independent in-situ ground water and soil moisture observations. Overall, the results demonstrate that LA is able to stabilize the inversion process (within the implementation of the SQRA filter) leading to less errors for all spatial scales considered with an average RMSE improvement of 54% (e.g., 52.23 mm down to 26.80 mm) for all the cases with respect to groundwater in-situ measurements. Validating the assimilated results with groundwater observations indicates that LA leads to 13% better (in terms of RMSE) assimilation results compared to the cases with Gaussian errors assumptions. This highlights the great potential of LA and the use of the full error covariance matrix of GRACE TWS
Sequential stochastic optimization
Cairoli, Renzo
1996-01-01
Sequential Stochastic Optimization provides mathematicians and applied researchers with a well-developed framework in which stochastic optimization problems can be formulated and solved. Offering much material that is either new or has never before appeared in book form, it lucidly presents a unified theory of optimal stopping and optimal sequential control of stochastic processes. This book has been carefully organized so that little prior knowledge of the subject is assumed; its only prerequisites are a standard graduate course in probability theory and some familiarity with discrete-paramet
Plowshare sequential device test
Energy Technology Data Exchange (ETDEWEB)
Ballou, L. B.
1971-08-02
For over a year we have been advocating the development of a hardened or ruggedized version of Diamond which will be suitable for sequential detonation of multiple explosives in one emplacement hole. A Plowshare-sponsored device development test, named `Yacht` is proposed for execution in Area 15 at the Nevada Test Site [NTS] in late September 1972. The test is designed to evaluate the ability of a ruggedized Diamond-type explosive assembly to withstand the effects of an adjacent nuclear detonation in the same emplacement hole and then be sequentially fired. The objectives and experimental plan for this concept is provided.
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
ias
RESONANCE ⎜ August 2014. GENERAL ⎜ ARTICLE. Variational Monte Carlo Technique. Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. Keywords. Variational methods, Monte. Carlo techniques, harmonic os- cillators, quantum mechanical systems. Sukanta Deb is an. Assistant Professor in the.
Indian Academy of Sciences (India)
. Keywords. Gibbs sampling, Markov Chain. Monte Carlo, Bayesian inference, stationary distribution, conver- gence, image restoration. Arnab Chakraborty. We describe the mathematics behind the Markov. Chain Monte Carlo method of ...
ASSIMILATION OF COARSE-SCALEDATAUSINGTHE ENSEMBLE KALMAN FILTER
Efendiev, Yalchin
2011-01-01
Reservoir data is usually scale dependent and exhibits multiscale features. In this paper we use the ensemble Kalman filter (EnKF) to integrate data at different spatial scales for estimating reservoir fine-scale characteristics. Relationships between the various scales is modeled via upscaling techniques. We propose two versions of the EnKF to assimilate the multiscale data, (i) where all the data are assimilated together and (ii) the data are assimilated sequentially in batches. Ensemble members obtained after assimilating one set of data are used as a prior to assimilate the next set of data. Both of these versions are easily implementable with any other upscaling which links the fine to the coarse scales. The numerical results with different methods are presented in a twin experiment setup using a two-dimensional, two-phase (oil and water) flow model. Results are shown with coarse-scale permeability and coarse-scale saturation data. They indicate that additional data provides better fine-scale estimates and fractional flow predictions. We observed that the two versions of the EnKF differed in their estimates when coarse-scale permeability is provided, whereas their results are similar when coarse-scale saturation is used. This behavior is thought to be due to the nonlinearity of the upscaling operator in the case of the former data. We also tested our procedures with various precisions of the coarse-scale data to account for the inexact relationship between the fine and coarse scale data. As expected, the results show that higher precision in the coarse-scale data yielded improved estimates. With better coarse-scale modeling and inversion techniques as more data at multiple coarse scales is made available, the proposed modification to the EnKF could be relevant in future studies.
Directory of Open Access Journals (Sweden)
Jingyi Jiang
2014-09-01
Full Text Available High-quality leaf area index (LAI products retrieved from satellite observations are urgently needed for crop growth monitoring and yield estimation, land-surface process simulation and global change studies. In recent years, sequential assimilation methods have been increasingly used to retrieve LAI from time series remote-sensing data. However, the inherent characteristics of these sequential assimilation methods result in temporal discontinuities in the retrieved LAI profiles. In this study, a sequential assimilation method with incremental analysis update (IAU was developed to jointly update model states and parameters and to retrieve temporally continuous LAI profiles from time series Moderate Resolution Imaging Spectroradiometer (MODIS reflectance data. Based on the existing multi-year Global Land Surface Satellite (GLASS LAI product, a dynamic model was constructed to evolve LAI anomalies over time. The sequential assimilation method with an IAU technique takes advantage of the Kalman filter (KF technique to update model parameters, uses the ensemble Kalman filter (EnKF technique to update LAI anomalies recursively from time series MODIS reflectance data and then calculates the temporally continuous LAI values by combining the LAI climatology data. The method was tested over eight Committee on Earth Observing Satellites-Benchmark Land Multisite Analysis and Intercomparison of Products (CEOS-BELMANIP sites with different vegetation types. The results indicate that the sequential method with IAU can precisely reconstruct the seasonal variation patterns of LAI and that the LAI profiles derived from the sequential method with IAU are smooth and continuous.
Sequential memory: Binding dynamics
Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail
2015-10-01
Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.
Toward the assimilation of images
Directory of Open Access Journals (Sweden)
F.-X. Le Dimet
2015-01-01
Full Text Available The equations that govern geophysical fluids (namely atmosphere, ocean and rivers are well known but their use for prediction requires the knowledge of the initial condition. In many practical cases, this initial condition is poorly known and the use of an imprecise initial guess is not sufficient to perform accurate forecasts because of the high sensitivity of these systems to small perturbations. As every situation is unique, the only additional information that can help to retrieve the initial condition are observations and statistics. The set of methods that combine these sources of heterogeneous information to construct such an initial condition are referred to as data assimilation. More and more images and sequences of images, of increasing resolution, are produced for scientific or technical studies. This is particularly true in the case of geophysical fluids that are permanently observed by remote sensors. However, the structured information contained in images or image sequences is not assimilated as regular observations: images are still (under-utilized to produce qualitative analysis by experts. This paper deals with the quantitative assimilation of information provided in an image form into a numerical model of a dynamical system. We describe several possibilities for such assimilation and identify associated difficulties. Results from our ongoing research are used to illustrate the methods. The assimilation of image is a very general framework that can be transposed in several scientific domains.
Directory of Open Access Journals (Sweden)
Xuefeng Zhang
2015-01-01
Full Text Available Sequential, adaptive, and gradient diffusion filters are implemented into spatial multiscale three-dimensional variational data assimilation (3DVAR as alternative schemes to model background error covariance matrix for the commonly used correction scale method, recursive filter method, and sequential 3DVAR. The gradient diffusion filter (GDF is verified by a two-dimensional sea surface temperature (SST assimilation experiment. Compared to the existing DF, the new GDF scheme shows a superior performance in the assimilation experiment due to its success in extracting the spatial multiscale information. The GDF can retrieve successfully the longwave information over the whole analysis domain and the shortwave information over data-dense regions. After that, a perfect twin data assimilation experiment framework is designed to study the effect of the GDF on the state estimation based on an intermediate coupled model. In this framework, the assimilation model is subject to “biased” initial fields from the “truth” model. While the GDF reduces the model bias in general, it can enhance the accuracy of the state estimation in the region that the observations are removed, especially in the South Ocean. In addition, the higher forecast skill can be obtained through the better initial state fields produced by the GDF.
A sequential tree approach for incremental sequential pattern mining
Indian Academy of Sciences (India)
Data mining; STISPM; sequential tree; incremental mining; backward tracking. Abstract. ''Sequential pattern mining'' is a prominent and significant method to explore the knowledge and innovation from the large database. Common sequential pattern mining algorithms handle static databases.Pragmatically, looking into the ...
Data assimilation in hydrological modelling
DEFF Research Database (Denmark)
Drecourt, Jean-Philippe
Data assimilation is an invaluable tool in hydrological modelling as it allows to efficiently combine scarce data with a numerical model to obtain improved model predictions. In addition, data assimilation also provides an uncertainty analysis of the predictions made by the hydrological model...... with model non-linearities and biased errors. A literature review analyzes the most popular techniques and their application in hydrological modelling. Since bias is an important problem in groundwater modelling, two bias aware Kalman filters have been implemented and compared using an artificial test case...
Data Assimilation for Applied Meteorology
Haupt, S. E.
2012-12-01
Although atmospheric models provide a best estimate of the future state of the atmosphere, due to sensitivity to initial condition, it is difficult to predict the precise future state. For applied problems, however, users often depend on having accurate knowledge of that future state. To improve prediction of a particular realization of an evolving flow field requires knowledge of the current state of that field and assimilation of local observations into the model. This talk will consider how dynamic assimilation can help address the concerns of users of atmospheric forecasts. First, we will look at the value of assimilation for the renewable energy industry. If the industry decision makers can have confidence in the wind and solar power forecasts, they can build their power allocations around the expected renewable resource, saving money for the ratepayers as well as reducing carbon emissions. We will assess the value to that industry of assimilating local real-time observations into the model forecasts and the value that is provided. The value of the forecasts with assimilation is important on both short (several hour) to medium range (within two days). A second application will be atmospheric transport and dispersion problems. In particular, we will look at assimilation of concentration data into a prediction model. An interesting aspect of this problem is that the dynamics are a one-way coupled system, with the fluid dynamic equations affecting the concentration equation, but not vice versa. So when the observations are of the concentration, one must infer the fluid dynamics. This one-way coupled system presents a challenge: one must first infer the changes in the flow field from observations of the contaminant, then assimilate that to recover both the advecting flow and information on the subgrid processes that provide the mixing. To accomplish such assimilation requires a robust method to match the observed contaminant field to that modeled. One approach is
Sequential measurements of conjugate observables
Energy Technology Data Exchange (ETDEWEB)
Carmeli, Claudio [Dipartimento di Fisica, Universita di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Heinosaari, Teiko [Department of Physics and Astronomy, Turku Centre for Quantum Physics, University of Turku, 20014 Turku (Finland); Toigo, Alessandro, E-mail: claudio.carmeli@gmail.com, E-mail: teiko.heinosaari@utu.fi, E-mail: alessandro.toigo@polimi.it [Dipartimento di Matematica ' Francesco Brioschi' , Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)
2011-07-15
We present a unified treatment of sequential measurements of two conjugate observables. Our approach is to derive a mathematical structure theorem for all the relevant covariant instruments. As a consequence of this result, we show that every Weyl-Heisenberg covariant observable can be implemented as a sequential measurement of two conjugate observables. This method is applicable both in finite- and infinite-dimensional Hilbert spaces, therefore covering sequential spin component measurements as well as position-momentum sequential measurements.
Forced Sequence Sequential Decoding
DEFF Research Database (Denmark)
Jensen, Ole Riis
is possible as low as Eb/No=0.6 dB, which is about 1.7 dB below the signal-to-noise ratio that marks the cut-off rate for the convolutional code. This is possible since the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability...
Data assimilation in reservoir management
Rommelse, J.R.
2009-01-01
The research presented in this thesis aims at improving computer models that allow simulations of water, oil and gas flows in subsurface petroleum reservoirs. This is done by integrating, or assimilating, measurements into physics-bases models. In recent years petroleum technology has developed
Data assimilation making sense of observations
Lahoz, William; Menard, Richard
2010-01-01
In recent years data assimilation methods have been applied to an increasing range of earth science disciplines. This book sets out the theoretical basis of data assimilation with contributions by top international experts in the field.
Assimilation of ocean colour data into a Biogeochemical Flux Model of the Eastern Mediterranean Sea
Directory of Open Access Journals (Sweden)
G. Triantafyllou
2007-08-01
Full Text Available An advanced multivariate sequential data assimilation system has been implemented within the framework of the European MFSTEP project to fit a three-dimensional biogeochemical model of the Eastern Mediterranean to satellite chlorophyll data from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS. The physics are described by the Princeton Ocean Model (POM while the biochemistry of the ecosystem is tackled with the Biogeochemical Flux Model (BFM. The assimilation scheme is based on the Singular Evolutive Extended Kalman (SEEK filter, in which the error statistics were parameterized by means of a suitable set of Empirical Orthogonal Functions (EOFs. To avoid spurious long-range correlations associated with the limited number of EOFs, the filter covariance matrix was given compact support through a radius of influence around every data point location. Hindcast experiments were performed for one year over 1999 and forced with ECMWF 6 h atmospheric fields. The solution of the assimilation system was evaluated against the assimilated data and the MedAtlas climatology, and by assessing the impact of the assimilation on non-observed biogeochemical processes. It is found that the assimilation of SeaWiFS data improves the overall behavior of the BFM model and efficiently removes long term biases from the model despite some difficulties during the spring bloom period. Results, however, suggest the need of subsurface data to enhance the estimation of the ecosystem variables in the deep layers.
Dunn, William L
2012-01-01
Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble
Directory of Open Access Journals (Sweden)
Bardenet Rémi
2013-07-01
Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.
Lagrangian Displacement Ensembles for Aerosol Data Assimilation (Invited)
da Silva, A.; Colarco, P. R.; Govindaraju, R. C.
2010-12-01
A challenge common to many constituent data assimilation applications is the fact that one observes a much smaller fraction of the phase space that one wishes to estimate. For example, remotely-sensed estimates of the column average concentrations are available, while one is faced with the problem of estimating 3D concentractions for initializing a prognostic model. This problem is exarcebated in the the case of aerosols because the observable Aerosol Optical Depth (AOD) is not only a column integrated quantity, but it also sums over a large number of species (dust, sea-salt, carbonaceous and sulfate aerosols). An aerosol transport model when driven by high-resolution, state-of-the-art analysis of meterorological fields and realistc emissions can produce skillful forecasts even when no aerosol data is assimilated. The main task of aerosol data assimilation is to address the bias arising from innacurate emissions, and the Lagrangian misplacement of plumes induced by errors in the driving meterorological fields. As long as one decouples the meteorological and aerosol assimilation as we do here, the classic baroclinic growth of errors is no longer the main order of business. We will describe and aerosol data assimilation scheme in which the anaysis update step is conducted in observation space, using an adaptive maximum-likelihood scheme for estimating background errors in AOD space. This scheme includes explicit sequential bias estimation as in Dee and da Silva (1998). Unlikely existing aerosol data assimiltion schemes we do not obtain analysis increments of the 3D concentrations by scalling the background profiles. Instead, we explore the Langrangian characteristics of the problem for generating local displacement ensembles. These high-resolution, state-dependent ensembles are then used to parameterize the background errors and generate 3D aerosol increments. The algorithm has computational complexity comparable to the forecasting step by the aerosol transport model
Synthetic Aperture Sequential Beamforming
DEFF Research Database (Denmark)
Kortbek, Jacob; Jensen, Jørgen Arendt; Gammelmark, Kim Løkke
2008-01-01
A synthetic aperture focusing (SAF) technique denoted Synthetic Aperture Sequential Beamforming (SASB) suitable for 2D and 3D imaging is presented. The technique differ from prior art of SAF in the sense that SAF is performed on pre-beamformed data contrary to channel data. The objective...... is stored. The second stage applies the focused image lines from the first stage as input data. The SASB method has been investigated using simulations in Field II and by off-line processing of data acquired with a commercial scanner. The performance of SASB with a static image object is compared with DRF...
News for assimilation or integration
Alencar, Amanda; Deuze, Mark
2017-04-01
This study investigates the functions of news media in shaping acculturation experiences of new economic and refugee immigrants in the Netherlands and Spain. Focus group data revealed that consumption of host country news media was mainly connected to immigrants' deliberate strategies to assimilate the culture, politics and language of the host society, while exposure to transnational news was viewed in terms of strategies of integration in both countries. We also observed that participants' educational background and language skills combined with their perceptions of the host country's news have an impact on the use they make of news for assimilating and/or integrating into the host society. Finally, important sociopolitical conditions of the context influenced the ways participants use the news media in their process of acculturation.
Wavelet Approximation in Data Assimilation
Tangborn, Andrew; Atlas, Robert (Technical Monitor)
2002-01-01
Estimation of the state of the atmosphere with the Kalman filter remains a distant goal because of high computational cost of evolving the error covariance for both linear and nonlinear systems. Wavelet approximation is presented here as a possible solution that efficiently compresses both global and local covariance information. We demonstrate the compression characteristics on the the error correlation field from a global two-dimensional chemical constituent assimilation, and implement an adaptive wavelet approximation scheme on the assimilation of the one-dimensional Burger's equation. In the former problem, we show that 99%, of the error correlation can be represented by just 3% of the wavelet coefficients, with good representation of localized features. In the Burger's equation assimilation, the discrete linearized equations (tangent linear model) and analysis covariance are projected onto a wavelet basis and truncated to just 6%, of the coefficients. A nearly optimal forecast is achieved and we show that errors due to truncation of the dynamics are no greater than the errors due to covariance truncation.
Dating phylogenies with sequentially sampled tips.
Stadler, Tanja; Yang, Ziheng
2013-09-01
We develop a Bayesian Markov chain Monte Carlo (MCMC) algorithm for estimating divergence times using sequentially sampled molecular sequences. This type of data is commonly collected during viral epidemics and is sometimes available from different species in ancient DNA studies. We derive the distribution of ages of nodes in the tree under a birth-death-sequential-sampling (BDSS) model and use it as the prior for divergence times in the dating analysis. We implement the prior in the MCMCtree program in the PAML package for divergence dating. The BDSS prior is very flexible and, with different parameters, can generate trees of very different shapes, suitable for examining the sensitivity of posterior time estimates. We apply the method to a data set of SIV/HIV-2 genes in comparison with a likelihood-based dating method, and to a data set of influenza H1 genes from different hosts in comparison with the Bayesian program BEAST. We examined the impact of tree topology on time estimates and suggest that multifurcating consensus trees should be avoided in dating analysis. We found posterior time estimates for old nodes to be sensitive to the priors on times and rates and suggest that previous Bayesian dating studies may have produced overconfident estimates.
Sequential Effects in Judgements of Attractiveness: The Influences of Face Race and Sex
Kramer, Robin S. S.; Jones, Alex L.; Sharma, Dinkar
2013-01-01
In perceptual decision-making, a person’s response on a given trial is influenced by their response on the immediately preceding trial. This sequential effect was initially demonstrated in psychophysical tasks, but has now been found in more complex, real-world judgements. The similarity of the current and previous stimuli determines the nature of the effect, with more similar items producing assimilation in judgements, while less similarity can cause a contrast effect. Previous research found assimilation in ratings of facial attractiveness, and here, we investigated whether this effect is influenced by the social categories of the faces presented. Over three experiments, participants rated the attractiveness of own- (White) and other-race (Chinese) faces of both sexes that appeared successively. Through blocking trials by race (Experiment 1), sex (Experiment 2), or both dimensions (Experiment 3), we could examine how sequential judgements were altered by the salience of different social categories in face sequences. For sequences that varied in sex alone, own-race faces showed significantly less opposite-sex assimilation (male and female faces perceived as dissimilar), while other-race faces showed equal assimilation for opposite- and same-sex sequences (male and female faces were not differentiated). For sequences that varied in race alone, categorisation by race resulted in no opposite-race assimilation for either sex of face (White and Chinese faces perceived as dissimilar). For sequences that varied in both race and sex, same-category assimilation was significantly greater than opposite-category. Our results suggest that the race of a face represents a superordinate category relative to sex. These findings demonstrate the importance of social categories when considering sequential judgements of faces, and also highlight a novel approach for investigating how multiple social dimensions interact during decision-making. PMID:24349226
Sequential Ensembles Tolerant to Synthetic Aperture Radar (SAR Soil Moisture Retrieval Errors
Directory of Open Access Journals (Sweden)
Ju Hyoung Lee
2016-04-01
Full Text Available Due to complicated and undefined systematic errors in satellite observation, data assimilation integrating model states with satellite observations is more complicated than field measurements-based data assimilation at a local scale. In the case of Synthetic Aperture Radar (SAR soil moisture, the systematic errors arising from uncertainties in roughness conditions are significant and unavoidable, but current satellite bias correction methods do not resolve the problems very well. Thus, apart from the bias correction process of satellite observation, it is important to assess the inherent capability of satellite data assimilation in such sub-optimal but more realistic observational error conditions. To this end, time-evolving sequential ensembles of the Ensemble Kalman Filter (EnKF is compared with stationary ensemble of the Ensemble Optimal Interpolation (EnOI scheme that does not evolve the ensembles over time. As the sensitivity analysis demonstrated that the surface roughness is more sensitive to the SAR retrievals than measurement errors, it is a scope of this study to monitor how data assimilation alters the effects of roughness on SAR soil moisture retrievals. In results, two data assimilation schemes all provided intermediate values between SAR overestimation, and model underestimation. However, under the same SAR observational error conditions, the sequential ensembles approached a calibrated model showing the lowest Root Mean Square Error (RMSE, while the stationary ensemble converged towards the SAR observations exhibiting the highest RMSE. As compared to stationary ensembles, sequential ensembles have a better tolerance to SAR retrieval errors. Such inherent nature of EnKF suggests an operational merit as a satellite data assimilation system, due to the limitation of bias correction methods currently available.
An ensemble Kalman filter for atmospheric data assimilation: Application to wind tunnel data
Zheng, D. Q.; Leung, J. K. C.; Lee, B. Y.
2010-05-01
In the previous work ( Zheng et al., 2007, 2009), a data assimilation method, based on ensemble Kalman filter, has been applied to a Monte Carlo Dispersion Model (MCDM). The results were encouraging when the method was tested by the twin experiment and a short-range field experiment. In this technical note, the measured data collected in a wind tunnel experiment have been assimilated into the Monte Carlo dispersion model. The uncertain parameters in the dispersion model, including source term, release height, turbulence intensity and wind direction have been considered. The 3D parameters, i.e. the turbulence intensity and wind direction, have been perturbed by 3D random fields. In order to find the factors which may influence the assimilation results, eight tests with different specifications were carried out. Two strategies of constructing the 3D perturbation field of wind direction were proposed, and the result shows that the two level strategy performs better than the one level strategy. It is also found that proper standard deviation and the correlation radius of the perturbation field play an important role for the data assimilation results.
Sequential auctions and price anomalies
Directory of Open Access Journals (Sweden)
Trifunović Dejan
2014-01-01
Full Text Available In sequential auctions objects are sold one by one in separate auctions. These sequential auctions might be organized as sequential first-price, second-price, or English auctions. We will derive equilibrium bidding strategies for these auctions. Theoretical models suggest that prices in sequential auctions with private values or with randomly assigned heterogeneous objects should have no trend. However, empirical research contradicts this result and prices exhibit a declining or increasing trend, which is called declining and increasing price anomaly. We will present a review of these empirical results, as well as different theoretical explanations for these anomalies.
A sequential tree approach for incremental sequential pattern mining
Indian Academy of Sciences (India)
''Sequential pattern mining'' is a prominent and significant method to explore the knowledge and innovation from the large database. Common sequential pattern mining algorithms handle static databases.Pragmatically, looking into the functional and actual execution, the database grows exponentially thereby leading to ...
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
International Nuclear Information System (INIS)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described
Sequential Methods and Their Applications
Mukhopadhyay, Nitis
2008-01-01
Illustrates the efficiency of sequential methodologies when dealing with contemporary statistical challenges in many areas. This book explores fixed sample size, sequential probability ratio, and nonparametric tests. It also presents multistage estimation methods for fixed-width confidence interval as well as minimum and bounded risk problems.
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...
Data assimilation in the hydrological dispersion module of Rodos
International Nuclear Information System (INIS)
Madsen, H.
2003-01-01
become available for assimilation into the DeMM more precise estimates of the deposition are obtained, which will reduce the uncertainty in the wash-off modelling. The prediction uncertainty of radionuclide contamination will be further reduced when radionuclide concentration measurements in downstream water bodies become available for updating the hydrological models. For the data assimilation in the HDM measurements of concentrations of different radionuclides in solute and on suspended sediments will be available. Based on these measurements the hydrological modelling components can be updated. This includes updating of the three different phases of radionuclides (i) in solute, (ii) on suspended sediments, and (iii) in bottom depositions in all computational grid points of the modelled system. Since the three radionuclide phases are linked together via the sorption/desorption process descriptions in the model, the data assimilation system is able to update all three phases when only one of the phases is being measured. The data assimilation system is based on the Kalman filter. In this respect, different cost-effective Kalman filter procedures that are feasible for real-time applications are being developed and implemented. These include the reduced rank square-root filter in which the error covariance matrix is approximated by a matrix of lower rank using a square-root factorization, an ensemble Kalman filter based an a Monte Carlo simulation approach for propagation of errors, and a steady Kalman filter based on a fixed error assumption. This paper provides a description of the data assimilation system that is being developed and implemented in the RODOS HDM. Test examples are presented that illustrate the use of the data assimilation procedures to improve the predictive capabilities of the one-dimensional and two-dimensional models of the RODOS HDM for prediction of radionuclide contamination of rivers and reservoirs. (author)
A data assimilation tool for the Pagasitikos Gulf ecosystem dynamics: Methods and benefits
Korres, Gerasimos
2012-06-01
Within the framework of the European INSEA project, an advanced assimilation system has been implemented for the Pagasitikos Gulf ecosystem. The system is based on a multivariate sequential data assimilation scheme that combines satellite ocean sea color (chlorophyll-a) data with the predictions of a three-dimensional coupled physical-biochemical model of the Pagasitikos Gulf ecosystem presented in a companion paper. The hydrodynamics are solved with a very high resolution (1/100°) implementation of the Princeton Ocean Model (POM). This model is nested within a coarser resolution model of the Aegean Sea which is part of the Greek POSEIDON forecasting system. The forecast of the Aegean Sea model, itself nested and initialized from a Mediterranean implementation of POM, is also used to periodically re-initalize the Pagatisikos hydrodynamics model using variational initialization techniques. The ecosystem dynamics of Pagasitikos are tackled with a stand-alone implementation of the European Seas Ecosystem Model (ERSEM). The assimilation scheme is based on the Singular Evolutive Extended Kalman (SEEK) filter, in which the error statistics are parameterized by means of a suitable set of Empirical Orthogonal Functions (EOFs).The assimilation experiments were performed for year 2003 and additionally for a 9-month period over 2006 during which the physical model was forced with the POSEIDON-ETA 6-hour atmospheric fields. The assimilation system is validated by assessing the relevance of the system in fitting the data, the impact of the assimilation on non-observed biochemical processes and the overall quality of the forecasts. Assimilation of either GlobColour in 2003 or SeaWiFS in 2006 chlorophyll-a data enhances the identification of the ecological state of the Pagasitikos Gulf. Results, however, suggest that subsurface ecological observations are needed to improve the controllability of the ecosystem in the deep layers. © 2011 Elsevier B.V.
Integrated Data Assimilation Architecture, Phase I
National Aeronautics and Space Administration — The Integrated Data Assimilation Architecture (IDAA) is a middleware architecture that facilitates the incorporation of heterogeneous sensing and control devices...
Assimilation Dynamic Network (ADN), Phase II
National Aeronautics and Space Administration — The Assimilation Dynamic Network (ADN) is a dynamic inter-processor communication network that spans heterogeneous processor architectures, unifying components,...
The Onsager–Machlup functional for data assimilation
Directory of Open Access Journals (Sweden)
N. Sugiura
2017-12-01
Full Text Available When taking the model error into account in data assimilation, one needs to evaluate the prior distribution represented by the Onsager–Machlup functional. Through numerical experiments, this study clarifies how the prior distribution should be incorporated into cost functions for discrete-time estimation problems. Consistent with previous theoretical studies, the divergence of the drift term is essential in weak-constraint 4D-Var (w4D-Var, but it is not necessary in Markov chain Monte Carlo with the Euler scheme. Although the former property may cause difficulties when implementing w4D-Var in large systems, this paper proposes a new technique for estimating the divergence term and its derivative.
The Onsager-Machlup functional for data assimilation
Sugiura, Nozomi
2017-12-01
When taking the model error into account in data assimilation, one needs to evaluate the prior distribution represented by the Onsager-Machlup functional. Through numerical experiments, this study clarifies how the prior distribution should be incorporated into cost functions for discrete-time estimation problems. Consistent with previous theoretical studies, the divergence of the drift term is essential in weak-constraint 4D-Var (w4D-Var), but it is not necessary in Markov chain Monte Carlo with the Euler scheme. Although the former property may cause difficulties when implementing w4D-Var in large systems, this paper proposes a new technique for estimating the divergence term and its derivative.
Data assimilation with inequality constraints
Thacker, W. C.
If values of variables in a numerical model are limited to specified ranges, these restrictions should be enforced when data are assimilated. The simplest option is to assimilate without regard for constraints and then to correct any violations without worrying about additional corrections implied by correlated errors. This paper addresses the incorporation of inequality constraints into the standard variational framework of optimal interpolation with emphasis on our limited knowledge of the underlying probability distributions. Simple examples involving only two or three variables are used to illustrate graphically how active constraints can be treated as error-free data when background errors obey a truncated multi-normal distribution. Using Lagrange multipliers, the formalism is expanded to encompass the active constraints. Two algorithms are presented, both relying on a solution ignoring the inequality constraints to discover violations to be enforced. While explicitly enforcing a subset can, via correlations, correct the others, pragmatism based on our poor knowledge of the underlying probability distributions suggests the expedient of enforcing them all explicitly to avoid the computationally expensive task of determining the minimum active set. If additional violations are encountered with these solutions, the process can be repeated. Simple examples are used to illustrate the algorithms and to examine the nature of the corrections implied by correlated errors.
Hepatobiliary sequential scintiscanning
International Nuclear Information System (INIS)
Eissner, D.
1985-01-01
The main criteria for interpreting hepatobiliary sequential scintiscanning (HBSS) data are given to be the following: (1) In young infants - without previous parenteral feeding - a normal up to a slightly increased activity uptake in the liver, accompanied by lack of activity excretion into the intestine (which requires a 24 - hour scan for detection) is a clear indication of bile duct atresia. However, the same findings can be obtained on very young newborn (up to one week of age) in case of a hepatitis with defined cholestasis. (2) In case of a comparably high activity uptake in the liver together with activity excretion into the intestine, which may be detectable in due time or after a delay (24-hour scan required), bile duct atresia can be excluded, the diagnosis being hepatitis. In general, hepatitis will cause a stronger liver cell damage, which excludes the criterion of activity excretion into the intestine. Similar findings can be obtained on infants with bile duct atresia with previous parenteral feeding. This is why the interpretation of HBSS data can only be effectively carried out in close cooperation with the pediatrician, and on the basis of profound knowledge of the overall clinical state of the infant. (orig.) [de
Sequential Design of Experiments
Energy Technology Data Exchange (ETDEWEB)
Anderson-Cook, Christine Michaela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-06-30
A sequential design of experiments strategy is being developed and implemented that allows for adaptive learning based on incoming results as the experiment is being run. The plan is to incorporate these strategies for the NCCC and TCM experimental campaigns to be run in the coming months. This strategy for experimentation has the advantages of allowing new data collected during the experiment to inform future experimental runs based on their projected utility for a particular goal. For example, the current effort for the MEA capture system at NCCC plans to focus on maximally improving the quality of prediction of CO_{2} capture efficiency as measured by the width of the confidence interval for the underlying response surface that is modeled as a function of 1) Flue Gas Flowrate [1000-3000] kg/hr; 2) CO_{2} weight fraction [0.125-0.175]; 3) Lean solvent loading [0.1-0.3], and; 4) Lean solvent flowrate [3000-12000] kg/hr.
Adaptive sequential controller
Energy Technology Data Exchange (ETDEWEB)
El-Sharkawi, Mohamed A. (Renton, WA); Xing, Jian (Seattle, WA); Butler, Nicholas G. (Newberg, OR); Rodriguez, Alonso (Pasadena, CA)
1994-01-01
An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.
Adaptive sequential controller
El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso
1994-01-01
An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.
Hydrologic Remote Sensing and Land Surface Data Assimilation
Directory of Open Access Journals (Sweden)
Hamid Moradkhani
2008-05-01
Full Text Available Accurate, reliable and skillful forecasting of key environmental variables such as soil moisture and snow are of paramount importance due to their strong influence on many water resources applications including flood control, agricultural production and effective water resources management which collectively control the behavior of the climate system. Soil moisture is a key state variable in land surfaceÃ¢Â€Â“atmosphere interactions affecting surface energy fluxes, runoff and the radiation balance. Snow processes also have a large influence on land-atmosphere energy exchanges due to snow high albedo, low thermal conductivity and considerable spatial and temporal variability resulting in the dramatic change on surface and ground temperature. Measurement of these two variables is possible through variety of methods using ground-based and remote sensing procedures. Remote sensing, however, holds great promise for soil moisture and snow measurements which have considerable spatial and temporal variability. Merging these measurements with hydrologic model outputs in a systematic and effective way results in an improvement of land surface model prediction. Data Assimilation provides a mechanism to combine these two sources of estimation. Much success has been attained in recent years in using data from passive microwave sensors and assimilating them into the models. This paper provides an overview of the remote sensing measurement techniques for soil moisture and snow data and describes the advances in data assimilation techniques through the ensemble filtering, mainly Ensemble Kalman filter (EnKF and Particle filter (PF, for improving the model prediction and reducing the uncertainties involved in prediction process. It is believed that PF provides a complete representation of the probability distribution of state variables of interests (according to sequential Bayes law and could be a strong alternative to EnKF which is subject to some
Assimilation of MLS and OMI Ozone Data
Stajner, I.; Wargan, K.; Chang, L.-P.; Hayashi, H.; Pawson, S.; Froidevaux, L.; Livesey, N.
2005-01-01
Ozone data from Aura Microwave Limb Sounder (MLS) and Ozone Monitoring Instrument (OMI) were assimilated into the ozone model at NASA's Global Modeling and Assimilation Office (GMAO). This assimilation produces ozone fields that are superior to those from the operational GMAO assimilation of Solar Backscatter Ultraviolet (SBUV/2) instrument data. Assimilation of Aura data improves the representation of the "ozone hole" and the agreement with independent Stratospheric Aerosol and Gas Experiment (SAGE) III and ozone sonde data. Ozone in the lower stratosphere is captured better: mean state, vertical gradients, spatial and temporal variability are all improved. Inclusion of OMI and MLS data together, or separately, in the assimilation system provides a way of checking how consistent OMI and MLS data are with each other, and with the ozone model. We found that differences between OMI total ozone column data and model forecasts decrease after MLS data are assimilated. This indicates that MLS stratospheric ozone profiles are consistent with OMI total ozone columns. The evaluation of error characteristics of OMI and MLS ozone will continue as data from newer versions of retrievals becomes available. We report on the initial step in obtaining global assimilated ozone fields that combine measurements from different Aura instruments, the ozone model at the GMAO, and their respective error characteristics. We plan to use assimilated ozone fields in estimation of tropospheric ozone. We also plan to investigate impacts of assimilated ozone fields on numerical weather prediction through their use in radiative models and in the assimilation of infrared nadir radiance data from NASA's Advanced Infrared Sounder (AIRS).
Monte Carlo codes and Monte Carlo simulator program
International Nuclear Information System (INIS)
Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.
1990-03-01
Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)
Gobert, Antoine; Tourdot-Maréchal, Raphaëlle; Morge, Christophe; Sparrow, Céline; Liu, Youzhong; Quintanilla-Casas, Beatriz; Vichi, Stefania; Alexandre, Hervé
2017-01-01
Nitrogen sources in the must are important for yeast metabolism, growth, and performance, and wine volatile compounds profile. Yeast assimilable nitrogen (YAN) deficiencies in grape must are one of the main causes of stuck and sluggish fermentation. The nitrogen requirement of Saccharomyces cerevisiae metabolism has been described in detail. However, the YAN preferences of non-Saccharomyces yeasts remain unknown despite their increasingly widespread use in winemaking. Furthermore, the impact of nitrogen consumption by non-Saccharomyces yeasts on YAN availability, alcoholic performance and volatile compounds production by S. cerevisiae in sequential fermentation has been little studied. With a view to improving the use of non-Saccharomyces yeasts in winemaking, we studied the use of amino acids and ammonium by three strains of non-Saccharomyces yeasts (Starmerella bacillaris, Metschnikowia pulcherrima, and Pichia membranifaciens) in grape juice. We first determined which nitrogen sources were preferentially used by these yeasts in pure cultures at 28 and 20°C (because few data are available). We then carried out sequential fermentations at 20°C with S. cerevisiae, to assess the impact of the non-Saccharomyces yeasts on the availability of assimilable nitrogen for S. cerevisiae. Finally, 22 volatile compounds were quantified in sequential fermentation and their levels compared with those in pure cultures of S. cerevisiae. We report here, for the first time, that non-Saccharomyces yeasts have specific amino-acid consumption profiles. Histidine, methionine, threonine, and tyrosine were not consumed by S. bacillaris, aspartic acid was assimilated very slowly by M. pulcherrima, and glutamine was not assimilated by P. membranifaciens. By contrast, cysteine appeared to be a preferred nitrogen source for all non-Saccharomyces yeasts. In sequential fermentation, these specific profiles of amino-acid consumption by non-Saccharomyces yeasts may account for some of the
Gobert, Antoine; Tourdot-Maréchal, Raphaëlle; Morge, Christophe; Sparrow, Céline; Liu, Youzhong; Quintanilla-Casas, Beatriz; Vichi, Stefania; Alexandre, Hervé
2017-01-01
Nitrogen sources in the must are important for yeast metabolism, growth, and performance, and wine volatile compounds profile. Yeast assimilable nitrogen (YAN) deficiencies in grape must are one of the main causes of stuck and sluggish fermentation. The nitrogen requirement of Saccharomyces cerevisiae metabolism has been described in detail. However, the YAN preferences of non- Saccharomyces yeasts remain unknown despite their increasingly widespread use in winemaking. Furthermore, the impact of nitrogen consumption by non- Saccharomyces yeasts on YAN availability, alcoholic performance and volatile compounds production by S. cerevisiae in sequential fermentation has been little studied. With a view to improving the use of non- Saccharomyces yeasts in winemaking, we studied the use of amino acids and ammonium by three strains of non- Saccharomyces yeasts ( Starmerella bacillaris, Metschnikowia pulcherrima , and Pichia membranifaciens ) in grape juice. We first determined which nitrogen sources were preferentially used by these yeasts in pure cultures at 28 and 20°C (because few data are available). We then carried out sequential fermentations at 20°C with S. cerevisiae , to assess the impact of the non- Saccharomyces yeasts on the availability of assimilable nitrogen for S. cerevisiae . Finally, 22 volatile compounds were quantified in sequential fermentation and their levels compared with those in pure cultures of S. cerevisiae . We report here, for the first time, that non- Saccharomyces yeasts have specific amino-acid consumption profiles. Histidine, methionine, threonine, and tyrosine were not consumed by S. bacillaris , aspartic acid was assimilated very slowly by M. pulcherrima , and glutamine was not assimilated by P. membranifaciens . By contrast, cysteine appeared to be a preferred nitrogen source for all non- Saccharomyces yeasts. In sequential fermentation, these specific profiles of amino-acid consumption by non- Saccharomyces yeasts may account for
Mining Frequent Max and Closed Sequential Patterns
Afshar, Ramin
2002-01-01
Although frequent sequential pattern mining has an important role in many data mining tasks, however, it often generates a large number of sequential patterns, which reduces its efficiency and effectiveness. For many applications mining all the frequent sequential patterns is not necessary, and mining frequent Max, or Closed sequential patterns will provide the same amount of information. Comparing to frequent sequential pattern mining, frequent Max, or Closed sequential pattern mining g...
Development of a data assimilation algorithm
DEFF Research Database (Denmark)
Thomsen, Per Grove; Zlatev, Zahari
2008-01-01
assimilation technique is applied. Therefore, it is important to study the interplay between the three components of the variational data assimilation techniques as well as to apply powerful parallel computers in the computations. Some results obtained in the search for a good combination of numerical methods...
On NonAsymptotic Optimal Stopping Criteria in Monte Carlo Simulations
Bayer, Christian
2014-01-01
We consider the setting of estimating the mean of a random variable by a sequential stopping rule Monte Carlo (MC) method. The performance of a typical second moment based sequential stopping rule MC method is shown to be unreliable in such settings both by numerical examples and through analysis. By analysis and approximations, we construct a higher moment based stopping rule which is shown in numerical examples to perform more reliably and only slightly less efficiently than the second moment based stopping rule.
Data assimilation a mathematical introduction
Law, Kody; Zygalakis, Konstantinos
2015-01-01
This book provides a systematic treatment of the mathematical underpinnings of work in data assimilation, covering both theoretical and computational approaches. Specifically the authors develop a unified mathematical framework in which a Bayesian formulation of the problem provides the bedrock for the derivation, development and analysis of algorithms; the many examples used in the text, together with the algorithms which are introduced and discussed, are all illustrated by the MATLAB software detailed in the book and made freely available online. The book is organized into nine chapters: the first contains a brief introduction to the mathematical tools around which the material is organized; the next four are concerned with discrete time dynamical systems and discrete time data; the last four are concerned with continuous time dynamical systems and continuous time data and are organized analogously to the corresponding discrete time chapters. This book is aimed at mathematical researchers interested in a sy...
Effective assimilation of global precipitation: simulation experiments
Directory of Open Access Journals (Sweden)
Guo-Yuan Lien
2013-07-01
Full Text Available Past attempts to assimilate precipitation by nudging or variational methods have succeeded in forcing the model precipitation to be close to the observed values. However, the model forecasts tend to lose their additional skill after a few forecast hours. In this study, a local ensemble transform Kalman filter (LETKF is used to effectively assimilate precipitation by allowing ensemble members with better precipitation to receive higher weights in the analysis. In addition, two other changes in the precipitation assimilation process are found to alleviate the problems related to the non-Gaussianity of the precipitation variable: (a transform the precipitation variable into a Gaussian distribution based on its climatological distribution (an approach that could also be used in the assimilation of other non-Gaussian observations and (b only assimilate precipitation at the location where at least some ensemble members have precipitation. Unlike many current approaches, both positive and zero rain observations are assimilated effectively. Observing system simulation experiments (OSSEs are conducted using the Simplified Parametrisations, primitivE-Equation DYnamics (SPEEDY model, a simplified but realistic general circulation model. When uniformly and globally distributed observations of precipitation are assimilated in addition to rawinsonde observations, both the analyses and the medium-range forecasts of all model variables, including precipitation, are significantly improved as compared to only assimilating rawinsonde observations. The effect of precipitation assimilation on the analyses is retained on the medium-range forecasts and is larger in the Southern Hemisphere (SH than that in the Northern Hemisphere (NH because the NH analyses are already made more accurate by the denser rawinsonde stations. These improvements are much reduced when only the moisture field is modified by the precipitation observations. Both the Gaussian transformation and
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Efficient Mean Field Variational Algorithm for Data Assimilation (Invited)
Vrettas, M. D.; Cornford, D.; Opper, M.
2013-12-01
Data assimilation algorithms combine available observations of physical systems with the assumed model dynamics in a systematic manner, to produce better estimates of initial conditions for prediction. Broadly they can be categorized in three main approaches: (a) sequential algorithms, (b) sampling methods and (c) variational algorithms which transform the density estimation problem to an optimization problem. However, given finite computational resources, only a handful of ensemble Kalman filters and 4DVar algorithms have been applied operationally to very high dimensional geophysical applications, such as weather forecasting. In this paper we present a recent extension to our variational Bayesian algorithm which seeks the ';optimal' posterior distribution over the continuous time states, within a family of non-stationary Gaussian processes. Our initial work on variational Bayesian approaches to data assimilation, unlike the well-known 4DVar method which seeks only the most probable solution, computes the best time varying Gaussian process approximation to the posterior smoothing distribution for dynamical systems that can be represented by stochastic differential equations. This approach was based on minimising the Kullback-Leibler divergence, over paths, between the true posterior and our Gaussian process approximation. Whilst the observations were informative enough to keep the posterior smoothing density close to Gaussian the algorithm proved very effective on low dimensional systems (e.g. O(10)D). However for higher dimensional systems, the high computational demands make the algorithm prohibitively expensive. To overcome the difficulties presented in the original framework and make our approach more efficient in higher dimensional systems we have been developing a new mean field version of the algorithm which treats the state variables at any given time as being independent in the posterior approximation, while still accounting for their relationships in the
Data assimilation of citizen collected information for real-time flood hazard mapping
Sayama, T.; Takara, K. T.
2017-12-01
Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE < 0.4 m in average. For the second more realistic situation, the error becomes larger (RMSE 0.5 m) and the impact of the optimal interpolation becomes comparatively less effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
ias
nonprobabilistic) problem [5]. ... In quantum mechanics, the MC methods are used to simulate many-particle systems us- ing random ...... D Ceperley, G V Chester and M H Kalos, Monte Carlo simulation of a many-fermion study, Physical Review Vol.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Markov Chain Monte Carlo - Examples. Arnab Chakraborty. General Article Volume 7 Issue 3 March 2002 pp 25-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0025-0034. Keywords.
Directory of Open Access Journals (Sweden)
Y. Zhao
2017-11-01
Full Text Available Climate signals are the results of interactions of multiple timescale media such as the atmosphere and ocean in the coupled earth system. Coupled data assimilation (CDA pursues balanced and coherent climate analysis and prediction initialization by incorporating observations from multiple media into a coupled model. In practice, an observational time window (OTW is usually used to collect measured data for an assimilation cycle to increase observational samples that are sequentially assimilated with their original error scales. Given different timescales of characteristic variability in different media, what are the optimal OTWs for the coupled media so that climate signals can be most accurately recovered by CDA? With a simple coupled model that simulates typical scale interactions in the climate system and twin CDA experiments, we address this issue here. Results show that in each coupled medium, an optimal OTW can provide maximal observational information that best fits the characteristic variability of the medium during the data blending process. Maintaining correct scale interactions, the resulting CDA improves the analysis of climate signals greatly. These simple model results provide a guideline for when the real observations are assimilated into a coupled general circulation model for improving climate analysis and prediction initialization by accurately recovering important characteristic variability such as sub-diurnal in the atmosphere and diurnal in the ocean.
Directory of Open Access Journals (Sweden)
Zhiwei Jiang
2014-03-01
Full Text Available To improve crop model performance for regional crop yield estimates, a new four-dimensional variational algorithm (POD4DVar merging the Monte Carlo and proper orthogonal decomposition techniques was introduced to develop a data assimilation strategy using the Crop Environment Resource Synthesis (CERES-Wheat model. Two winter wheat yield estimation procedures were conducted on a field plot and regional scale to test the feasibility and potential of the POD4DVar-based strategy. Winter wheat yield forecasts for the field plots showed a coefficient of determination (R2 of 0.73, a root mean square error (RMSE of 319 kg/ha, and a relative error (RE of 3.49%. An acceptable yield at the regional scale was estimated with an R2 of 0.997, RMSE of 7346 tons, and RE of 3.81%. The POD4DVar-based strategy was more accurate and efficient than the EnKF-based strategy. In addition to crop yield, other critical crop variables such as the biomass, harvest index, evapotranspiration, and soil organic carbon may also be estimated. The present study thus introduces a promising approach for operationally monitoring regional crop growth and predicting yield. Successful application of this assimilation model at regional scales must focus on uncertainties derived from the crop model, model inputs, data assimilation algorithm, and assimilated observations.
Berline, L.; Brankart, J.-M.; Brasseur, P.
The general objective of this work is to examine how the assimilation of data in a circulation model can improve the biological response simulated by a coupled physical-ecosystem model. In this work, the focus will be on the impact of altimetric, SST and SSS data assimilation in an eddy-permitting coupled model of the North Atlantic. The physical model is a z-coordinate, rigid lid, primitive-equation model based on the OPA code [Madec et al, 1998]. The horizontal resolution is 1/3° and there are 43 vertical levels with refinement near the surface. The biogeochemical model is the P3ZD biogeochemical model [Aumont et al., 1998] that describes the cycling of carbon, silica and calcium. The simulations are performed using realistic forcings during 1998. The assimilation method is based on a Kalman filter with reduced order error covariance matrix, known as the SEEK filter [ Pham et al., 1998]. The sequential scheme has been modified recently using the concept of "incremental analysis update" to enforce temporal continuity of the assimilation run. In order to evaluate how the assimilation can improve the representation of the biological fields, comparisons are made between free runs and simulations with assimilation. A first comparison with the assimilation run obtained using the scheme developed by Testut et al. [2003] indicates the excessive supply of nutrients in the euphotic zone through spurious mixing and advection mechanisms. This can be partly attributed to several factors, e.g. the statistical method which is unable to maintain the model constraint of hydrostatic stability, the discontinuous nature of the sequential algorithm, or the lack of consistent corrections between the physical and biological components of the state vector. Several variants of the assimilation algorithm are implemented in order to improve the representation of the model dynamics and its subsequent impact on the biological variables. A comparison between the assimilation runs obtained
Assimilative and non-assimilative color spreading in the watercolor configuration
Directory of Open Access Journals (Sweden)
Eiji eKimura
2014-09-01
Full Text Available A colored line flanking a darker contour will appear to spread its color onto an area enclosed by the line (watercolor effect. The watercolor effect has been characterized as an assimilative effect, but non-assimilative color spreading has also been demonstrated in the same spatial configuration; e.g., when a black inner contour (IC is paired with a blue outer contour (OC, yellow color spreading can be observed. To elucidate visual mechanisms underlying these different color spreading effects, this study investigated the effects of luminance ratio between the double contours on the induced color by systematically manipulating the IC and OC luminances (Experiment 1 as well as the background luminance (Experiment 2. The results showed that the luminance conditions suitable for assimilative and non-assimilative color spreading were nearly opposite. When the Weber contrast of the IC to the background luminances (IC contrast was smaller than that of the OC (OC contrast, the induced color became similar to the IC color (assimilative spreading. In contrast, when the OC contrast was smaller than or equal to the IC contrast, the induced color became yellow (non-assimilative spreading. Extending these findings, Experiment 3 showed that bilateral color spreading, e.g., assimilative spreading on one side and non-assimilative spreading on the other side, can also be observed in the watercolor configuration. These results suggest that the assimilative and non-assimilative spreading were mediated by different visual mechanisms. The properties of the assimilative spreading are consistent with the model proposed to account for neon color spreading [Grossberg, S. & Mingolla, E. (1985 Percept. Psychophys., 38, 141-171] and extended for the watercolor effect [Pinna, B., & Grossberg, S. (2005 J. Opt. Soc. Am. A, 22, 2207-2221]. However, the present results suggest that additional mechanisms are needed to account for the non-assimilative color spreading.
Assimilative and non-assimilative color spreading in the watercolor configuration.
Kimura, Eiji; Kuroki, Mikako
2014-01-01
A colored line flanking a darker contour will appear to spread its color onto an area enclosed by the line (watercolor effect). The watercolor effect has been characterized as an assimilative effect, but non-assimilative color spreading has also been demonstrated in the same spatial configuration; e.g., when a black inner contour (IC) is paired with a blue outer contour (OC), yellow color spreading can be observed. To elucidate visual mechanisms underlying these different color spreading effects, this study investigated the effects of luminance ratio between the double contours on the induced color by systematically manipulating the IC and the OC luminance (Experiment 1) as well as the background luminance (Experiment 2). The results showed that the luminance conditions suitable for assimilative and non-assimilative color spreading were nearly opposite. When the Weber contrast of the IC to the background luminance (IC contrast) was smaller in size than that of the OC (OC contrast), the induced color became similar to the IC color (assimilative spreading). In contrast, when the OC contrast was smaller than or equal to the IC contrast, the induced color became yellow (non-assimilative spreading). Extending these findings, Experiment 3 showed that bilateral color spreading, i.e., assimilative spreading on one side and non-assimilative spreading on the other side, can also be observed in the watercolor configuration. These results suggest that the assimilative and the non-assimilative spreading were mediated by different visual mechanisms. The properties of the assimilative spreading are consistent with the model proposed to account for neon color spreading (Grossberg and Mingolla, 1985) and extended for the watercolor effect (Pinna and Grossberg, 2005). However, the present results suggest that additional mechanisms are needed to account for the non-assimilative color spreading.
ERP ASSIMILATION: AN END-USER APPROACH
Directory of Open Access Journals (Sweden)
Hurbean Luminita
2013-07-01
The paper discusses the ERP adoption based on the IT assimilation theory. The ERP lifecycle is associated with the IT assimilation steps. We propose a distribution of these steps along the lifecycle. Derived from the findings in the reviewed literature we will focus the cultural factors, in particular those related to the end-users (determined as a major impact factor in our previous study: Negovan et al., 2011. Our empirical study is centred on the end-users perspective and it tries to determine if and how their behaviour affects the achievement of the ERP assimilation steps. The paper reasons that organizations that understand the IT assimilation steps correlated to the ERP implementation critical factors are more likely to implement and use ERP successfully.
Blocking for Sequential Political Experiments.
Moore, Ryan T; Moore, Sally A
2013-10-01
In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects "trickle in" to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion.
Computational methods for data evaluation and assimilation
Cacuci, Dan Gabriel
2013-01-01
Data evaluation and data combination require the use of a wide range of probability theory concepts and tools, from deductive statistics mainly concerning frequencies and sample tallies to inductive inference for assimilating non-frequency data and a priori knowledge. Computational Methods for Data Evaluation and Assimilation presents interdisciplinary methods for integrating experimental and computational information. This self-contained book shows how the methods can be applied in many scientific and engineering areas. After presenting the fundamentals underlying the evaluation of experiment
The Acceleration of Immigrant Unhealthy Assimilation
Giuntella, Osea; Stella, Luca
2016-01-01
It is well-known that immigrants tend to be healthier than US natives and that this advantage erodes with time spent in the US. However, we know less about the heterogeneity of these trajectories among arrival cohorts. Recent studies have shown that later arrival cohorts of immigrants have lower entry wages and experience less economic assimilation. In this paper, we investigate whether similar cohort effects can be observed in the weight assimilation of immigrants in the US. Focusing on obes...
Sequential logic analysis and synthesis
Cavanagh, Joseph
2007-01-01
Until now, there was no single resource for actual digital system design. Using both basic and advanced concepts, Sequential Logic: Analysis and Synthesis offers a thorough exposition of the analysis and synthesis of both synchronous and asynchronous sequential machines. With 25 years of experience in designing computing equipment, the author stresses the practical design of state machines. He clearly delineates each step of the structured and rigorous design principles that can be applied to practical applications. The book begins by reviewing the analysis of combinatorial logic and Boolean a
A simple lightning assimilation technique for improving ...
Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-Fritsch (KF) convective scheme to improve retrospective simulations using the Weather Research and Forecasting (WRF) model. The assimilation method has a straightforward approach: force KF deep convection where lightning is observed and, optionally, suppress deep convection where lightning is absent. WRF simulations were made with and without lightning assimilation over the continental United States for July 2012, July 2013, and January 2013. The simulations were evaluated against NCEP stage-IV precipitation data and MADIS near-surface meteorological observations. In general, the use of lightning assimilation considerably improves the simulation of summertime rainfall. For example, the July 2012 monthly averaged bias of 6 h accumulated rainfall is reduced from 0.54 to 0.07 mm and the spatial correlation is increased from 0.21 to 0.43 when lightning assimilation is used. Statistical measures of near-surface meteorological variables also are improved. Consistent improvements also are seen for the July 2013 case. These results suggest that this lightning assimilation technique has the potential to substantially improve simulation of warm-season rainfall in retrospective WRF applications. The
Kalos, Melvin H
2008-01-01
This introduction to Monte Carlo methods seeks to identify and study the unifying elements that underlie their effective application. Initial chapters provide a short treatment of the probability and statistics needed as background, enabling those without experience in Monte Carlo techniques to apply these ideas to their research.The book focuses on two basic themes: The first is the importance of random walks as they occur both in natural stochastic systems and in their relationship to integral and differential equations. The second theme is that of variance reduction in general and importance sampling in particular as a technique for efficient use of the methods. Random walks are introduced with an elementary example in which the modeling of radiation transport arises directly from a schematic probabilistic description of the interaction of radiation with matter. Building on this example, the relationship between random walks and integral equations is outlined
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.
Wormhole Hamiltonian Monte Carlo
Lan, S; Streets, J; Shahbaba, B
2014-01-01
Copyright © 2014, Association for the Advancement of Artificial Intelligence. In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, espe...
International Nuclear Information System (INIS)
Creutz, M.
1986-01-01
The author discusses a recently developed algorithm for simulating statistical systems. The procedure interpolates between molecular dynamics methods and canonical Monte Carlo. The primary advantages are extremely fast simulations of discrete systems such as the Ising model and a relative insensitivity to random number quality. A variation of the algorithm gives rise to a deterministic dynamics for Ising spins. This model may be useful for high speed simulation of non-equilibrium phenomena
Energy Technology Data Exchange (ETDEWEB)
Brockway, D.; Soran, P.; Whalen, P.
1985-01-01
A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....
Attack Trees with Sequential Conjunction
Jhawar, Ravi; Kordy, Barbara; Mauw, Sjouke; Radomirović, Sasa; Trujillo-Rasua, Rolando
2015-01-01
We provide the first formal foundation of SAND attack trees which are a popular extension of the well-known attack trees. The SAND at- tack tree formalism increases the expressivity of attack trees by intro- ducing the sequential conjunctive operator SAND. This operator enables the modeling of
Korabel, Vasily; She, Jun; Huess, Vibeke; Woge Nielsen, Jacob; Murawsky, Jens; Nerger, Lars
2017-04-01
The potential of an efficient data assimilation (DA) scheme to improve model forecast skill was successfully demonstrated by many operational centres around the world. The Baltic-North Sea region is one of the most heavily monitored seas. Ferryboxes, buoys, ADCP moorings, shallow water Argo floats, and research vessels are providing more and more near-real time observations. Coastal altimetry has now providing increasing amount of high resolution sea level observations, which will be significantly expanded by the launch of SWOT satellite in next years. This will turn operational DA into a valuable tool for improving forecast quality in the region. This motivated us to focus on advancing DA for the Baltic Monitoring and Forecasting Centre (BAL MFC) in order to create a common framework for operational data assimilation in the Baltic Sea. We have implemented HBM-PDAF system based on the Parallel Data Assimilation Framework (PDAF), a highly versatile and optimised parallel suit with a choice of sequential schemes originally developed at AWI, and a hydrodynamic HIROMB-BOOS Model (HBM). At initial phase, only the satellite Sea Surface Temperature (SST) Level 3 data has been assimilated. Several related aspects are discussed, including improvements of the forecast quality for both surface and subsurface fields, the estimation of ensemble-based forecast error covariance, as well as possibilities of assimilating new types of observations, such as in-situ salinity and temperature profiles, coastal altimetry, and ice concentration.
Evaluation of Gaussian approximations for data assimilation in reservoir models
Iglesias, Marco A.
2013-07-14
The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our
Nitrogen assimilation in soybean nodules, 2
International Nuclear Information System (INIS)
Ohyama, Takuji; Kumazawa, Kikuo
1980-01-01
15 N assimilation was studied in bacteroid and cytosol fractions of soybean nodules. In the first experiment, after exposing the intact nodules to 15 N 2 for 5 min and 10 min, most of the fixed 15 N was detected in cytosol fraction. In cytosol fraction, 15 N content of glutamine was the highest and followed by glutamic acid, alanine, and allantoin in this sequence, whereas, in bacteroid fraction, glutamic acid showed the highest 15 N content and alanine and glutamine followed. In the second experiment, 15 N assimilation of various 15 N-labeled compounds in the separated bacteroid and cytosol fractions was investigated. In the separated bacteroid fraction which was fed with 15 NH 4 , 15 N was incorporated very rapidly into glutamic acid, alanine, and aspartic acid, but very slowly into glutamine. From these results, it was suggested that most of the fixed ammonia was exported to cytosol and assimilated via glutamine synthetase to glutamine, then via glutamate synthase to glutamic acid, and from these compounds various nitrogenous compounds were formed, but in bacteroids glutamate dehydrogenase and alanine dehydrogenase played an important role in the assimilation of fixed ammonia though quantitatively the contribution to ammonia assimilation in nodules was much less compared with cytosol. (author)
Jazz Club
2012-01-01
The 5th edition of the "Monts Jura Jazz Festival" that will take place on September 21st and 22nd 2012 at the Esplanade du Lac in Divonne-les-Bains. This festival is organized by the "CERN Jazz Club" with the support of the "CERN Staff Association". This festival is a major musical event in the French/Swiss area and proposes a world class program with jazz artists such as D.Lockwood and D.Reinhardt. More information on http://www.jurajazz.com.
International Nuclear Information System (INIS)
Talley, T.L.; Evans, F.
1988-01-01
Prior work demonstrated the importance of nuclear scattering to fusion product energy deposition in hot plasmas. This suggests careful examination of nuclear physics details in burning plasma simulations. An existing Monte Carlo fast ion transport code is being expanded to be a test bed for this examination. An initial extension, the energy deposition of fast alpha particles in a hot deuterium plasma, is reported. The deposition times and deposition ranges are modified by allowing nuclear scattering. Up to 10% of the initial alpha particle energy is carried to greater ranges and times by the more mobile recoil deuterons. 4 refs., 5 figs., 2 tabs
2012-01-01
The 5th edition of the "Monts Jura Jazz Festival" will take place at the Esplanade du Lac in Divonne-les-Bains, France on September 21 and 22. This festival organized by the CERN Jazz Club and supported by the CERN Staff Association is becoming a major musical event in the Geneva region. International Jazz artists like Didier Lockwood and David Reinhardt are part of this year outstanding program. Full program and e-tickets are available on the festival website. Don't miss this great festival!
Data assimilation the ensemble Kalman filter
Evensen, Geir
2007-01-01
Data Assimilation comprehensively covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers. It is demonstrated how the different methods can be derived from a common theoretical basis, as well as how they differ and/or are related to each other, and which properties characterize them, using several examples. Rather than emphasize a particular discipline such as oceanography or meteorology, it presents the mathematical framework and derivations in a way which is common for any discipline where dynamics is merged with measurements. The mathematics level is modest, although it requires knowledge of basic spatial statistics, Bayesian statistics, and calculus of variations. Readers will also appreciate the introduction to the mathematical methods used and detailed derivations, which should b...
Model Uncertainty Quantification Methods In Data Assimilation
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
The Acceleration of Immigrant Unhealthy Assimilation.
Giuntella, Osea; Stella, Luca
2017-04-01
It is well known that immigrants tend to be healthier than US natives and that this advantage erodes with time spent in the USA. However, we know less about the heterogeneity of these trajectories among arrival cohorts. Recent studies have shown that later arrival cohorts of immigrants have lower entry wages and experience less economic assimilation. In this paper, we investigate whether similar cohort effects can be observed in the weight assimilation of immigrants in the USA. Focusing on obesity, we show that more recent immigrant cohorts arrive with higher obesity rates and experience a faster 'unhealthy assimilation' in terms of weight gain. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Random sequential adsorption of cubes.
Cieśla, Michał; Kubala, Piotr
2018-01-14
Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.
A nested sampling particle filter for nonlinear data assimilation
Elsheikh, Ahmed H.
2014-04-15
We present an efficient nonlinear data assimilation filter that combines particle filtering with the nested sampling algorithm. Particle filters (PF) utilize a set of weighted particles as a discrete representation of probability distribution functions (PDF). These particles are propagated through the system dynamics and their weights are sequentially updated based on the likelihood of the observed data. Nested sampling (NS) is an efficient sampling algorithm that iteratively builds a discrete representation of the posterior distributions by focusing a set of particles to high-likelihood regions. This would allow the representation of the posterior PDF with a smaller number of particles and reduce the effects of the curse of dimensionality. The proposed nested sampling particle filter (NSPF) iteratively builds the posterior distribution by applying a constrained sampling from the prior distribution to obtain particles in high-likelihood regions of the search space, resulting in a reduction of the number of particles required for an efficient behaviour of particle filters. Numerical experiments with the 3-dimensional Lorenz63 and the 40-dimensional Lorenz96 models show that NSPF outperforms PF in accuracy with a relatively smaller number of particles. © 2013 Royal Meteorological Society.
Winiarek, Victor; Vira, Julius; Bocquet, Marc; Sofiev, Mikhail; Saunier, Olivier
2011-06-01
In the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. The accuracy of the forecast plume is highly dependent on the source term estimation. On several academic test cases, including real data, inverse modelling and data assimilation techniques were proven to help in the assessment of the source term. In this paper, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Two dispersion models have been used: Polair3D and Silam developed in two different research centres. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large multiplicative observational errors are assumed. The inverse modelling scheme accounts for strong error bias encountered with such errors. The efficiency of the data assimilation system is tested via statistical indicators. For France and Finland, the average performance of the data assimilation system is strong. However there are outlying situations where the inversion fails because of a too poor observability. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools are developed and tested to discriminate candidate release sites.
Hu, Shun; Shi, Liangsheng; Zha, Yuanyuan; Williams, Mathew; Lin, Lin
2017-12-01
Improvements to agricultural water and crop managements require detailed information on crop and soil states, and their evolution. Data assimilation provides an attractive way of obtaining these information by integrating measurements with model in a sequential manner. However, data assimilation for soil-water-atmosphere-plant (SWAP) system is still lack of comprehensive exploration due to a large number of variables and parameters in the system. In this study, simultaneous state-parameter estimation using ensemble Kalman filter (EnKF) was employed to evaluate the data assimilation performance and provide advice on measurement design for SWAP system. The results demonstrated that a proper selection of state vector is critical to effective data assimilation. Especially, updating the development stage was able to avoid the negative effect of ;phenological shift;, which was caused by the contrasted phenological stage in different ensemble members. Simultaneous state-parameter estimation (SSPE) assimilation strategy outperformed updating-state-only (USO) assimilation strategy because of its ability to alleviate the inconsistency between model variables and parameters. However, the performance of SSPE assimilation strategy could deteriorate with an increasing number of uncertain parameters as a result of soil stratification and limited knowledge on crop parameters. In addition to the most easily available surface soil moisture (SSM) and leaf area index (LAI) measurements, deep soil moisture, grain yield or other auxiliary data were required to provide sufficient constraints on parameter estimation and to assure the data assimilation performance. This study provides an insight into the response of soil moisture and grain yield to data assimilation in SWAP system and is helpful for soil moisture movement and crop growth modeling and measurement design in practice.
International Nuclear Information System (INIS)
Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh
2007-01-01
This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)
Vrugt, J.A.; ter Braak, C.J.F.; Diks, C.G.H.; Schoups, G.
2013-01-01
During the past decades much progress has been made in the development of computer based methods for parameter and predictive uncertainty estimation of hydrologic models. The goal of this paper is twofold. As part of this special anniversary issue we first shortly review the most important
Vrugt, J.A.; Braak, ter C.J.F.; Diks, C.G.H.
2013-01-01
During the past decades much progress has been made in the development of computer based methods for parameter and predictive uncertainty estimation of hydrologic models. The goal of this paper is twofold. As part of this special anniversary issue we first shortly review the most important
Application of Wald's sequential probability ratio test to nuclear materials control
International Nuclear Information System (INIS)
Fehlau, P.E.; Coop, K.L.; Markin, J.T.
1984-01-01
We have replaced traditional analysis methods for nuclear material control monitoring with hypothesis testing, specifically with Wald's sequential-probability-ratio test. Our evaluation of Walds'd method, applied in both vehicle and pedestrian SNM monitors, is by Monte Carlo calculation to determine the alarm probability and average monitoring times of the monitors. The vehicle monitor with Wald's test has a much shorter monitoring delay than with traditional methods, without serious compensating changes in operating characteristics. The pedestrian monitor with Wald's method also has advantages over traditional single-interval test, in that the Wald method duplicates the advantages of a moving-average technique. We verified the Monte Carlo calculations for the pedestrian monitor by means of a special program for the monitor's microprocessor controller. The observations of false-alarm probability and average monitoring time for over 500,000 tests verified the Monte Carlo results
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
time Technical Consultant to. Systat Software Asia-Pacific. (P) Ltd., in Bangalore, where the technical work for the development of the statistical software Systat takes place. His research interests have been in statistical pattern recognition and biostatistics. Keywords. Markov chain, Monte Carlo sampling, Markov chain Monte.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
ter of the 20th century, due to rapid developments in computing technology ... early part of this development saw a host of Monte ... These iterative. Monte Carlo procedures typically generate a random se- quence with the Markov property such that the Markov chain is ergodic with a limiting distribution coinciding with the ...
Data ingestion and assimilation in ionospheric models
Czech Academy of Sciences Publication Activity Database
Burešová, Dalia; Nava, B.; Galkin, I.; Angling, M.; Stankov, S. M.; Coisson, P.
2009-01-01
Roč. 52, 3/4 (2009), s. 235-253 ISSN 1593-5213 R&D Projects: GA ČR GA205/08/1356; GA MŠk OC 091 Institutional research plan: CEZ:AV0Z30420517 Keywords : ionosphere * models * data assimilation * data ingestion Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.548, year: 2009
DIVERSE APPROACHES TO MODELLING THE ASSIMILATIVE ...
African Journals Online (AJOL)
This study evaluated the assimilative capacity of Ikpoba River using different approaches namely: homogeneous differential equation, ANOVA/Duncan Multiple rage test, first and second order differential equations, correlation analysis, Eigen values and eigenvectors, multiple linear regression, bootstrapping and far-field ...
Nitrogen assimilation in soybean nodules, 1
International Nuclear Information System (INIS)
Ohyama, Takuji; Kumazawa, Kikuo
1980-01-01
In order to elucidate the pathways to assimilate the ammonia produced by N 2 -fixation in soybean nodules, 15 N-labeled compounds were administered to intact nodules or nodule slices pretreated with various inhibitors of nitrogen assimilation. After exposure to 15 N 2 , 15 N-incorporation into various nitrogenous compounds was investigated in attached nodules injected with methionine sulfoximine (MSX) or azaserine (AS). MSX treatment increased the 15 N content of ammonia more than 6 times, however, depressed 15 N content of most of amides and amino acids. AS treatment enhanced 15 N content of amido-N of glutamine as well as ammonia, but decreased amino-N of glutamine and most of amino acids. Experiments with nodule slices pretreated with MSX or AS solution and then fed with 15 N-labeled ammonia or amido- 15 N of glutamine showed the same trends. Aminooxyacetate inhibited nitrogen flow from glutamic acid to other amino acids. These results strongly indicate that the ammonia produced by N 2 -fixation is assimilated by GS/GOGAT system to glutamic acid and then transaminated to various amino acids in situ. 15 N-incorporation patterns in nodule slices fed with 15 N-labeled ammonia, hydroxylamine, nitrite were similar, but nitrate seemed to be reduced in a definite compartment and assimilated similarly as in intact nodules fed with 15 N 2 (author)
Data Assimilation in Discrete Event Simulations
Xie, X.
2018-01-01
Enabled by the increased availability of data, the data assimilation technique, which incorporates measured observations into a dynamical system model to produce a time sequence of estimated system states, gains popularity. The main reason is that it can produce more accurate estimation results than
Data assimilation for air quality models
DEFF Research Database (Denmark)
Silver, Jeremy David
2014-01-01
-dimensional optimal interpolation procedure (OI), an Ensemble Kalman Filter (EnKF), and a three-dimensional variational scheme (3D-var). The three assimilation procedures are described and tested. A multi-faceted approach is taken for the verification, using independent measurements from surface air-quality...
Significance of Assimilation and Fractional Crystallization (AFC ...
Indian Academy of Sciences (India)
57
andesites of the Chhotaudepur area plot close to a consistent mixing trend between typical mantle composition ... Deccan tholeiites and alkaline felsic rocks also exhibit a significant trend of crustal contamination. .... Keeping the rate of assimilation to fractional crystallization (r) as 0.3, the binary plotting was carried out and.
Monte Carlo Methods in Physics
International Nuclear Information System (INIS)
Santoso, B.
1997-01-01
Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained
A flexible Open Data Assimilation framework
van Velzen, Nils; Ridler, Marc E.; Altaf, Umer; Madsen, Henrik; Heemink, Arnold; Dijkzeul, Johan
2015-04-01
Accurate and reliable real-time hydrological forecasts are essential for protection against water-related hazards, operation of infrastructure, and water resources management. Recent advances in radar rainfall estimation and forecasting, numerical weather predictions, satellite and in-situ monitoring, and faster computing facilities are opening up new opportunities in real-time hydrological forecasting. More effective use of the different information sources via data assimilation will provide the basis for producing more accurate and more reliable forecasts. In this regard, development and implementation of robust and computationally efficient data assimilation algorithms that are feasible for real-time applications remains one of the key challenges. The implementation of data assimilation techniques is traditionally in a model specific form. The disadvantage of this approach is the need to have in-depth knowledge of the numerical core computations and it does not allow to freely experiment with data assimilation algorithms and measurement sources without the need of additional programming. We present a more flexible approach to setup a forecasting system. The OpenDA data assimilation framework contains many state of the art data assimilation algorithms to easily set up a forecasting system. The setup of the framework allows users to select and experiment with various algorithms. OpenDA defines an interface between model and data assimilation algorithms. This interface only needs to be implemented once for a particular model. The OpenDA model interface is already implemented for various models. Besides these models it is very easy to couple models that are already implementing the Open Model Interface (OpenMI) to OpenDA using the generic OpenMI-OpenDA coupler. Using a synthetic test case we demonstrate the capabilities of the proposed approach using OpenMI and OpenDA. We use the MIKE SHE distributed and integrated hydrological modeling system to demonstrate how
Raeder, K.; Hoar, T. J.; Anderson, J. L.; Collins, N.; Hendricks, J.; Kershaw, H.; Ha, S.; Snyder, C.; Skamarock, W. C.; Mizzi, A. P.; Liu, H.; Liu, J.; Pedatella, N. M.; Karspeck, A. R.; Karol, S. I.; Bitz, C. M.; Zhang, Y.
2017-12-01
The capabilities of the Data Assimilation Research Testbed (DART) at NCAR have been significantly expanded with the recent "Manhattan" release. DART is an ensemble Kalman filter based suite of tools, which enables researchers to use data assimilation (DA) without first becoming DA experts. Highlights: significant improvement in efficient ensemble DA for very large models on thousands of processors, direct read and write of model state files in parallel, more control of the DA output for finer-grained analysis, new model interfaces which are useful to a variety of geophysical researchers, new observation forward operators and the ability to use precomputed forward operators from the forecast model. The new model interfaces and example applications include the following: MPAS-A; Model for Prediction Across Scales - Atmosphere is a global, nonhydrostatic, variable-resolution mesh atmospheric model, which facilitates multi-scale analysis and forecasting. The absence of distinct subdomains eliminates problems associated with subdomain boundaries. It demonstrates the ability to consistently produce higher-quality analyses than coarse, uniform meshes do. WRF-Chem; Weather Research and Forecasting + (MOZART) Chemistry model assimilates observations from FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment). WACCM-X; Whole Atmosphere Community Climate Model with thermosphere and ionosphere eXtension assimilates observations of electron density to investigate sudden stratospheric warming. CESM (weakly) coupled assimilation; NCAR's Community Earth System Model is used for assimilation of atmospheric and oceanic observations into their respective components using coupled atmosphere+land+ocean+sea+ice forecasts. CESM2.0; Assimilation in the atmospheric component (CAM, WACCM) of the newly released version is supported. This version contains new and extensively updated components and software environment. CICE; Los Alamos sea ice model (in CESM) is used to assimilate
Random sequential adsorption on fractals.
Ciesla, Michal; Barbasz, Jakub
2012-07-28
Irreversible adsorption of spheres on flat collectors having dimension d fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.
Assimilation of Doppler weather radar observations in a mesoscale ...
Indian Academy of Sciences (India)
The variational data assimilation approach is one of the most promising tools available for directly assimilating the mesoscale obser- vations in order to improve the initial state. The horizontal wind derived from the DWR has been used alongwith other conventional and non-conventional data in the assimilation system.
Metropolis Methods for Quantum Monte Carlo Simulations
Ceperley, D. M.
2003-01-01
Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...
Parallelizing Monte Carlo with PMC
International Nuclear Information System (INIS)
Rathkopf, J.A.; Jones, T.R.; Nessett, D.M.; Stanberry, L.C.
1994-11-01
PMC (Parallel Monte Carlo) is a system of generic interface routines that allows easy porting of Monte Carlo packages of large-scale physics simulation codes to Massively Parallel Processor (MPP) computers. By loading various versions of PMC, simulation code developers can configure their codes to run in several modes: serial, Monte Carlo runs on the same processor as the rest of the code; parallel, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on other MPP processor(s); distributed, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on a different machine. This multi-mode approach allows maintenance of a single simulation code source regardless of the target machine. PMC handles passing of messages between nodes on the MPP, passing of messages between a different machine and the MPP, distributing work between nodes, and providing independent, reproducible sequences of random numbers. Several production codes have been parallelized under the PMC system. Excellent parallel efficiency in both the distributed and parallel modes results if sufficient workload is available per processor. Experiences with a Monte Carlo photonics demonstration code and a Monte Carlo neutronics package are described
Lectures on Monte Carlo methods
Madras, Neal
2001-01-01
Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati
Nonstationary source separation using sequential and variational Bayesian learning.
Chien, Jen-Tzung; Hsieh, Hsin-Lung
2013-05-01
Independent component analysis (ICA) is a popular approach for blind source separation where the mixing process is assumed to be unchanged with a fixed set of stationary source signals. However, the mixing system and source signals are nonstationary in real-world applications, e.g., the source signals may abruptly appear or disappear, the sources may be replaced by new ones or even moving by time. This paper presents an online learning algorithm for the Gaussian process (GP) and establishes a separation procedure in the presence of nonstationary and temporally correlated mixing coefficients and source signals. In this procedure, we capture the evolved statistics from sequential signals according to online Bayesian learning. The activity of nonstationary sources is reflected by an automatic relevance determination, which is incrementally estimated at each frame and continuously propagated to the next frame. We employ the GP to characterize the temporal structures of time-varying mixing coefficients and source signals. A variational Bayesian inference is developed to approximate the true posterior for estimating the nonstationary ICA parameters and for characterizing the activity of latent sources. The differences between this ICA method and the sequential Monte Carlo ICA are illustrated. In the experiments, the proposed algorithm outperforms the other ICA methods for the separation of audio signals in the presence of different nonstationary scenarios.
The COsmic-ray Soil Moisture Interaction Code (COSMIC for use in data assimilation
Directory of Open Access Journals (Sweden)
J. Shuttleworth
2013-08-01
Full Text Available Soil moisture status in land surface models (LSMs can be updated by assimilating cosmic-ray neutron intensity measured in air above the surface. This requires a fast and accurate model to calculate the neutron intensity from the profiles of soil moisture modeled by the LSM. The existing Monte Carlo N-Particle eXtended (MCNPX model is sufficiently accurate but too slow to be practical in the context of data assimilation. Consequently an alternative and efficient model is needed which can be calibrated accurately to reproduce the calculations made by MCNPX and used to substitute for MCNPX during data assimilation. This paper describes the construction and calibration of such a model, COsmic-ray Soil Moisture Interaction Code (COSMIC, which is simple, physically based and analytic, and which, because it runs at least 50 000 times faster than MCNPX, is appropriate in data assimilation applications. The model includes simple descriptions of (a degradation of the incoming high-energy neutron flux with soil depth, (b creation of fast neutrons at each depth in the soil, and (c scattering of the resulting fast neutrons before they reach the soil surface, all of which processes may have parameterized dependency on the chemistry and moisture content of the soil. The site-to-site variability in the parameters used in COSMIC is explored for 42 sample sites in the COsmic-ray Soil Moisture Observing System (COSMOS, and the comparative performance of COSMIC relative to MCNPX when applied to represent interactions between cosmic-ray neutrons and moist soil is explored. At an example site in Arizona, fast-neutron counts calculated by COSMIC from the average soil moisture profile given by an independent network of point measurements in the COSMOS probe footprint are similar to the fast-neutron intensity measured by the COSMOS probe. It was demonstrated that, when used within a data assimilation framework to assimilate COSMOS probe counts into the Noah land surface
Wormhole Hamiltonian Monte Carlo
Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak
2015-01-01
In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function. PMID:25861551
Wormhole Hamiltonian Monte Carlo.
Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak
2014-07-31
In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function.
Kumar, Sujay V.; Zaitchik, Benjamin F.; Peters-Lidard, Christa D.; Rodell, Matthew; Reichle, Rolf; Li, Bailing; Jasinski, Michael; Mocko, David; Getirana, Augusto; De Lannoy, Gabrielle;
2016-01-01
The objective of the North American Land Data Assimilation System (NLDAS) is to provide best available estimates of near-surface meteorological conditions and soil hydrological status for the continental United States. To support the ongoing efforts to develop data assimilation (DA) capabilities for NLDAS, the results of Gravity Recovery and Climate Experiment (GRACE) DA implemented in a manner consistent with NLDAS development are presented. Following previous work, GRACE terrestrial water storage (TWS) anomaly estimates are assimilated into the NASA Catchment land surface model using an ensemble smoother. In contrast to many earlier GRACE DA studies, a gridded GRACE TWS product is assimilated, spatially distributed GRACE error estimates are accounted for, and the impact that GRACE scaling factors have on assimilation is evaluated. Comparisons with quality-controlled in situ observations indicate that GRACE DA has a positive impact on the simulation of unconfined groundwater variability across the majority of the eastern United States and on the simulation of surface and root zone soil moisture across the country. Smaller improvements are seen in the simulation of snow depth, and the impact of GRACE DA on simulated river discharge and evapotranspiration is regionally variable. The use of GRACE scaling factors during assimilation improved DA results in the western United States but led to small degradations in the eastern United States. The study also found comparable performance between the use of gridded and basin averaged GRACE observations in assimilation. Finally, the evaluations presented in the paper indicate that GRACE DA can be helpful in improving the representation of droughts.
The Bacterial Sequential Markov Coalescent.
De Maio, Nicola; Wilson, Daniel J
2017-05-01
Bacteria can exchange and acquire new genetic material from other organisms directly and via the environment. This process, known as bacterial recombination, has a strong impact on the evolution of bacteria, for example, leading to the spread of antibiotic resistance across clades and species, and to the avoidance of clonal interference. Recombination hinders phylogenetic and transmission inference because it creates patterns of substitutions (homoplasies) inconsistent with the hypothesis of a single evolutionary tree. Bacterial recombination is typically modeled as statistically akin to gene conversion in eukaryotes, i.e. , using the coalescent with gene conversion (CGC). However, this model can be very computationally demanding as it needs to account for the correlations of evolutionary histories of even distant loci. So, with the increasing popularity of whole genome sequencing, the need has emerged for a faster approach to model and simulate bacterial genome evolution. We present a new model that approximates the coalescent with gene conversion: the bacterial sequential Markov coalescent (BSMC). Our approach is based on a similar idea to the sequential Markov coalescent (SMC)-an approximation of the coalescent with crossover recombination. However, bacterial recombination poses hurdles to a sequential Markov approximation, as it leads to strong correlations and linkage disequilibrium across very distant sites in the genome. Our BSMC overcomes these difficulties, and shows a considerable reduction in computational demand compared to the exact CGC, and very similar patterns in simulated data. We implemented our BSMC model within new simulation software FastSimBac. In addition to the decreased computational demand compared to previous bacterial genome evolution simulators, FastSimBac provides more general options for evolutionary scenarios, allowing population structure with migration, speciation, population size changes, and recombination hotspots. FastSimBac is
Data assimilation approaches in the EURANOS project
DEFF Research Database (Denmark)
Kaiser, J.C.; Gering, F.; Astrup, Poul
2010-01-01
-nuclides in urban areas the results of demonstration exercises are presented here. With the data assimilation module of the RIMPUFF dispersion code, predictions of the gamma dose rate are corrected with simulated readings of fixed detector stations. Using the DA capabilities of the IAMM package for mapping......Within the EURANOS project data assimilation (DA) approaches have been successfully applied in two areas to improve the predictive power of simulation models used in the RODOS and ARGOS decision support systems. For the areas of atmospheric dispersion modelling and of modelling the fate of radio...... the radioactive contamination in inhabited areas, predictions of a large scale deposition model have been combined with hypothetical measurements on a local scale. In both examples the accuracy of the model predictions has been improved and the uncertainties have been reduced. © EDP Sciences, 2010...
Data assimilation in integrated hydrological modelling
DEFF Research Database (Denmark)
Rasmussen, Jørn
Integrated hydrological models are useful tools for water resource management and research, and advances in computational power and the advent of new observation types has resulted in the models generally becoming more complex and distributed. However, the models are often characterized by a high...... degree of parameterization which results in significant model uncertainty which cannot be reduced much due to observations often being scarce and often taking the form of point measurements. Data assimilation shows great promise for use in integrated hydrological models , as it allows for observations...... to be efficiently combined with models to improve model predictions, reduce uncertainty and estimate model parameters. In this thesis, a framework for assimilating multiple observation types and updating multiple components and parameters of a catchment scale integrated hydrological model is developed and tested...
Directory of Open Access Journals (Sweden)
Suyanto Suyanto
2015-01-01
Full Text Available This paper discusses the usage of short term energy contour of a speech smoothed by a fuzzy-based method to automatically segment the speech into syllabic units. Two additional procedures, local normalization and postprocessing, are proposed to improve the method. Testing to Indonesian speech dataset shows that local normalization significantly improves the accuracy of fuzzy smoothing. In postprocessing step, the procedure of splitting missed short syllables reduces the deletion errors, but unfortunately it increases the insertion ones. On the other hand, an assimilation of a single consonant segment into its previous or next segment reduces the insertion errors, but increases the deletion ones. The sequential combination of splitting and then assimilation gives quite significant improvement of accuracy as well as reduction of deletion errors, but it slightly increases the insertion ones.
Variational Data Assimilation for the Global Ocean
2013-01-01
Assimilation for the Global Ocean 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0602435N 6. AUTHOR(S) James A. Cummings and Ole...has relatively uniform density at depths of 200- 400m . 13.3.3 Multivariate Correlations The horizontal and vertical correlation functions described...system. The primary application of the analysis error covariance program is as a constraint in the Ensemble Transform technique (Sect. 13.5.3
Wage Assimilation of Immigrants in Spain
Zenón Jiménez-Ridruejo; Carlos Borondo Arribas
2011-01-01
In this study we quantify the effect of the years of residence in Spain on the earnings of immigrants. We take sex, origin, education and age into account. The results are clearly positive, the longer the length of residence the more earnings, confirming the hypothesis of wage assimilation of immigrants as their human capital is adapted to the Spanish labor market. The information used comes from the Social Security’s Continuous Sample of Working Lives 2007. Additionally, we merge the earning...
Covariance Function for Nearshore Wave Assimilation Systems
2018-01-30
optimization of the assimilation systems has to be driven by the data taking into account the physics of the wave field. Whereas the temporal...known and fairly straightforward, but technically challenging (Bennett, 2002; Menke, 2012). However, determination of the covariances is an open...question with multiple scientific challenges due to the number of possible errors from all the sources (instrument error and error of
Geology of Maxwell Montes, Venus
Head, J. W.; Campbell, D. B.; Peterfreund, A. R.; Zisk, S. A.
1984-01-01
Maxwell Montes represent the most distinctive topography on the surface of Venus, rising some 11 km above mean planetary radius. The multiple data sets of the Pioneer missing and Earth based radar observations to characterize Maxwell Montes are analyzed. Maxwell Montes is a porkchop shaped feature located at the eastern end of Lakshmi Planum. The main massif trends about North 20 deg West for approximately 1000 km and the narrow handle extends several hundred km West South-West WSW from the north end of the main massif, descending down toward Lakshmi Planum. The main massif is rectilinear and approximately 500 km wide. The southern and northern edges of Maxwell Montes coincide with major topographic boundaries defining the edge of Ishtar Terra.
Handbook of Monte Carlo methods
National Research Council Canada - National Science Library
Kroese, Dirk P; Taimre, Thomas; Botev, Zdravko I
2011-01-01
... in rapid succession, the staggering number of related techniques, ideas, concepts and algorithms makes it difficult to maintain an overall picture of the Monte Carlo approach. This book attempts to encapsulate the emerging dynamics of this field of study"--
Monte Carlo simulation for IRRMA
International Nuclear Information System (INIS)
Gardner, R.P.; Liu Lianyan
2000-01-01
Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors
Design of Experiments, Model Calibration and Data Assimilation
Energy Technology Data Exchange (ETDEWEB)
Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2014-07-30
This presentation provides an overview of emulation, calibration and experiment design for computer experiments. Emulation refers to building a statistical surrogate from a carefully selected and limited set of model runs to predict unsampled outputs. The standard kriging approach to emulation of complex computer models is presented. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Markov chain Monte Carlo (MCMC) algorithms are often used to sample the calibrated parameter distribution. Several MCMC algorithms commonly employed in practice are presented, along with a popular diagnostic for evaluating chain behavior. Space-filling approaches to experiment design for selecting model runs to build effective emulators are discussed, including Latin Hypercube Design and extensions based on orthogonal array skeleton designs and imposed symmetry requirements. Optimization criteria that further enforce space-filling, possibly in projections of the input space, are mentioned. Designs to screen for important input variations are summarized and used for variable selection in a nuclear fuels performance application. This is followed by illustration of sequential experiment design strategies for optimization, global prediction, and rare event inference.
Effect of S-07 on 14CO2 assimilation and distribution of assimilates during ripening stage of wheat
International Nuclear Information System (INIS)
Yu Meiyu; Wang Xi; Tao Longxing; Huang Xiaolin
1995-01-01
The effect of S-07 (S-3307, uniconazole) on 14 CO 2 assimilation and distribution of assimilates during ripening stage of wheat was studied. The experimental results showed that the amount of 14 Co 2 assimilates in leaves of the wheat plant sprayed with 10∼40 ppm S-07 at heading stage was higher than that of the control. The distribution of 14 CO 2 assimilates in ear and root of wheat plant increased. It is also found that S-07 treatment made more assimilates transferring from primary stem to tillers
Immediate Sequential Bilateral Cataract Surgery
DEFF Research Database (Denmark)
Kessel, Line; Andresen, Jens; Erngaard, Ditte
2015-01-01
The aim of the present systematic review was to examine the benefits and harms associated with immediate sequential bilateral cataract surgery (ISBCS) with specific emphasis on the rate of complications, postoperative anisometropia, and subjective visual function in order to formulate evidence......-based national Danish guidelines for cataract surgery. A systematic literature review in PubMed, Embase, and Cochrane central databases identified three randomized controlled trials that compared outcome in patients randomized to ISBCS or bilateral cataract surgery on two different dates. Meta-analyses were...... performed using the Cochrane Review Manager software. The quality of the evidence was assessed using the GRADE method (Grading of Recommendation, Assessment, Development, and Evaluation). We did not find any difference in the risk of complications or visual outcome in patients randomized to ISBCS or surgery...
SMAP Data Assimilation at the GMAO
Reichle, R.; De Lannoy, G.; Liu, Q.; Ardizzone, J.
2016-01-01
The NASA Soil Moisture Active Passive (SMAP) mission has been providing L-band (1.4 GHz) passive microwave brightness temperature (Tb) observations since April 2015. These observations are sensitive to surface(0-5 cm) soil moisture. Several of the key applications targeted by SMAP, however, require knowledge of deeper-layer, root zone (0-100 cm) soil moisture, which is not directly measured by SMAP. The NASA Global Modeling and Assimilation Office (GMAO) contributes to SMAP by providing Level 4 data, including the Level 4 Surface and Root Zone Soil Moisture(L4_SM) product, which is based on the assimilation of SMAP Tb observations in the ensemble-based NASA GEOS-5 land surface data assimilation system. The L4_SM product offers global data every three hours at 9 km resolution, thereby interpolating and extrapolating the coarser- scale (40 km) SMAP observations in time and in space (both horizontally and vertically). Since October 31, 2015, beta-version L4_SM data have been available to the public from the National Snow and Ice Data Center for the period March 31, 2015, to near present, with a mean latency of approx. 2.5 days.
Adjoint electron Monte Carlo calculations
International Nuclear Information System (INIS)
Jordan, T.M.
1986-01-01
Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment
Peters-Lidard, Christa D.
2011-01-01
Center (EMC) for their land data assimilation systems to support weather and climate modeling. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through "plugins". LIS has been coupled to the Weather Research and Forecasting (WRF) model to support studies of land-atmosphere coupling be enabling ensembles of land surface states to be tested against multiple representations of the atmospheric boundary layer. LIS has also been demonstrated for parameter estimation, who showed that the use of sequential remotely sensed soil moisture products can be used to derive soil hydraulic and texture properties given a sufficient dynamic range in the soil moisture retrievals and accurate precipitation inputs.LIS has also recently been demonstrated for multi-model data assimilation using an Ensemble Kalman Filter for sequential assimilation of soil moisture, snow, and temperature.Ongoing work has demonstrated the value of bias correction as part of the filter, and also that of joint calibration and assimilation.Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeorological modeling, assimilation and parameter estimation will be presented as advancements towards the next generation of integrated observation and modeling systems
Pattern Recognition for a Flight Dynamics Monte Carlo Simulation
Restrepo, Carolina; Hurtado, John E.
2011-01-01
The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.
Trial Sequential Methods for Meta-Analysis
Kulinskaya, Elena; Wood, John
2014-01-01
Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…
Sequential association rules in atonal music
Honingh, A.; Weyde, T.; Conklin, D.; Chew, E.; Childs, A.; Chuan, C.-H.
2009-01-01
This paper describes a preliminary study on the structure of atonal music. In the same way as sequential association rules of chords can be found in tonal music, sequential association rules of pitch class set categories can be found in atonal music. It has been noted before that certain pitch class
Multi-agent sequential hypothesis testing
Kim, Kwang-Ki K.
2014-12-15
This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.
Blind Decoding of Multiple Description Codes over OFDM Systems via Sequential Monte Carlo
Directory of Open Access Journals (Sweden)
Guo Dong
2005-01-01
Full Text Available We consider the problem of transmitting a continuous source through an OFDM system. Multiple description scalar quantization (MDSQ is applied to the source signal, resulting in two correlated source descriptions. The two descriptions are then OFDM modulated and transmitted through two parallel frequency-selective fading channels. At the receiver, a blind turbo receiver is developed for joint OFDM demodulation and MDSQ decoding. Transformation of the extrinsic information of the two descriptions are exchanged between each other to improve system performance. A blind soft-input soft-output OFDM detector is developed, which is based on the techniques of importance sampling and resampling. Such a detector is capable of exchanging the so-called extrinsic information with the other component in the above turbo receiver, and successively improving the overall receiver performance. Finally, we also treat channel-coded systems, and a novel blind turbo receiver is developed for joint demodulation, channel decoding, and MDSQ source decoding.
Parallel Sequential Monte Carlo for Efficient Density Combination: The Deco Matlab Toolbox
DEFF Research Database (Denmark)
Casarin, Roberto; Grassi, Stefano; Ravazzolo, Francesco
This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights...... for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy...... times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications....
Efficient Sequential Monte Carlo Sampling for Continuous Monitoring of a Radiation Situation
Czech Academy of Sciences Publication Activity Database
Šmídl, Václav; Hofman, Radek
2014-01-01
Roč. 56, č. 4 (2014), s. 514-527 ISSN 0040-1706 R&D Projects: GA MV VG20102013018 Institutional support: RVO:67985556 Keywords : radiation protection * atmospheric dispersion model * importance sampling Subject RIV: BD - Theory of Information Impact factor: 1.814, year: 2014 http:// library .utia.cas.cz/separaty/2014/AS/smidl-0433631.pdf
Simulation and sequential dynamical systems
Energy Technology Data Exchange (ETDEWEB)
Mortveit, H.S.; Reidys, C.M.
1999-06-01
Computer simulations have a generic structure. Motivated by this the authors present a new class of discrete dynamical systems that captures this structure in a mathematically precise way. This class of systems consists of (1) a loopfree graph {Upsilon} with vertex set {l_brace}1,2,{hor_ellipsis},n{r_brace} where each vertex has a binary state, (2) a vertex labeled set of functions (F{sub i,{Upsilon}}:F{sub 2}{sup n} {yields} F{sub 2}{sup n}){sub i} and (3) a permutation {pi} {element_of} S{sub n}. The function F{sub i,{Upsilon}} updates the state of vertex i as a function of the states of vertex i and its {Upsilon}-neighbors and leaves the states of all other vertices fixed. The permutation {pi} represents the update ordering, i.e., the order in which the functions F{sub i,{Upsilon}} are applied. By composing the functions F{sub i,{Upsilon}} in the order given by {pi} one obtains the dynamical system (equation given in paper), which the authors refer to as a sequential dynamical system, or SDS for short. The authors will present bounds for the number of functionally different systems and for the number of nonisomorphic digraphs {Gamma}[F{sub {Upsilon}},{pi}] that can be obtained by varying the update order and applications of these to specific graphs and graph classes.
Sequential provisional implant prosthodontics therapy.
Zinner, Ira D; Markovits, Stanley; Jansen, Curtis E; Reid, Patrick E; Schnader, Yale E; Shapiro, Herbert J
2012-01-01
The fabrication and long-term use of first- and second-stage provisional implant prostheses is critical to create a favorable prognosis for function and esthetics of a fixed-implant supported prosthesis. The fixed metal and acrylic resin cemented first-stage prosthesis, as reviewed in Part I, is needed for prevention of adjacent and opposing tooth movement, pressure on the implant site as well as protection to avoid micromovement of the freshly placed implant body. The second-stage prosthesis, reviewed in Part II, should be used following implant uncovering and abutment installation. The patient wears this provisional prosthesis until maturation of the bone and healing of soft tissues. The second-stage provisional prosthesis is also a fail-safe mechanism for possible early implant failures and also can be used with late failures and/or for the necessity to repair the definitive prosthesis. In addition, the screw-retained provisional prosthesis is used if and when an implant requires removal or other implants are to be placed as in a sequential approach. The creation and use of both first- and second-stage provisional prostheses involve a restorative dentist, dental technician, surgeon, and patient to work as a team. If the dentist alone cannot do diagnosis and treatment planning, surgery, and laboratory techniques, he or she needs help by employing the expertise of a surgeon and a laboratory technician. This team approach is essential for optimum results.
Accelerating assimilation development for new observing systems using EFSO
Lien, Guo-Yuan; Hotta, Daisuke; Kalnay, Eugenia; Miyoshi, Takemasa; Chen, Tse-Chun
2018-03-01
To successfully assimilate data from a new observing system, it is necessary to develop appropriate data selection strategies, assimilating only the generally useful data. This development work is usually done by trial and error using observing system experiments (OSEs), which are very time and resource consuming. This study proposes a new, efficient methodology to accelerate the development using ensemble forecast sensitivity to observations (EFSO). First, non-cycled assimilation of the new observation data is conducted to compute EFSO diagnostics for each observation within a large sample. Second, the average EFSO conditionally sampled in terms of various factors is computed. Third, potential data selection criteria are designed based on the non-cycled EFSO statistics, and tested in cycled OSEs to verify the actual assimilation impact. The usefulness of this method is demonstrated with the assimilation of satellite precipitation data. It is shown that the EFSO-based method can efficiently suggest data selection criteria that significantly improve the assimilation results.
Benefits and Pitfalls of GRACE Terrestrial Water Storage Data Assimilation
Girotto, Manuela
2018-01-01
Satellite observations of terrestrial water storage (TWS) from the Gravity Recovery and Climate Experiment (GRACE) mission have a coarse resolution in time (monthly) and space (roughly 150,000 sq km at midlatitudes) and vertically integrate all water storage components over land, including soil moisture and groundwater. Nonetheless, data assimilation can be used to horizontally downscale and vertically partition GRACE-TWS observations. This presentation illustrates some of the benefits and drawbacks of assimilating TWS observations from GRACE into a land surface model over the continental United States and India. The assimilation scheme yields improved skill metrics for groundwater compared to the no-assimilation simulations. A smaller impact is seen for surface and root-zone soil moisture. Further, GRACE observes TWS depletion associated with anthropogenic groundwater extraction. Results from the assimilation emphasize the importance of representing anthropogenic processes in land surface modeling and data assimilation systems.
Spatial dependence of color assimilation by the watercolor effect.
Devinck, Frédéric; Delahunt, Peter B; Hardy, Joseph L; Spillmann, Lothar; Werner, John S
2006-01-01
Color assimilation with bichromatic contours was quantified for spatial extents ranging from von Bezold-type color assimilation to the watercolor effect. The magnitude and direction of assimilative hue change was measured as a function of the width of a rectangular stimulus. Assimilation was quantified by hue cancellation. Large hue shifts were required to null the color of stimuli < or = 9.3 min of arc in width, with an exponential decrease for stimuli increasing up to 7.4 deg. When stimuli were viewed through an achromatizing lens, the magnitude of the assimilation effect was reduced for narrow stimuli, but not for wide ones. These results demonstrate that chromatic aberration may account, in part, for color assimilation over small, but not large, surface areas.
Fox, A. M.; Hoar, T. J.; Smith, W. K.; Moore, D. J.
2017-12-01
assumptions and inputs in the algorithms that are incompatible with those encoded within CLM. It is probable that VOD describes changes in biomass more accurately than absolute values, so in additional to sequential assimilation of observations, we have tested alternative filter algorithms, and assimilating VOD anomalies.
Testbed model and data assimilation for ARM
International Nuclear Information System (INIS)
Louis, J.F.
1992-01-01
The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes
Nitrogen uptake and assimilation by corn roots
International Nuclear Information System (INIS)
Yoneyama, Tadakatsu; Akiyama, Yoko; Kumazawa, Kikuo
1977-01-01
The site of nitrogen uptake in the apical root zone of corn was experimentally investigated. Two experiments were performed. The one is to see the assimilation of nitrate and ammonium and the effects of low temperature on it. The 4-day-old roots were treated with 15 N-labelled inorganic nitrogen of 20 ppm N in 5 x 10 -4 M CaSO 4 solution at 30 deg. C and 0 deg. C. The other is to see the nitrogen uptake at apical root zone and the utilization of newly absorbed nitrogen at the root top. The 4-day-old roots were transferred into 5 x 10 -4 M CaSO 4 solution containing 15 N-labelled ammonium nitrate of 40 ppm N. As a result, the effect of low temperature on the nitrogen uptake appeared to be more drastic in the case of nitrate than ammonium. The 15 N content of amino acids indicates that ammonium is assimilated into amino acids even at 0 deg. C, but nitrate is not. The ammonium nitrogen seemed to be absorbed at both cell dividing and elongating zones. On the other hand, nitrate nitrogen seemed to be strongly absorbed at cell elongating zone. The nitrogen in the apical part may be supplied not only by direct absorption but also by translocation from the basal part. The clear difference was found in the utilization of nitrate and ammonium nitrogen at the root top when the root was elongating. This may be due to the difference of assimilation products of inorganic nitrogen. Newly absorbed ammonium nitrogen is more utilizable for the growth of root top than nitrate nitrogen. (Iwakiri, K.)
Ercolani, Giulia; Castelli, Fabio
2016-04-01
restrictive hypothesis than Kalman and Monte Carlo filters and smoothers, although it needs the not straightforward derivation of an adjoint model. The developed assimilation system is tested through hindcast experiments on selected events in the period 2010-2014 that actually resulted into false alarms in the Arno river basin (about 8230 km2). The hydrologic model is run with the spatial and temporal resolutions that are employed operationally, i.e. 500 m and 15 minutes. The improvement in discharge forecasts is evaluated through classical performance indexes as error on peak flow and Nash-Sutcliffe efficiency. In addition, performances of LST assimilation are compared with those obtained with the assimilation of discharge data at multiple punctual locations for the same events.
Monte Carlo simulation of experiments
International Nuclear Information System (INIS)
Opat, G.I.
1977-07-01
An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)
International Nuclear Information System (INIS)
Haussaire, Jean-Matthieu
2017-01-01
assimilation of real tropospheric ozone concentrations mitigates these results and shows how hard atmospheric chemistry data assimilation is. A strong model error is indeed attached to these models, stemming from multiple uncertainty sources. Two steps must be taken to tackle this issue. First of all, the data assimilation method used must be able to efficiently take into account the model error. However, most methods are developed under the assumption of a perfect model. To avoid this hypothesis, a new method has then been developed. Called IEnKF-Q, it expands the IEnKS to the model error framework. It has been validated on a low-order model, proving its superiority over data assimilation methods naively adapted to take into account model error. Nevertheless, such methods need to know the exact nature and amplitude of the model error which needs to be accounted for. Therefore, the second step is to use statistical tools to quantify this model error. The expectation-maximization algorithm, the naive and unbiased randomize-then-optimize algorithms, an importance sampling based on a Laplace proposal, and a Markov chain Monte Carlo simulation, potentially trans-dimensional, have been assessed, expanded, and compared to estimate the uncertainty on the retrieval of the source term of the Chernobyl and Fukushima-Daiichi nuclear power plant accidents. This thesis therefore improves the domain of 4D EnVar data assimilation by its methodological input and by paving the way to applying these methods on atmospheric chemistry models. (author) [fr
Energy Technology Data Exchange (ETDEWEB)
Chorin, Alexandre J. [Univ. of California, Berkeley, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Morzfeld, Matthias [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tu, Xuemin [Univ. of Kansas, Lawrence, KS (United States)
2013-07-05
Implicit particle filters for data assimilation update the particles by first choosing probabilities and then looking for particle locations that assume them, guiding the particles one by one to the high probability domain. We provide a detailed description of these filters, with illustrative examples, together with new, more general, methods for solving the algebraic equations and with a new algorithm for parameter identification.
Directory of Open Access Journals (Sweden)
M. J. Angling
2008-02-01
Full Text Available Ground based measurements of slant total electron content (TEC can be assimilated into ionospheric models to produce 3-D representations of ionospheric electron density. The Electron Density Assimilative Model (EDAM has been developed for this purpose. Previous tests using EDAM and ground based data have demonstrated that the information on the vertical structure of the ionosphere is limited in this type of data. The launch of the COSMIC satellite constellation provides the opportunity to use radio occultation data which has more vertical information. EDAM assimilations have been run for three time periods representing quiet, moderate and disturbed geomagnetic conditions. For each run, three data sets have been ingested – only ground based data, only COSMIC data and both ground based and COSMIC data. The results from this preliminary study show that both ground and space based data are capable of improving the representation of the vertical structure of the ionosphere. However, the analysis is limited by the incomplete deployment of the COSMIC constellation and the use of auto-scaled ionosonde data. The first of these can be addressed by repeating this type of study once full deployment has been achieved. The latter requires the manual scaling of ionosonde data; ideally an agreed data set would be scaled and made available to the community to facilitate comparative testing of assimilative models.
Bayesian Modelling, Monte Carlo Sampling and Capital Allocation of Insurance Risks
Directory of Open Access Journals (Sweden)
Gareth W. Peters
2017-09-01
Full Text Available The main objective of this work is to develop a detailed step-by-step guide to the development and application of a new class of efficient Monte Carlo methods to solve practically important problems faced by insurers under the new solvency regulations. In particular, a novel Monte Carlo method to calculate capital allocations for a general insurance company is developed, with a focus on coherent capital allocation that is compliant with the Swiss Solvency Test. The data used is based on the balance sheet of a representative stylized company. For each line of business in that company, allocations are calculated for the one-year risk with dependencies based on correlations given by the Swiss Solvency Test. Two different approaches for dealing with parameter uncertainty are discussed and simulation algorithms based on (pseudo-marginal Sequential Monte Carlo algorithms are described and their efficiency is analysed.
Boundary Conditions, Data Assimilation, and Predictability in Coastal Ocean Models
National Research Council Canada - National Science Library
Samelson, Roger M; Allen, John S; Egbert, Gary D; Kindle, John C; Snyder, Chris
2007-01-01
...: The specific objectives of this research are to determine the impact on coastal ocean circulation models of open ocean boundary conditions from Global Ocean Data Assimilation Experiment (GODAE...
A Robust Non-Gaussian Data Assimilation Method for Highly Non-Linear Models
Directory of Open Access Journals (Sweden)
Elias D. Nino-Ruiz
2018-03-01
Full Text Available In this paper, we propose an efficient EnKF implementation for non-Gaussian data assimilation based on Gaussian Mixture Models and Markov-Chain-Monte-Carlo (MCMC methods. The proposed method works as follows: based on an ensemble of model realizations, prior errors are estimated via a Gaussian Mixture density whose parameters are approximated by means of an Expectation Maximization method. Then, by using an iterative method, observation operators are linearized about current solutions and posterior modes are estimated via a MCMC implementation. The acceptance/rejection criterion is similar to that of the Metropolis-Hastings rule. Experimental tests are performed on the Lorenz 96 model. The results show that the proposed method can decrease prior errors by several order of magnitudes in a root-mean-square-error sense for nearly sparse or dense observational networks.
Markov Chain Monte Carlo Methods-Simple Monte Carlo
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo ... New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific (PI Ltd., Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560017, India.
Continuous data assimilation with stochastically noisy data
Bessaih, Hakima; Olson, Eric; Titi, Edriss S.
2015-03-01
We analyse the performance of a data-assimilation algorithm based on a linear feedback control when used with observational data that contains measurement errors. Our model problem consists of dynamics governed by the two-dimensional incompressible Navier-Stokes equations, observational measurements given by finite volume elements or nodal points of the velocity field and measurement errors which are represented by stochastic noise. Under these assumptions, the data-assimilation algorithm consists of a system of stochastically forced Navier-Stokes equations. The main result of this paper provides explicit conditions on the observation density (resolution) which guarantee explicit asymptotic bounds, as the time tends to infinity, on the error between the approximate solution and the actual solutions which is corresponding to these measurements, in terms of the variance of the noise in the measurements. Specifically, such bounds are given for the limit supremum, as the time tends to infinity, of the expected value of the L2-norm and of the H1 Sobolev norm of the difference between the approximating solution and the actual solution. Moreover, results on the average time error in mean are stated. birthday.
Dynamic Radiation Environment Assimilation Model: DREAM
Reeves, G. D.; Chen, Y.; Cunningham, G. S.; Friedel, R. W. H.; Henderson, M. G.; Jordanova, V. K.; Koller, J.; Morley, S. K.; Thomsen, M. F.; Zaharia, S.
2012-03-01
The Dynamic Radiation Environment Assimilation Model (DREAM) was developed to provide accurate, global specification of the Earth's radiation belts and to better understand the physical processes that control radiation belt structure and dynamics. DREAM is designed using a modular software approach in order to provide a computational framework that makes it easy to change components such as the global magnetic field model, radiation belt dynamics model, boundary conditions, etc. This paper provides a broad overview of the DREAM model and a summary of some of the principal results to date. We describe the structure of the DREAM model, describe the five major components, and illustrate the various options that are available for each component. We discuss how the data assimilation is performed and the data preprocessing and postprocessing that are required for producing the final DREAM outputs. We describe how we apply global magnetic field models for conversion between flux and phase space density and, in particular, the benefits of using a self-consistent, coupled ring current-magnetic field model. We discuss some of the results from DREAM including testing of boundary condition assumptions and effects of adding a source term to radial diffusion models. We also describe some of the testing and validation of DREAM and prospects for future development.
Deciphering Intrinsic Inter-subunit Couplings that Lead to Sequential Hydrolysis of F1-ATPase Ring.
Dai, Liqiang; Flechsig, Holger; Yu, Jin
2017-10-03
Rotary sequential hydrolysis of the metabolic machine F 1 -ATPase is a prominent manifestation of high coordination among multiple chemical sites in ring-shaped molecular machines, and it is also functionally essential for F 1 to tightly couple chemical reactions and central γ-shaft rotation. High-speed AFM experiments have identified that sequential hydrolysis is maintained in the F 1 stator ring even in the absence of the γ-rotor. To explore the origins of intrinsic sequential performance, we computationally investigated essential inter-subunit couplings on the hexameric ring of mitochondrial and bacterial F 1 . We first reproduced in stochastic Monte Carlo simulations the experimentally determined sequential hydrolysis schemes by kinetically imposing inter-subunit couplings and following subsequent tri-site ATP hydrolysis cycles on the F 1 ring. We found that the key couplings to support the sequential hydrolysis are those that accelerate neighbor-site ADP and Pi release upon a certain ATP binding or hydrolysis reaction. The kinetically identified couplings were then examined in atomistic molecular dynamics simulations at a coarse-grained level to reveal the underlying structural mechanisms. To do that, we enforced targeted conformational changes of ATP binding or hydrolysis to one chemical site on the F 1 ring and monitored the ensuing conformational responses of the neighboring sites using structure-based simulations. Notably, we found asymmetrical neighbor-site opening that facilitates ADP release upon enforced ATP binding. We also captured a complete charge-hopping process of the Pi release subsequent to enforced ATP hydrolysis in the neighbor site, confirming recent single-molecule analyses with regard to the role of ATP hydrolysis in F 1 . Our studies therefore elucidate both the coordinated chemical kinetics and structural dynamics mechanisms underpinning the sequential operation of the F 1 ring. Copyright © 2017 Biophysical Society. Published by
Benchmarking a Soil Moisture Data Assimilation System for Agricultural Drought Monitoring
Hun, Eunjin; Crow, Wade T.; Holmes, Thomas; Bolten, John
2014-01-01
Despite considerable interest in the application of land surface data assimilation systems (LDAS) for agricultural drought applications, relatively little is known about the large-scale performance of such systems and, thus, the optimal methodological approach for implementing them. To address this need, this paper evaluates an LDAS for agricultural drought monitoring by benchmarking individual components of the system (i.e., a satellite soil moisture retrieval algorithm, a soil water balance model and a sequential data assimilation filter) against a series of linear models which perform the same function (i.e., have the same basic inputoutput structure) as the full system component. Benchmarking is based on the calculation of the lagged rank cross-correlation between the normalized difference vegetation index (NDVI) and soil moisture estimates acquired for various components of the system. Lagged soil moistureNDVI correlations obtained using individual LDAS components versus their linear analogs reveal the degree to which non-linearities andor complexities contained within each component actually contribute to the performance of the LDAS system as a whole. Here, a particular system based on surface soil moisture retrievals from the Land Parameter Retrieval Model (LPRM), a two-layer Palmer soil water balance model and an Ensemble Kalman filter (EnKF) is benchmarked. Results suggest significant room for improvement in each component of the system.
DEFF Research Database (Denmark)
Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou
2010-01-01
Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... priors to be used. We demonstrate how sequential simulation can be seen as an application of the Gibbs sampler, and how such a Gibbs sampler assisted by sequential simulation can be used to perform a random walk generating realizations of a relatively complex random function. We propose to combine...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....
Rakesh, V.; Kantharao, B.
2017-03-01
Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events
4DVar Data Assimilation for Dust Emission Parameter Estimation over East Asia Area
Jin, Jianbing; Lin, Hai Xiang; Heemink, Arnold; Segers, Arjo
2017-04-01
The Severe Dust Storms (SDS) have long played a wide and negative impact on the atmospheric environment and climate system. To reduce the social and economic influences caused by the SDS, various model-based dust forecasting and early warning systems have been developed. However, the simulated dust concentrations by those existing models sometimes show a discrepancy of more than two orders of magnitudes from the observations. The most important reason for such large differences is the difficulty in accurately identifying the dust emission source region and emission rate. In our study, the LOTOS-EUROS/Dust is used to simulate the SDS over East Asia areas. A geographic dependent friction velocity threshold (FVT), instead of a spatially constant one, is introduced in the dust emission equation. A trajectory-based 4DVar data assimilation scheme is designed to estimate the spatially different FVTs. By using the trajectories (the ensemble model realization perturbations with the Monte Carlo sampled FVTs), an accurate approximation of the expected FVTs with high efficiency can be obtained. Twin experiments have been implemented, where the 2D Aerosol Optical Depth (AOD) observations transformed from the expected model realization are assimilated, both the estimated FVTs and the forecast dust concentrations are evaluated. Besides, the improved FVTs (for trajectories) sampling scheme and model-based FVTs reduction are also implemented, which can further improve the forecast accuracy without increasing the number of trajectories.
Energy Technology Data Exchange (ETDEWEB)
Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng
2016-06-01
The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a relatively large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos to approximate the original system. In this way, the sampling error can be reduced. However, PCKF suffers from the so-called "curse of dimensionality". When the system nonlinearity is strong and number of parameters is large, PCKF could be even more computationally expensive than EnKF. Motivated by most recent developments in uncertainty quantification, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected. The "restart" technology is used to eliminate the inconsistency between model parameters and states. The performance of RAPCKF is tested with numerical cases of unsaturated flow models. It is shown that RAPCKF is more efficient than EnKF with the same computational cost. Compared with the traditional PCKF, the RAPCKF is more applicable in strongly nonlinear and high dimensional problems.
Directory of Open Access Journals (Sweden)
Sheng Wanxing
2016-01-01
Full Text Available In allusion to the randomness of output power of distributed generation (DG, a reliability evaluation model based on sequential Monte Carlo simulation (SMCS for distribution system with DG is proposed. Operating states of the distribution system can be sampled by SMCS in chronological order thus the corresponding output power of DG can be generated. The proposed method has been tested on feeder F4 of IEEE-RBTS Bus 6. The results show that reliability evaluation of distribution system considering the uncertainty of output power of DG can be effectively implemented by SMCS.
da Silva, Arlindo
2010-01-01
A challenge common to many constituent data assimilation applications is the fact that one observes a much smaller fraction of the phase space that one wishes to estimate. For example, remotely sensed estimates of the column average concentrations are available, while one is faced with the problem of estimating 3D concentrations for initializing a prognostic model. This problem is exacerbated in the case of aerosols because the observable Aerosol Optical Depth (AOD) is not only a column integrated quantity, but it also sums over a large number of species (dust, sea-salt, carbonaceous and sulfate aerosols. An aerosol transport model when driven by high-resolution, state-of-the-art analysis of meteorological fields and realistic emissions can produce skillful forecasts even when no aerosol data is assimilated. The main task of aerosol data assimilation is to address the bias arising from inaccurate emissions, and Lagrangian misplacement of plumes induced by errors in the driving meteorological fields. As long as one decouples the meteorological and aerosol assimilation as we do here, the classic baroclinic growth of error is no longer the main order of business. We will describe an aerosol data assimilation scheme in which the analysis update step is conducted in observation space, using an adaptive maximum-likelihood scheme for estimating background errors in AOD space. This scheme includes e explicit sequential bias estimation as in Dee and da Silva. Unlikely existing aerosol data assimilation schemes we do not obtain analysis increments of the 3D concentrations by scaling the background profiles. Instead we explore the Lagrangian characteristics of the problem for generating local displacement ensembles. These high-resolution state-dependent ensembles are then used to parameterize the background errors and generate 3D aerosol increments. The algorithm has computational complexity running at a resolution of 1/4 degree, globally. We will present the result of
Exact Monte Carlo for molecules
Energy Technology Data Exchange (ETDEWEB)
Lester, W.A. Jr.; Reynolds, P.J.
1985-03-01
A brief summary of the fixed-node quantum Monte Carlo method is presented. Results obtained for binding energies, the classical barrier height for H + H2, and the singlet-triplet splitting in methylene are presented and discussed. 17 refs.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
Markov Chain Monte Carlo Methods. 2. The Markov Chain Case. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance. His spare time is ...
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
GENERAL ! ARTICLE. Markov Chain Monte Carlo Methods. 3. Statistical Concepts. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance.
Monte Carlo calculations of nuclei
Energy Technology Data Exchange (ETDEWEB)
Pieper, S.C. [Argonne National Lab., IL (United States). Physics Div.
1997-10-01
Nuclear many-body calculations have the complication of strong spin- and isospin-dependent potentials. In these lectures the author discusses the variational and Green`s function Monte Carlo techniques that have been developed to address this complication, and presents a few results.
Is Monte Carlo embarrassingly parallel?
International Nuclear Information System (INIS)
Hoogenboom, J. E.
2012-01-01
Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)
Space to ground sequential lobe tracking of aircraft
Shannon, P. D.; Kwon, D. W.; Polites, M.
Growing demand for satellite communications capability coupled with shrinking government budgets, has spurred acquisition and repurposing of commercial satellite systems for government missions. One subset of these satellites provides high bandwidth communication with aerial vehicles from geosynchronous orbit. Automated tracking by these satellites of aerial vehicles improves link margin, but is not a typical function of commercial product lines. Additional tracking hardware and flight software development are required to give these commercial products tracking capability. This leads to an inefficient design from a cost and mass standpoint for a large number of slow flying aerial vehicles. Therefore, a need was identified to design a low cost tracking system that minimizes tracking specific spacecraft hardware and flight software development. This paper outlines a sequential lobe tracking system to auto track aerial vehicles and analyzes the algorithm's accuracy and sensitivity in tracking aerial vehicles using their pre-existing uplink signal. The tracking scheme consists of a satellite based RF power meter, automated ground based control of antenna pointing, and ground based processing of the tracking telemetry. The aerial vehicle was modeled as a high altitude, relatively slow moving Ka-band aircraft. To identify and evaluate a feasible design, a MATLAB model was developed to simulate an aerial vehicle, the vehicle's primary uplink signal and its variance, communication and processing latency in the design, and tracking telemetry processing. In addition, the effect on the spacecraft antenna actuators was modeled. The primary output of the model is tracking accuracy and Monte Carlo simulations were used to determine 1, 2, and 3 sigma results. Overall, this paper demonstrates the viability of a sequential lobe scheme with ground based processing as a low cost alternative for Space-to-Ground tracking of slow flying aerial vehicles.
Stationary Anonymous Sequential Games with Undiscounted Rewards.
Więcek, Piotr; Altman, Eitan
Stationary anonymous sequential games with undiscounted rewards are a special class of games that combine features from both population games (infinitely many players) with stochastic games. We extend the theory for these games to the cases of total expected reward as well as to the expected average reward. We show that in the anonymous sequential game equilibria correspond to the limits of those of related finite population games as the number of players grows to infinity. We provide examples to illustrate our results.
Monte Carlo - Advances and Challenges
International Nuclear Information System (INIS)
Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.
2008-01-01
Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature
Altimeter data assimilation in the tropical Indian Ocean using water ...
Indian Academy of Sciences (India)
It has been found that the assimilation exhibits a significant positive impact on the simulation of SST. The subsurface effect of the assimilation could be judged by comparing the model simulated depth of the 20°C isotherm (hereafter referred to as D20), as a proxy of the thermocline depth, with the same quantity estimated ...
The effects of drought stress on assimilate availability and ...
African Journals Online (AJOL)
Changes in carbohydrate status and metabolism in the source and sink organs determine rate of growth and yield of plants subjected to drought stress. The objective of this study was to assess the effect of lost-flowering drought stress on assimilate synthesis at source level and availability of the assimilates for metabolism in ...
Nearing, Grey S.; Crow, Wade T.; Thorp, Kelly R.; Moran, Mary S.; Reichle, Rolf H.; Gupta, Hoshin V.
2012-01-01
Observing system simulation experiments were used to investigate ensemble Bayesian state updating data assimilation of observations of leaf area index (LAI) and soil moisture (theta) for the purpose of improving single-season wheat yield estimates with the Decision Support System for Agrotechnology Transfer (DSSAT) CropSim-Ceres model. Assimilation was conducted in an energy-limited environment and a water-limited environment. Modeling uncertainty was prescribed to weather inputs, soil parameters and initial conditions, and cultivar parameters and through perturbations to model state transition equations. The ensemble Kalman filter and the sequential importance resampling filter were tested for the ability to attenuate effects of these types of uncertainty on yield estimates. LAI and theta observations were synthesized according to characteristics of existing remote sensing data, and effects of observation error were tested. Results indicate that the potential for assimilation to improve end-of-season yield estimates is low. Limitations are due to a lack of root zone soil moisture information, error in LAI observations, and a lack of correlation between leaf and grain growth.
Efficient sampling algorithms for Monte Carlo based treatment planning
International Nuclear Information System (INIS)
DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.
1998-01-01
Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed
Explorations into Data Assimilation of Short-lived Chemicals
Cohen, R. C.
2016-12-01
Until recently data assimilation of atmospheric constituents has focused on features that are at synoptic or larger spatial scales and used constituents that are conserved on time scales of days (e.g. aerosol) to weeks (CO). The chemical lifetime of NO2 in the boundary layer is of order 5 hours. This fact demands we take a different approach to thinking about the information in an NO2 assimilation and results in distinctly different aspects of an assimilation that are sensitive to NO2 than, for example, in a CO assimilation. Here we discuss aspects of assimilation of short-lived gases. Two data sources are likely to become widely available: geostationary satellites and dense surface networks. Imaging these data sources exist, we describe the potential for constraints on emissions at the scale of 10 km and also examine the potential for constraints on meteorological fields including PBL winds, soil moisture and PBL height.
(U) Introduction to Monte Carlo Methods
Energy Technology Data Exchange (ETDEWEB)
Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-03-20
Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.
Nitrogen mineralization and assimilation at millimeter scales.
Myrold, David D; Pett-Ridge, Jennifer; Bottomley, Peter J
2011-01-01
The assimilation (uptake or immobilization) of inorganic nitrogen (N) and the production of ammonium (NH(4)(+)) from organic N compounds are universal functions of microorganisms, and the balance between these two processes is tightly regulated by the relative demands of microbes for N and carbon (C). In a heterogeneous environment, such as soils, bulk measurements of N mineralization or immobilization do not reflect the variation of these two processes in different microhabitats (1μm-1mm). Our purpose is to review the approaches that can be applied to measure N mineralization and immobilization within soil microhabitats, at scales of millimeter (using adaptations of (15)N isotope pool dilution and IRMS-isotope ratio mass spectrometry) to micrometer (using SIMS-secondary ion mass spectrometry). Copyright © 2011 Elsevier Inc. All rights reserved.
Transgenic plants that exhibit enhanced nitrogen assimilation
Coruzzi, Gloria M.; Brears, Timothy
1999-01-01
The present invention relates to a method for producing plants with improved agronomic and nutritional traits. Such traits include enhanced nitrogen assimilatory and utilization capacities, faster and more vigorous growth, greater vegetative and reproductive yields, and enriched or altered nitrogen content in vegetative and reproductive parts. More particularly, the invention relates to the engineering of plants modified to have altered expression of key enzymes in the nitrogen assimilation and utilization pathways. In one embodiment of the present invention, the desired altered expression is accomplished by engineering the plant for ectopic overexpression of one of more the native or modified nitrogen assimilatory enzymes. The invention also has a number of other embodiments, all of which are disclosed herein.
The dynamic radiation environment assimilation model (DREAM)
Energy Technology Data Exchange (ETDEWEB)
Reeves, Geoffrey D [Los Alamos National Laboratory; Koller, Josef [Los Alamos National Laboratory; Tokar, Robert L [Los Alamos National Laboratory; Chen, Yue [Los Alamos National Laboratory; Henderson, Michael G [Los Alamos National Laboratory; Friedel, Reiner H [Los Alamos National Laboratory
2010-01-01
The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.
Norris, P. M.; da Silva, A. M., Jr.
2016-12-01
Norris and da Silva recently published a method to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation (CDA). The gridcolumn model includes assumed-PDF intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used are MODIS cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. The new approach not only significantly reduces mean and standard deviation biases with respect to the assimilated observables, but also improves the simulated rotational-Ramman scattering cloud optical centroid pressure against independent (non-assimilated) retrievals from the OMI instrument. One obvious difficulty for the method, and other CDA methods, is the lack of information content in passive cloud observables on cloud vertical structure, beyond cloud-top and thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard is helpful, better honoring inversion structures in the background state.
Data Assimilation for Management of Industrial Groundwater Contamination at a Regional Scale
El Gharamti, Mohamad
2014-12-01
Groundwater is one of the main sources for drinking water and agricultural activities. Various activities of both humans and nature may lead to groundwater pollution. Very often, pollution, or contamination, of groundwater goes undetected for long periods of time until it begins to a ect human health and/or the environment. Cleanup technologies used to remediate pollution can be costly and remediation processes are often protracted. A more practical and feasible way to manage groundwater contamination is to monitor and predict contamination and act as soon as there is risk to the population and the environment. Predicting groundwater contamination requires advanced numerical models of groundwater ow and solute transport. Such numerical modeling is increasingly becoming a reference criterion for water resources assessment and environmental protection. Subsurface numerical models are, however, subject to many sources of uncertainties from unknown parameters and approximate dynamics. This dissertation considers the sequential data assimilation approach and tackles the groundwater contamination problem at the port of Rotterdam in the Netherlands. Industrial concentration data are used to monitor and predict the fate of organic contaminants using a threedimensional coupled ow and reactive transport model. We propose a number of 5 novel assimilation techniques that address di erent challenges, including prohibitive computational burden, the nonlinearity and coupling of the subsurface dynamics, and the structural and parametric uncertainties. We also investigate the problem of optimal observational designs to optimize the location and the number of wells. The proposed new methods are based on the ensemble Kalman Filter (EnKF), which provides an e cient numerical solution to the Bayesian ltering problem. The dissertation rst investigates in depth the popular joint and dual ltering formulations of the state-parameters estimation problem. New methodologies, algorithmically
Shell model Monte Carlo methods
International Nuclear Information System (INIS)
Koonin, S.E.
1996-01-01
We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs
Zimmerman, George B.
Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials.
International Nuclear Information System (INIS)
Zimmerman, G.B.
1997-01-01
Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials. copyright 1997 American Institute of Physics
International Nuclear Information System (INIS)
Zimmerman, George B.
1997-01-01
Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials
Yang, Z. L.; Zhang, Y.; Kwon, Y.; Lin, P.; Zhao, L.; Hoar, T. J.; Anderson, J. L.; Toure, A. M.; Rodell, M.
2015-12-01
Land plays an important role in shaping regional and global climate and the water cycle. However, many of these processes are not well understood, which is largely due to the lack of high quality datasets. Over the past 5 years, we have developed a global-scale multi-sensor snow data assimilation system based on NCAR's Data Assimilation Research Testbed (DART) coupled to the Community Land Model version 4 (CLM4); CLM4 can be replaced by CLM4.5 or the latest versions as they become available. This data assimilation system can be applied to all land areas to take advantage of high-resolution regional-specific observations. The DART data assimilation system has an unprecedented large ensemble (80-member) atmospheric forcing (temperature, precipitation, winds, humidity, radiation) with a quality of typical reanalysis products, which not only facilitates ensemble land data assimilation, but also allows a comprehensive study of many feedback processes (e.g. the snow albedo feedback and soil moisture-precipitation feedback). While initial findings were reported in the past AGU, AMS and GEWEX meetings, this paper will present comprehensive results from the CLM/DART with assimilating MODIS (Moderate Resolution Imaging Spectroradiometer) snow cover fraction and GRACE (Gravity Recovery and Climate Experiment) terrestrial water storage. Besides our prototype snow data assimilation, the coupled CLM4/DART framework is useful for data assimilation involving other variables, such as soil moisture, skin temperature, and leaf area index from various satellite sources and ground observations. Such a truly multi-mission, multi-platform, multi-sensor, and multi-scale data assimilation system with DART will, ultimately, help constrain earth system models using all kinds of observations to improve their prediction skills from intraseasonal to interannual. Some preliminary results from using our snow data assimilation output in seasonal climate prediction will be presented as well.
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
Extending canonical Monte Carlo methods
International Nuclear Information System (INIS)
Velazquez, L; Curilef, S
2010-01-01
In this paper, we discuss the implications of a recently obtained equilibrium fluctuation-dissipation relation for the extension of the available Monte Carlo methods on the basis of the consideration of the Gibbs canonical ensemble to account for the existence of an anomalous regime with negative heat capacities C α with α≈0.2 for the particular case of the 2D ten-state Potts model
Parallel Monte Carlo reactor neutronics
International Nuclear Information System (INIS)
Blomquist, R.N.; Brown, F.B.
1994-01-01
The issues affecting implementation of parallel algorithms for large-scale engineering Monte Carlo neutron transport simulations are discussed. For nuclear reactor calculations, these include load balancing, recoding effort, reproducibility, domain decomposition techniques, I/O minimization, and strategies for different parallel architectures. Two codes were parallelized and tested for performance. The architectures employed include SIMD, MIMD-distributed memory, and workstation network with uneven interactive load. Speedups linear with the number of nodes were achieved
Statistical techniques to extract information during SMAP soil moisture assimilation
Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.
2017-12-01
Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.
SMOS brightness temperature assimilation into the Community Land Model
Rains, Dominik; Han, Xujun; Lievens, Hans; Montzka, Carsten; Verhoest, Niko E. C.
2017-11-01
SMOS (Soil Moisture and Ocean Salinity mission) brightness temperatures at a single incident angle are assimilated into the Community Land Model (CLM) across Australia to improve soil moisture simulations. Therefore, the data assimilation system DasPy is coupled to the local ensemble transform Kalman filter (LETKF) as well as to the Community Microwave Emission Model (CMEM). Brightness temperature climatologies are precomputed to enable the assimilation of brightness temperature anomalies, making use of 6 years of SMOS data (2010-2015). Mean correlation R with in situ measurements increases moderately from 0.61 to 0.68 (11 %) for upper soil layers if the root zone is included in the updates. A reduced improvement of 5 % is achieved if the assimilation is restricted to the upper soil layers. Root-zone simulations improve by 7 % when updating both the top layers and root zone, and by 4 % when only updating the top layers. Mean increments and increment standard deviations are compared for the experiments. The long-term assimilation impact is analysed by looking at a set of quantiles computed for soil moisture at each grid cell. Within hydrological monitoring systems, extreme dry or wet conditions are often defined via their relative occurrence, adding great importance to assimilation-induced quantile changes. Although still being limited now, longer L-band radiometer time series will become available and make model output improved by assimilating such data that are more usable for extreme event statistics.
PATTERNS OF OXIDATIVE ASSIMILATION IN STRAINS OF ACETOBACTER AND AZOTOBACTER
Tomlinson, Geraldine A.; Campbell, J. J. R.
1963-01-01
Tomlinson, Geraldine A. (The University of British Columbia, Vancouver, B.C., Canada), and J. J. R. Campbell. Patterns of oxidative assimilation in strains of Acetobacter and Azotobacter. J. Bacteriol. 86:1165–1172. 1963.—Oxidative assimilation of glucose-U-C14 was studied with washed-cell suspensions of Acetobacter aceti, A. xylinum, Azotobacter vinelandii, and A. agilis. The suggestion that oxidative assimilation is largely the incorporation of endogenously produced ammonia is tenable. A. aceti did not exhibit oxidative assimilation and it did not incorporate ammonia in the presence of glucose, α-ketoglutarate, or pyruvate. A. xylinum, A. vinelandii, and A. agilis incorporated C14 into the nitrogenous fractions of the cell. The level of assimilation into A. xylinum was low due to the accumulation of extracellular cellulose, and the level of assimilation into the Azotobacter was low presumably because of the requirement of energy for nitrogen fixation. The Azotobacter were characterized by the presence of a high level of radioactivity in the cold trichloroacetic acid-soluble pool. None of the organisms accumulated compounds in the supernatant fluid that might be considered pacemakers in glucose oxidation, and this could be a contributing factor in the low level of assimilation. PMID:14086085
SMOS brightness temperature assimilation into the Community Land Model
Directory of Open Access Journals (Sweden)
D. Rains
2017-11-01
Full Text Available SMOS (Soil Moisture and Ocean Salinity mission brightness temperatures at a single incident angle are assimilated into the Community Land Model (CLM across Australia to improve soil moisture simulations. Therefore, the data assimilation system DasPy is coupled to the local ensemble transform Kalman filter (LETKF as well as to the Community Microwave Emission Model (CMEM. Brightness temperature climatologies are precomputed to enable the assimilation of brightness temperature anomalies, making use of 6 years of SMOS data (2010–2015. Mean correlation R with in situ measurements increases moderately from 0.61 to 0.68 (11 % for upper soil layers if the root zone is included in the updates. A reduced improvement of 5 % is achieved if the assimilation is restricted to the upper soil layers. Root-zone simulations improve by 7 % when updating both the top layers and root zone, and by 4 % when only updating the top layers. Mean increments and increment standard deviations are compared for the experiments. The long-term assimilation impact is analysed by looking at a set of quantiles computed for soil moisture at each grid cell. Within hydrological monitoring systems, extreme dry or wet conditions are often defined via their relative occurrence, adding great importance to assimilation-induced quantile changes. Although still being limited now, longer L-band radiometer time series will become available and make model output improved by assimilating such data that are more usable for extreme event statistics.
Ren, Lei; Hartnett, Michael
2017-02-01
Accurate forecasting of coastal surface currents is of great economic importance due to marine activities such as marine renewable energy and fish farms in coastal regions in recent twenty years. Advanced oceanographic observation systems such as satellites and radars can provide many parameters of interest, such as surface currents and waves, with fine spatial resolution in near real time. To enhance modelling capability, data assimilation (DA) techniques which combine the available measurements with the hydrodynamic models have been used since the 1990s in oceanography. Assimilating measurements into hydrodynamic models makes the original model background states follow the observation trajectory, then uses it to provide more accurate forecasting information. Galway Bay is an open, wind dominated water body on which two coastal radars are deployed. An efficient and easy to implement sequential DA algorithm named Optimal Interpolation (OI) was used to blend radar surface current data into a three-dimensional Environmental Fluid Dynamics Code (EFDC) model. Two empirical parameters, horizontal correlation length and DA cycle length (CL), are inherent within OI. No guidance has previously been published regarding selection of appropriate values of these parameters or how sensitive OI DA is to variations in their values. Detailed sensitivity analysis has been performed on both of these parameters and results presented. Appropriate value of DA CL was examined and determined on producing the minimum Root-Mean-Square-Error (RMSE) between radar data and model background states. Analysis was performed to evaluate assimilation index (AI) of using an OI DA algorithm in the model. AI of the half-day forecasting mean vectors' directions was over 50% in the best assimilation model. The ability of using OI to improve model forecasts was also assessed and is reported upon.
Satellite Altimetry, Ocean Circulation, and Data Assimilation
Fu, Lee-Lueng
1999-01-01
objective of my research is to investigate the extent of information one can infer about the deep ocean from altimetric observations. The approach is data assimilation by ocean models. The objective is to develop a modeling/data assimilation system that will produce estimates of the 3-dimensional state of the ocean for the entire duration of the TOPEX/POSEIDON Mission.
Assimilative Learning with the Aid of Cognitive Maps
Directory of Open Access Journals (Sweden)
D. Läge
2008-06-01
Full Text Available Assimilative learning is understood asintegrating new information into existing knowledge orcognitive structures without restructuring the currentschema. If new information causes inconsistencies, cognitiveefforts are necessary to reorganize or to accommodate theold knowledge. Thus, assimilative learning is more efficientand economic. Nonetheless a stable and most notably acorrect memory representation which “spans” theknowledge space is essential. The current article highlightsthe logic of assimilative learning and shows how buildingelaborately a basic structure as well as the assimilativeintegration of new information can be eased with the aid ofcognitive maps. Such a didactical scenario can be easilyimplemented in the field of eLearning and thus, isadaptively and automatically supporting the learningprocess.
Finding False Paths in Sequential Circuits
Matrosova, A. Yu.; Andreeva, V. V.; Chernyshov, S. V.; Rozhkova, S. V.; Kudin, D. V.
2018-02-01
Method of finding false paths in sequential circuits is developed. In contrast with heuristic approaches currently used abroad, the precise method based on applying operations on Reduced Ordered Binary Decision Diagrams (ROBDDs) extracted from the combinational part of a sequential controlling logic circuit is suggested. The method allows finding false paths when transfer sequence length is not more than the given value and obviates the necessity of investigation of combinational circuit equivalents of the given lengths. The possibilities of using of the developed method for more complicated circuits are discussed.
Passive Baited Sequential Filth Fly Trap.
Aldridge, Robert L; Britch, Seth C; Snelling, Melissa; Gutierez, Arturo; White, Gregory; Linthicum, Kenneth J
2015-09-01
Filth fly control measures may be optimized with a better understanding of fly population dynamics measured throughout the day. We describe the modification of a commercial motorized sequential mosquito trap to accept liquid odorous bait and leverage a classic inverted-cone design to passively confine flies in 8 modified collection bottles corresponding to 8 intervals. Efficacy trials in a hot-arid desert environment indicate no significant difference (P = 0.896) between the modified sequential trap and a Rid-Max® fly trap.
Asynchronous Operators of Sequential Logic Venjunction & Sequention
Vasyukevich, Vadim
2011-01-01
This book is dedicated to new mathematical instruments assigned for logical modeling of the memory of digital devices. The case in point is logic-dynamical operation named venjunction and venjunctive function as well as sequention and sequentional function. Venjunction and sequention operate within the framework of sequential logic. In a form of the corresponding equations, they organically fit analytical expressions of Boolean algebra. Thus, a sort of symbiosis is formed using elements of asynchronous sequential logic on the one hand and combinational logic on the other hand. So, asynchronous
Double sequential defibrillation for refractory ventricular fibrillation.
El Tawil, Chady; Mrad, Sandra; Khishfe, Basem F
2017-12-01
A 54-year-old suffered from an out-of-hospital cardiac arrest. Compressions were started within minutes and the patient was in refractory ventricular fibrillation despite multiple asynchronized shocks and maximal doses of antiarrhythmic agents. Double sequential defibrillation was attempted with successful Return Of Spontaneous Circulation (ROSC) after a total of 61min of cardiac arrest. The patient was discharged home neurologically intact. Double sequential defibrillation could be a simple effective approach to patients with refractory ventricular fibrillation. Copyright © 2017 Elsevier Inc. All rights reserved.
Conversion from Excel into Aleph sequential
Renaville, François; Thirion, Paul
2009-01-01
Libraries must sometimes load records that are not available to them in a bibliographic format standard (Marc21, Unimarc...): integration of the book database of an academic research center, list of new e-journals bought by the library... This can make the conversion procedure of the data to the Aleph sequential format quite hard. Sometimes the records are only available in Excel. This poster explains how to convert easily in a few steps an Excel file into Aleph sequential in order to load re...
Pawson, Steven; Lin, Shian-Jiann; Rood, Richard B.; Stajner, Ivanka; Nebuda, Sharon; Nielsen, J. Eric; Douglass, Anne R.
2000-01-01
In order to support the EOS-Chem project, a comprehensive assimilation package for the coupled chemical-dynamical system is being developed by the Data Assimilation Office at NASA GSFC. This involves development of a coupled chemistry/meteorology model and of data assimilation techniques for trace species and meteorology. The model is being developed using the flux-form semi-Lagrangian dynamical core of Lin and Rood, the physical parameterizations from the NCAR Community Climate Model, and atmospheric chemistry modules from the Atmospheric Chemistry and Dynamics branch at NASA GSFC. To date the following results have been obtained: (i) multi-annual simulations with the dynamics-radiation model show the credibility of the package for atmospheric simulations; (ii) initial simulations including a limited number of middle atmospheric trace gases reveal the realistic nature of transport mechanisms, although there is still a need for some improvements. Samples of these results will be shown. A meteorological assimilation system is currently being constructed using the model; this will form the basis for the proposed meteorological/chemical assimilation package. The latter part of the presentation will focus on areas targeted for development in the near and far terms, with the objective of Providing a comprehensive assimilation package for the EOS-Chem science experiment. The first stage will target ozone assimilation. The plans also encompass a reanalysis (ReSTS) for the 1991-1995 period, which includes the Mt. Pinatubo eruption and the time when a large number of UARS observations were available. One of the most challenging aspects of future developments will be to couple theoretical advances in tracer assimilation with the practical considerations of a real environment and eventually a near-real-time assimilation system.
Mazzarella, Vincenzo; Maiello, Ida; Capozzi, Vincenzo; Budillon, Giorgio; Ferretti, Rossella
2017-08-01
This work aims to provide a comparison between three dimensional and four dimensional variational data assimilation methods (3D-Var and 4D-Var) for a heavy rainfall case in central Italy. To evaluate the impact of the assimilation of reflectivity and radial velocity acquired from Monte Midia Doppler radar into the Weather Research Forecasting (WRF) model, the quantitative precipitation forecast (QPF) is used.The two methods are compared for a heavy rainfall event that occurred in central Italy on 14 September 2012 during the first Special Observation Period (SOP1) of the HyMeX (HYdrological cycle in Mediterranean EXperiment) campaign. This event, characterized by a deep low pressure system over the Tyrrhenian Sea, produced flash floods over the Marche and Abruzzo regions, where rainfall maxima reached more than 150 mm 24 h-1.To identify the best QPF, nine experiments are performed using 3D-Var and 4D-Var data assimilation techniques. All simulations are compared in terms of rainfall forecast and precipitation measured by the gauges through three statistical indicators: probability of detection (POD), critical success index (CSI) and false alarm ratio (FAR). The assimilation of conventional observations with 4D-Var method improves the QPF compared to 3D-Var. In addition, the use of radar measurements in 4D-Var simulations enhances the performances of statistical scores for higher rainfall thresholds.
Application of Bred Vectors To Data Assimilation
Corazza, M.; Kalnay, E.; Patil, Dj
We introduced a statistic, the BV-dimension, to measure the effective local finite-time dimensionality of the atmosphere. We show that this dimension is often quite low, and suggest that this finding has important implications for data assimilation and the accuracy of weather forecasting (Patil et al, 2001). The original database for this study was the forecasts of the NCEP global ensemble forecasting system. The initial differences between the control forecast and the per- turbed forecasts are called bred vectors. The control and perturbed initial conditions valid at time t=n(t are evolved using the forecast model until time t=(n+1) (t. The differences between the perturbed and the control forecasts are scaled down to their initial amplitude, and constitute the bred vectors valid at (n+1) (t. Their growth rate is typically about 1.5/day. The bred vectors are similar by construction to leading Lya- punov vectors except that they have small but finite amplitude, and they are valid at finite times. The original NCEP ensemble data set has 5 independent bred vectors. We define a local bred vector at each grid point by choosing the 5 by 5 grid points centered at the grid point (a region of about 1100km by 1100km), and using the north-south and east- west velocity components at 500mb pressure level to form a 50 dimensional column vector. Since we have k=5 global bred vectors, we also have k local bred vectors at each grid point. We estimate the effective dimensionality of the subspace spanned by the local bred vectors by performing a singular value decomposition (EOF analysis). The k local bred vector columns form a 50xk matrix M. The singular values s(i) of M measure the extent to which the k column unit vectors making up the matrix M point in the direction of v(i). We define the bred vector dimension as BVDIM={Sum[s(i)]}^2/{Sum[s(i)]^2} For example, if 4 out of the 5 vectors lie along v, and one lies along v, the BV- dimension would be BVDIM[sqrt(4), 1, 0
Kimberlite ascent by assimilation-fuelled buoyancy.
Russell, James K; Porritt, Lucy A; Lavallée, Yan; Dingwell, Donald B
2012-01-18
Kimberlite magmas have the deepest origin of all terrestrial magmas and are exclusively associated with cratons. During ascent, they travel through about 150 kilometres of cratonic mantle lithosphere and entrain seemingly prohibitive loads (more than 25 per cent by volume) of mantle-derived xenoliths and xenocrysts (including diamond). Kimberlite magmas also reputedly have higher ascent rates than other xenolith-bearing magmas. Exsolution of dissolved volatiles (carbon dioxide and water) is thought to be essential to provide sufficient buoyancy for the rapid ascent of these dense, crystal-rich magmas. The cause and nature of such exsolution, however, remains elusive and is rarely specified. Here we use a series of high-temperature experiments to demonstrate a mechanism for the spontaneous, efficient and continuous production of this volatile phase. This mechanism requires parental melts of kimberlite to originate as carbonatite-like melts. In transit through the mantle lithosphere, these silica-undersaturated melts assimilate mantle minerals, especially orthopyroxene, driving the melt to more silicic compositions, and causing a marked drop in carbon dioxide solubility. The solubility drop manifests itself immediately in a continuous and vigorous exsolution of a fluid phase, thereby reducing magma density, increasing buoyancy, and driving the rapid and accelerating ascent of the increasingly kimberlitic magma. Our model provides an explanation for continuous ascent of magmas laden with high volumes of dense mantle cargo, an explanation for the chemical diversity of kimberlite, and a connection between kimberlites and cratons.
Efficient data assimilation algorithm for bathymetry application
Ghorbanidehno, H.; Lee, J. H.; Farthing, M.; Hesser, T.; Kitanidis, P. K.; Darve, E. F.
2017-12-01
Information on the evolving state of the nearshore zone bathymetry is crucial to shoreline management, recreational safety, and naval operations. The high cost and complex logistics of using ship-based surveys for bathymetry estimation have encouraged the use of remote sensing techniques. Data assimilation methods combine the remote sensing data and nearshore hydrodynamic models to estimate the unknown bathymetry and the corresponding uncertainties. In particular, several recent efforts have combined Kalman Filter-based techniques such as ensembled-based Kalman filters with indirect video-based observations to address the bathymetry inversion problem. However, these methods often suffer from ensemble collapse and uncertainty underestimation. Here, the Compressed State Kalman Filter (CSKF) method is used to estimate the bathymetry based on observed wave celerity. In order to demonstrate the accuracy and robustness of the CSKF method, we consider twin tests with synthetic observations of wave celerity, while the bathymetry profiles are chosen based on surveys taken by the U.S. Army Corps of Engineer Field Research Facility (FRF) in Duck, NC. The first test case is a bathymetry estimation problem for a spatially smooth and temporally constant bathymetry profile. The second test case is a bathymetry estimation problem for a temporally evolving bathymetry from a smooth to a non-smooth profile. For both problems, we compare the results of CSKF with those obtained by the local ensemble transform Kalman filter (LETKF), which is a popular ensemble-based Kalman filter method.
Assimilate partitioning in avocado, Persea americana
Energy Technology Data Exchange (ETDEWEB)
Finazzo, S.; Davenport, T.L.
1986-04-01
Assimilate partitioning is being studied in avocado, Persea americana cv. Millborrow in relation to fruit set. Single leaves on girdled branches of 10 year old trees were radiolabeled for 1 hr with 13..mu..Ci of /sup 14/CO/sub 2/. The source leaves were sampled during the experiment to measure translocation rates. At harvest the sink tissues were dissected and the incorporated radioactivity was measured. The translocation of /sup 14/C-labelled compounds to other leaves was minimal. Incorporation of label into fruitlets varied with the tissue and the stage of development. Sink (fruitlets) nearest to the labelled leaf and sharing the same phyllotaxy incorporated the most /sup 14/C. Source leaves for single non-abscising fruitlets retained 3X more /sup 14/C-labelled compounds than did source leaves for 2 or more fruitlets at 31 hrs. post-labelling. Export of label decreased appreciably when fruitlets abscised. If fruitlets abscised within 4 days of labeling then the translocation pattern was similar to the pattern for single fruitlets. If the fruitlet abscised later, the translocation pattern was intermediate between the single and double fruitlet pattern.
Parker, James A. D.; Eleri Pryse, S.; Jackson-Booth, Natasha; Buckland, Rachel A.
2018-01-01
The main ionospheric trough is a large-scale spatial depletion in the electron density distribution at the interface between the high- and mid-latitude ionosphere. In western Europe it appears in early evening, progresses equatorward during the night, and retreats rapidly poleward at dawn. It exhibits substantial day-to-day variability and under conditions of increased geomagnetic activity it moves progressively to lower latitudes. Steep gradients on the trough-walls on either side of the trough minimum, and their variability, can cause problems for radio applications. Numerous studies have sought to characterize and quantify the trough behaviour. The Electron Density Assimilative Model (EDAM) models the ionosphere on a global scale. It assimilates observations into a background ionosphere, the International Reference Ionosphere 2007 (IRI2007), to provide a full 3-D representation of the ionospheric plasma distribution at specified times and days. This current investigation studied the capability of EDAM to model the ionosphere in the region of the main trough. Total electron content (TEC) measurements from 46 GPS stations in western Europe from September to December 2002 were assimilated into EDAM to provide a model of the ionosphere in the trough region. Vertical electron content profiles through the model revealed the trough and the detail of its structure. Statistical results are presented of the latitude of the trough minimum, TEC at the minimum and of other defined parameters that characterize the trough structure. The results are compared with previous observations made with the Navy Ionospheric Monitoring System (NIMS), and reveal the potential of EDAM to model the large-scale structure of the ionosphere.
da Silva, Arlindo M.; Norris, Peter M.
2013-01-01
Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.
Norris, Peter M.; da Silva, Arlindo M.
2016-01-01
Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by
Dobolyi, David G; Dodson, Chad S
2013-12-01
Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PsycINFO Database Record (c) 2013 APA, all rights reserved.
UARS Correlative UKMO Daily Gridded Stratospheric Assimilated Data V001
National Aeronautics and Space Administration — The UARS Correlative assimilation data from the U.K. Meteorological Office (UKMO) consists of daily model runs at 12:00 GMT as a means of providing an independent...
Air Quality Activities in the Global Modeling and Assimilation Office
Pawson, Steven
2016-01-01
GMAO's mission is to enhance the use of NASA's satellite observations in weather and climate modeling. This presentation will be discussing GMAO's mission, value of data assimilation, and some relevant (available) GMAO data products.
Phyto-agglutinin, total proteins and amino assimilating enzymatic ...
African Journals Online (AJOL)
Jane
2011-09-28
2K cultivars extract showed highly phyto-agglutination of human erythrocytes with reproductive organs and other tissues, which represents the presence of potent lectins (phyto-agglutinin). The amino assimilating enzymatic.
Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast
Zhu, Jiang; Stevens, E.; Zavodsky, B. T.; Zhang, X.; Heinrichs, T.; Broderson, D.
2014-01-01
Data assimilation has been demonstrated very useful in improving both global and regional numerical weather prediction. Alaska has very coarser surface observation sites. On the other hand, it gets much more satellite overpass than lower 48 states. How to utilize satellite data to improve numerical prediction is one of hot topics among weather forecast community in Alaska. The Geographic Information Network of Alaska (GINA) at University of Alaska is conducting study on satellite data assimilation for WRF model. AIRS/CRIS sounder profile data are used to assimilate the initial condition for the customized regional WRF model (GINA-WRF model). Normalized standard deviation, RMSE, and correlation statistic analysis methods are applied to analyze one case of 48 hours forecasts and one month of 24-hour forecasts in order to evaluate the improvement of regional numerical model from Data assimilation. The final goal of the research is to provide improved real-time short-time forecast for Alaska regions.
The Culure Assimilator: An Approach to Cross-Cultural Training
Fiedler, Fred E.; And Others
1971-01-01
Evaluates the cultural assimilator, a kind of training manual to help members of one culture understand and adjust to another culture. Describes those constructed for the Arab countries, Iran, Thailand, Central America, and Greece. (MB)
Develop a Hybrid Coordinate Ocean Model with Data Assimilation Capabilities
National Research Council Canada - National Science Library
Thacker, W. C
2003-01-01
.... The objectives of the research are as follows: (1) to develop a methodology for assimilating temperature and salinity profiles from XBT, CTD, and ARGO float data that accommodates the peculiarities of HYCOM's hybrid vertical coordinates, allowing...
Assimilation potential of water column biota: Mesocosm-based evaluations
Digital Repository Service at National Institute of Oceanography (India)
Ramaiah, N.; Ansari, Z.A.; Sadhasivan, A.; Naik, S.; Sawkar, K.
-toxic. It reveals the findings of mesocosm experiments, conducted to evaluate the assimilation potential of water column biota (bacteria, phytoplankton, and zooplankton). Bulk water quantities from coastal locations, characterized by intense tourist activity, were...
A simple lightning assimilation technique for improving retrospective WRF simulations.
Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-F...
Assimilate unloading from maize (Zea mays L.) pedicel tissues
International Nuclear Information System (INIS)
Porter, G.A.; Knievel, D.P.; Shannon, J.C.
1987-01-01
Sugar and 14 C-assimilate release from the pedicel tissue of attached maize (Zea mays L.) kernels was studied following treatment with solute concentrations of up to 800 millimolal. Exposure and collection times ranged from 3 to 6 hours. Sugar and 14 C-assimilate unloading and collection in agar traps was reduced by 25 and 43%, respectively, following exposure to 800 millimolal mannitol. Inhibition of unloading was not specific to mannitol, since similar concentrations of glucose, fructose, or equimolar glucose plus fructose resulted in comparable inhibition. Ethylene glycol, a rapidly permeating solute which should not greatly influence cell turgor, did not inhibit 14 C-assimilate unloading. Based on these results, they suggest that inhibition of unloading by high concentrations of sugar or mannitol was due to reduced pedicel cell turgor. Changes in pedicel cell turgor may play a role in the regulation of assimilate transfer within the maize kernel
Regional Ocean Modeling System (ROMS): Main Hawaiian Islands: Data Assimilating
National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 3-day, 3-hourly data assimilating hindcast for the region surrounding the main Hawaiian islands at approximately 4-km...
Weak Sequential Composition in Process Algebras
Rensink, Arend; Jonsson, B.; Parrow, J.; Wehrheim, H.
1994-01-01
n this paper we study a special operator for sequential composition, which is defined relative to a dependency relation over the actions of a given system. The idea is that actions which are not dependent (intuitively because they share no common resources) do not have to wait for one another to
Sequential Bayesian technique: An alternative approach for ...
Indian Academy of Sciences (India)
This paper proposes a sequential Bayesian approach similar to Kalman ﬁlter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with ...
Sequential Bayesian technique: An alternative approach for ...
Indian Academy of Sciences (India)
MS received 8 October 2007; revised 15 July 2008. Abstract. This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become ...
Fareed Zakaria's Democratic Sequentialism and Nigeria's ...
African Journals Online (AJOL)
This essay will attempt to analyse the prospects for political and economic liberalisation and ultimately democracy in Nigeria by examining Fareed Zakaria's prescription of democratic sequentialism i.e, liberalism before democracy. Zakaria's argument was that due to cultural variation, different societies will require different ...
Early Astronomical Sequential Photography, 1873-1923
Bonifácio, Vitor
2011-11-01
In 1873 Jules Janssen conceived the first automatic sequential photographic apparatus to observe the eagerly anticipated 1874 transit of Venus. This device, the 'photographic revolver', is commonly considered today as the earliest cinema precursor. In the following years, in order to study the variability or the motion of celestial objects, several instruments, either manually or automatically actuated, were devised to obtain as many photographs as possible of astronomical events in a short time interval. In this paper we strive to identify from the available documents the attempts made between 1873 and 1923, and discuss the motivations behind them and the results obtained. During the time period studied astronomical sequential photography was employed to determine the time of the instants of contact in transits and occultations, and to study total solar eclipses. The technique was seldom used but apparently the modern film camera invention played no role on this situation. Astronomical sequential photographs were obtained both before and after 1895. We conclude that the development of astronomical sequential photography was constrained by the reduced number of subjects to which the technique could be applied.
Sensitivity Analysis in Sequential Decision Models.
Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet
2017-02-01
Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.
S.M.P. SEQUENTIAL MATHEMATICS PROGRAM.
CICIARELLI, V; LEONARD, JOSEPH
A SEQUENTIAL MATHEMATICS PROGRAM BEGINNING WITH THE BASIC FUNDAMENTALS ON THE FOURTH GRADE LEVEL IS PRESENTED. INCLUDED ARE AN UNDERSTANDING OF OUR NUMBER SYSTEM, AND THE BASIC OPERATIONS OF WORKING WITH WHOLE NUMBERS--ADDITION, SUBTRACTION, MULTIPLICATION, AND DIVISION. COMMON FRACTIONS ARE TAUGHT IN THE FIFTH, SIXTH, AND SEVENTH GRADES. A…
The curse of sequentiality in routing games
Correa, José; de Jong, Jasper; de Keijzer, Bart; Uetz, Marc Jochen; Markakis, Evangelos; Schäfer, Guido
2015-01-01
In the "The curse of simultaneity", Paes Leme at al. show that there are interesting classes of games for which sequential decision making and corresponding subgame perfect equilibria avoid worst case Nash equilibria, resulting in substantial improvements for the price of anarchy. This is called the
A framework for sequential multiblock component methods
Smilde, A.K.; Westerhuis, J.A.; Jong, S.de
2003-01-01
Multiblock or multiset methods are starting to be used in chemistry and biology to study complex data sets. In chemometrics, sequential multiblock methods are popular; that is, methods that calculate one component at a time and use deflation for finding the next component. In this paper a framework
STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING: A SURVEY
Directory of Open Access Journals (Sweden)
Damián Fernández
2014-12-01
Full Text Available We review the motivation for, the current state-of-the-art in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globalization challenges, and extentions of the method to the more general variational problems.
Adult Word Recognition and Visual Sequential Memory
Holmes, V. M.
2012-01-01
Two experiments were conducted investigating the role of visual sequential memory skill in the word recognition efficiency of undergraduate university students. Word recognition was assessed in a lexical decision task using regularly and strangely spelt words, and nonwords that were either standard orthographically legal strings or items made from…
Interpretability degrees of finitely axiomatized sequential theories
Visser, Albert
In this paper we show that the degrees of interpretability of finitely axiomatized extensions-in-the-same-language of a finitely axiomatized sequential theory-like Elementary Arithmetic EA, IΣ1, or the Gödel-Bernays theory of sets and classes GB-have suprema. This partially answers a question posed
Interpretability Degrees of Finitely Axiomatized Sequential Theories
Visser, Albert
2012-01-01
In this paper we show that the degrees of interpretability of finitely axiomatized extensions-in-the-same-language of a finitely axiomatized sequential theory —like Elementary Arithmetic EA, IΣ1, or the Gödel-Bernays theory of sets and classes GB— have suprema. This partially answers a question
Sequential auctions for full truckload allocation
Mes, Martijn R.K.
2008-01-01
In this thesis we examine the use of sequential auctions for the dynamic allocation of transportation jobs. For all players, buyers and sellers, we develop strategies and examine their performance both in terms of individual benefits and with respect to the global logistical performance (resource
Antitwilight II: Monte Carlo simulations.
Richtsmeier, Steven C; Lynch, David K; Dearborn, David S P
2017-07-01
For this paper, we employ the Monte Carlo scene (MCScene) radiative transfer code to elucidate the underlying physics giving rise to the structure and colors of the antitwilight, i.e., twilight opposite the Sun. MCScene calculations successfully reproduce colors and spatial features observed in videos and still photos of the antitwilight taken under clear, aerosol-free sky conditions. Through simulations, we examine the effects of solar elevation angle, Rayleigh scattering, molecular absorption, aerosol scattering, multiple scattering, and surface reflectance on the appearance of the antitwilight. We also compare MCScene calculations with predictions made by the MODTRAN radiative transfer code for a solar elevation angle of +1°.
Joint Center for Satellite Data Assimilation Overview and Research Activities
Auligne, T.
2017-12-01
In 2001 NOAA/NESDIS, NOAA/NWS, NOAA/OAR, and NASA, subsequently joined by the US Navy and Air Force, came together to form the Joint Center for Satellite Data Assimilation (JCSDA) for the common purpose of accelerating the use of satellite data in environmental numerical prediction modeling by developing, using, and anticipating advances in numerical modeling, satellite-based remote sensing, and data assimilation methods. The primary focus was to bring these advances together to improve operational numerical model-based forecasting, under the premise that these partners have common technical and logistical challenges assimilating satellite observations into their modeling enterprises that could be better addressed through cooperative action and/or common solutions. Over the last 15 years, the JCSDA has made and continues to make major contributions to operational assimilation of satellite data. The JCSDA is a multi-agency U.S. government-owned-and-operated organization that was conceived as a venue for the several agencies NOAA, NASA, USAF and USN to collaborate on advancing the development and operational use of satellite observations into numerical model-based environmental analysis and forecasting. The primary mission of the JCSDA is to "accelerate and improve the quantitative use of research and operational satellite data in weather, ocean, climate and environmental analysis and prediction systems." This mission is fulfilled through directed research targeting the following key science objectives: Improved radiative transfer modeling; new instrument assimilation; assimilation of humidity, clouds, and precipitation observations; assimilation of land surface observations; assimilation of ocean surface observations; atmospheric composition; and chemistry and aerosols. The goal of this presentation is to briefly introduce the JCSDA's mission and vision, and to describe recent research activities across various JCSDA partners.
Monte Carlo techniques in radiation therapy
Verhaegen, Frank
2013-01-01
Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...
Development of ionospheric data assimilation model under geomagnetic storm conditions
Lin, C. C. H.; Chen, C. H.; Chen, W.; Matsuo, T.
2016-12-01
This study attempts to construct the ionosphere data assimilation model for both quiet and storm time ionosphere. The model assimilates radio occultation and ground-based GNSS observations of global ionosphere using an Ensemble Kalman Filter (EnKF) software of Data Assimilation Research Testbed (DART) together with the theoretical thermosphere-ionosphere-electrodynamic general circulation model (TIEGCM), developed by National Center for Atmospheric Research (NCAR). Using DART-TIEGCM, we investigate the effects of rapid assimilation-forecast cycling for the 26 September 2011 geomagnetic storm period. Effects of various assimilation-forecast cycles, 60-, 30-, and 10-minutes, on the ionospheric forecast are examined by using the global root-mean-square of observation-minus-forecast (OmF) TEC residuals during the entire storm period. Examinations show that the 10-minutes assimilation cycle could greatly improve the quality of model forecast under the storm conditions. Additionally, examinations of storm-time forecast quality for different high latitude forcing given by Heelis and Weimer empirical models are also performed.
Cholesterol Assimilation by Lactobacillus Probiotic Bacteria: An In Vitro Investigation
Directory of Open Access Journals (Sweden)
Catherine Tomaro-Duchesneau
2014-01-01
Full Text Available Excess cholesterol is associated with cardiovascular diseases (CVD, an important cause of mortality worldwide. Current CVD therapeutic measures, lifestyle and dietary interventions, and pharmaceutical agents for regulating cholesterol levels are inadequate. Probiotic bacteria have demonstrated potential to lower cholesterol levels by different mechanisms, including bile salt hydrolase activity, production of compounds that inhibit enzymes such as 3-hydroxy-3-methylglutaryl coenzyme A, and cholesterol assimilation. This work investigates 11 Lactobacillus strains for cholesterol assimilation. Probiotic strains for investigation were selected from the literature: Lactobacillus reuteri NCIMB 11951, L. reuteri NCIMB 701359, L. reuteri NCIMB 702655, L. reuteri NCIMB 701089, L. reuteri NCIMB 702656, Lactobacillus fermentum NCIMB 5221, L. fermentum NCIMB 8829, L. fermentum NCIMB 2797, Lactobacillus rhamnosus ATCC 53103 GG, Lactobacillus acidophilus ATCC 314, and Lactobacillus plantarum ATCC 14917. Cholesterol assimilation was investigated in culture media and under simulated intestinal conditions. The best cholesterol assimilator was L. plantarum ATCC 14917 (15.18 ± 0.55 mg/1010 cfu in MRS broth. L. reuteri NCIMB 701089 assimilated over 67% (2254.70 ± 63.33 mg/1010 cfu of cholesterol, the most of all the strains, under intestinal conditions. This work demonstrates that probiotic bacteria can assimilate cholesterol under intestinal conditions, with L. reuteri NCIMB 701089 showing great potential as a CVD therapeutic.
IASI Radiance Data Assimilation in Local Ensemble Transform Kalman Filter
Cho, K.; Hyoung-Wook, C.; Jo, Y.
2016-12-01
Korea institute of Atmospheric Prediction Systems (KIAPS) is developing NWP model with data assimilation systems. Local Ensemble Transform Kalman Filter (LETKF) system, one of the data assimilation systems, has been developed for KIAPS Integrated Model (KIM) based on cubed-sphere grid and has successfully assimilated real data. LETKF data assimilation system has been extended to 4D- LETKF which considers time-evolving error covariance within assimilation window and IASI radiance data assimilation using KPOP (KIAPS package for observation processing) with RTTOV (Radiative Transfer for TOVS). The LETKF system is implementing semi operational prediction including conventional (sonde, aircraft) observation and AMSU-A (Advanced Microwave Sounding Unit-A) radiance data from April. Recently, the semi operational prediction system updated radiance observations including GPS-RO, AMV, IASI (Infrared Atmospheric Sounding Interferometer) data at July. A set of simulation of KIM with ne30np4 and 50 vertical levels (of top 0.3hPa) were carried out for short range forecast (10days) within semi operation prediction LETKF system with ensemble forecast 50 members. In order to only IASI impact, our experiments used only conventional and IAIS radiance data to same semi operational prediction set. We carried out sensitivity test for IAIS thinning method (3D and 4D). IASI observation number was increased by temporal (4D) thinning and the improvement of IASI radiance data impact on the forecast skill of model will expect.
Plans for Assimilating AQUA data at NASA's DAO
Frank, Donald; Joiner, Joanna; Atlas, Robert; Stajner, Ivanka
2002-01-01
NASA's Data Assimilation Office (DAO) is expanding its work with TIROS Operational Vertical Sounder (TOVS) to assimilate data from the advanced instruments which will fly on NASA's AQUA satellite in early 2002. The Atmospheric Infrared Sounder (AIRS), which has over 2000 channels, together with the Advanced Microwave Sounding Unit-A (AMSU-A) and the Humidity Sounder Brazil (HSB) will provide many technical challenges for data assimilation centers. One of the primary concerns is how best to subset the data in order to efficiently extract information about the Earth's atmosphere and surface. This includes static and dynamic channel selection as well as pixel thinning. The DAO is currently experimenting with simulated AIRS/AMSU/HSB radiances within the framework of our finite volume data assimilation system (NDAS) using the OPTRAN radiative transfer code developed as part of the NOAA/NASA Joint Center for Satellite Data Assimilation. The short-term goals include the assessment of the cost of processing various data subsets and preparation for near-real-time assimilation within a few months of launch. We will also discuss plans and tools for evaluating the quality of AIRS data including radiances and level 2 products from the AIRS science team.
Monte Carlo simulations of neutron scattering instruments
International Nuclear Information System (INIS)
Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.
2001-01-01
A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)
Status of Monte Carlo dose planning
International Nuclear Information System (INIS)
Mackie, T.R.
1995-01-01
Monte Carlo simulation will become increasing important for treatment planning for radiotherapy. The EGS4 Monte Carlo system, a general particle transport system, has been used most often for simulation tasks in radiotherapy although ETRAN/ITS and MCNP have also been used. Monte Carlo treatment planning requires that the beam characteristics such as the energy spectrum and angular distribution of particles emerging from clinical accelerators be accurately represented. An EGS4 Monte Carlo code, called BEAM, was developed by the OMEGA Project (a collaboration between the University of Wisconsin and the National Research Council of Canada) to transport particles through linear accelerator heads. This information was used as input to simulate the passage of particles through CT-based representations of phantoms or patients using both an EGS4 code (DOSXYZ) and the macro Monte Carlo (MMC) method. Monte Carlo computed 3-D electron beam dose distributions compare well to measurements obtained in simple and complex heterogeneous phantoms. The present drawback with most Monte Carlo codes is that simulation times are slower than most non-stochastic dose computation algorithms. This is especially true for photon dose planning. In the future dedicated Monte Carlo treatment planning systems like Peregrine (from Lawrence Livermore National Laboratory), which will be capable of computing the dose from all beam types, or the Macro Monte Carlo (MMC) system, which is an order of magnitude faster than other algorithms, may dominate the field
Khaki, M; Forootan, E; Kuhn, M; Awange, J; Papa, F; Shum, C K
2018-06-01
Climate change can significantly influence terrestrial water changes around the world particularly in places that have been proven to be more vulnerable such as Bangladesh. In the past few decades, climate impacts, together with those of excessive human water use have changed the country's water availability structure. In this study, we use multi-mission remotely sensed measurements along with a hydrological model to separately analyze groundwater and soil moisture variations for the period 2003-2013, and their interactions with rainfall in Bangladesh. To improve the model's estimates of water storages, terrestrial water storage (TWS) data obtained from the Gravity Recovery And Climate Experiment (GRACE) satellite mission are assimilated into the World-Wide Water Resources Assessment (W3RA) model using the ensemble-based sequential technique of the Square Root Analysis (SQRA) filter. We investigate the capability of the data assimilation approach to use a non-regional hydrological model for a regional case study. Based on these estimates, we investigate relationships between the model derived sub-surface water storage changes and remotely sensed precipitations, as well as altimetry-derived river level variations in Bangladesh by applying the empirical mode decomposition (EMD) method. A larger correlation is found between river level heights and rainfalls (78% on average) in comparison to groundwater storage variations and rainfalls (57% on average). The results indicate a significant decline in groundwater storage (∼32% reduction) for Bangladesh between 2003 and 2013, which is equivalent to an average rate of 8.73 ± 2.45mm/year. Copyright © 2018 Elsevier B.V. All rights reserved.
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi
MCOR - Monte Carlo depletion code for reference LWR calculations
Energy Technology Data Exchange (ETDEWEB)
Puente Espel, Federico, E-mail: fup104@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Tippayakul, Chanatip, E-mail: cut110@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Ivanov, Kostadin, E-mail: kni1@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Misu, Stefan, E-mail: Stefan.Misu@areva.com [AREVA, AREVA NP GmbH, Erlangen (Germany)
2011-04-15
Research highlights: > Introduction of a reference Monte Carlo based depletion code with extended capabilities. > Verification and validation results for MCOR. > Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations. Additionally
MCOR - Monte Carlo depletion code for reference LWR calculations
International Nuclear Information System (INIS)
Puente Espel, Federico; Tippayakul, Chanatip; Ivanov, Kostadin; Misu, Stefan
2011-01-01
Research highlights: → Introduction of a reference Monte Carlo based depletion code with extended capabilities. → Verification and validation results for MCOR. → Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations
Monte Carlo simulation based reliability evaluation in a multi-bilateral contracts market
International Nuclear Information System (INIS)
Goel, L.; Viswanath, P.A.; Wang, P.
2004-01-01
This paper presents a time sequential Monte Carlo simulation technique to evaluate customer load point reliability in multi-bilateral contracts market. The effects of bilateral transactions, reserve agreements, and the priority commitments of generating companies on customer load point reliability have been investigated. A generating company with bilateral contracts is modelled as an equivalent time varying multi-state generation (ETMG). A procedure to determine load point reliability based on ETMG has been developed. The developed procedure is applied to a reliability test system to illustrate the technique. Representing each bilateral contract by an ETMG provides flexibility in determining the reliability at various customer load points. (authors)
Remarks on a financial inverse problem by means of Monte Carlo Methods
Cuomo, Salvatore; Di Somma, Vittorio; Sica, Federica
2017-10-01
Estimating the price of a barrier option is a typical inverse problem. In this paper we present a numerical and statistical framework for a market with risk-free interest rate and a risk asset, described by a Geometric Brownian Motion (GBM). After approximating the risk asset with a numerical method, we find the final option price by following an approach based on sequential Monte Carlo methods. All theoretical results are applied to the case of an option whose underlying is a real stock.
Assimilative domain proficiency and performance in chemistry coursework
Byrnes, Scott William
The assimilation and synthesis of knowledge is essential for students to be successful in chemistry, yet not all students synthesize knowledge as intended. The study used the Learning Preference Checklist to classify students into one of three learning modalities -- visual, auditory, or kinesthetic (VAK). It also used the Kolb Learning Style Inventory (KLSI), which utilizes four learning domains - Converging, Accommodating, Diverging, and Assimilating - to explain the students' maturation process by showing shift from any domain towards the Assimilating domain. A shift approaching this domain was considered as improvement in the assimilation and synthesis of knowledge. This pre-experimental one-group pretest-posttest study was used to test the hypothesis that modifying a high school chemistry curriculum to accentuate a student's learning preference would result in a shift towards the Assimilative domain on the KLSI and if there was a correlation between the improvement in student learning and a shift towards the KLSI Assimilating domain. Forty-two high school students were issued the VAK and provided with differentiated instruction via homologous cooperative learning groups. Pre- and post-KLSI and chemistry concepts tests were administered. T test analyses showed no significant shift towards the Assimilating domain. Further Pearson's r analyses showed no significant correlation between the KLSI and exam scores. This study contributes to social change by providing empirical evidence related to the effectiveness infusing learning styles into the science curriculum and the integration of the KLSI to monitor cognitive development as tools in raising standardized test scores and enhancing academic achievement. Results from the study can also inform future research into learning styles through their incorporation into the science curriculum.
Assimilation of GPM GMI Rainfall Product with WRF GSI
Li, Xuanli; Mecikalski, John; Zavodsky, Bradley
2015-01-01
The Global Precipitation Measurement (GPM) is an international mission to provide next-generation observations of rain and snow worldwide. The GPM built on Tropical Rainfall Measuring Mission (TRMM) legacy, while the core observatory will extend the observations to higher latitudes. The GPM observations can help advance our understanding of precipitation microphysics and storm structures. Launched on February 27th, 2014, the GPM core observatory is carrying advanced instruments that can be used to quantify when, where, and how much it rains or snows around the world. Therefore, the use of GPM data in numerical modeling work is a new area and will have a broad impact in both research and operational communities. The goal of this research is to examine the methodology of assimilation of the GPM retrieved products. The data assimilation system used in this study is the community Gridpoint Statistical Interpolation (GSI) system for the Weather Research and Forecasting (WRF) model developed by the Development Testbed Center (DTC). The community GSI system runs in independently environment, yet works functionally equivalent to operational centers. With collaboration with the NASA Short-term Prediction Research and Transition (SPoRT) Center, this research explores regional assimilation of the GPM products with case studies. Our presentation will highlight our recent effort on the assimilation of the GPM product 2AGPROFGMI, the retrieved Microwave Imager (GMI) rainfall rate data for initializing a real convective storm. WRF model simulations and storm scale data assimilation experiments will be examined, emphasizing both model initialization and short-term forecast of precipitation fields and processes. In addition, discussion will be provided on the development of enhanced assimilation procedures in the GSI system with respect to other GPM products. Further details of the methodology of data assimilation, preliminary result and test on the impact of GPM data and the
DART: A Community Facility for Ensemble Data Assimilation
Hoar, T. J.; Raeder, K.; Anderson, J. L.; Collins, N.; Liu, H.; Romine, G.; Arellano, A. F.; Lawson, G.
2009-12-01
The Data Assimilation Research Testbed (DART) is a mature community software facility providing researchers access to state-of-the-art ensemble data assimilation tools. The freely-available DART distribution includes fully functional low-order and high-order models, support for commonly available observations, hooks to easily add both new models and observation types, diagnostic programs to interpret the results, and a full tutorial suitable for self-study or teaching data assimilation concepts, including exercises using the models distributed with DART. DART is used regularly with a number of geophysical models including NCAR's WRF and CAM atmospheric models. DART/WRF is being used for tropical storm analysis and prediction in the Pacific and Atlantic and was used to produce real-time predictions during the 2009 Atlantic hurricane season. DART/CAM has played an integral part in the development of the new CAM version 4 that will be used for NCAR's contribution to the next IPCC. DART/CAM has been run for many model configurations to evaluate CAM systematic errors and parameterization options. DART is also in use for chemical assimilation in the WRF-CHEM and CAM-CHEM versions of these models. New models, both small and large continue to be added to the set compatible with DART. During 2009, DART assimilation was developed for the POP (Parallel Ocean Program) ocean general circulation model that is being used for decadal coupled atmosphere/ocean predictions at NCAR. The newest version of the Planet WRF model, configured for Martian data assimilation, is also now in use with DART. Novel observation types also continue to be added to DART. For instance, assimilation capabilities for radiance observations from the COSMIC and MOPITT instruments on earth and from TES on Mars have been added in 2009.
Manzini, Paola; Mariotti, Marco
2004-01-01
A sequentially rationalizable choice function is a choice function which can be obtained by applying sequentially a fixed set of asymmetric binary relations (rationales). A Rational ShortlistMethod (RSM) is a choice function which is sequentially rationalizable by two rationales. These concepts translate into economic language some human choice heuristics studied in psychology. We provide a full characterization of RSMs and study some properties of sequential rationalizability. These properti...
Sequential shrink photolithography for plastic microlens arrays
Dyer, David; Shreim, Samir; Jayadev, Shreshta; Lew, Valerie; Botvinick, Elliot; Khine, Michelle
2011-01-01
Endeavoring to push the boundaries of microfabrication with shrinkable polymers, we have developed a sequential shrink photolithography process. We demonstrate the utility of this approach by rapidly fabricating plastic microlens arrays. First, we create a mask out of the children’s toy Shrinky Dinks by simply printing dots using a standard desktop printer. Upon retraction of this pre-stressed thermoplastic sheet, the dots shrink to a fraction of their original size, which we then lithographically transfer onto photoresist-coated commodity shrink wrap film. This shrink film reduces in area by 95% when briefly heated, creating smooth convex photoresist bumps down to 30 µm. Taken together, this sequential shrink process provides a complete process to create microlenses, with an almost 99% reduction in area from the original pattern size. Finally, with a lithography molding step, we emboss these bumps into optical grade plastics such as cyclic olefin copolymer for functional microlens arrays. PMID:21863126
Directory of Open Access Journals (Sweden)
C. E. Chung
2010-07-01
Full Text Available An estimate of monthly 3-D aerosol solar heating rates and surface solar fluxes in Asia from 2001 to 2004 is described here. This product stems from an Asian aerosol assimilation project, in which a the PNNL regional model bounded by the NCEP reanalyses was used to provide meteorology, b MODIS and AERONET data were integrated for aerosol observations, c the Iowa aerosol/chemistry model STEM-2K1 used the PNNL meteorology and assimilated aerosol observations, and d 3-D (X-Y-Z aerosol simulations from the STEM-2K1 were used in the Scripps Monte-Carlo Aerosol Cloud Radiation (MACR model to produce total and anthropogenic aerosol direct solar forcing for average cloudy skies. The MACR model and STEM-2K1 both used the PNNL model resolution of 0.45°×0.4° in the horizontal and of 23 layers in the troposphere.
The 2001–2004 averaged anthropogenic all-sky aerosol forcing is −1.3 Wm^{−2} (TOA, +7.3 Wm^{−2} (atmosphere and −8.6 Wm^{−2} (surface averaged in Asia (60–138° E and Equator–45° N. In the absence of AERONET SSA assimilation, absorbing aerosol concentration (especially BC aerosol is much smaller, giving −2.3 Wm^{−2} (TOA, +4.5 Wm^{−2} (atmosphere and −6.8 Wm^{−2} (surface, averaged in Asia. In the vertical, monthly forcing is mainly concentrated below 600 hPa with maximum around 800 hPa. Seasonally, low-level forcing is far larger in dry season than in wet season in South Asia, whereas the wet season forcing exceeds the dry season forcing in East Asia. The anthropogenic forcing in the present study is similar to that in Chung et al. (2005 in overall magnitude but the former offers fine-scale features and simulated vertical profiles. The interannual variability of the computed anthropogenic forcing is significant and extremely large over major emission outflow areas. Given the interannual variability, the present study's estimate is within the implicated range of
Monte Carlo Simulation of Phase Transitions
村井, 信行; N., MURAI; 中京大学教養部
1983-01-01
In the Monte Carlo simulation of phase transition, a simple heat bath method is applied to the classical Heisenberg model in two dimensions. It reproduces the correlation length predicted by the Monte Carlo renor-malization group and also computed in the non-linear σ model
Advanced Computational Methods for Monte Carlo Calculations
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2018-01-12
This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.
Monte Carlo lattice program KIM
International Nuclear Information System (INIS)
Cupini, E.; De Matteis, A.; Simonini, R.
1980-01-01
The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed
Robust Sequential Analysis for Special Capacities
高橋, 一
1985-01-01
Huber type robustness will be considered for some extensions of Wald's Sequential Probability Ratio Test, including Wald's three decision problem and the Kiefer-Weiss formulation. The results of Huber (1965, 1968), Huber and Strassen (1973), Rieder (1977) and Osterreicher (1978) will be extended to derive a least favorable tuple in the multiple decision problem. And then the asymptotically least favorable KieferWeiss procedure together with its asymptotic relative efficiency for the s-contami...
Sleep memory processing: the sequential hypothesis
Giuditta, Antonio
2014-01-01
According to the sequential hypothesis (SH) memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS) and rapid eye movement (REM) sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of ...
Sequential neural models with stochastic layers
DEFF Research Database (Denmark)
Fraccaro, Marco; Sønderby, Søren Kaae; Paquet, Ulrich
2016-01-01
How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural...... generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over...
On Locally Most Powerful Sequential Rank Tests
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2017-01-01
Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : nonparametric tests * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016
Sequential pattern recognition by maximum conditional informativity
Czech Academy of Sciences Publication Activity Database
Grim, Jiří
2014-01-01
Roč. 45, č. 1 (2014), s. 39-45 ISSN 0167-8655 R&D Projects: GA ČR(CZ) GA14-02652S; GA ČR(CZ) GA14-10911S Keywords : Multivariate statistics * Statistical pattern recognition * Sequential decision making * Product mixtures * EM algorithm * Shannon information Subject RIV: IN - Informatics, Computer Science Impact factor: 1.551, year: 2014 http://library.utia.cas.cz/separaty/2014/RO/grim-0428565.pdf
Sequential tests for near-real-time accounting
International Nuclear Information System (INIS)
Cobb, D.D.
1981-01-01
Statistical hypothesis testing is used in the analysis of nuclear materials accounting data for evidence of diversion. Sequential hypothesis testing is particularly well suited for analyzing data that arise sequentially in time from near-real-time accounting systems. The properties of selected sequential tests adapted for this application are described. 10 figures, 12 tables
Background Error Statistics for Assimilation of Atmospheric CO2
Chatterjee, A.; Engelen, R. J.; Kawa, S. R.; Sweeney, C.; Michalak, A. M.
2012-12-01
Recent improvements in the CO2 observational density have spurred the development and application of data assimilation systems for extracting information about global CO2 distributions from available observations. A novel application that has been pursued at the European Centre for Medium-Range Weather Forecasts (ECMWF), as part of the Monitoring Atmospheric Composition and Climate (MACC) project, is to use a state-of-the-art 4DVAR system to assimilate CO2 observations, along with meteorological variables to obtain a consistent estimate of atmospheric CO2 concentrations. Global CO2 fields generated in this way enhance the observational database, because the data assimilation procedure uses physical and dynamical laws, along with the available observations, to constrain the analysis. As in any data assimilation framework, the background error covariance matrix plays the critical role of filtering the observed information and propagating it to nearby grid points and levels of the assimilating model. For atmospheric CO2 assimilation, however, the errors in the background are not only impacted by the uncertainties in the CO2 transport but also by the spatial and temporal variability of the carbon exchange at the Earth surface. The background errors cannot be prescribed via traditional forecast-based methods as these fail to account for the uncertainties in the carbon emissions and uptake, resulting in an overall underestimation of the errors. We present a unique approach for characterizing the background error statistics whereby the differences between two CO2 model concentrations are used as a proxy for the statistics of the background errors. The resulting error statistics - 1) vary regionally and seasonally to better capture the changing degree of variability in the background CO2 field, 2) are independent of the observation density, and 3) have a discernible impact on the analysis estimates by allowing observations to adjust predictions over a larger area. In this
The MC21 Monte Carlo Transport Code
International Nuclear Information System (INIS)
Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H
2007-01-01
MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities
Monte Carlo simulation in nuclear medicine
International Nuclear Information System (INIS)
Morel, Ch.
2007-01-01
The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)
Supporting Operational Data Assimilation Capabilities to the Research Community
Shao, H.; Hu, M.; Stark, D. R.; Zhou, C.; Beck, J.; Ge, G.
2017-12-01
The Developmental Testbed Center (DTC), in partnership with the National Centers for Environmental Prediction (NCEP) and other operational and research institutions, provides operational data assimilation capabilities to the research community and helps transition research advances to operations. The primary data assimilation system supported currently by the DTC is the Gridpoint Statistical Interpolation (GSI) system and the National Oceanic and Atmospheric Administration (NOAA) Ensemble Kalman Filter (EnKF) system. GSI is a variational based system being used for daily operations at NOAA, NCEP, the National Aeronautics and Space Administration, and other operational agencies. Recently, GSI has evolved into a four-dimensional EnVar system. Since 2009, the DTC has been releasing the GSI code to the research community annually and providing user support. In addition to GSI, the DTC, in 2015, began supporting the ensemble based EnKF data assimilation system. EnKF shares the observation operator with GSI and therefore, just as GSI, can assimilate both conventional and non-conventional data (e.g., satellite radiance). Currently, EnKF is being implemented as part of the GSI based hybrid EnVar system for NCEP Global Forecast System operations. This paper will summarize the current code management and support framework for these two systems. Following that is a description of available community services and facilities. Also presented is the pathway for researchers to contribute their development to the daily operations of these data assimilation systems.
Recent Developments in DAO's Finite-Volume Data Assimilation System
daSilva, Arlindo; Lin, S.-J.; Joiner, J.; Dee, D.; Frank, D.; Norris, P.; Poli, P.; Atlas, Robert (Technical Monitor)
2001-01-01
The Physical-space/Finite-volume Data Assimilation System (fvDAS) is the next generation global atmospheric data assimilation system in development at the Data Assimilation Office at NASA's Goddard Space Flight Center. It is based on a new finite-volume general circulation model jointly developed by NASA and NCAR and on the Physical-Space Statistical Analysis System (PSAS) developed at the DAO. The data assimilation method implemented in CODAS incorporates a simplified version of the model bias estimation and correction algorithm, as described by Dee and da Silva (1998). In this talk we will briefly describe the general system formulation, and focus on the impact of 3 data types recently introduced, namely: 1) cloud tracks winds from the Multi-angle Imaging Spectrometer by the US Air Force, and 3) temperature and moisture information derived from GPS refractivity occultation measurements. The impact of these data types on observation-minus-6hr forecast (O-F) statistics, as well as 5-day forecast skills will be discussed. In addition we will assess the impact of cloud assimilation on top of the atmosphere radiation fields estimated from CERES measurements.
Architecting Service Based Sensor Networks for the Intelligent Assimilation
Directory of Open Access Journals (Sweden)
R. S. Ponmagal
2014-01-01
Full Text Available The aim of this paper is to propose an architectural model for assimilating distributed sensor networks through cloud paradigm. This strategy can be applied to monitor and control the physical parameters such as temperature, pressure, and level. It is proposed to consider the use of service oriented architecture to program and deploy the sensed parameters. The service oriented architecture for sensor network has been implemented in such a way that, for every specific requirement of the monitor center, the assimilation agent invokes the services of the sensors through a registry and the specific changes in the sensed parameters are also notified as auditable event using push interaction pattern of SOA. The assimilation agent serves as an intelligent component by providing authentication services. This SOA is extended to integrate different types of sensor networks through cloud environment. Hence several sensors can be networked together to monitor different process parameters and they have been assimilated with Internet by registering them as services, hence a complete distributed assimilation environment is exploited.
Assimilating uncertain, dynamic and intermittent streamflow observations in hydrological models
Mazzoleni, Maurizio; Alfonso, Leonardo; Chacon-Hurtado, Juan; Solomatine, Dimitri
2015-09-01
Catastrophic floods cause significant socio-economical losses. Non-structural measures, such as real-time flood forecasting, can potentially reduce flood risk. To this end, data assimilation methods have been used to improve flood forecasts by integrating static ground observations, and in some cases also remote sensing observations, within water models. Current hydrologic and hydraulic research works consider assimilation of observations coming from traditional, static sensors. At the same time, low-cost, mobile sensors and mobile communication devices are becoming also increasingly available. The main goal and innovation of this study is to demonstrate the usefulness of assimilating uncertain streamflow observations that are dynamic in space and intermittent in time in the context of two different semi-distributed hydrological model structures. The developed method is applied to the Brue basin, where the dynamic observations are imitated by the synthetic observations of discharge. The results of this study show how model structures and sensors locations affect in different ways the assimilation of streamflow observations. In addition, it proves how assimilation of such uncertain observations from dynamic sensors can provide model improvements similar to those of streamflow observations coming from a non-optimal network of static physical sensors. This can be a potential application of recent efforts to build citizen observatories of water, which can make the citizens an active part in information capturing, evaluation and communication, helping simultaneously to improvement of model-based flood forecasting.
Assimilation of GNSS radio occultation observations in GRAPES
Liu, Y.; Xue, J.
2014-07-01
This paper reviews the development of the global navigation satellite system (GNSS) radio occultation (RO) observations assimilation in the Global/Regional Assimilation and PrEdiction System (GRAPES) of China Meteorological Administration, including the choice of data to assimilate, the data quality control, the observation operator, the tuning of observation error, and the results of the observation impact experiments. The results indicate that RO data have a significantly positive effect on analysis and forecast at all ranges in GRAPES not only in the Southern Hemisphere where conventional observations are lacking but also in the Northern Hemisphere where data are rich. It is noted that a relatively simple assimilation and forecast system in which only the conventional and RO observation are assimilated still has analysis and forecast skill even after nine months integration, and the analysis difference between both hemispheres is gradually reduced with height when compared with NCEP (National Centers for Enviromental Prediction) analysis. Finally, as a result of the new onboard payload of the Chinese FengYun-3 (FY-3) satellites, the research status of the RO of FY-3 satellites is also presented.
Development Of A Data Assimilation Capability For RAPID
Emery, C. M.; David, C. H.; Turmon, M.; Hobbs, J.; Allen, G. H.; Famiglietti, J. S.
2017-12-01
The global decline of in situ observations associated with the increasing ability to monitor surface water from space motivates the creation of data assimilation algorithms that merge computer models and space-based observations to produce consistent estimates of terrestrial hydrology that fill the spatiotemporal gaps in observations. RAPID is a routing model based on the Muskingum method that is capable of estimating river streamflow over large scales with a relatively short computing time. This model only requires limited inputs: a reach-based river network, and lateral surface and subsurface flow into the rivers. The relatively simple model physics imply that RAPID simulations could be significantly improved by including a data assimilation capability. Here we present the early developments of such data assimilation approach into RAPID. Given the linear and matrix-based structure of the model, we chose to apply a direct Kalman filter, hence allowing for the preservation of high computational speed. We correct the simulated streamflows by assimilating streamflow observations and our early results demonstrate the feasibility of the approach. Additionally, the use of in situ gauges at continental scales motivates the application of our new data assimilation scheme to altimetry measurements from existing (e.g. EnviSat, Jason 2) and upcoming satellite missions (e.g. SWOT), and ultimately apply the scheme globally.
Chattopadhyay, Bhargab; Kelley, Ken
2016-01-01
The coefficient of variation is an effect size measure with many potential uses in psychology and related disciplines. We propose a general theory for a sequential estimation of the population coefficient of variation that considers both the sampling error and the study cost, importantly without specific distributional assumptions. Fixed sample size planning methods, commonly used in psychology and related fields, cannot simultaneously minimize both the sampling error and the study cost. The sequential procedure we develop is the first sequential sampling procedure developed for estimating the coefficient of variation. We first present a method of planning a pilot sample size after the research goals are specified by the researcher. Then, after collecting a sample size as large as the estimated pilot sample size, a check is performed to assess whether the conditions necessary to stop the data collection have been satisfied. If not an additional observation is collected and the check is performed again. This process continues, sequentially, until a stopping rule involving a risk function is satisfied. Our method ensures that the sampling error and the study costs are considered simultaneously so that the cost is not higher than necessary for the tolerable sampling error. We also demonstrate a variety of properties of the distribution of the final sample size for five different distributions under a variety of conditions with a Monte Carlo simulation study. In addition, we provide freely available functions via the MBESS package in R to implement the methods discussed.
Energy Technology Data Exchange (ETDEWEB)
Keating, B.A.; Evenson, J.P.; Fukai, S.
1982-12-01
Assimilate distribution and storage organ (storage roots plus swollen planting piece) yield of serial plantings of the cassava cultivar M Aus 10, made throughout a year, and grown for one year duration were studied with sequential harvests in S.E. Queensland (latitude 27 degrees 37'S), Australia. Seasonal differences in the proportion of total dry matter assimilation partioned to storage organs over a given time period (referred to as distribution ratio, DR) were observed with low DR over the mid-summer (January to March) period (0.1 to 0.3) when crop growth rate (CGR) was at a maximum compared with 0.4 to 0.5 in November to December and 0.5 to 1.0 in late autumn to winter (April to July). This period of low DR restricted storage organ yields which generally lower (6-9 t DW ha-1 year-1) than those reported for adapted germplasm at lower latitudes. Multiple regression models were developed which accounted for much of the variation in DR in terms of mean air temperature of photoperiod and leaf area index (R2 = 0.73). High temperatures, long photoperiods and high leaf area indices were associated with reduced DR. Mean air temperature and photoperiod are highly correlated in this environment and their separate effects on DR could not be distinguised. This model of distribution ratio was combined with earlier published models of CGR, and storage organ growth rate predicted. (Refs. 20).
Data assimilation in the decision support system RODOS
DEFF Research Database (Denmark)
Rojas-Palma, C.; Madsen, H.; Gering, F.
2003-01-01
. The process of combining model predictions and observations, usually referred to as data assimilation, is described in this article within the framework of the real time on-line decision support system (RODOS) for off-site nuclear emergency management in Europe. Data assimilation capabilities, based on Kalman...... filters, are under development for several modules of the RODOS system, including the atmospheric dispersion, deposition, food chain and hydrological models. The use of such a generic data assimilation methodology enables the propagation of uncertainties throughout the various modules of the system....... This would in turn provide decision makers with uncertainty estimates taking into account both model and observation errors. This paper describes the methodology employed as well as results of some preliminary studies based on simulated data....
[Post-photosynthetic use of labeled assimilates in fiber flax].
Chikov, V I; Avvakumova, N Iu; Bakirova, G G
2003-01-01
The distribution of 14C in various tissues of fiber flax was assayed 1, 17, and 21 days after 30-min assimilation of 14CO@2 by the whole rapidly growing plant. Polymeric photosynthetic products were largely hydrolyzed in the 14C-donor part of the shoot and the hydrolysates were transported upward. The content of 14C in pigments and lipids of the donor leaves (that absorbed 14CO2) was significantly higher than that in the 14C-acceptor ones. An additional nitrogen feeding decreased the labeled sucrose: hexose ratio and inhibited transport of the assimilates from both 14C-donor and acceptor leaves. 14C transported to the shoot tip was largely used for synthesis of poorly soluble proteins (extractable with alkali and Triton X-100) in the acceptor tissues. In the donor part of the shoot, particularly in the bast, cellulose was mainly synthesized from the "new" assimilates.
Storm surge model based on variational data assimilation method
Directory of Open Access Journals (Sweden)
Shi-li Huang
2010-06-01
Full Text Available By combining computation and observation information, the variational data assimilation method has the ability to eliminate errors caused by the uncertainty of parameters in practical forecasting. It was applied to a storm surge model based on unstructured grids with high spatial resolution meant for improving the forecasting accuracy of the storm surge. By controlling the wind stress drag coefficient, the variation-based model was developed and validated through data assimilation tests in an actual storm surge induced by a typhoon. In the data assimilation tests, the model accurately identified the wind stress drag coefficient and obtained results close to the true state. Then, the actual storm surge induced by Typhoon 0515 was forecast by the developed model, and the results demonstrate its efficiency in practical application.
Data assimilation on atmospheric dispersion of radioactive materials
DEFF Research Database (Denmark)
Drews, Martin
assimilation methods in a realistic setting. New experimental studies of atmospheric dispersion of radioactive material was carried out in October 2001 at the SCK"CEN in Mol, Belgium. In the Mol experiment, the radiation field from routine releases of 41 Ar is recorded by an array of gamma detectors along......During a nuclear accident in which radionuclides are released to the atmosphere, off-site dose assessment using atmospheric dispersion models play an important role in facilitating optimized interventions, i.e. for mitigating the radiological consequences. By using data assimilation methods......, radiological observations, e.g. dose rate measurements, can be used to improve these model predictions and to obtain real-time estimates of the atmospheric dispersion parameters. This thesis examines data assimilation in the context of atmospheric dispersion of radioactive materials. In particular, it presents...
Data Assimilation in Integrated and Distributed Hydrological Models
DEFF Research Database (Denmark)
Zhang, Donghua
Integrated hydrological models are frequently used in water-related environmental resource management. With our better understanding of the hydrological processes and the improved computational power, hydrological models are becoming increasingly more complex as they integrate multiple hydrological...... to efficient use of traditional and new observational data in integrated hydrological models, as this technique can improve model prediction and reduce model uncertainty. The thesis investigates several challenges within the scope of data assimilation in integrated hydrological models. From the methodological...... point of view, different assimilation methodologies and techniques have been developed or customized to better serve hydrological assimilation. From the application point of view, real data and real-world complex catchments are used with the focus of investigating the models’ improvements with data...
Data Assimilation in Hydrodynamic Models of Continental Shelf Seas
DEFF Research Database (Denmark)
Sørensen, Jacob Viborg Tornfeldt
2004-01-01
of a stochastic state propagation step using a numerical hydrodynamic model and an update step based on a best linear unbiased estimator when new measurements are available. The main challenge is to construct a stochastic model of the high dimensional ocean state that provides su cient skill for a proper update....... Assimilation of sea surface temperature and parameter estimation in hydrodynamic models are also considered. The main focus has been on the development of robust and efficient techniques applicable in real operational settings. The applied assimilation techniques all use a Kalman filter approach. They consist...... and forecast skill in the Inner Danish Waters. The framework for combining data assimilation and off-line error correction techniques is discussed and presented. Early results show a potential for such an approach, but a more elaborate investigation is needed to further develop the idea. Finally, work has been...
3D Data Assimilation using VERB Diffusion Code
Shprits, Y.; Kondrashov, D. A.; Kellerman, A. C.; Subbotin, D.
2012-12-01
Significant progress has been done in recent years in application of the data assimilation tools to the radiation belt research. Previous studies concentrated on the analysis of radial profiles of phase space density using multi-satellite measurements and radial transport models. In this study we present analysis of the 3D phase space density using the VERB-3D code blended with CRRES observations by means of operator-splitting Kalman filtering. Assimilation electron fluxes at various energies and pitch-angles into the model allows us to utilize a vast amount of data including information on pitch-angle distributions and radial energy spectra. 3D data assimilation of the radiation belts allows us to differentiate between various acceleration and loss mechanisms. We present reanalysis of the radiation belts and find tell-tale signatures of various physical processes.
Vernieres, Guillaume Rene Jean; Kovach, Robin M.; Keppenne, Christian L.; Akella, Santharam; Brucker, Ludovic; Dinnat, Emmanuel Phillippe
2014-01-01
Ocean salinity and temperature differences drive thermohaline circulations. These properties also play a key role in the ocean-atmosphere coupling. With the availability of L-band space-borne observations, it becomes possible to provide global scale sea surface salinity (SSS) distribution. This study analyzes globally the along-track (Level 2) Aquarius SSS retrievals obtained using both passive and active L-band observations. Aquarius alongtrack retrieved SSS are assimilated into the ocean data assimilation component of Version 5 of the Goddard Earth Observing System (GEOS-5) assimilation and forecast model. We present a methodology to correct the large biases and errors apparent in Version 2.0 of the Aquarius SSS retrieval algorithm and map the observed Aquarius SSS retrieval into the ocean models bulk salinity in the topmost layer. The impact of the assimilation of the corrected SSS on the salinity analysis is evaluated by comparisons with insitu salinity observations from Argo. The results show a significant reduction of the global biases and RMS of observations-minus-forecast differences at in-situ locations. The most striking results are found in the tropics and southern latitudes. Our results highlight the complementary role and problems that arise during the assimilation of salinity information from in-situ (Argo) and space-borne surface (SSS) observations
Quantifying Monte Carlo uncertainty in ensemble Kalman filter
Energy Technology Data Exchange (ETDEWEB)
Thulin, Kristian; Naevdal, Geir; Skaug, Hans Julius; Aanonsen, Sigurd Ivar
2009-01-15
This report is presenting results obtained during Kristian Thulin PhD study, and is a slightly modified form of a paper submitted to SPE Journal. Kristian Thulin did most of his portion of the work while being a PhD student at CIPR, University of Bergen. The ensemble Kalman filter (EnKF) is currently considered one of the most promising methods for conditioning reservoir simulation models to production data. The EnKF is a sequential Monte Carlo method based on a low rank approximation of the system covariance matrix. The posterior probability distribution of model variables may be estimated fram the updated ensemble, but because of the low rank covariance approximation, the updated ensemble members become correlated samples from the posterior distribution. We suggest using multiple EnKF runs, each with smaller ensemble size to obtain truly independent samples from the posterior distribution. This allows a point-wise confidence interval for the posterior cumulative distribution function (CDF) to be constructed. We present a methodology for finding an optimal combination of ensemble batch size (n) and number of EnKF runs (m) while keeping the total number of ensemble members ( m x n) constant. The optimal combination of n and m is found through minimizing the integrated mean square error (MSE) for the CDFs and we choose to define an EnKF run with 10.000 ensemble members as having zero Monte Carlo error. The methodology is tested on a simplistic, synthetic 2D model, but should be applicable also to larger, more realistic models. (author). 12 refs., figs.,tabs
Assimilation scheme of the Mediterranean Forecasting System: operational implementation
Directory of Open Access Journals (Sweden)
E. Demirov
Full Text Available This paper describes the operational implementation of the data assimilation scheme for the Mediterranean Forecasting System Pilot Project (MFSPP. The assimilation scheme, System for Ocean Forecast and Analysis (SOFA, is a reduced order Optimal Interpolation (OI scheme. The order reduction is achieved by projection of the state vector into vertical Empirical Orthogonal Functions (EOF. The data assimilated are Sea Level Anomaly (SLA and temperature profiles from Expandable Bathy Termographs (XBT. The data collection, quality control, assimilation and forecast procedures are all done in Near Real Time (NRT. The OI is used intermittently with an assimilation cycle of one week so that an analysis is produced once a week. The forecast is then done for ten days following the analysis day. The root mean square (RMS between the model forecast and the analysis (the forecast RMS is below 0.7°C in the surface layers and below 0.2°C in the layers deeper than 200 m for all the ten forecast days. The RMS between forecast and initial condition (persistence RMS is higher than forecast RMS after the first day. This means that the model improves forecast with respect to persistence. The calculation of the misfit between the forecast and the satellite data suggests that the model solution represents well the main space and time variability of the SLA except for a relatively short period of three – four weeks during the summer when the data show a fast transition between the cyclonic winter and anti-cyclonic summer regimes. This occurs in the surface layers that are not corrected by our assimilation scheme hypothesis. On the basis of the forecast skill scores analysis, conclusions are drawn about future improvements.
Key words. Oceanography; general (marginal and semi-enclosed seas; numerical modeling; ocean prediction
Assimilation scheme of the Mediterranean Forecasting System: operational implementation
Directory of Open Access Journals (Sweden)
E. Demirov
2003-01-01
Full Text Available This paper describes the operational implementation of the data assimilation scheme for the Mediterranean Forecasting System Pilot Project (MFSPP. The assimilation scheme, System for Ocean Forecast and Analysis (SOFA, is a reduced order Optimal Interpolation (OI scheme. The order reduction is achieved by projection of the state vector into vertical Empirical Orthogonal Functions (EOF. The data assimilated are Sea Level Anomaly (SLA and temperature profiles from Expandable Bathy Termographs (XBT. The data collection, quality control, assimilation and forecast procedures are all done in Near Real Time (NRT. The OI is used intermittently with an assimilation cycle of one week so that an analysis is produced once a week. The forecast is then done for ten days following the analysis day. The root mean square (RMS between the model forecast and the analysis (the forecast RMS is below 0.7°C in the surface layers and below 0.2°C in the layers deeper than 200 m for all the ten forecast days. The RMS between forecast and initial condition (persistence RMS is higher than forecast RMS after the first day. This means that the model improves forecast with respect to persistence. The calculation of the misfit between the forecast and the satellite data suggests that the model solution represents well the main space and time variability of the SLA except for a relatively short period of three – four weeks during the summer when the data show a fast transition between the cyclonic winter and anti-cyclonic summer regimes. This occurs in the surface layers that are not corrected by our assimilation scheme hypothesis. On the basis of the forecast skill scores analysis, conclusions are drawn about future improvements. Key words. Oceanography; general (marginal and semi-enclosed seas; numerical modeling; ocean prediction
Royer, A.; Larue, F.; De Sève, D.; Roy, A.; Vionnet, V.; Picard, G.; Cosme, E.
2017-12-01
Over northern snow-dominated basins, the snow water equivalent (SWE) is of primary interest for spring streamflow forecasting. SWE retrievals from satellite data are still not well resolved, in particular from microwave (MW) measurements, the only type of data sensible to snow mass. Also, the use of snowpack models is challenging due to the large uncertainties in meteorological input forcings. This project aims to improve SWE prediction by assimilation of satellite brightness temperature (TB), without any ground-based observations. The proposed approach is the coupling of a detailed multilayer snowpack model (Crocus) with a MW snow emission model (DMRT-ML). The assimilation scheme is a Sequential Importance Resampling Particle filter, through ensembles of perturbed meteorological forcings according to their respective uncertainties. Crocus simulations driven by operational meteorological forecasts from the Canadian Global Environmental Multiscale model at 10 km spatial resolution were compared to continuous daily SWE measurements over Québec, North-Eastern Canada (56° - 45°N). The results show a mean bias of the maximum SWE overestimated by 16% with variations up to +32%. This observed large variability could lead to dramatic consequences on spring flood forecasts. Results of Crocus-DMRT-ML coupling compared to surface-based TB measurements (at 11, 19 and 37 GHz) show that the Crocus snowpack microstructure described by sticky hard spheres within DMRT has to be scaled by a snow stickiness of 0.18, significantly reducing the overall RMSE of simulated TBs. The ability of assimilation of daily TBs to correct the simulated SWE is first presented through twin experiments with synthetic data, and then with AMSR-2 satellite time series of TBs along the winter taking into account atmospheric and forest canopy interferences (absorption and emission). The differences between TBs at 19-37 GHz and at 11-19 GHz, in vertical polarization, were assimilated. This assimilation
SMAP Data Assimilation at NASA SPoRT
Blankenship, Clay B.; Case, Jonathan L.; Zavodsky, Bradley T.
2016-01-01
The NASA Short-Term Prediction Research and Transition (SPoRT) Center maintains a near-real- time run of the Noah Land Surface Model within the Land Information System (LIS) at 3-km resolution. Soil moisture products from this model are used by several NOAA/National Weather Service Weather Forecast Offices for flood and drought situational awareness. We have implemented assimilation of soil moisture retrievals from the Soil Moisture Ocean Salinity (SMOS) and Soil Moisture Active/ Passive (SMAP) satellites, and are now evaluating the SMAP assimilation. The SMAP-enhanced LIS product is planned for public release by October 2016.
Assimilation of LAI time-series in crop production models
Kooistra, Lammert; Rijk, Bert; Nannes, Louis
2014-05-01
Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor
Data assimilation in the early phase: Kalman filtering RIMPUFF
DEFF Research Database (Denmark)
Astrup, P.; Turcanu, C.; Puch, R.O.
2004-01-01
In the framework of the DAONEM project (Data Assimilation for Off-site Nuclear Emergency Management), a data assimilation module, ADUM (Atmospheric Dispersion Updating Module), for the mesoscale atmospheric dispersion program RIMPUFF (Risø Mesoscale Puffmodel) – part of the early-phase programs......, and this based on time averaged measurements. Given reasonable conditions, i.e. a spatially densedistribution of gamma monitors and a realistic wind field, the developed ADUM module is found to be able to enhance the prediction of the gamma dose field. Based on some of the Kalman filtering parameters, another...
Assimilation of radar-based nowcast into HIRLAM NWP model
DEFF Research Database (Denmark)
Jensen, David Getreuer; Petersen, Claus; Rasmussen, Michael R.
2015-01-01
The present study introduces a nowcast scheme that assimilates radar extrapolation data (RED) into a nowcasting version of the high resolution limited area model (HIRLAM) numerical weather prediction (NWP) model covering the area of Denmark. The RED are based on the Co-TREC (tracking radar echoes...... by correlation) methodology and are generated from cleaned radar mosaics from the Danish weather radar network. The assimilation technique is a newly developed method that increases model precipitation by increasing low-level convergence and decreasing convergence aloft in order to increase the vertical velocity...
Reconstruction of Historical Weather by Assimilating Old Weather Diary Data
Neluwala, P.; Yoshimura, K.; Toride, K.; Hirano, J.; Ichino, M.; Okazaki, A.
2017-12-01
Climate can control not only human life style but also other living beings. It is important to investigate historical climate to understand the current and future climates. Information about daily weather can give a better understanding of past life on earth. Long-term weather influences crop calendar as well as the development of civilizations. Unfortunately, existing reconstructed daily weather data are limited to 1850s due to the availability of instrumental data. The climate data prior to that are derived from proxy materials (e.g., tree-ring width, ice core isotopes, etc.) which are either in annual or decadal scale. However, there are many historical documents which contain information about weather such as personal diaries. In Japan, around 20 diaries in average during the 16th - 19th centuries have been collected and converted into a digitized form. As such, diary data exist in many other countries. This study aims to reconstruct historical daily weather during the 18th and 19th centuries using personal daily diaries which have analogue weather descriptions such as `cloudy' or `sunny'. A recent study has shown the possibility of assimilating coarse weather data using idealized experiments. We further extend this study by assimilating modern weather descriptions similar to diary data in recent periods. The Global Spectral model (GSM) of National Centers for Environmental Prediction (NCEP) is used to reconstruct weather with the Local Ensemble Kalman filter (LETKF). Descriptive data are first converted to model variables such as total cloud cover (TCC), solar radiation and precipitation using empirical relationships. Those variables are then assimilated on a daily basis after adding random errors to consider the uncertainty of actual diary data. The assimilation of downward short wave solar radiation using weather descriptions improves RMSE from 64.3 w/m2 to 33.0 w/m2 and correlation coefficient (R) from 0.5 to 0.8 compared with the case without any
Real Time Radiation Belt Data Assimilation using DREAM
Henderson, M. G.; Koller, J.; Tokar, R. L.; Chen, Y.; Reeves, G. D.; Friedel, R. H.
2009-12-01
We present the first real-time version of the DREAM radiation belt data assimilation model. The model uses an "Ensemble Kalman Filter" to assimilate data in real time from inner magnetospheric spacecraft and computes Phase Space Density (PSD) as a functionof L* and time at contact first and second adiabatic invariants. Results using multiple pairs of first and second invariants are computed in order to recover flux versus energy along arbitrary spacecraft trajectories. The model can also be used to monitor the evolution of artificial electron injections and we show results using model inputs. We also present a visualization tool that can be used to examine the computed drift shells.
O'Keeffe, C J; Ren, Ruichao; Orkoulas, G
2007-11-21
Spatial updating grand canonical Monte Carlo algorithms are generalizations of random and sequential updating algorithms for lattice systems to continuum fluid models. The elementary steps, insertions or removals, are constructed by generating points in space either at random (random updating) or in a prescribed order (sequential updating). These algorithms have previously been developed only for systems of impenetrable spheres for which no particle overlap occurs. In this work, spatial updating grand canonical algorithms are generalized to continuous, soft-core potentials to account for overlapping configurations. Results on two- and three-dimensional Lennard-Jones fluids indicate that spatial updating grand canonical algorithms, both random and sequential, converge faster than standard grand canonical algorithms. Spatial algorithms based on sequential updating not only exhibit the fastest convergence but also are ideal for parallel implementation due to the absence of strict detailed balance and the nature of the updating that minimizes interprocessor communication. Parallel simulation results for three-dimensional Lennard-Jones fluids show a substantial reduction of simulation time for systems of moderate and large size. The efficiency improvement by parallel processing through domain decomposition is always in addition to the efficiency improvement by sequential updating.
Poage, J. L.
1975-01-01
A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.
Monte carlo simulation for soot dynamics
Zhou, Kun
2012-01-01
A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
Monte Carlo approaches to light nuclei
International Nuclear Information System (INIS)
Carlson, J.
1990-01-01
Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of 16 O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs
Monte Carlo approaches to light nuclei
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.
Importance iteration in MORSE Monte Carlo calculations
International Nuclear Information System (INIS)
Kloosterman, J.L.; Hoogenboom, J.E.
1994-02-01
An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)
Rostkier-Edelstein, Dorita; Hacker, Joshua P.; Snyder, Chris
2014-05-01
Numerical weather prediction and data assimilation models are composed of coupled atmosphere and land-surface (LS) components. If possible, the assimilation procedure should be coupled so that observed information in one module is used to correct fields in the coupled module. There have been some attempts in this direction using optimal interpolation, nudging and 2/3DVAR data assimilation techniques. Aside from satellite remote sensed observations, reference height in-situ observations of temperature and moisture have been used in these studies. Among other problems, difficulties in coupled atmosphere and LS assimilation arise as a result of the different time scales characteristic of each component and the unsteady correlation between these components under varying flow conditions. Ensemble data-assimilation techniques rely on flow dependent observations-model covariances. Provided that correlations and covariances between land and atmosphere can be adequately simulated and sampled, ensemble data assimilation should enable appropriate assimilation of observations simultaneously into the atmospheric and LS states. Our aim is to explore assimilation of reference height in-situ temperature and moisture observations into the coupled atmosphere-LS modules(simultaneously) in NCAR's WRF-ARW model using the NCAR's DART ensemble data-assimilation system. Observing system simulation experiments (OSSEs) are performed using the single column model (SCM) version of WRF. Numerical experiments during a warm season are centered on an atmospheric and soil column in the South Great Plains. Synthetic observations are derived from "truth" WRF-SCM runs for a given date,initialized and forced using North American Regional Reanalyses (NARR). WRF-SCM atmospheric and LS ensembles are created by mixing the atmospheric and soil NARR profile centered on a given date with that from another day (randomly chosen from the same season) with weights drawn from a logit-normal distribution. Three
A probabilistic collocation based iterative Kalman filter for landfill data assimilation
Zheng, Qiang; Xu, Wenjie; Man, Jun; Zeng, Lingzao; Wu, Laosheng
2017-11-01
Accurate forecast of landfill gas (LFG) transport has remained as an active research area, due to the safety and environmental concerns, as well as the green energy potential. The iterative ensemble Kalman filter (IEnKF) has been used to characterize the heterogeneous permeability field of landfills. As a Monte Carlo-based method, IEnKF requires a sufficiently large ensemble size to guarantee its accuracy, which may result in a huge computational cost, especially for large-scale problems. In this study, an efficient probabilistic collocation based iterative Kalman filter (PCIKF) is developed. The polynomial chaos expansion (PCE) is employed to represent and propagate the uncertainties, and an iterative form of Kalman filter is used to assimilate the measurements. To further reduce the computational cost, only the zeroth and first-order ANOVA (analysis of variance) components are kept in the PCE approximation. As demonstrated by two numerical case studies, PCIKF shows significant superiority over IEnKF in terms of accuracy and efficiency. The developed method has the potential to reliably predict and develop best management practices for landfill gas production.
Bayesian modeling of the assimilative capacity component of nutrient total maximum daily loads
Faulkner, B. R.
2008-08-01
Implementing stream restoration techniques and best management practices to reduce nonpoint source nutrients implies enhancement of the assimilative capacity for the stream system. In this paper, a Bayesian method for evaluating this component of a total maximum daily load (TMDL) load capacity is developed and applied. The joint distribution of nutrient retention metrics from a literature review of 495 measurements was used for Monte Carlo sampling with a process transfer function for nutrient attenuation. Using the resulting histograms of nutrient retention, reference prior distributions were developed for sites in which some of the metrics contributing to the transfer function were measured. Contributing metrics for the prior include stream discharge, cross-sectional area, fraction of storage volume to free stream volume, denitrification rate constant, storage zone mass transfer rate, dispersion coefficient, and others. Confidence of compliance (CC) that any given level of nutrient retention has been achieved is also determined using this approach. The shape of the CC curve is dependent on the metrics measured and serves in part as a measure of the information provided by the metrics to predict nutrient retention. It is also a direct measurement, with a margin of safety, of the fraction of export load that can be reduced through changing retention metrics. For an impaired stream in western Oklahoma, a combination of prior information and measurement of nutrient attenuation was used to illustrate the proposed approach. This method may be considered for TMDL implementation.
Sequential tool use in great apes.
Directory of Open Access Journals (Sweden)
Gema Martin-Ordas
Full Text Available Sequential tool use is defined as using a tool to obtain another non-food object which subsequently itself will serve as a tool to act upon a further (subgoal. Previous studies have shown that birds and great apes succeed in such tasks. However, the inclusion of a training phase for each of the sequential steps and the low cost associated with retrieving the longest tools limits the scope of the conclusions. The goal of the experiments presented here was, first to replicate a previous study on sequential tool use conducted on New Caledonian crows and, second, extend this work by increasing the cost of retrieving a tool in order to test tool selectivity of apes. In Experiment 1, we presented chimpanzees, orangutans and bonobos with an out-of-reach reward, two tools that were available but too short to reach the food and four out-of-reach tools differing in functionality. Similar to crows, apes spontaneously used up to 3 tools in sequence to get the reward and also showed a strong preference for the longest out-of reach tool independently of the distance of the food. In Experiment 2, we increased the cost of reaching for the longest out-of reach tool. Now apes used up to 5 tools in sequence to get the reward and became more selective in their choice of the longest tool as the costs of its retrieval increased. The findings of the studies presented here contribute to the growing body of comparative research on tool use.
Adaptive Markov Chain Monte Carlo
Jadoon, Khan
2016-08-08
A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In the MCMC simulations, posterior distribution was computed using Bayes rule. The electromagnetic forward model based on the full solution of Maxwell\\'s equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD mini-Explorer. The model parameters and uncertainty for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness are not well estimated as compared to layers electrical conductivity because layer thicknesses in the model exhibits a low sensitivity to the EMI measurements, and is hence difficult to resolve. Application of the proposed MCMC based inversion to the field measurements in a drip irrigation system demonstrate that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provide useful insight about parameter uncertainty for the assessment of the model outputs.
Advanced computers and Monte Carlo
International Nuclear Information System (INIS)
Jordan, T.L.
1979-01-01
High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables
Norris, P.; da Silva, A.
2003-04-01
Cloud fraction and optical depth data from the International Satellite Cloud Climatology Project (ISCCP) and cloud liquid water path retrievals from the Special Sensor Microwave/Imager (SSM/I) instrument are assimilated into the NASA Data Assimilation Office finite volume Data Assimilation System (fvDAS) using a parameter adjustment method. The rationale behind this adjustment method is that there are several empirical parameters in the model cloud/radiation parameterizations (e.g., ``critical relative humidity'') that are not in-fact universal but have remaining spatial and temporal dependencies. These parameters can be slowly adjusted in space and time to improve the model's representation of cloud properties. In this study, the Slingo-type relative-humidity-based diagnostic cloud fraction of the Community Climate Model (CCM3) is generalized to a two parameter S-shaped dependence of cloud fraction on relative humidity. These two parameters are adjusted in both low and mid-high cloud bands using observed cloud fractions in these bands derived from ISCCP DX data. This procedure greatly improves the representation of cloud fraction in the model as compared with ISCCP data. It also significantly improves mid-latitude longwave cloud radiative forcing, as independently validated against Clouds and the Earth's Radiant Energy System (CERES) data, and mid-latitude column-averaged liquid water path (LWP) over ocean, as validated against TRMM Microwave Imager (TMI) data. The cloud fraction assimilation, by itself, degrades the shortwave cloud radiative forcing at the top-of-atmosphere, but this is recovered (as validated against CERES) by assimilating SSM/I LWP and ISCCP optical depth data via adjustment of the CCM3 diagnostic cloud liquid water parameterization and the dependence of the CCM3 column optical depth on layer cloud fractions. The net effect of this promising cloud data assimilation method is to improve forecast skill for cloud cover and optical depth. Other
Noh, Seong Jin; Mazzoleni, Maurizio; Lee, Haksu; Liu, Yuqiong; Seo, Dong Jun; Solomatine, Dimitri
2016-04-01
Reliable water depth estimation is an extremely important issue in operational early flood warning systems. Different water system models have been implemented in the last decades, and, in parallel, data assimilation approaches have been introduced in order to reduce the uncertainty of such models. The goal of this study is to compare the performances of a distributed hydrologic routing model with streamflow assimilation using six different data assimilation methods, including direct insertion, nudging, Kalman filter, Ensemble Kalman filter, Asynchronous Ensemble Kalman filter and variational method. The model used in this study is a 3-parameter Muskingum (O'Donnell 1985) which was implemented in the Trinity River, within the Dallas-Fort-Worth Metroplex area in Texas, USA. The first methodological step is to discretize the river reach into multiple 1-km sub-reaches in order to estimate water depth in a distributed fashion. Then, different data assimilation approaches were implemented using the state-space approach formulation of the Muskingum model proposed by Georgakakos (1990). Finally, streamflow observations were assimilated at two points where flow sensors are located. The results of this work pointed out that assimilation of streamflow observations can noticeably improve the hydrologic routing model prediction and that ensemble definition is particularly important for both Ensemble Kalman filter and Asynchronous Ensemble Kalman filter. This study is part of the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/) and NSF Project Integrated Sensing and Prediction of urban Water for Sustainable Cities (http://ispuw.uta.edu/nsf)
Reaction probability for sequential separatrix crossings
International Nuclear Information System (INIS)
Cary, J.R.; Skodje, R.T.
1988-01-01
The change of the crossing parameter (essentially the phase) between sequential slow separatrix crossings is calculated for Hamiltonian systems with one degree of freedom. Combined with the previous separatrix crossing analysis, these results reduce the dynamics of adiabatic systems with separatrices to a map. This map determines whether a trajectory leaving a given separatrix lobe is ultimately captured by the other lobe. Averaging these results over initial phase yields the reaction probability, which does not asymptote to the fully phase-mixed result even for arbitrarily long times between separatrix crossings
A sequential/parallel track selector
Bertolino, F; Bressani, Tullio; Chiavassa, E; Costa, S; Dellacasa, G; Gallio, M; Musso, A
1980-01-01
A medium speed ( approximately 1 mu s) hardware pre-analyzer for the selection of events detected in four planes of drift chambers in the magnetic field of the Omicron Spectrometer at the CERN SC is described. Specific geometrical criteria determine patterns of hits in the four planes of vertical wires that have to be recognized and that are stored as patterns of '1's in random access memories. Pairs of good hits are found sequentially, then the RAMs are used as look-up tables. (6 refs).
THE DEVELOPMENT OF SPECIAL SEQUENTIALLY-TIMED
Directory of Open Access Journals (Sweden)
Stanislav LICHOROBIEC
2016-06-01
Full Text Available This article documents the development of the noninvasive use of explosives during the destruction of ice mass in river flows. The system of special sequentially-timed charges utilizes the increase in efficiency of cutting charges by covering them with bags filled with water, while simultaneously increasing the effect of the entire system of timed charges. Timing, spatial combinations during placement, and the linking of these charges results in the loosening of ice barriers on a frozen waterway, while at the same time regulating the size of the ice fragments. The developed charges will increase the operability and safety of IRS units.
Pass-transistor asynchronous sequential circuits
Whitaker, Sterling R.; Maki, Gary K.
1989-01-01
Design methods for asynchronous sequential pass-transistor circuits, which result in circuits that are hazard- and critical-race-free and which have added degrees of freedom for the input signals, are discussed. The design procedures are straightforward and easy to implement. Two single-transition-time state assignment methods are presented, and hardware bounds for each are established. A surprising result is that the hardware realizations for each next state variable and output variable is identical for a given flow table. Thus, a state machine with N states and M outputs can be constructed using a single layout replicated N + M times.
From sequential to parallel programming with patterns
CERN. Geneva
2018-01-01
To increase in both performance and efficiency, our programming models need to adapt to better exploit modern processors. The classic idioms and patterns for programming such as loops, branches or recursion are the pillars of almost every code and are well known among all programmers. These patterns all have in common that they are sequential in nature. Embracing parallel programming patterns, which allow us to program for multi- and many-core hardware in a natural way, greatly simplifies the task of designing a program that scales and performs on modern hardware, independently of the used programming language, and in a generic way.
Decoding restricted participation in sequential electricity markets
Energy Technology Data Exchange (ETDEWEB)
Knaut, Andreas; Paschmann, Martin
2017-06-15
Restricted participation in sequential markets may cause high price volatility and welfare losses. In this paper we therefore analyze the drivers of restricted participation in the German intraday auction which is a short-term electricity market with quarter-hourly products. Applying a fundamental electricity market model with 15-minute temporal resolution, we identify the lack of sub-hourly market coupling being the most relevant driver of restricted participation. We derive a proxy for price volatility and find that full market coupling may trigger quarter-hourly price volatility to decrease by a factor close to four.
Boundary conditions in random sequential adsorption
Cieśla, Michał; Ziff, Robert M.
2018-04-01
The influence of different boundary conditions on the density of random packings of disks is studied. Packings are generated using the random sequential adsorption algorithm with three different types of boundary conditions: periodic, open, and wall. It is found that the finite size effects are smallest for periodic boundary conditions, as expected. On the other hand, in the case of open and wall boundaries it is possible to introduce an effective packing size and a constant correction term to significantly improve the packing densities.
On Locally Most Powerful Sequential Rank Tests
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2017-01-01
Roč. 36, č. 1 (2017), s. 111-125 ISSN 0747-4946 R&D Projects: GA ČR GA17-07384S Grant - others:Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985556 Keywords : nonparametric tests * sequential ranks * stopping variable Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.339, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/kalina-0474065.pdf
Davidson, Valerie J; Ryks, Joanne
2003-10-01
The objective of food safety risk assessment is to quantify levels of risk for consumers as well as to design improved processing, distribution, and preparation systems that reduce exposure to acceptable limits. Monte Carlo simulation tools have been used to deal with the inherent variability in food systems, but these tools require substantial data for estimates of probability distributions. The objective of this study was to evaluate the use of fuzzy values to represent uncertainty. Fuzzy mathematics and Monte Carlo simulations were compared to analyze the propagation of uncertainty through a number of sequential calculations in two different applications: estimation of biological impacts and economic cost in a general framework and survival of Campylobacter jejuni in a sequence of five poultry processing operations. Estimates of the proportion of a population requiring hospitalization were comparable, but using fuzzy values and interval arithmetic resulted in more conservative estimates of mortality and cost, in terms of the intervals of possible values and mean values, compared to Monte Carlo calculations. In the second application, the two approaches predicted the same reduction in mean concentration (-4 log CFU/ ml of rinse), but the limits of the final concentration distribution were wider for the fuzzy estimate (-3.3 to 5.6 log CFU/ml of rinse) compared to the probability estimate (-2.2 to 4.3 log CFU/ml of rinse). Interval arithmetic with fuzzy values considered all possible combinations in calculations and maximum membership grade for each possible result. Consequently, fuzzy results fully included distributions estimated by Monte Carlo simulations but extended to broader limits. When limited data defines probability distributions for all inputs, fuzzy mathematics is a more conservative approach for risk assessment than Monte Carlo simulations.
11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing
Nuyens, Dirk
2016-01-01
This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.
Monte Carlo simulations for plasma physics
International Nuclear Information System (INIS)
Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X.
2000-07-01
Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)
Hybrid Monte Carlo methods in computational finance
Leitao Rodriguez, A.
2017-01-01
Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the
Simulation and the Monte Carlo method
Rubinstein, Reuven Y
2016-01-01
Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...
Avariide kiuste Monte Carlosse / Aare Arula
Arula, Aare
2007-01-01
Vt. ka Tehnika dlja Vsehh nr. 3, lk. 26-27. 26. jaanuaril 1937 Tallinnast Monte Carlo tähesõidule startinud Karl Siitanit ja tema meeskonda ootasid ees seiklused, mis oleksid neile peaaegu elu maksnud
Monte Carlo methods for particle transport
Haghighat, Alireza
2015-01-01
The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...
Monte Carlo code development in Los Alamos
International Nuclear Information System (INIS)
Carter, L.L.; Cashwell, E.D.; Everett, C.J.; Forest, C.A.; Schrandt, R.G.; Taylor, W.M.; Thompson, W.L.; Turner, G.D.
1974-01-01
The present status of Monte Carlo code development at Los Alamos Scientific Laboratory is discussed. A brief summary is given of several of the most important neutron, photon, and electron transport codes. 17 references. (U.S.)
Aasta film - joonisfilm "Mont Blanc" / Verni Leivak
Leivak, Verni, 1966-
2002-01-01
Eesti Filmiajakirjanike Ühing andis aasta 2001 parima filmi tiitli Priit Tenderi joonisfilmile "Mont Blanc" : Eesti Joonisfilm 2001.Ka filmikriitikute eelistused kinodes ja televisioonis 2001. aastal näidatud filmide osas
Quantum Monte Carlo approaches for correlated systems
Becca, Federico
2017-01-01
Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...
A reduced adjoint approach to variational data assimilation
Altaf, Muhammad
2013-02-01
The adjoint method has been used very often for variational data assimilation. The computational cost to run the adjoint model often exceeds several original model runs and the method needs significant programming efforts to implement the adjoint model code. The work proposed here is variational data assimilation based on proper orthogonal decomposition (POD) which avoids the implementation of the adjoint of the tangent linear approximation of the original nonlinear model. An ensemble of the forward model simulations is used to determine the approximation of the covariance matrix and only the dominant eigenvectors of this matrix are used to define a model subspace. The adjoint of the tangent linear model is replaced by the reduced adjoint based on this reduced space. Thus the adjoint model is run in reduced space with negligible computational cost. Once the gradient is obtained in reduced space it is projected back in full space and the minimization process is carried in full space. In the paper the reduced adjoint approach to variational data assimilation is introduced. The characteristics and performance of the method are illustrated with a number of data assimilation experiments in a ground water subsurface contaminant model. © 2012 Elsevier B.V.
Naming game with biased assimilation over adaptive networks
Fu, Guiyuan; Zhang, Weidong
2018-01-01
The dynamics of two-word naming game incorporating the influence of biased assimilation over adaptive network is investigated in this paper. Firstly an extended naming game with biased assimilation (NGBA) is proposed. The hearer in NGBA accepts the received information in a biased manner, where he may refuse to accept the conveyed word from the speaker with a predefined probability, if the conveyed word is different from his current memory. Secondly, the adaptive network is formulated by rewiring the links. Theoretical analysis is developed to show that the population in NGBA will eventually reach global consensus on either A or B. Numerical simulation results show that the larger strength of biased assimilation on both words, the slower convergence speed, while larger strength of biased assimilation on only one word can slightly accelerate the convergence; larger population size can make the rate of convergence slower to a large extent when it increases from a relatively small size, while such effect becomes minor when the population size is large; the behavior of adaptively reconnecting the existing links can greatly accelerate the rate of convergence especially on the sparse connected network.
assimilation efficiency in two herbivores, oreochromis niloticus and ...
African Journals Online (AJOL)
Preferred Customer
that acid lyses of phytoplankton is the means for breaking down and releasing the contents of the cells. Fig. 3. Stomach fullness and stomach pH in O. niloticus at different times of the day (modified from Getachew. Teferra and Fernando, 1989). The best-fit line drawn through all assimilation efficiency values for both fish and ...
Growth, assimilate partitioning and grain yield response of soybean ...
African Journals Online (AJOL)
Abstract. This investigation tested variation in the growth components, assimilate partitioning and grain yield of soybean (Glycine max L. Merrrill) varieties established in CO2 enriched atmosphere when inoculated with mixtures of Arbuscular mycorrhizal fungi (AMF) species in the humid rainforest of Nigeria. A pot and a field ...
Altimeter data assimilation in the tropical Indian Ocean using water ...
Indian Academy of Sciences (India)
Brahmaputra interannual discharge variations on Bay of Bengal salinity and temperature during the 1992–1999 period; J. Earth Syst. Sci. 120(5). 859–887. Evensen G, van Leeuwen P J 1996 Assimilation of Geosat altimeter data for the Agulhas Current ...
Opinion Dynamics with Heterogeneous Interactions and Information Assimilation
Mir Tabatabaei, Seydeh Anahita
2013-01-01
In any modern society, individuals interact to form opinions on various topics, including economic, political, and social aspects. Opinions evolve as the result of the continuous exchange of information among individuals and of the assimilation of information distributed by media. The impact of individuals' opinions on each other forms a network,…
Modelling Effluent Assimilative Capacity of Ikpoba River, Benin City ...
African Journals Online (AJOL)
The sheer display of reprehensible propensity on the part of public hospitals, abattoirs, breweries and city dwellers at large to discharge untreated waste, debris, scum and, in particular, municipal and industrial effluents into Ikpoba River has morphed into a situation whereby the assimilative capacity of the river has reached ...
Modelling nitrogen assimilation of Escherichia coli at low ammonium concentration.
Ma, H.; Boogerd, F.C.; Goryanin, I.
2009-01-01
Modelling is an important methodology in systems biology research. In this paper, we presented a kinetic model for the complex ammonium assimilation regulation system of Escherichia coli. Based on a previously published model, the new model included AmtB mediated ammonium transport and AmtB
Phenological model of bird cherry Padus racemosa with data assimilation
Kalvāns, Andis; Sīle, Tija; Kalvāne, Gunta
2017-12-01
The accuracy of the operational models can be improved by using observational data to shift the model state in a process called data assimilation. Here, a data assimilation approach using the temperature similarity to control the extent of extrapolation of point-like phenological observations is explored. A degree-day model is used to describe the spring phenology of the bird cherry Padus racemosa in the Baltic region in 2014. The model results are compared to phenological observations that are expressed on a continuous scale based on the BBCH code. The air temperature data are derived from a numerical weather prediction (NWP) model. It is assumed that the phenology at two points with a similar temperature pattern should be similar. The root mean squared difference (RMSD) between the time series of hourly temperature data over a selected time interval are used to measure the temperature similarity of any two points. A sigmoidal function is used to scale the RMSD into a weight factor that determines how the modelled and observed phenophases are combined in the data assimilation. The parameter space for determining the weight of observations is explored. It is found that data assimilation improved the accuracy of the phenological model and that the value of the point-like observations can be increased through using a weighting function based on environmental parameters, such as temperature.
On existence and uniqueness of solutions for variational data assimilation
Bröcker, Jochen
2017-04-01
Data assimilation is a term from the geosciences and refers to methods for estimating orbits of dynamical models from observations. Variational techniques for data assimilation estimate these orbits by minimising an appropriate cost functional which takes the error with respect to the observations but also deviations of the orbits from the model equations into account. Such techniques are very important in practice. In this contribution, the problem of existence and uniqueness of solutions to variational data assimilation is investigated. Under mild hypotheses a solution to this problem exists. The problem of uniqueness is investigated as well, and several results (which all have analogues in optimal control) are established in the present context. The value function is introduced as the cost of an optimal trajectory starting from a given initial condition. The necessary conditions in combination with an envelope theorem can be used to demonstrate that the solution is unique if and only if the value function is differentiable at the given initial condition. This occurs for all initial conditions except maybe on a set of Lebesgue measure zero. Several examples are studied which demonstrate that non-uniqueness of solutions cannot be ruled out altogether though, which has important consequences in practice. References: J. Bröcker, "Existence and Uniqueness For Four Dimensional Variational Data Assimilation in Discrete Time.", SIAM Journal of Applied Dynamical Systems (accepted).
Music playlist generation by assimilating GMMs into SOMs
Balkema, Wietse; van der Heijden, Ferdinand
A method for music playlist generation, using assimilated Gaussian mixture models (GMMs) in self organizing maps (SOMs) is presented. Traditionally, the neurons in a SOM are represented by vectors, but in this paper we propose to use GMMs instead. To this end, we introduce a method to adapt a GMM
Assimilation (in vitro) of cholesterol by yogurt bacteria.
Dilmi-Bouras, Abdelkader
2006-01-01
A considerable variation is noticed between the different species studied and even between the strains of the same species, in the assimilation of cholesterol in synthetic media, in presence of different concentrations of bile salts and under anaerobiosis conditions. The obtained results show that certain strains of Streptococcus thermophilus and Lactobacillus bulgaricus resist bile salts and assimilate appreciable cholesterol quantities in their presence. The study of associations shows that only strains assimilating cholesterol in a pure state remain active when they are put in associations, but there is no additional effect. However, the symbiotic effect between Streptococcus thermophilus and Lactobacillus bulgaricus of yogurt, with regard to bile salts, is confirmed. The lactic fermenters of yogurt (Y2) reduce the levels of total cholesterol, HDL-cholesterol and LDL-cholesterol, in a well-balanced way. In all cases, the assimilated quantity of HDL-cholesterol is lower than that of LDL-cholesterol. Moreover, yogurt Y2 keeps a significant number of bacteria, superior to 10(8) cells ml(-1), and has a good taste 10 days after its production.
Economic Assimilation and Outmigration of Immigrants in West-Germany
Bellemare, C.
2003-01-01
By analyzing earnings of observed immigrants workers, the literature on the economic assimilation of immigrants has generally overlooked two potentially important selectivity issues.First, earnings of immigrant workers may di¿er substantially from those of non-workers.Second, earnings of immigrants
Data assimilation in integrated hydrological modeling using ensemble Kalman filtering
DEFF Research Database (Denmark)
Rasmussen, Jørn; Madsen, H.; Jensen, Karsten Høgh
2015-01-01
Groundwater head and stream discharge is assimilated using the ensemble transform Kalman filter in an integrated hydrological model with the aim of studying the relationship between the filter performance and the ensemble size. In an attempt to reduce the required number of ensemble members...
Remote sensing data assimilation for a prognostic phenology model
R. Stockli; T. Rutishauser; D. Dragoni; J. O' Keefe; P. E. Thornton; M. Jolly; L. Lu; A. S. Denning
2008-01-01
Predicting the global carbon and water cycle requires a realistic representation of vegetation phenology in climate models. However most prognostic phenology models are not yet suited for global applications, and diagnostic satellite data can be uncertain and lack predictive power. We present a framework for data assimilation of Fraction of Photosynthetically Active...
Altimeter data assimilation in the tropical Indian Ocean using water ...
Indian Academy of Sciences (India)
The subsurface effect of the assimilation could be judged by comparing the model simulated depth of the 20°C isotherm (hereafter referred to as D20), as a proxy of the thermocline depth, with the same quantity estimated from ARGO observations. In this case also, the impact is noteworthy. Effect on the dynamics has been ...
Catchment-scale hydrological modeling and data assimilation
Troch, P.A.A.; Paniconi, C.; McLaughlin, D.
2003-01-01
This special issue of Advances in Water Resources presents recent progress in the application of DA (data assimilation) for distributed hydrological modeling and in the use of in situ and remote sensing datasets for hydrological analysis and parameter estimation. The papers were presented at the De
Homophily and assimilation among sportactive adolescent substance users
Pearson, M; Steglich, Ch.; Snijders, T.A.B.
2006-01-01
We analyse the co-evolution of social networks and substance use behaviour of adolescents and address the problem of separating the effects of homophily and assimilation. Adolescents who prefer friends with the same substance-use behaviour exhibit the homophily principle. Adolescents who adapt their
Abscisic acid and assimilate partitioning during seed development
Bruijn, de S.M.
1993-01-01
This thesis describes the influence of abscisic acid (ABA) on the transport of assimilates to seeds and the deposition of reserves in seeds. It is well-known from literature that ABA accumulates in seeds during development, and that ABA concentrations in seeds correlate rather well with
Assimilation Ideology: Critically Examining Underlying Messages in Multicultural Literature
Yoon, Bogum; Simpson, Anne; Haag, Claudia
2010-01-01
Using the framework of multicultural education, this article presents an analysis of multicultural picture books that depict the features of assimilation ideology. The findings suggest that assimilationist ideas are presented through the main characters' identities in the resolution of the story and through the portrayal of a glorified dominant…
Satellite Data Assimilation within KIAPS-LETKF system
Jo, Y.; Lee, S., Sr.; Cho, K.
2016-12-01
Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing an ensemble data assimilation system using four-dimensional local ensemble transform kalman filter (LETKF; Hunt et al., 2007) within KIAPS Integrated Model (KIM), referred to as "KIAPS-LETKF". KIAPS-LETKF system was successfully evaluated with various Observing System Simulation Experiments (OSSEs) with NCAR Community Atmospheric Model - Spectral Element (Kang et al., 2013), which has fully unstructured quadrilateral meshes based on the cubed-sphere grid as the same grid system of KIM. Recently, assimilation of real observations has been conducted within the KIAPS-LETKF system with four-dimensional covariance functions over the 6-hr assimilation window. Then, conventional (e.g., sonde, aircraft, and surface) and satellite (e.g., AMSU-A, IASI, GPS-RO, and AMV) observations have been provided by the KIAPS Package for Observation Processing (KPOP). Wind speed prediction was found most beneficial due to ingestion of AMV and for the temperature prediction the improvement in assimilation is mostly due to ingestion of AMSU-A and IASI. However, some degradation in the simulation of the GPS-RO is presented in the upper stratosphere, even though GPS-RO leads positive impacts on the analysis and forecasts. We plan to test the bias correction method and several vertical localization strategies for radiance observations to improve analysis and forecast impacts.
Screening of oleaginous yeast with xylose assimilating capacity for ...
African Journals Online (AJOL)
... in industrial-scale production. In our preliminary study, 57 oleaginous yeast with xylose assimilating capacity were isolated from 13 soil samples, 16 strains were identified as potential lipid biomass producer. Four strains which showed higher lipid content were used for further ethanol fermentation at different conditions.
Information Flow in an Atmospheric Model and Data Assimilation
Yoon, Young-noh
2011-01-01
Weather forecasting consists of two processes, model integration and analysis (data assimilation). During the model integration, the state estimate produced by the analysis evolves to the next cycle time according to the atmospheric model to become the background estimate. The analysis then produces a new state estimate by combining the background…
CASE REPORT: Assimilation of Atlas in Indian Dry Skulls
Directory of Open Access Journals (Sweden)
Surekha D Jadhav
2012-01-01
Full Text Available Background: A congenital fusion of the atlas to the base of the occiput is defined as assimilation of atlas. It may produce narrowingof foramen magnum which may compress the spinal cord or brain stem. Rarely, it also results in vertebral artery compression, leading todizziness, seizures and syncope. Multiple variations of partial assimilation have been reported and may involve any aspect of atlantooccipital articulation. Therefore the knowledge of such anomaly is essential for orthopedic, anesthetist, and clinician.Aims and Objectives: Aim of the present study was to find the incidence of assimilation of atlas in Indian dry adult skulls of unknown sex and age for which 150 skulls were examined. Results: Only in one skull, we observed fusion of atlas vertebra with occipital bone. The posterior arch and two superior facets of atlashad completely fused with the occipital condyles. The anterior arch had incompletely fused with occipital bone, showing nonunion inthe midline. Only right transverse process was fused with occipital bone. Conclusions: Assimilation of atlas may cause orthopedic problems and occasionally it produces neurological effects especially when the lumen of foramen magnum is reduced. Therefore, improved knowledge on the fusion of the atlas with the occipital bone is important in clinical practice as it shows multiple variations and combinations.
Probability Maps for the Visualization of Assimilation Ensemble Flow Data
Hollt, Thomas
2015-05-25
Ocean forecasts nowadays are created by running ensemble simulations in combination with data assimilation techniques. Most of these techniques resample the ensemble members after each assimilation cycle. This means that in a time series, after resampling, every member can follow up on any of the members before resampling. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially with the number of assimilation cycles. In general a single possible path is not of interest but only the probabilities that any point in space might be reached by a particle at some point in time. In this work we present an approach using probability-weighted piecewise particle trajectories to allow such a mapping interactively, instead of tracing quadrillions of individual particles. We achieve interactive rates by binning the domain and splitting up the tracing process into the individual assimilation cycles, so that particles that fall into the same bin after a cycle can be treated as a single particle with a larger probability as input for the next time step. As a result we loose the possibility to track individual particles, but can create probability maps for any desired seed at interactive rates.
Sensitivity of Satellite Altimetry Data Assimilation on a Weapon Acoustic Preset Using MODAS
National Research Council Canada - National Science Library
Chu, Peter; Mancini, Steven; Gottshall, Eric; Cwalina, David; Barron, Charlie N
2007-01-01
...) is analyzed with SSP derived from the modular ocean data assimilation system (MODAS). The MODAS fields differ in that one uses altimeter data assimilated from three satellites while the other uses no altimeter data...
Sequential evidence accumulation in decision making
Directory of Open Access Journals (Sweden)
Daniel Hausmann
2008-03-01
Full Text Available Judgments and decisions under uncertainty are frequently linked to a prior sequential search for relevant information. In such cases, the subject has to decide when to stop the search for information. Evidence accumulation models from social and cognitive psychology assume an active and sequential information search until enough evidence has been accumulated to pass a decision threshold. In line with such theories, we conceptualize the evidence threshold as the ``desired level of confidence'' (DLC of a person. This model is tested against a fixed stopping rule (one-reason decision making and against the class of multi-attribute information integrating models. A series of experiments using an information board for horse race betting demonstrates an advantage of the proposed model by measuring the individual DLC of each subject and confirming its correctness in two separate stages. In addition to a better understanding of the stopping rule (within the narrow framework of simple heuristics, the results indicate that individual aspiration levels might be a relevant factor when modelling decision making by task analysis of statistical environments.
Unsupervised Sequential Outlier Detection With Deep Architectures.
Lu, Weining; Cheng, Yu; Xiao, Cao; Chang, Shiyu; Huang, Shuai; Liang, Bin; Huang, Thomas
2017-09-01
Unsupervised outlier detection is a vital task and has high impact on a wide variety of applications domains, such as image analysis and video surveillance. It also gains long-standing attentions and has been extensively studied in multiple research areas. Detecting and taking action on outliers as quickly as possible are imperative in order to protect network and related stakeholders or to maintain the reliability of critical systems. However, outlier detection is difficult due to the one class nature and challenges in feature construction. Sequential anomaly detection is even harder with more challenges from temporal correlation in data, as well as the presence of noise and high dimensionality. In this paper, we introduce a novel deep structured framework to solve the challenging sequential outlier detection problem. We use autoencoder models to capture the intrinsic difference between outliers and normal instances and integrate the models to recurrent neural networks that allow the learning to make use of previous context as well as make the learners more robust to warp along the time axis. Furthermore, we propose to use a layerwise training procedure, which significantly simplifies the training procedure and hence helps achieve efficient and scalable training. In addition, we investigate a fine-tuning step to update all parameters set by incorporating the temporal correlation in the sequence. We further apply our proposed models to conduct systematic experiments on five real-world benchmark data sets. Experimental results demonstrate the effectiveness of our model, compared with other state-of-the-art approaches.
Continuous sequential boundaries for vaccine safety surveillance.
Li, Rongxia; Stewart, Brock; Weintraub, Eric; McNeil, Michael M
2014-08-30
Various recently developed sequential methods have been used to detect signals for post-marketing surveillance in drug and vaccine safety. Among these, the maximized sequential probability ratio test (MaxSPRT) has been used to detect elevated risks of adverse events following vaccination using large healthcare databases. However, a limitation of MaxSPRT is that it only provides a time-invariant flat boundary. In this study, we propose the use of time-varying boundaries for controlling how type I error is distributed throughout the surveillance period. This is especially useful in two scenarios: (i) when we desire generally larger sample sizes before a signal is generated, for example, when early adopters are not representative of the larger population; and (ii) when it is desired for a signal to be generated as early as possible, for example, when the adverse event is considered rare but serious. We consider four specific time-varying boundaries (which we call critical value functions), and we study their statistical power and average time to signal detection. The methodology we present here can be viewed as a generalization or flexible extension of MaxSPRT. Published 2014. This article is a US Government work and is in the public domain in the USA.
Noncommutative Biology: Sequential Regulation of Complex Networks.
Directory of Open Access Journals (Sweden)
William Letsou
2016-08-01
Full Text Available Single-cell variability in gene expression is important for generating distinct cell types, but it is unclear how cells use the same set of regulatory molecules to specifically control similarly regulated genes. While combinatorial binding of transcription factors at promoters has been proposed as a solution for cell-type specific gene expression, we found that such models resulted in substantial information bottlenecks. We sought to understand the consequences of adopting sequential logic wherein the time-ordering of factors informs the final outcome. We showed that with noncommutative control, it is possible to independently control targets that would otherwise be activated simultaneously using combinatorial logic. Consequently, sequential logic overcomes the information bottleneck inherent in complex networks. We derived scaling laws for two noncommutative models of regulation, motivated by phosphorylation/neural networks and chromosome folding, respectively, and showed that they scale super-exponentially in the number of regulators. We also showed that specificity in control is robust to the loss of a regulator. Lastly, we connected these theoretical results to real biological networks that demonstrate specificity in the context of promiscuity. These results show that achieving a desired outcome often necessitates roundabout steps.
Monte Carlo Algorithms for Linear Problems
Dimov, Ivan
2000-01-01
MSC Subject Classification: 65C05, 65U05. Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the followin...
Data Assimilation to Extract Soil Moisture Information from SMAP Observations
Directory of Open Access Journals (Sweden)
Jana Kolassa
2017-11-01
Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a
Assimilation of 3D radar reflectivities with an ensemble Kalman filter on the convective scale
Bick, T.; Simmer, C.; Trömel, S.; Wapler, K.; Hendricks Franssen, H.-J.; Stephan, K.; Blahak, U.; Schraff, C.; Reich, H.; Zeng, Y.; Potthast, Roland
2016-01-01
An ensemble data assimilation system for 3D radar reflectivity data is introduced for the convection-permitting numerical weather prediction model of the COnsortium for Small-scale MOdelling (COSMO) based on the Kilometre-scale ENsemble Data Assimilation system (KENDA), developed by Deutscher Wetterdienst and its partners. KENDA provides a state-of-the-art ensemble data assimilation system on the convective scale for operational data assimilation and forecasting based on the Local Ensemble Tr...
Satellite data assimilation in global forecast system in India
Basu, Swati
2014-11-01
Satellite data is very important for model initialization and verification. A large number of satellite observations are currently assimilated into the Numerical Weather Prediction (NWP) systems at the National Centre for Medium Range Weather Forecasting (NCMRWF). Apart from Global meteorological observations from GTS, near-real time satellite observations are received at NCMRWF from other operational centres like ISRO, NOAA/NESDIS, EUMETCAST, etc. Recently India has become member of Asia-Pacific Regional ATOVS Retransmission Service (APRARS) for faster access to high resolution global satellite data useful for high resolution regional models. Indian HRPT at Chennai covers the APRARS data gap region over South East Asia. A robust data monitoring system has been implemented at NCMRWF to assess the quantity and quality of the data as well as the satellite sensor strength, before getting assimilated in the models. Validation of new satellite observations, especially from Indian satellites are being carried out against insitu observations and similar space borne platforms. After establishing the quality of the data, Observation System Experiments (OSEs) are being conducted to study their impact in the assimilation and forecast systems. OSEs have been carried out with the Oceansat-2 scatterometer winds and radiance data from Megha-Tropiques SAPHIR sensor. Daily rainfall analysis dataset is being generated by merging satellite estimates and in-situ observations. ASCAT soil wetness measurements from METOP satellite is being assimilated into the global model. Land surface parameters (LuLc and albedo) retrieved from Indian satellites are being explored for its possible usage in the global and regional models. OLR from Indian satellites are used for validating model outputs. This paper reviews the efforts made at NCMRWF in (i) assimilating the data from Indian/International satellites and (ii) generating useful products from the satellite data.
Assimilating soil moisture into an Earth System Model
Stacke, Tobias; Hagemann, Stefan
2017-04-01
Several modelling studies reported potential impacts of soil moisture anomalies on regional climate. In particular for short prediction periods, perturbations of the soil moisture state may result in significant alteration of surface temperature in the following season. However, it is not clear yet whether or not soil moisture anomalies affect climate also on larger temporal and spatial scales. In an earlier study, we showed that soil moisture anomalies can persist for several seasons in the deeper soil layers of a land surface model. Additionally, those anomalies can influence root zone moisture, in particular during explicitly dry or wet periods. Thus, one prerequisite for predictability, namely the existence of long term memory, is evident for simulated soil moisture and might be exploited to improve climate predictions. The second prerequisite is the sensitivity of the climate system to soil moisture. In order to investigate this sensitivity for decadal simulations, we implemented a soil moisture assimilation scheme into the Max-Planck Institute for Meteorology's Earth System Model (MPI-ESM). The assimilation scheme is based on a simple nudging algorithm and updates the surface soil moisture state once per day. In our experiments, the MPI-ESM is used which includes model components for the interactive simulation of atmosphere, land and ocean. Artificial assimilation data is created from a control simulation to nudge the MPI-ESM towards predominantly dry and wet states. First analyses are focused on the impact of the assimilation on land surface variables and reveal distinct differences in the long-term mean values between wet and dry state simulations. Precipitation, evapotranspiration and runoff are larger in the wet state compared to the dry state, resulting in an increased moisture transport from the land to atmosphere and ocean. Consequently, surface temperatures are lower in the wet state simulations by more than one Kelvin. In terms of spatial pattern
Assimilation of diazotrophic nitrogen into pelagic food webs.
Directory of Open Access Journals (Sweden)
Ryan J Woodland
Full Text Available The fate of diazotrophic nitrogen (N(D fixed by planktonic cyanobacteria in pelagic food webs remains unresolved, particularly for toxic cyanophytes that are selectively avoided by most herbivorous zooplankton. Current theory suggests that N(D fixed during cyanobacterial blooms can enter planktonic food webs contemporaneously with peak bloom biomass via direct grazing of zooplankton on cyanobacteria or via the uptake of bioavailable N(D (exuded from viable cyanobacterial cells by palatable phytoplankton or microbial consortia. Alternatively, N(D can enter planktonic food webs post-bloom following the remineralization of bloom detritus. Although the relative contribution of these processes to planktonic nutrient cycles is unknown, we hypothesized that assimilation of bioavailable N(D (e.g., nitrate, ammonium by palatable phytoplankton and subsequent grazing by zooplankton (either during or after the cyanobacterial bloom would be the primary pathway by which N(D was incorporated into the planktonic food web. Instead, in situ stable isotope measurements and grazing experiments clearly documented that the assimilation of N(D by zooplankton outpaced assimilation by palatable phytoplankton during a bloom of toxic Nodularia spumigena Mertens. We identified two distinct temporal phases in the trophic transfer of N(D from N. spumigena to the plankton community. The first phase was a highly dynamic transfer of N(D to zooplankton with rates that covaried with bloom biomass while bypassing other phytoplankton taxa; a trophic transfer that we infer was routed through bloom-associated bacteria. The second phase was a slowly accelerating assimilation of the dissolved-N(D pool by phytoplankton that was decoupled from contemporaneous variability in N. spumigena concentrations. These findings provide empirical evidence that N(D can be assimilated and transferred rapidly throughout natural plankton communities and yield insights into the specific processes
A virtual reality catchment for data assimilation experiments
Schalge, Bernd; Rihani, Jehan; Haese, Barbara; Baroni, Gabriele; Erdal, Daniel; Neuweiler, Insa; Hendricks-Franssen, Harrie-Jan; Geppert, Gernot; Ament, Felix; Kollet, Stefan; Cirpka, Olaf; Saavedra, Pablo; Han, Xujun; Attinger, Sabine; Kunstmann, Harald; Vereecken, Harry; Simmer, Clemens
2016-04-01
Current data assimilation (DA) systems often lack the possibility to assimilate measurements across compartments to accurately estimate states and fluxes in subsurface-land surface-atmosphere systems (SLAS). In order to develop a new DA framework that is able to realize this cross-compartmental assimilation a comprehensive testing environment is needed. Therefore a virtual reality (VR) catchment is constructed with the Terrestrial System Modeling Platform (TerrSysMP). This catchment mimics the Neckar catchment in Germany. TerrSysMP employs the atmospheric model COSMO, the land surface model CLM and the hydrological model ParFlow coupled with the external coupler OASIS. We will show statistical tests to prove the plausibility of the VR. The VR is running in a fully-coupled mode (subsurface - land surface - atmosphere) which includes the interactions of subsurface dynamics with the atmosphere, such as the effects of soil moisture, which can influence near-surface temperatures, convection patterns or the surface heat fluxes. A reference high resolution run serves as the "truth" from which virtual observations are extracted with observation operators like virtual rain gauges, synoptic stations and satellite observations (amongst others). This effectively solves the otherwise often encountered data scarcity issues with respect to DA. Furthermore an ensemble of model runs at a reduced resolution is performed. This ensemble serves also for open loop runs to be compared with data assimilation experiments. The model runs with this ensemble served to identify sets of parameters that are especially sensitive to changes and have the largest impact on the system. These parameters were the focus of subsequent ensemble simulations and DA experiments. We will show to what extend the VR states can be re-constructed using data assimilation methods with only a limited number of virtual observations available.
Personalized glucose forecasting for type 2 diabetes using data assimilation.
Directory of Open Access Journals (Sweden)
David J Albers
2017-04-01
Full Text Available Type 2 diabetes leads to premature death and reduced quality of life for 8% of Americans. Nutrition management is critical to maintaining glycemic control, yet it is difficult to achieve due to the high individual differences in glycemic response to nutrition. Anticipating glycemic impact of different meals can be challenging not only for individuals with diabetes, but also for expert diabetes educators. Personalized computational models that can accurately forecast an impact of a given meal on an individual's blood glucose levels can serve as the engine for a new generation of decision support tools for individuals with diabetes. However, to be useful in practice, these computational engines need to generate accurate forecasts based on limited datasets consistent with typical self-monitoring practices of individuals with type 2 diabetes. This paper uses three forecasting machines: (i data assimilation, a technique borrowed from atmospheric physics and engineering that uses Bayesian modeling to infuse data with human knowledge represented in a mechanistic model, to generate real-time, personalized, adaptable glucose forecasts; (ii model averaging of data assimilation output; and (iii dynamical Gaussian process model regression. The proposed data assimilation machine, the primary focus of the paper, uses a modified dual unscented Kalman filter to estimate states and parameters, personalizing the mechanistic models. Model selection is used to make a personalized model selection for the individual and their measurement characteristics. The data assimilation forecasts are empirically evaluated against actual postprandial glucose measurements captured by individuals with type 2 diabetes, and against predictions generated by experienced diabetes educators after reviewing a set of historical nutritional records and glucose measurements for the same individual. The evaluation suggests that the data assimilation forecasts compare well with specific
Development of KIAPS Observation Processing Package for Data Assimilation System
Kang, Jeon-Ho; Chun, Hyoung-Wook; Lee, Sihye; Han, Hyun-Jun; Ha, Su-Jin
2015-04-01
The Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded in 2011 by the Korea Meteorological Administration (KMA) to develop Korea's own global Numerical Weather Prediction (NWP) system as nine year (2011-2019) project. Data assimilation team at KIAPS has been developing the observation processing system (KIAPS Package for Observation Processing: KPOP) to provide optimal observations to the data assimilation system for the KIAPS Global Model (KIAPS Integrated Model - Spectral Element method based on HOMME: KIM-SH). Currently, the KPOP is capable of processing the satellite radiance data (AMSU-A, IASI), GPS Radio Occultation (GPS-RO), AIRCRAFT (AMDAR, AIREP, and etc…), and synoptic observation (SONDE and SURFACE). KPOP adopted Radiative Transfer for TOVS version 10 (RTTOV_v10) to get brightness temperature (TB) for each channel at top of the atmosphere (TOA), and Radio Occultation Processing Package (ROPP) 1-dimensional forward module to get bending angle (BA) at each tangent point. The observation data are obtained from the KMA which has been composited with BUFR format to be converted with ODB that are used for operational data assimilation and monitoring at the KMA. The Unified Model (UM), Community Atmosphere - Spectral Element (CAM-SE) and KIM-SH model outputs are used for the bias correction (BC) and quality control (QC) of the observations, respectively. KPOP provides radiance and RO data for Local Ensemble Transform Kalman Filter (LETKF) and also provides SONDE, SURFACE and AIRCRAFT data for Three-Dimensional Variational Assimilation (3DVAR). We are expecting all of the observation type which processed in KPOP could be combined with both of the data assimilation method as soon as possible. The preliminary results from each observation type will be introduced with the current development status of the KPOP.
Assimilation of SMOS Retrievals in the Land Information System
Blankenship, Clay B.; Case, Jonathan L.; Zavodsky, Bradley T.; Crosson, William L.
2016-01-01
The Soil Moisture and Ocean Salinity (SMOS) satellite provides retrievals of soil moisture in the upper 5 cm with a 30-50 km resolution and a mission accuracy requirement of 0.04 cm(sub 3 cm(sub -3). These observations can be used to improve land surface model soil moisture states through data assimilation. In this paper, SMOS soil moisture retrievals are assimilated into the Noah land surface model via an Ensemble Kalman Filter within the NASA Land Information System. Bias correction is implemented using Cumulative Distribution Function (CDF) matching, with points aggregated by either land cover or soil type to reduce sampling error in generating the CDFs. An experiment was run for the warm season of 2011 to test SMOS data assimilation and to compare assimilation methods. Verification of soil moisture analyses in the 0-10 cm upper layer and root zone (0-1 m) was conducted using in situ measurements from several observing networks in the central and southeastern United States. This experiment showed that SMOS data assimilation significantly increased the anomaly correlation of Noah soil moisture with station measurements from 0.45 to 0.57 in the 0-10 cm layer. Time series at specific stations demonstrate the ability of SMOS DA to increase the dynamic range of soil moisture in a manner consistent with station measurements. Among the bias correction methods, the correction based on soil type performed best at bias reduction but also reduced correlations. The vegetation-based correction did not produce any significant differences compared to using a simple uniform correction curve.
Verma, Arjun; Privman, Vladimir
2018-02-01
We study approach to the large-time jammed state of the deposited particles in the model of random sequential adsorption. The convergence laws are usually derived from the argument of Pomeau which includes the assumption of the dominance, at large enough times, of small landing regions into each of which only a single particle can be deposited without overlapping earlier deposited particles and which, after a certain time are no longer created by depositions in larger gaps. The second assumption has been that the size distribution of gaps open for particle-center landing in this large-time small-gaps regime is finite in the limit of zero gap size. We report numerical Monte Carlo studies of a recently introduced model of random sequential adsorption on patterned one-dimensional substrates that suggest that the second assumption must be generalized. We argue that a region exists in the parameter space of the studied model in which the gap-size distribution in the Pomeau large-time regime actually linearly vanishes at zero gap sizes. In another region, the distribution develops a threshold property, i.e., there are no small gaps below a certain gap size. We discuss the implications of these findings for new asymptotic power-law and exponential-modified-by-a-power-law convergences to jamming in irreversible one-dimensional deposition.
Directory of Open Access Journals (Sweden)
K. Boniface
2009-07-01
Full Text Available Impact of GPS (Global Positioning System data assimilation is assessed here using a high-resolution numerical weather prediction system at 2.5 km horizontal resolution. The Zenithal Tropospheric Delay (ZTD GPS data from mesoscale networks are assimilated with the 3DVAR AROME data assimilation scheme. Data from more than 280 stations over the model domain have been assimilated during 15-day long assimilation cycles prior each of the two studied events. The results of these assimilation cycles show that the assimilation of GPS ZTD with the AROME system performs well in producing analyses closer to the ZTD observations in average. Then the impacts of assimilating GPS data on the precipitation forecast have been evaluated. For the first case, only the AROME runs starting a few hours prior the triggering of the convective system are able to simulate the convective precipitation. The assimilation of GPS ZTD observations improves the simulation of the spatial extent of the precipitation, but slightly underestimates the heaviest precipitation in that case compared with the experiment without GPS. The accuracy of the precipitation forecast for the second case is much better. The analyses from the control assimilation cycle provide already a good description of the atmosphere state that cannot be further improved by the assimilation of GPS observations. Only for the latest day (22 November 2007, significant differences have been found between the two parallel cycles. In that case, the assimilation of GPS ZTD allows to improve the first 6 to 12 h of the precipitation forecast.
ECTOPIC PREGNANCY AFTER SEQUENTIAL EMBRYO TRANSFER: REVIEW OF 22 CASES
Nadkarni Purnima K, Nadkarni Kishore, Singh Pooja P, Singh Prabhakar , Nadkarni Aditi A , Agarwal Neha R
2015-01-01
Objective: To assess the prevalence of ectopic pregnancy among women who conceived with assisted reproductive technology and to see if there is increased risk after sequential embryo transfer. Methods: The ectopic pregnancy rate for ART pregnancies was calculated among women who conceived and had ectopic pregnancy after ICSI followed by Sequential embryo transfer from an ART centre. Variation in ectopic risk by patient and ART treatment factors was assessed including Sequential transfer, risk...
Norris, Peter M.; Da Silva, Arlindo M.
2016-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
Lemak, Alexander; Steren, Carlos A; Arrowsmith, Cheryl H; Llinás, Miguel
2008-05-01
ABACUS [Grishaev et al. (2005) Proteins 61:36-43] is a novel protocol for automated protein structure determination via NMR. ABACUS starts from molecular fragments defined by unassigned J-coupled spin-systems and involves a Monte Carlo stochastic search in assignment space, probabilistic sequence selection, and assembly of fragments into structures that are used to guide the stochastic search. Here, we report further development of the two main algorithms that increase the flexibility and robustness of the method. Performance of the BACUS [Grishaev and Llinás (2004) J Biomol NMR 28:1-101] algorithm was significantly improved through use of sequential connectivities available from through-bond correlated 3D-NMR experiments, and a new set of likelihood probabilities derived from a database of 56 ultra high resolution X-ray structures. A Multicanonical Monte Carlo procedure, Fragment Monte Carlo (FMC), was developed for sequence-specific assignment of spin-systems. It relies on an enhanced assignment sampling and provides the uncertainty of assignments in a quantitative manner. The efficiency of the protocol was validated on data from four proteins of between 68-116 residues, yielding 100% accuracy in sequence specific assignment of backbone and side chain resonances.
Adaptive Learning in Extensive Form Games and Sequential Equilibrium
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1999-01-01
This paper studies adaptive learning in extensive form games and provides conditions for convergence points of adaptive learning to be sequential equilibria. Precisely, we present a set of conditions on learning sequences such that an assessment is a sequential equilibrium if and only if there is......This paper studies adaptive learning in extensive form games and provides conditions for convergence points of adaptive learning to be sequential equilibria. Precisely, we present a set of conditions on learning sequences such that an assessment is a sequential equilibrium if and only...
Bayesian statistics and Monte Carlo methods
Koch, K. R.
2018-03-01
The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.
Dancing Twins: Stellar Hierarchies That Formed Sequentially?
Tokovinin, Andrei
2018-04-01
This paper draws attention to the class of resolved triple stars with moderate ratios of inner and outer periods (possibly in a mean motion resonance) and nearly circular, mutually aligned orbits. Moreover, stars in the inner pair are twins with almost identical masses, while the mass sum of the inner pair is comparable to the mass of the outer component. Such systems could be formed either sequentially (inside-out) by disk fragmentation with subsequent accretion and migration, or by a cascade hierarchical fragmentation of a rotating cloud. Orbits of the outer and inner subsystems are computed or updated in four such hierarchies: LHS 1070 (GJ 2005, periods 77.6 and 17.25 years), HIP 9497 (80 and 14.4 years), HIP 25240 (1200 and 47.0 years), and HIP 78842 (131 and 10.5 years).
Sequential scintigraphic staging of small cell carcinoma
International Nuclear Information System (INIS)
Bitran, J.D.; Bekerman, C.; Pinsky, S.
1981-01-01
Thirty patients with small cell carcinoma (SCC) of the lung were sequentially staged following a history and physical exam with liver, bran, bone, and gallium-67 citrate scans. Scintigraphic evaluation disclosed 7 of 30 patients (23%) with advanced disease, stage IIIM1. When Gallium-67 scans were used as the sole criteria for staging, they proved to be accurate and identified six of the seven patients with occult metastatic disease. Gallium-67 scans proved to be accurate in detecting thoracic and extrathoracic metastases in the 30 patients with SCC, especially within the liver and lymph node-bearing area. The diagnostic accuracy of gallium-67 fell in regions such as bone or brain. Despite the limitations of gallium-67 scanning, the authors conclude that these scans are useful in staging patients with SCC and should be the initial scans used in staging such patients
Gleason-Busch theorem for sequential measurements
Flatt, Kieran; Barnett, Stephen M.; Croke, Sarah
2017-12-01
Gleason's theorem is a statement that, given some reasonable assumptions, the Born rule used to calculate probabilities in quantum mechanics is essentially unique [A. M. Gleason, Indiana Univ. Math. J. 6, 885 (1957), 10.1512/iumj.1957.6.56050]. We show that Gleason's theorem contains within it also the structure of sequential measurements, and along with this the state update rule. We give a small set of axioms, which are physically motivated and analogous to those in Busch's proof of Gleason's theorem [P. Busch, Phys. Rev. Lett. 91, 120403 (2003), 10.1103/PhysRevLett.91.120403], from which the familiar Kraus operator form follows. An axiomatic approach has practical relevance as well as fundamental interest, in making clear those assumptions which underlie the security of quantum communication protocols. Interestingly, the two-time formalism is seen to arise naturally in this approach.
Sequential Therapy in Metastatic Renal Cell Carcinoma
Directory of Open Access Journals (Sweden)
Bradford R Hirsch
2016-04-01
Full Text Available The treatment of metastatic renal cell carcinoma (mRCC has changed dramatically in the past decade. As the number of available agents, and related volume of research, has grown, it is increasingly complex to know how to optimally treat patients. The authors are practicing medical oncologists at the US Oncology Network, the largest community-based network of oncology providers in the country, and represent the leadership of the Network's Genitourinary Research Committee. We outline our thought process in approaching sequential therapy of mRCC and the use of real-world data to inform our approach. We also highlight the evolving literature that will impact practicing oncologists in the near future.
Prosody and alignment: a sequential perspective
Szczepek Reed, Beatrice
2010-12-01
In their analysis of a corpus of classroom interactions in an inner city high school, Roth and Tobin describe how teachers and students accomplish interactional alignment by prosodically matching each other's turns. Prosodic matching, and specific prosodic patterns are interpreted as signs of, and contributions to successful interactional outcomes and positive emotions. Lack of prosodic matching, and other specific prosodic patterns are interpreted as features of unsuccessful interactions, and negative emotions. This forum focuses on the article's analysis of the relation between interpersonal alignment, emotion and prosody. It argues that prosodic matching, and other prosodic linking practices, play a primarily sequential role, i.e. one that displays the way in which participants place and design their turns in relation to other participants' turns. Prosodic matching, rather than being a conversational action in itself, is argued to be an interactional practice (Schegloff 1997), which is not always employed for the accomplishment of `positive', or aligning actions.
Sequential Stereotype Priming: A Meta-Analysis.
Kidder, Ciara K; White, Katherine R; Hinojos, Michelle R; Sandoval, Mayra; Crites, Stephen L
2017-08-01
Psychological interest in stereotype measurement has spanned nearly a century, with researchers adopting implicit measures in the 1980s to complement explicit measures. One of the most frequently used implicit measures of stereotypes is the sequential priming paradigm. The current meta-analysis examines stereotype priming, focusing specifically on this paradigm. To contribute to ongoing discussions regarding methodological rigor in social psychology, one primary goal was to identify methodological moderators of the stereotype priming effect-whether priming is due to a relation between the prime and target stimuli, the prime and target response, participant task, stereotype dimension, stimulus onset asynchrony (SOA), and stimuli type. Data from 39 studies yielded 87 individual effect sizes from 5,497 participants. Analyses revealed that stereotype priming is significantly moderated by the presence of prime-response relations, participant task, stereotype dimension, target stimulus type, SOA, and prime repetition. These results carry both practical and theoretical implications for future research on stereotype priming.
Discriminative predation: Simultaneous and sequential encounter experiments
Directory of Open Access Journals (Sweden)
C. D. BEATTY, D.W.FRANKS
2012-08-01
Full Text Available There are many situations in which the ability of animals to distinguish between two similar looking objects can have significant selective consequences. For example, the objects that require discrimination may be edible versus defended prey, predators versus non-predators, or mates of varying quality. Working from the premise that there are situations in which discrimination may be more or less successful, we hypothesized that individuals find it more difficult to distinguish between stimuli when they encounter them sequentially rather than simultaneously. Our study has wide biological and psychological implications from the perspective of signal perception, signal evolution, and discrimination, and could apply to any system where individuals are making relative judgments or choices between two or more stimuli or signals. While this is a general principle that might seem intuitive, it has not been experimentally tested in this context, and is often not considered in the design of models or experiments, or in the interpretation of a wide range of studies. Our study is different from previous studies in psychology in that a the level of similarity of stimuli are gradually varied to obtain selection gradients, and b we discuss the implications of our study for specific areas in ecology, such as the level of perfection of mimicry in predator-prey systems. Our experiments provide evidence that it is indeed more difficult to distinguish between stimuli – and to learn to distinguish between stimuli – when they are encountered sequentially rather than simultaneously, even if the intervening time interval is short [Current Zoology 58 (4: 649–657, 2012].
Exploring the potential of sequential simulation.
Powell, Polly; Sorefan, Zinah; Hamilton, Sara; Kneebone, Roger; Bello, Fernando
2016-04-01
Several recent papers have highlighted the need for better integrated care to improve health care for children and families. Our team spent a year exploring the potential of 'Sequential Simulation' (SqS) as a teaching tool to address this need with young people and multidisciplinary teams. SqS allows the simulation of a series of key events or 'crunch points' that come together to represent the patient journey, and highlights the impact of individuals on this journey. The pilot SqS was based on an adolescent with asthma - a common condition that requires excellent multidisciplinary care with the patient at the centre. The SqS was designed using transportable sets and audio-visual equipment to create realism. Actors were employed to play the roles of the young person and mother and health professionals played themselves. The SqS was run at different events with varied audiences, including young people, health professionals and teachers. It was used to explore the difficulties that can arise during a patient journey, the importance of communication throughout, and to highlight the significance of each individual in the patient experience. The SqS was met with enthusiasm and felt to be an innovative and effective way of promoting better teamwork and communication. It was well received at a school asthma education event for pupils and community teams, demonstrating its varied potential. The year was the first step in the introduction of this exciting new concept that has the potential to help promote better integrated care for paediatric patients and their families. Our team spent a year exploring the potential of 'Sequential Simulation' as a teaching tool [to provide better integrated care]. © 2015 John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
O. Sus
2013-04-01
Full Text Available Agroecosystem models are strongly dependent on information on land management patterns for regional applications. Land management practices play a major role in determining global yield variability, and add an anthropogenic signal to the observed seasonality of atmospheric CO2 concentrations. However, there is still little knowledge on spatial and temporal variability of important farmland activities such as crop sowing dates, and thus these remain rather crudely approximated within carbon cycle studies. In this study, we present a framework allowing for spatio-temporally resolved simulation of cropland carbon fluxes under observational constraints on land management and canopy greenness. We apply data assimilation methodology in order to explicitly account for information on sowing dates and model leaf area index. MODIS 250 m vegetation index data were assimilated both in batch-calibration for sowing date estimation and sequentially for improved model state estimation, using the ensemble Kalman filter (EnKF, into a crop carbon mass balance model (SPAc. In doing so, we are able to quantify the multiannual (2000–2006 regional carbon flux and biometry seasonality of maize–soybean crop rotations surrounding the Bondville Ameriflux eddy covariance site, averaged over 104 pixel locations within the wider area. (1 Validation at the Bondville site shows that growing season C cycling is simulated accurately with MODIS-derived sowing dates, and we expect that this framework allows for accurate simulations of C cycling at locations for which ground-truth data are not available. Thus, this framework enables modellers to simulate current (i.e. last 10 yr carbon cycling of major agricultural regions. Averaged over the 104 field patches analysed, relative spatial variability for biometry and net ecosystem exchange ranges from ∼7% to ∼18%. The annual sign of net biome productivity is not significantly different from carbon neutrality. (2 Moreover
Successful vectorization - reactor physics Monte Carlo code
International Nuclear Information System (INIS)
Martin, W.R.
1989-01-01
Most particle transport Monte Carlo codes in use today are based on the ''history-based'' algorithm, wherein one particle history at a time is simulated. Unfortunately, the ''history-based'' approach (present in all Monte Carlo codes until recent years) is inherently scalar and cannot be vectorized. In particular, the history-based algorithm cannot take advantage of vector architectures, which characterize the largest and fastest computers at the current time, vector supercomputers such as the Cray X/MP or IBM 3090/600. However, substantial progress has been made in recent years in developing and implementing a vectorized Monte Carlo algorithm. This algorithm follows portions of many particle histories at the same time and forms the basis for all successful vectorized Monte Carlo codes that are in use today. This paper describes the basic vectorized algorithm along with descriptions of several variations that have been developed by different researchers for specific applications. These applications have been mainly in the areas of neutron transport in nuclear reactor and shielding analysis and photon transport in fusion plasmas. The relative merits of the various approach schemes will be discussed and the present status of known vectorization efforts will be summarized along with available timing results, including results from the successful vectorization of 3-D general geometry, continuous energy Monte Carlo. (orig.)
Mathematical foundations of hybrid data assimilation from a synchronization perspective
Penny, Stephen G.
2017-12-01
The state-of-the-art data assimilation methods used today in operational weather prediction centers around the world can be classified as generalized one-way coupled impulsive synchronization. This classification permits the investigation of hybrid data assimilation methods, which combine dynamic error estimates of the system state with long time-averaged (climatological) error estimates, from a synchronization perspective. Illustrative results show how dynamically informed formulations of the coupling matrix (via an Ensemble Kalman Filter, EnKF) can lead to synchronization when observing networks are sparse and how hybrid methods can lead to synchronization when those dynamic formulations are inadequate (due to small ensemble sizes). A large-scale application with a global ocean general circulation model is also presented. Results indicate that the hybrid methods also have useful applications in generalized synchronization, in particular, for correcting systematic model errors.
Ensemble-Based Data Assimilation in Reservoir Characterization: A Review
Directory of Open Access Journals (Sweden)
Seungpil Jung
2018-02-01
Full Text Available This paper presents a review of ensemble-based data assimilation for strongly nonlinear problems on the characterization of heterogeneous reservoirs with different production histories. It concentrates on ensemble Kalman filter (EnKF and ensemble smoother (ES as representative frameworks, discusses their pros and cons, and investigates recent progress to overcome their drawbacks. The typical weaknesses of ensemble-based methods are non-Gaussian parameters, improper prior ensembles and finite population size. Three categorized approaches, to mitigate these limitations, are reviewed with recent accomplishments; improvement of Kalman gains, add-on of transformation functions, and independent evaluation of observed data. The data assimilation in heterogeneous reservoirs, applying the improved ensemble methods, is discussed on predicting unknown dynamic data in reservoir characterization.
GPS=A Good Candidate for Data Assimilation?
Poli, P.; Joiner, J.; Kursinski, R.; Einaudi, Franco (Technical Monitor)
2000-01-01
The Global Positioning System (GPS) enables positioning anywhere about our planet. The microwave signals sent by the 24 transmitters are sensitive to the atmosphere. Using the radio occultation technique, it is possible to perform soundings, with a Low Earth Orbiter (700 km) GPS receiver. The insensitiveness to clouds and aerosols, the relatively high vertical resolution (1.5 km), the self-calibration and stability of the GPS make it a priori a potentially good observing system candidate for data assimilation. A low-computing cost simple method to retrieve both temperature and humidity will be presented. Comparisons with radiosonde show the capability of the GPS to resolve the tropopause. Options for using GPS for data assimilation and remaining issues will be discussed.
Regulation of assimilate partitioning by daylength and spectral quality
Energy Technology Data Exchange (ETDEWEB)
Britz, S.J. [USDA-Climate Stress Lab., Beltsville, MD (United States)
1994-12-31
Photosynthesis is the process by which plants utilize light energy to assimilate and transform carbon dioxide into products that support growth and development. The preceding review provides an excellent summary of photosynthetic mechanisms and diurnal patterns of carbon metabolism with emphasis on the importance of gradual changes in photosynthetically-active radiation at dawn and dusk. In addition to these direct effects of irradiance, there are indirect effects of light period duration and spectral quality on carbohydrate metabolism and assimilate partitioning. Both daylength and spectral quality trigger developmental phenomena such as flowering (e.g., photoperiodism) and shade avoidance responses, but their effects on partitioning of photoassimilates in leaves are less well known. Moreover, the adaptive significance to the plants of such effects is sometimes not clear.
Herbicides effect on the nitrogen fertilizer assimilation by sensitive plants
International Nuclear Information System (INIS)
Ladonin, V.F.; Samojlov, L.N.
1976-01-01
It has been established in studying the effect of herbicides on pea plants that the penetration of the preparations into the tissues of leaves and stems results in a slight increase of the rate of formation of dry substance in the leaves of the treated plants within 24 hours after treatment as compared with control, whereas in the last period of the analysis the herbicides strongly inhibit the formation of dry substance in leaves. The applied herbicide doses have resulted in drastic changes of the distribution of the plant-assimilated nitrogen between the protein and non-protein fractions in the leaves and stems of pea. When affected by the studied herbicides, the fertilizer nitrogen supply to the pea plants changes and the rate of the fertilizer nitrogen assimilation by the plants varies noticeably. The regularities of the fertilizer nitrogen inclusion in the protein and non-protein nitrogen compounds of the above-ground pea organs have been studied
Investigation of glycerol assimilation and cofactor metabolism in Lactococcus lactis
DEFF Research Database (Denmark)
Holm, Anders Koefoed
: anaerobic, aerobic and respiration permissive growth in combination with either glycerol as a sole substrate or with co-metabolization of glycerol with common sugar substrates. Although no growth on glycerol was seen, both positive and detrimental effects were observed from cultures with glycerol...... itself under both anaerobic and respiration permissive conditions, but was not found to have the same profound effect on other sugar substrates such as galactose or ribose. Supplementation of nucleosides to the growth medium or increased substrate concentration were found to counteract the inhibitory...... of glycerol kinase from L. lactis, introduction of a heterologous glycerol assimilation pathway and construction of a library of NADH oxidase activity. Based on a preliminary analysis of transcription level data, an attempt was made to stimulate glycerol assimilation by overexpressing the glycerol kinase...
Background error covariance estimation for atmospheric CO2 data assimilation
Chatterjee, Abhishek; Engelen, Richard J.; Kawa, Stephan R.; Sweeney, Colm; Michalak, Anna M.
2013-09-01
any data assimilation framework, the background error covariance statistics play the critical role of filtering the observed information and determining the quality of the analysis. For atmospheric CO2 data assimilation, however, the background errors cannot be prescribed via traditional forecast or ensemble-based techniques as these fail to account for the uncertainties in the carbon emissions and uptake, or for the errors associated with the CO2 transport model. We propose an approach where the differences between two modeled CO2 concentration fields, based on different but plausible CO2 flux distributions and atmospheric transport models, are used as a proxy for the statistics of the background errors. The resulting error statistics: (1) vary regionally and seasonally to better capture the uncertainty in the background CO2 field, and (2) have a positive impact on the analysis estimates by allowing observations to adjust predictions over large areas. A state-of-the-art four-dimensional variational (4D-VAR) system developed at the European Centre for Medium-Range Weather Forecasts (ECMWF) is used to illustrate the impact of the proposed approach for characterizing background error statistics on atmospheric CO2 concentration estimates. Observations from the Greenhouse gases Observing SATellite "IBUKI" (GOSAT) are assimilated into the ECMWF 4D-VAR system along with meteorological variables, using both the new error statistics and those based on a traditional forecast-based technique. Evaluation of the four-dimensional CO2 fields against independent CO2 observations confirms that the performance of the data assimilation system improves substantially in the summer, when significant variability and uncertainty in the fluxes are present.
Variational Assimilation of Glider Data in Monterey Bay
2011-01-01
gliders and ten Slocum gliders were deployed in the Monterey Bay region, collecting temperature and salinity profiles (Ramp et al., 2008). Since the... Glider Data in the Monterey Bay 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0601153N 6. AUTHOR(S) Chudong Pan, Max...observed by gliders in the Monterey Bay in August 2003 are assimilated into NCOM model in the framework of a 3dVar scheme with a hybrid background error
Nitrogen and sulfur assimilation in plants and algae
Czech Academy of Sciences Publication Activity Database
Giordano, Mario; Raven, John A.
2014-01-01
Roč. 118, č. 2 (2014), s. 45-61 ISSN 0304-3770 Grant - others:University of Dundee(GB) SC 015096; Italian Ministry for Agriculture(IT) MIPAF, Bioforme project; Italian Ministry of Foreign Affairs(IT) MAE. Joint Italian-Israel Cooperation Program Institutional support: RVO:61388971 Keywords : nitrogen * sulfur * assimilation * algae Subject RIV: EE - Microbiology, Virology Impact factor: 1.608, year: 2014
The Role of Data Assimilation in Model Diagnostics
Nearing, G. S.; Ruddell, B.; Clark, M. P.; Nijssen, B.
2016-12-01
To make reliable predictions under nonstationary conditions we must build physically realistic models. However, because hydrological systems are compendiums of dynamic interactions between many different components and processes, it is difficult to use a direct comparison between model predictions and observations of integrated system responses to determine exactly which process representation(s) in any given model might contribute to or compensate for prediction error. Hydrologists and Earth Systems Modelers might recognize this problem as one of model diagnostics, however this is a classical problem that affects almost all aspects of scientific investigation - it is the confirmation holism aspect of the Duhem-Quine thesis. We propose here a systematic way to attack this problem that is based on a fundamental logic-based interpretation of the Duhem-Quine problem. The general idea is that diagnostic evaluation of complex systems models will involve tracking information flows through and between different interacting components in a given model structure, and that it is actually these information flows that we should wish to validate, evaluate or benchmark against observations. The problem is that we rarely have observations of all pertinent states and fluxes at all relevant spatiotemporal scales, and we propose that the fundamental resolution to this problem is data assimilation. The key insight is that data assimilation is simply the projection of information onto the states of a dynamical systems model. We discuss the implications of this for doing science and making predictions with coupled land surface hydrology models, as well as the risks associated with using sub-optimal data assimilation strategies. We will finally outline a few application examples where we found that land surface models apparently have a systematic problem with underestimating the connectivity between hydrological and ecological processes. We will us these examples to show how data-assimilation
Derivation of sequential, real-time, process-control programs
Marzullo, Keith; Schneider, Fred B.; Budhiraja, Navin
1991-01-01
The use of weakest-precondition predicate transformers in the derivation of sequential, process-control software is discussed. Only one extension to Dijkstra's calculus for deriving ordinary sequential programs was found to be necessary: function-valued auxiliary variables. These auxiliary variables are needed for reasoning about states of a physical process that exists during program transitions.
Factor screening for simulation with multiple responses : Sequential bifurcation
Shi, W.; Kleijnen, J.P.C.; Liu, Z.
2014-01-01
The goal of factor screening is to find the really important inputs (factors) among the many inputs that may be changed in a realistic simulation experiment. A specific method is sequential bifurcation (SB), which is a sequential method that changes groups of inputs simultaneously. SB is most
Factor Screening for Simulation with Multiple Responses : Sequential Bifurcation
Shi, W.; Kleijnen, Jack P.C.; Liu, Zhixue
2012-01-01
Abstract: Factor screening searches for the really important inputs (factors) among the many inputs that are changed in a realistic simulation experiment. Sequential bifurcation (or SB) is a sequential method that changes groups of inputs simultaneously. SB is the most e¢ cient and effective method
Factor Screening For Simulation With Multiple Responses : Sequential Bifurcation
Shi, W.; Kleijnen, Jack P.C.; Liu, Zhixue
2013-01-01
Abstract: Factor screening searches for the really important inputs (factors) among the many inputs that are changed in a realistic simulation experiment. Sequential bifurcation (SB) is a sequential method that changes groups of inputs simultaneously. SB is the most efficient and effective method if
A Survey of Multi-Objective Sequential Decision-Making
Roijers, D.M.; Vamplew, P.; Whiteson, S.; Dazeley, R.
2013-01-01
Sequential decision-making problems with multiple objectives arise naturally in practice and pose unique challenges for research in decision-theoretic planning and learning, which has largely focused on single-objective settings. This article surveys algorithms designed for sequential
Accounting for Heterogeneous Returns in Sequential Schooling Decisions
Zamarro, G.
2006-01-01
This paper presents a method for estimating returns to schooling that takes into account that returns may be heterogeneous among agents and that educational decisions are made sequentially.A sequential decision model is interesting because it explicitly considers that the level of education of each
The sequential price of anarchy for atomic congestion games
de Jong, Jasper; Uetz, Marc Jochen; Liu, Tie-Yan; Qi, Qi; Ye, Yinyu
2014-01-01
In situations without central coordination, the price of anarchy relates the quality of any Nash equilibrium to the quality of a global optimum. Instead of assuming that all players choose their actions simultaneously, we consider games where players choose their actions sequentially. The sequential
Applications of Bayesian decision theory to sequential mastery testing
Vos, Hendrik J.
1999-01-01
The purpose of this paper is to formulate optimal sequential rules for mastery tests. The framework for the approach is derived from Bayesian sequential decision theory. Both a threshold and linear loss structure are considered. The binomial probability distribution is adopted as the psychometric
Quantum Probability Zero-One Law for Sequential Terminal Events
Rehder, Wulf
1980-07-01
On the basis of the Jauch-Piron quantum probability calculus a zero-one law for sequential terminal events is proven, and the significance of certain crucial axioms in the quantum probability calculus is discussed. The result shows that the Jauch-Piron set of axioms is appropriate for the non-Boolean algebra of sequential events.
The Clinical effectiveness of sequential treatment of skeletal class III ...
African Journals Online (AJOL)
Aim: To assess the dentofacial changes induced by the sequential treatment in the skeletal class III malocclusion with maxillary retrognathism. Study design: Controlled clinical trial assessing the effectiveness of sequential treatment of skeletal class III malocclusion. Materials and Methods: The treated group consisted of 30 ...
Lag Sequential Analysis: Taking Consultation Communication Research to the Movies.
Benes, Kathryn M.; Gutkin, Terry B.; Kramer, Jack J.
1995-01-01
Describes lag-sequential analysis and its unique contributions to research literature, addressing communication processes in school-based consultation. For purposes of demonstrating the application and potential utility of lag-sequential analysis, article analyzes the communication behaviors of two consultants. Considers directions for future…
Sequential injection spectrophotometric determination of V(V) in ...
African Journals Online (AJOL)
Sequential injection spectrophotometric determination of V(V) in environmental polluted waters. ES Silva, PCAG Pinto, JLFC Lima, MLMFS Saraiva. Abstract. A fast and robust sequential injection analysis (SIA) methodology for routine determination of V(V) in environmental polluted waters is presented. The determination ...
Monte Carlo simulation of Markov unreliability models
International Nuclear Information System (INIS)
Lewis, E.E.; Boehm, F.
1984-01-01
A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)
Adiabatic optimization versus diffusion Monte Carlo methods
Jarret, Michael; Jordan, Stephen P.; Lackey, Brad
2016-10-01
Most experimental and theoretical studies of adiabatic optimization use stoquastic Hamiltonians, whose ground states are expressible using only real nonnegative amplitudes. This raises a question as to whether classical Monte Carlo methods can simulate stoquastic adiabatic algorithms with polynomial overhead. Here we analyze diffusion Monte Carlo algorithms. We argue that, based on differences between L1 and L2 normalized states, these algorithms suffer from certain obstructions preventing them from efficiently simulating stoquastic adiabatic evolution in generality. In practice however, we obtain good performance by introducing a method that we call Substochastic Monte Carlo. In fact, our simulations are good classical optimization algorithms in their own right, competitive with the best previously known heuristic solvers for MAX-k -SAT at k =2 ,3 ,4 .
Shell model the Monte Carlo way
International Nuclear Information System (INIS)
Ormand, W.E.
1995-01-01
The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined
Shell model the Monte Carlo way
Energy Technology Data Exchange (ETDEWEB)
Ormand, W.E.
1995-03-01
The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.
Off-diagonal expansion quantum Monte Carlo.
Albash, Tameem; Wagenbreth, Gene; Hen, Itay
2017-12-01
We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.
Off-diagonal expansion quantum Monte Carlo
Albash, Tameem; Wagenbreth, Gene; Hen, Itay
2017-12-01
We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.
Random Numbers and Monte Carlo Methods
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
Monte Carlo strategies in scientific computing
Liu, Jun S
2008-01-01
This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...
Directory of Open Access Journals (Sweden)
Chun Yang
2016-06-01
Full Text Available A method to assimilate all-sky radiances from the Advanced Microwave Scanning Radiometer 2 (AMSR2 was developed within the Weather Research and Forecasting (WRF model's data assimilation (WRFDA system. The four essential elements are: (1 extending the community radiative transform model's (CRTM interface to include hydrometeor profiles; (2 using total water Qt as the moisture control variable; (3 using a warm-rain physics scheme for partitioning the Qt increment into individual increments of water vapour, cloud liquid water and rain; and (4 adopting a symmetric observation error model for all-sky radiance assimilation.Compared to a benchmark experiment with no AMSR2 data, the impact of assimilating clear-sky or all-sky AMSR2 radiances on the analysis and forecast of Hurricane Sandy (2012 was assessed through analysis/forecast cycling experiments using WRF and WRFDA's three-dimensional variational (3DVAR data assimilation scheme. With more cloud/precipitation-affected data being assimilated around tropical cyclone (TC core areas in the all-sky AMSR2 assimilation experiment, better analyses were obtained in terms of the TC's central sea level pressure (CSLP, warm-core structure and cloud distribution. Substantial (>20 % error reduction in track and CSLP forecasts was achieved from both clear-sky and all-sky AMSR2 assimilation experiments, and this improvement was consistent from the analysis time to 72-h forecasts. Moreover, the all-sky assimilation experiment consistently yielded better track and CSLP forecasts than the clear-sky did for all forecast lead times, due to a better analysis in the TC core areas. Positive forecast impact from assimilating AMSR2 radiances is also seen when verified against the European Center for Medium-Range Weather Forecasts (ECMWF analysis and the Stage IV precipitation analysis, with an overall larger positive impact from the all-sky assimilation experiment.