WorldWideScience

Sample records for adaptive importance sampling

  1. Adaptive Importance Sampling in Particle Filtering

    Šmídl, Václav; Hofman, Radek

    Istanbul : ISIF, 2013. ISBN 978-605-86311-1-3. [16th International Conference on Information Fusion. Istanbul (TR), 09.07.2013-12.07.2013] R&D Projects: GA MV VG20102013018; GA ČR(CZ) GAP102/11/0437 Keywords : importance sampling * sequential monte carlo * sufficient statistics Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2013/AS/smidl-adaptive importance sampling in particle filtering.pdf

  2. Adaptive Importance Sampling for Control and Inference

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  3. Adaptive importance sampling of random walks on continuous state spaces

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  4. AIS-BN: An Adaptive Importance Sampling Algorithm for Evidential Reasoning in Large Bayesian Networks

    Cheng, J; 10.1613/jair.764

    2011-01-01

    Stochastic sampling algorithms, while an attractive alternative to exact algorithms in very large Bayesian network models, have been observed to perform poorly in evidential reasoning with extremely unlikely evidence. To address this problem, we propose an adaptive importance sampling algorithm, AIS-BN, that shows promising convergence rates even under extreme conditions and seems to outperform the existing sampling algorithms consistently. Three sources of this performance improvement are (1) two heuristics for initialization of the importance function that are based on the theoretical properties of importance sampling in finite-dimensional integrals and the structural advantages of Bayesian networks, (2) a smooth learning method for the importance function, and (3) a dynamic weighting function for combining samples from different stages of the algorithm. We tested the performance of the AIS-BN algorithm along with two state of the art general purpose sampling algorithms, likelihood weighting (Fung and Chang...

  5. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van Wim C.; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models t

  6. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations

  7. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van, WC Wim; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models that are analytically tractable are employed to demonstrate the validity of the techniques. As an application to situations that are analytically intractable and numerically intensive, the influence...

  8. An improved adaptive kriging-based importance technique for sampling multiple failure regions of low probability

    The estimation of system failure probabilities may be a difficult task when the values involved are very small, so that sampling-based Monte Carlo methods may become computationally impractical, especially if the computer codes used to model the system response require large computational efforts, both in terms of time and memory. This paper proposes a modification of an algorithm proposed in literature for the efficient estimation of small failure probabilities, which combines FORM to an adaptive kriging-based importance sampling strategy (AK-IS). The modification allows overcoming an important limitation of the original AK-IS in that it provides the algorithm with the flexibility to deal with multiple failure regions characterized by complex, non-linear limit states. The modified algorithm is shown to offer satisfactory results with reference to four case studies of literature, outperforming in general several other alternative methods of literature. - Highlights: • We tackle low failure probability estimation within reliability analysis context. • We improve a kriging-based importance sampling for estimating failure probabilities. • The new algorithm is capable of dealing with multiple-disconnected failure regions. • The performances are better than other methods of literature on 4 test case-studies

  9. Network and adaptive sampling

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  10. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    Liu, Bin, E-mail: bins@ieee.org [School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 (China)

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  11. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis

  12. Covariance-Adaptive Slice Sampling

    Thompson, Madeleine; Neal, Radford M.

    2010-01-01

    We describe two slice sampling methods for taking multivariate steps using the crumb framework. These methods use the gradients at rejected proposals to adapt to the local curvature of the log-density surface, a technique that can produce much better proposals when parameters are highly correlated. We evaluate our methods on four distributions and compare their performance to that of a non-adaptive slice sampling method and a Metropolis method. The adaptive methods perform favorably on low-di...

  13. Adaptive sampling for noisy problems

    Cantu-Paz, E

    2004-03-26

    The usual approach to deal with noise present in many real-world optimization problems is to take an arbitrary number of samples of the objective function and use the sample average as an estimate of the true objective value. The number of samples is typically chosen arbitrarily and remains constant for the entire optimization process. This paper studies an adaptive sampling technique that varies the number of samples based on the uncertainty of deciding between two individuals. Experiments demonstrate the effect of adaptive sampling on the final solution quality reached by a genetic algorithm and the computational cost required to find the solution. The results suggest that the adaptive technique can effectively eliminate the need to set the sample size a priori, but in many cases it requires high computational costs.

  14. Quantization based recursive Importance Sampling

    Frikha, Noufel

    2011-01-01

    We investigate in this paper an alternative method to simulation based recursive importance sampling procedure to estimate the optimal change of measure for Monte Carlo simulations. We propose an algorithm which combines (vector and functional) optimal quantization with Newton-Raphson zero search procedure. Our approach can be seen as a robust and automatic deterministic counterpart of recursive importance sampling by means of stochastic approximation algorithm which, in practice, may require tuning and a good knowledge of the payoff function in practice. Moreover, unlike recursive importance sampling procedures, the proposed methodology does not rely on simulations so it is quite generic and can come along on the top of Monte Carlo simulations. We first emphasize on the consistency of quantization for designing an importance sampling algorithm for both multi-dimensional distributions and diffusion processes. We show that the induced error on the optimal change of measure is controlled by the mean quantizatio...

  15. Adaptive Sampling in Hierarchical Simulation

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  16. Large-Flip Importance Sampling

    Hamze, Firas; De Freitas, Nando

    2012-01-01

    We propose a new Monte Carlo algorithm for complex discrete distributions. The algorithm is motivated by the N-Fold Way, which is an ingenious event-driven MCMC sampler that avoids rejection moves at any specific state. The N-Fold Way can however get "trapped" in cycles. We surmount this problem by modifying the sampling process. This correction does introduce bias, but the bias is subsequently corrected with a carefully engineered importance sampler.

  17. A new design for sampling with adaptive sample plots

    Yang, Haijun; Kleinn, Christoph; Fehrmann, Lutz; Tang, Shouzheng; Magnussen, Steen

    2009-01-01

    Adaptive cluster sampling (ACS) is a sampling technique for sampling rare and geographically clustered populations. Aiming to enhance the practicability of ACS while maintaining some of its major characteristics, an adaptive sample plot design is introduced in this study which facilitates field work compared to “standard” ACS. The plot design is based on a conditional plot expansion: a larger plot (by a pre-defined plot size factor) is installed at a sample point instead of the smaller initia...

  18. Simulated Maximum Likelihood using Tilted Importance Sampling

    Christian N. Brinch

    2008-01-01

    Abstract: This paper develops the important distinction between tilted and simple importance sampling as methods for simulating likelihood functions for use in simulated maximum likelihood. It is shown that tilted importance sampling removes a lower bound to simulation error for given importance sample size that is inherent in simulated maximum likelihood using simple importance sampling, the main method for simulating likelihood functions in the statistics literature. In addit...

  19. Adaptive processing with signal contaminated training samples

    Besson, Olivier; Bidon, Stéphanie

    2013-01-01

    We consider the adaptive beamforming or adaptive detection problem in the case of signal contaminated training samples, i.e., when the latter may contain a signal-like component. Since this results in a significant degradation of the signal to interference and noise ratio at the output of the adaptive filter, we investigate a scheme to jointly detect the contaminated samples and subsequently take this information into account for estimation of the disturbance covariance matrix. Towards this e...

  20. Monte Carlo Integration Using Importance Sampling and Gibbs Sampling

    Hörmann, Wolfgang; Leydold, Josef

    2005-01-01

    To evaluate the expectation of a simple function with respect to a complicated multivariate density Monte Carlo integration has become the main technique. Gibbs sampling and importance sampling are the most popular methods for this task. In this contribution we propose a new simple general purpose importance sampling procedure. In a simulation study we compare the performance of this method with the performance of Gibbs sampling and of importance sampling using a vector of independent variate...

  1. An Adaptive Importance Sampling Theory Based on The Generalized Genetic Algorithm%基于广义遗传算法的自适应重要抽样理论

    董聪; 郭晓华

    2000-01-01

    In the present paper,using the generalized genetic algorithm,the problem of finding out all design points in the case of generalized multiple design point is solved,establishing recursion-type bound-and-classification algorithm,the problem of reducing and synthesizing generaliged multiple design points is also solved.The present paper shows that the adaptive importance sampling theory based on the generalized genetic algorithm is an more efficient tool for the reliability simulation of nonlinear sys-tems.

  2. Importance Sampling for the Infinite Sites Model*

    Hobolth, Asger; Uyenoyama, Marcy K.; Wiuf, Carsten

    2008-01-01

    Importance sampling or Markov Chain Monte Carlo sampling is required for state-of-the-art statistical analysis of population genetics data. The applicability of these sampling-based inference techniques depends crucially on the proposal distribution. In this paper, we discuss importance sampling for the infinite sites model. The infinite sites assumption is attractive because it constraints the number of possible genealogies, thereby allowing for the analysis of larger data sets. We recall th...

  3. On Invertible Sampling and Adaptive Security

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio;

    2011-01-01

    Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot reliably...... erase their secrets prior to corruption. Previous feasibility results for adaptively secure MPC in this setting applied either to deterministic functionalities or to randomized functionalities which satisfy a certain technical requirement. The question whether adaptive security is possible for all...... functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...

  4. Adaptive Stochastic Methods for Sampling Driven Systems

    Jones, Andrew; Leimkuhler, Benedict

    2011-01-01

    Thermostatting methods are discussed in the context of canonical sampling in the presence of driving stochastic forces. Generalisations of the Nosé-Hoover method and Langevin dynamics are introduced which are able to dissipate excess heat introduced by steady Brownian perturbation (without a priori knowledge of its strength) while preserving ergodicity. Implementation and parameter selection are considered. It is demonstrated using numerical experiments that the methods derived can adaptively...

  5. 电力系统可靠性评估的自适应分层重要抽样法%A Self-adapting Stratified and Importance Sampling Method for Power System Reliability Evaluation

    王晓滨; 郭瑞鹏; 曹一家; 余秀月; 杨桂钟

    2011-01-01

    提出了电力系统可靠性评估的自适应分层重要抽样算法,将系统状态空间分割成无故障状态子空间和各重故障状态子空间,避免对无故障状态子空间抽样,对各重故障状态子空间的抽样次数进行最优分配,并不断修正最优重要抽样概率密度函数,可以显著提高计算效率并解决了以往蒙特卡洛方法在高可靠性系统中效率低下的问题.对IEEE-RTS系统的发输电部分进行可靠性评估,结果表明该方法合理、高效,且不会出现退化现象.%A new method for power system reliability evaluation called self-adapting stratified and importance sampling (SASIS) is presented. With the SASIS, the system state space is partitioned into one contingency-free state subspace and various contingency order state subspaces. As contingency-free state subspace sampling is completely avoided, the SASIS converges fast in the system with high reliability. The number of sampling is optimally allocated among the contingency order state subspaces and the probability density function is steadily rectified. This method will markedly increase the calculating efficiency while eradicating the problem of low efficiency with the Monte Carlo method in high efficiency systems as reported in the past. Compared with other Monte Carlo methods, the results of the IEEE-RTS test system show that the method proposed is rational and highly effective and free from degradation.This work is supported by Important Zhejiang Science & Technology Specific Projects (No. 2007C11098).

  6. Feature Adaptive Sampling for Scanning Electron Microscopy

    Dahmen, Tim; Engstler, Michael; Pauly, Christoph; Trampert, Patrick; de Jonge, Niels; Mücklich, Frank; Slusallek, Philipp

    2016-01-01

    A new method for the image acquisition in scanning electron microscopy (SEM) was introduced. The method used adaptively increased pixel-dwell times to improve the signal-to-noise ratio (SNR) in areas of high detail. In areas of low detail, the electron dose was reduced on a per pixel basis, and a-posteriori image processing techniques were applied to remove the resulting noise. The technique was realized by scanning the sample twice. The first, quick scan used small pixel-dwell times to gener...

  7. A software sampling frequency adaptive algorithm for reducing spectral leakage

    PAN Li-dong; WANG Fei

    2006-01-01

    Spectral leakage caused by synchronous error in a nonsynchronous sampling system is an important cause that reduces the accuracy of spectral analysis and harmonic measurement.This paper presents a software sampling frequency adaptive algorithm that can obtain the actual signal frequency more accurately,and then adjusts sampling interval base on the frequency calculated by software algorithm and modifies sampling frequency adaptively.It can reduce synchronous error and impact of spectral leakage;thereby improving the accuracy of spectral analysis and harmonic measurement for power system signal where frequency changes slowly.This algorithm has high precision just like the simulations show,and it can be a practical method in power system harmonic analysis since it can be implemented easily.

  8. An adaptive sampling scheme for deep-penetration calculation

    As we know, the deep-penetration problem has been one of the important and difficult problems in shielding calculation with Monte Carlo Method for several decades. In this paper, an adaptive Monte Carlo method under the emission point as a sampling station for shielding calculation is investigated. The numerical results show that the adaptive method may improve the efficiency of the calculation of shielding and might overcome the under-estimation problem easy to happen in deep-penetration calculation in some degree

  9. A support vector density-based importance sampling for reliability assessment

    An importance sampling method based on the adaptive Markov chain simulation and support vector density estimation is developed in this paper for efficient structural reliability assessment. The methodology involves the generation of samples that can adaptively populate the important region by the adaptive Metropolis algorithm, and the construction of importance sampling density by support vector density. The use of the adaptive Metropolis algorithm may effectively improve the convergence and stability of the classical Markov chain simulation. The support vector density can approximate the sampling density with fewer samples in comparison to the conventional kernel density estimation. The proposed importance sampling method can effectively reduce the number of structural analysis required for achieving a given accuracy. Examples involving both numerical and practical structural problems are given to illustrate the application and efficiency of the proposed methodology.

  10. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  11. Sparse signals estimation for adaptive sampling

    Andrey Ordin

    2014-08-01

    Full Text Available This paper presents an estimation procedure for sparse signals in adaptive setting. We show that when the pure signal is strong enough, the value of loss function is asymptotically the same as for an optimal estimator up to a constant multiplier.

  12. New adaptive sampling method in particle image velocimetry

    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method. (technical design note)

  13. Adaptive Sampling for Large Scale Boosting

    Dubout, Charles; Fleuret, Francois

    2014-01-01

    Classical Boosting algorithms, such as AdaBoost, build a strong classifier without concern for the computational cost. Some applications, in particular in computer vision, may involve millions of training examples and very large feature spaces. In such contexts, the training time of off-the-shelf Boosting algorithms may become prohibitive. Several methods exist to accelerate training, typically either by sampling the features or the examples used to train the weak learners. Even if some of th...

  14. Domain Adaptation: Overfitting and Small Sample Statistics

    Foster, Dean; Salakhutdinov, Ruslan

    2011-01-01

    We study the prevalent problem when a test distribution differs from the training distribution. We consider a setting where our training set consists of a small number of sample domains, but where we have many samples in each domain. Our goal is to generalize to a new domain. For example, we may want to learn a similarity function using only certain classes of objects, but we desire that this similarity function be applicable to object classes not present in our training sample (e.g. we might seek to learn that "dogs are similar to dogs" even though images of dogs were absent from our training set). Our theoretical analysis shows that we can select many more features than domains while avoiding overfitting by utilizing data-dependent variance properties. We present a greedy feature selection algorithm based on using T-statistics. Our experiments validate this theory showing that our T-statistic based greedy feature selection is more robust at avoiding overfitting than the classical greedy procedure.

  15. Averaging analysis for discrete time and sampled data adaptive systems

    Fu, Li-Chen; Bai, Er-Wei; Sastry, Shankar S.

    1986-01-01

    Earlier continuous time averaging theorems are extended to the nonlinear discrete time case. Theorems for the study of the convergence analysis of discrete time adaptive identification and control systems are used. Instability theorems are also derived and used for the study of robust stability and instability of adaptive control schemes applied to sampled data systems. As a by product, the effects of sampling on unmodeled dynamics in continuous time systems are also studied.

  16. Pricing and Risk Management with Stochastic Volatility Using Importance Sampling

    Przemyslaw S. Stilger, Simon Acomb and Ser-Huang Poon

    2012-01-01

    In this paper, we apply importance sampling to Heston's stochastic volatility model and Bates's stochastic volatility model with jumps. We propose an effective numerical scheme that dramatically improves the speed of importance sampling. We show how the Greeks can be computed using the Likelihood Ratio Method based on characteristic function, and how combining it with importance sampling leads to a significant variance reduction for the Greeks. All results are illustrated using European and b...

  17. Application of adaptive cluster sampling to low-density populations of freshwater mussels

    Smith, D.R.; Villella, R.F.; Lemarie, D.P.

    2003-01-01

    Freshwater mussels appear to be promising candidates for adaptive cluster sampling because they are benthic macroinvertebrates that cluster spatially and are frequently found at low densities. We applied adaptive cluster sampling to estimate density of freshwater mussels at 24 sites along the Cacapon River, WV, where a preliminary timed search indicated that mussels were present at low density. Adaptive cluster sampling increased yield of individual mussels and detection of uncommon species; however, it did not improve precision of density estimates. Because finding uncommon species, collecting individuals of those species, and estimating their densities are important conservation activities, additional research is warranted on application of adaptive cluster sampling to freshwater mussels. However, at this time we do not recommend routine application of adaptive cluster sampling to freshwater mussel populations. The ultimate, and currently unanswered, question is how to tell when adaptive cluster sampling should be used, i.e., when is a population sufficiently rare and clustered for adaptive cluster sampling to be efficient and practical? A cost-effective procedure needs to be developed to identify biological populations for which adaptive cluster sampling is appropriate.

  18. Cellular adaptation as an important response during chemical carcinogenesis

    Since disease processes are largely expressions of how living organisms react and respond to perturbations in the external and internal environments, adaptive or protective responses and their modulations and mechanisms are of the greatest concern in fundamental studies of disease pathogenesis. Such considerations are also of the greatest relevance in toxicology, including how living organisms respond to low levels of single and multiple xenobiotics and radiations. As the steps and mechanisms during cancer development are studied in greater depth, phenomena become apparent that suggest that adaptive reactions and responses may play important or even critical roles in the process of carcinogenesis. The question becomes whether the process of carcinogenesis is fundamentally an adversarial one (i.e., an abnormal cell in a vulnerable host), or is it more in the nature of a physiological selection or differentiation, which has survival value for the host as an adaptive phenomena? The very early initial interactions of mutagenic chemical carcinogens, radiations and viruses with DNA prejudice most to consider the adversarial 'abnormal' view as the appropriate one. Yet, the unusually common nature of the earliest altered rare cells that appear during carcinogenesis, their unusually bland nature, and their spontaneous differentiation to normal-appearing adult liver should be carefully considered

  19. State-dependent importance sampling for a Jackson tandem network

    D. Miretskiy; W. Scheinhardt; M. Mandjes

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting "traditional" state-independent importance-sampling distributions per

  20. State-dependent importance sampling for a Jackson tandem network

    Miretskiy, Denis; Scheinhardt, Werner; Mandjes, Michel

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting “traditional” state-independent importance-sampling distributions per

  1. Fast Efficient Importance Sampling by State Space Methods

    Koopman, S.J.; Nguyen, T.M.

    2012-01-01

    We show that efficient importance sampling for nonlinear non-Gaussian state space models can be implemented by computationally efficient Kalman filter and smoothing methods. The result provides some new insights but it primarily leads to a simple and fast method for efficient importance sampling. A simulation study and empirical illustration provide some evidence of the computational gains.

  2. Adaptive sampling program support for expedited site characterization

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ''real-time'' data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection

  3. Adaptive maximal poisson-disk sampling on surfaces

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  4. Adaptive sampling program support for expedited site characterization

    Johnson, R.

    1993-10-01

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ``real-time`` data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection.

  5. Multi-Level Monte Carlo Simulations with Importance Sampling

    Przemyslaw S. Stilger and Ser-Huang Poon

    2013-01-01

    We present an application of importance sampling to multi-asset options under the Heston and the Bates models as well as to the Heston-Hull-White and the Heston-Cox-Ingersoll-Ross models. Moreover, we provide an efficient importance sampling scheme in a Multi-Level Monte Carlo simulation. In all cases, we explain how the Greeks can be computed in the different simulation schemes using the Likelihood Ratio Method, and how combining it with importance sampling leads to a significant variance re...

  6. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  7. Computing Greeks with Multilevel Monte Carlo Methods using Importance Sampling

    Euget, Thomas

    2012-01-01

    This paper presents a new efficient way to reduce the variance of an estimator of popular payoffs and greeks encounter in financial mathematics. The idea is to apply Importance Sampling with the Multilevel Monte Carlo recently introduced by M.B. Giles. So far, Importance Sampling was proved successful in combination with standard Monte Carlo method. We will show efficiency of our approach on the estimation of financial derivatives prices and then on the estimation of Greeks (i.e. sensitivitie...

  8. Two-phase importance sampling for inference about transmission trees

    Numminen, E.; Chewapreecha, C.; Siren, J.; Turner, C.; Turner, P; Bentley, S.D.; Corander, J.

    2014-01-01

    There has been growing interest in the statistics community to develop methods for inferring transmission pathways of infectious pathogens from molecular sequence data. For many datasets, the computational challenge lies in the huge dimension of the missing data. Here, we introduce an importance sampling scheme in which the transmission trees and phylogenies of pathogens are both sampled from reasonable importance distributions, alleviating the inference. Using this approach, arbitrary models...

  9. On the Use of Importance Sampling in Particle Transport Problems

    The idea of importance sampling is applied to the problem of solving integral equations of Fredholm's type. Especially Bolzmann's neutron transport equation is taken into consideration. For the solution of the latter equation, an importance sampling technique is derived from some simple transformations at the original transport equation into a similar equation. Examples of transformations are given, which have been used with great success in practice

  10. Iterative importance sampling algorithms for parameter estimation problems

    Morzfeld, Matthias; Day, Marcus S.; Grout, Ray W.; Pau, George Shu Heng; Finsterle, Stefan A.; Bell, John B.

    2016-01-01

    In parameter estimation problems one approximates a posterior distribution over uncertain param- eters defined jointly by a prior distribution, a numerical model, and noisy data. Typically, Markov Chain Monte Carlo (MCMC) is used for the numerical solution of such problems. An alternative to MCMC is importance sampling, where one draws samples from a proposal distribution, and attaches weights to each sample to account for the fact that the proposal distribution is not the posterior distribut...

  11. Geometrical importance sampling in Geant4 from design to verification

    Dressel, M

    2003-01-01

    The addition of flexible, general implementations of geometrical splitting and Russian Roulette, in combination called geometrical importance sampling, for variance reduction and of a scoring system, for controlling the sampling, are described. The efficiency of the variance reduction implementation is measured in a simulation of a typical benchmark experiment for neutron shielding. Using geometrical importance sampling a reduction of the computing time of a factor 89 compared to the analog calculation, for obtaining a neutron flux with a certain precision, was achieved for the benchmark application.

  12. Adaptation of the methodology of sample surveys for marketing researches

    Kataev Andrey

    2015-08-01

    Full Text Available The article presents the results of the theory of adaptation of sample survey for the purposes of marketing, that allows to answer the fundamental question of any marketing research – how many objects should be studied for drawing adequate conclusions.

  13. Efficient computation of smoothing splines via adaptive basis sampling

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  14. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Huang Jiangtao

    2015-10-01

    Full Text Available Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is proposed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sampling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowdness enhance (RCE adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic optimization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  15. Adaptive video compressed sampling in the wavelet domain

    Dai, Hui-dong; Gu, Guo-hua; He, Wei-ji; Chen, Qian; Mao, Tian-yi

    2016-07-01

    In this work, we propose a multiscale video acquisition framework called adaptive video compressed sampling (AVCS) that involves sparse sampling and motion estimation in the wavelet domain. Implementing a combination of a binary DMD and a single-pixel detector, AVCS acquires successively finer resolution sparse wavelet representations in moving regions directly based on extended wavelet trees, and alternately uses these representations to estimate the motion in the wavelet domain. Then, we can remove the spatial and temporal redundancies and provide a method to reconstruct video sequences from compressed measurements in real time. In addition, the proposed method allows adaptive control over the reconstructed video quality. The numerical simulation and experimental results indicate that AVCS performs better than the conventional CS-based methods at the same sampling rate even under the influence of noise, and the reconstruction time and measurements required can be significantly reduced.

  16. An Importance Sampling Simulation Method for Bayesian Decision Feedback Equalizers

    Chen, S.; Hanzo, L.

    2000-01-01

    An importance sampling (IS) simulation technique is presented for evaluating the lower-bound bit error rate (BER) of the Bayesian decision feedback equalizer (DFE) under the assumption of correct decisions being fed back. A design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency of the IS simulation.

  17. Importance Sampling for Failure Probabilities in Computing and Data Transmission

    Asmussen, Søren

    given to T having either a gamma-like tail or a regularly varying tail, and to failures at Poisson times. Different type of conditional limits occur, in particular exponentially tilted Gumbel distributions and Pareto distributions. The algorithms based upon importance distributions for T using these......We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description of the...

  18. Importance sampling for failure probabilities in computing and data transmission

    Asmussen, Søren

    2009-01-01

    attention is given to T having either a gamma-like tail or a regularly varying tail, and to failures at Poisson times. Different types of conditional limit occur, in particular exponentially tilted Gumbel distributions and Pareto distributions. The algorithms based upon importance distributions for T using......In this paper we study efficient simulation algorithms for estimating P(X›x), where X is the total time of a job with ideal time $T$ that needs to be restarted after a failure. The main tool is importance sampling, where a good importance distribution is identified via an asymptotic description of...

  19. A flexible importance sampling method for integrating subgrid processes

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  20. The Importance of Formalizing Computational Models of Face Adaptation Aftereffects

    Ross, David A.; Palmeri, Thomas J.

    2016-01-01

    Face adaptation is widely used as a means to probe the neural representations that support face recognition. While the theories that relate face adaptation to behavioral aftereffects may seem conceptually simple, our work has shown that testing computational instantiations of these theories can lead to unexpected results. Instantiating a model of face adaptation not only requires specifying how faces are represented and how adaptation shapes those representations but also specifying how decisions are made, translating hidden representational states into observed responses. Considering the high-dimensionality of face representations, the parallel activation of multiple representations, and the non-linearity of activation functions and decision mechanisms, intuitions alone are unlikely to succeed. If the goal is to understand mechanism, not simply to examine the boundaries of a behavioral phenomenon or correlate behavior with brain activity, then formal computational modeling must be a component of theory testing. To illustrate, we highlight our recent computational modeling of face adaptation aftereffects and discuss how models can be used to understand the mechanisms by which faces are recognized. PMID:27378960

  1. Importance sampling for failure probabilities in computing and data transmission

    Asmussen, Søren

    2009-01-01

    We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description of the conditional distribution of T given Χ > χ. If T ≡ t is constant, the problem reduces to the efficient simulation of geometric sums, and a standard algorithm involving a Cramér type root  γ(t) is ...

  2. Adaptive Sampling of Time Series During Remote Exploration

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  3. Learning Adaptive Forecasting Models from Irregularly Sampled Multivariate Clinical Data

    Liu, Zitao; Hauskrecht, Milos

    2016-01-01

    Building accurate predictive models of clinical multivariate time series is crucial for understanding of the patient condition, the dynamics of a disease, and clinical decision making. A challenging aspect of this process is that the model should be flexible and adaptive to reflect well patient-specific temporal behaviors and this also in the case when the available patient-specific data are sparse and short span. To address this problem we propose and develop an adaptive two-stage forecasting approach for modeling multivariate, irregularly sampled clinical time series of varying lengths. The proposed model (1) learns the population trend from a collection of time series for past patients; (2) captures individual-specific short-term multivariate variability; and (3) adapts by automatically adjusting its predictions based on new observations. The proposed forecasting model is evaluated on a real-world clinical time series dataset. The results demonstrate the benefits of our approach on the prediction tasks for multivariate, irregularly sampled clinical time series, and show that it can outperform both the population based and patient-specific time series prediction models in terms of prediction accuracy.

  4. Distributed Database Kriging for Adaptive Sampling (D2 KAS)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; Rouet-Leduc, Bertrand; McPherson, Allen L.; Germann, Timothy C.; Junghans, Christoph

    2015-07-01

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5-25, while retaining high accuracy for various choices of the algorithm parameters.

  5. Distributed database kriging for adaptive sampling (D2KAS)

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters

  6. Gap processing for adaptive maximal Poisson-disk sampling

    Yan, Dongming

    2013-09-01

    In this article, we study the generation of maximal Poisson-disk sets with varying radii. First, we present a geometric analysis of gaps in such disk sets. This analysis is the basis for maximal and adaptive sampling in Euclidean space and on manifolds. Second, we propose efficient algorithms and data structures to detect gaps and update gaps when disks are inserted, deleted, moved, or when their radii are changed.We build on the concepts of regular triangulations and the power diagram. Third, we show how our analysis contributes to the state-of-the-art in surface remeshing. © 2013 ACM.

  7. Roof Reconstruction from Point Clouds using Importance Sampling

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2013-10-01

    We propose a novel fully automatic technique for roof fitting in 3D point clouds based on sequential importance sampling (SIS). Our approach makes no assumption of the nature (sparse, dense) or origin (LIDAR, image matching) of the point clouds and further distinguishes, automatically, between different basic roof types based on model selection. The algorithm comprises an inherent data parallelism, the lack of which has been a major drawback of most Monte Carlo schemes. A further speedup is achieved by applying a coarse to fine search within all probable roof configurations in the sample space of roofs. The robustness and effectiveness of our roof reconstruction algorithm is illustrated for point clouds of varying nature.

  8. Effect of imperfect detectability on adaptive and conventional sampling: Simulated sampling of freshwater mussels in the upper Mississippi River

    Smith, D.R.; Gray, B.R.; Newton, T.J.; Nichols, D.

    2010-01-01

    Adaptive sampling designs are recommended where, as is typical with freshwater mussels, the outcome of interest is rare and clustered. However, the performance of adaptive designs has not been investigated when outcomes are not only rare and clustered but also imperfectly detected. We address this combination of challenges using data simulated to mimic properties of freshwater mussels from a reach of the upper Mississippi River. Simulations were conducted under a range of sample sizes and detection probabilities. Under perfect detection, efficiency of the adaptive sampling design increased relative to the conventional design as sample size increased and as density decreased. Also, the probability of sampling occupied habitat was four times higher for adaptive than conventional sampling of the lowest density population examined. However, imperfect detection resulted in substantial biases in sample means and variances under both adaptive sampling and conventional designs. The efficiency of adaptive sampling declined with decreasing detectability. Also, the probability of encountering an occupied unit during adaptive sampling, relative to conventional sampling declined with decreasing detectability. Thus, the potential gains in the application of adaptive sampling to rare and clustered populations relative to conventional sampling are reduced when detection is imperfect. The results highlight the need to increase or estimate detection to improve performance of conventional and adaptive sampling designs.

  9. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    Grace, J. M.; Verseux, C.; Gentry, D.; Moffet, A.; Thayabaran, R.; Wong, N.; Rothschild, L.

    2013-12-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates[Wielgoss et al., 2013]. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques[Wassmann et al., 2010]. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols[Alcántara-Díaz et al., 2004; Goldman and Travisano, 2011]. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of

  10. Semigroups and sequential importance sampling for multiway tables

    Yoshida, Ruriko; Wei, Shaoceng; Zhou, Feng; Haws, David

    2011-01-01

    When an interval of integers between the lower bound $l_i$ and the upper bound $u_i$ is the support of the marginal distribution $n_i|(n_{i-1}, ...,n_1)$, Chen et al, 2005 noticed that sampling from the interval at each step, for $n_i$ during a sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection Chen et al 2006. In this paper we consider the uniform distribution as the target distribution. First we show that if we fix the number of rows and columns of the design matrix of the model for contingency tables then there exists a polynomial time algorithm in terms of the input size to sample a table from the set of all tables satisfying all marginals defined by the given model via the SIS procedure without rejection. We then show experimentall...

  11. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    Cheon, Sooyoung

    2013-02-16

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  12. Local Importance Sampling: A Novel Technique to Enhance Particle Filtering

    Péter Torma

    2006-04-01

    Full Text Available In the low observation noise limit particle filters become inefficient. In this paper a simple-to- implement particle filter is suggested as a solution to this well-known problem. The proposed Local Importance Sampling based particle filters draw the particles’ positions in a two-step process that makes use of both the dynamics of the system and the most recent observation. Experiments with the standard bearings-only tracking problem indicate that the proposed new particle filter method is indeed very successful when observations are reliable. Experiments with a high-dimensional variant of this problem further show that the advantage of the new filter grows with the increasing dimensionality of the system.

  13. Semigroups and sequential importance sampling for multiway tables and beyond

    Xi, Jing; Zhou, Feng; Yoshida, Ruriko; Haws, David

    2011-01-01

    When an interval of integers between the lower bound l_i and the upper bounds u_i is the support of the marginal distribution n_i|(n_{i-1}, ...,n_1), Chen et al. 2005 noticed that sampling from the interval at each step, for n_i during the sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection [Chen et al. 2006]. Rejecting tables is computationally expensive and incorrect proposal distributions result in biased estimators for the number of tables given its marginal sums. This paper has two focuses; (1) we propose a correction coefficient which corrects an interval of integers between the lower bound l_i and the upper bounds u_i to the support of the marginal distribution asymptotically even with rejections and with the same time complexity ...

  14. Structured estimation - Sample size reduction for adaptive pattern classification

    Morgera, S.; Cooper, D. B.

    1977-01-01

    The Gaussian two-category classification problem with known category mean value vectors and identical but unknown category covariance matrices is considered. The weight vector depends on the unknown common covariance matrix, so the procedure is to estimate the covariance matrix in order to obtain an estimate of the optimum weight vector. The measure of performance for the adapted classifier is the output signal-to-interference noise ratio (SIR). A simple approximation for the expected SIR is gained by using the general sample covariance matrix estimator; this performance is both signal and true covariance matrix independent. An approximation is also found for the expected SIR obtained by using a Toeplitz form covariance matrix estimator; this performance is found to be dependent on both the signal and the true covariance matrix.

  15. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  16. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  17. Multi-Scaling Sampling: An Adaptive Sampling Method for Discovering Approximate Association Rules

    Cai-Yan Jia; Xie-Ping Gao

    2005-01-01

    One of the obstacles of the efficient association rule mining is the explosive expansion of data sets since it is costly or impossible to scan large databases, esp., for multiple times. A popular solution to improve the speed and scalability of the association rule mining is to do the algorithm on a random sample instead of the entire database. But how to effectively define and efficiently estimate the degree of error with respect to the outcome of the algorithm, and how to determine the sample size needed are entangling researches until now. In this paper, an effective and efficient algorithm is given based on the PAC (Probably Approximate Correct) learning theory to measure and estimate sample error. Then, a new adaptive, on-line, fast sampling strategy - multi-scaling sampling - is presented inspired by MRA (Multi-Resolution Analysis) and Shannon sampling theorem, for quickly obtaining acceptably approximate association rules at appropriate sample size. Both theoretical analysis and empirical study have showed that the sampling strategy can achieve a very good speed-accuracy trade-off.

  18. Importance sampling approach for the nonstationary approximation error method

    The approximation error approach has previously been proposed to handle modelling, numerical and computational errors. This approach has been developed both for stationary and nonstationary inverse problems (Kalman filtering). The key idea of the approach is to compute the approximate statistics of the errors over the distribution of all unknowns and uncertainties and carry out approximative marginalization with respect to these errors. In nonstationary problems, however, information is accumulated over time, and the initial uncertainties may turn out to have been exaggerated. In this paper, we propose an algorithm with which the approximation error statistics can be updated during the accumulation of measurement information. The proposed algorithm is based on importance sampling. The recursions that are proposed here are, however, based on the (extended) Kalman filter and therefore do not employ the often exceedingly heavy computational load of particle filtering. As a computational example, we study an estimation problem that is related to a convection–diffusion problem in which the velocity field is not accurately specified

  19. Turkish adaptation of the Fear of Spiders Questionnaire: Reliability and validity in non-clinical samples

    Robert W. Booth

    2016-12-01

    Full Text Available The rapid, objective measurement of spider fear is important for clinicians, and for researchers studying fear. To facilitate this, we adapted the Fear of Spiders Questionnaire (FSQ into Turkish. The FSQ is quick to complete and easy to understand. Compared to the commonly used Spider Phobia Questionnaire, it has shown superior test-retest reliability and better discrimination of lower levels of spider fear, facilitating fear research in non-clinical samples. In two studies, with 137 and 105 undergraduates and unselected volunteers, our adapted FSQ showed excellent internal consistency (Cronbach’s α = .95 and .96 and test-retest reliability (r = .90, and good discriminant validity against the State–Trait Anxiety Inventory—Trait (r = .23 and Beck Anxiety Inventory—Trait (r = .07. Most importantly, our adapted FSQ significantly predicted 26 students’ self-reported discomfort upon approaching a caged tarantula; however, a measure of behavioural avoidance of the tarantula yielded little variability, so a more sensitive task will be required for future behavioural testing. Based on this initial testing, we recommend our adapted FSQ for research use. Further research is required to verify that our adapted FSQ discriminates individuals with and without phobia effectively. A Turkish-language report of the studies is included as supplementary material.

  20. Monte Carlo importance sampling for the MCNP{trademark} general source

    Lichtenstein, H.

    1996-01-09

    Research was performed to develop an importance sampling procedure for a radiation source. The procedure was developed for the MCNP radiation transport code, but the approach itself is general and can be adapted to other Monte Carlo codes. The procedure, as adapted to MCNP, relies entirely on existing MCNP capabilities. It has been tested for very complex descriptions of a general source, in the context of the design of spent-reactor-fuel storage casks. Dramatic improvements in calculation efficiency have been observed in some test cases. In addition, the procedure has been found to provide an acceleration to acceptable convergence, as well as the benefit of quickly identifying user specified variance-reduction in the transport that effects unstable convergence.

  1. Large Deviations and Importance Sampling for Systems of Slow-Fast Motion

    Spiliopoulos, Konstantinos, E-mail: kspiliop@dam.brown.edu [Brown University, Division of Applied Mathematics (United States)

    2013-02-15

    In this paper we develop the large deviations principle and a rigorous mathematical framework for asymptotically efficient importance sampling schemes for general, fully dependent systems of stochastic differential equations of slow and fast motion with small noise in the slow component. We assume periodicity with respect to the fast component. Depending on the interaction of the fast scale with the smallness of the noise, we get different behavior. We examine how one range of interaction differs from the other one both for the large deviations and for the importance sampling. We use the large deviations results to identify asymptotically optimal importance sampling schemes in each case. Standard Monte Carlo schemes perform poorly in the small noise limit. In the presence of multiscale aspects one faces additional difficulties and straightforward adaptation of importance sampling schemes for standard small noise diffusions will not produce efficient schemes. It turns out that one has to consider the so called cell problem from the homogenization theory for Hamilton-Jacobi-Bellman equations in order to guarantee asymptotic optimality. We use stochastic control arguments.

  2. Innovation and adaptation in a Turkish sample: a preliminary study.

    Oner, B

    2000-11-01

    The aim of this study was to examine the representations of adaptation and innovation among adults in Turkey. Semi-structured interviews were carried out with a sample of 20 Turkish adults (10 men, 10 women) from various occupations. The participants' ages ranged from 21 to 58 years. Results of content analysis showed that the representation of innovation varied with the type of context. Innovation was not preferred within the family and interpersonal relationship contexts, whereas it was relatively more readily welcomed within the contexts of work, science, and technology. This finding may indicate that the concept of innovation that is assimilated in traditional Turkish culture has limits. Contents of the interviews were also analyzed with respect to M. J. Kirton's (1976) subscales of originality, efficiency, and rule-group conformity. The participants favored efficient innovators, whereas they thought that the risk of failure was high in cases of inefficient innovation. The reasons for and indications of the representations of innovativeness among Turkish people are discussed in relation to their social structure and cultural expectations. PMID:11092420

  3. The importance of cooling of urine samples for doping analysis

    Kuenen, J.G.; Konings, W.N.

    2009-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  4. The importance of cooling of urine samples for doping analysis

    Kuenen, J. Gijs; Konings, Wil N.

    2010-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  5. Job performance ratings : The relative importance of mental ability, conscientiousness, and career adaptability

    Ohme, Melanie; Zacher, Hannes

    2015-01-01

    According to career construction theory, continuous adaptation to the work environment is crucial to achieve work and career success. In this study, we examined the relative importance of career adaptability for job performance ratings using an experimental policy-capturing design. Employees (N = 13

  6. Joint importance sampling of low-order volumetric scattering

    Georgiev, Iliyan; Křivánek, Jaroslav; Hachisuka, Toshiya; Nowrouzezahrai, Derek; Jarosz, Wojciech

    path-based rendering algorithms such as path tracing, bidirectional path tracing, and many-light methods. We also use our sampling routines to generalize deterministic shadow connections to connection subpaths consisting of two or three random decisions, to efficiently simulate higher-order multiple......Central to all Monte Carlo-based rendering algorithms is the construction of light transport paths from the light sources to the eye. Existing rendering approaches sample path vertices incrementally when constructing these light transport paths. The resulting probability density is thus a product...... of the conditional densities of each local sampling step, constructed without explicit control over the form of the final joint distribution of the complete path. We analyze why current incremental construction schemes often lead to high variance in the presence of participating media, and reveal...

  7. On the importance sampling of self-avoiding walks

    Bousquet-Mélou, Mireille

    2014-01-01

    In a 1976 paper published in Science, Knuth presented an algorithm to sample (non-uniform) self-avoiding walks crossing a square of side k. From this sample, he constructed an estimator for the number of such walks. The quality of this estimator is directly related to the (relative) variance of a certain random variable X_k. From his experiments, Knuth suspected that this variance was extremely large (so that the estimator would not be very efficient). But how large? For the analogous Rosenbl...

  8. Stress avoidance in a common annual: reproductive timing is important for local adaptation and geographic distribution.

    Griffith, T M; Watson, M A

    2005-11-01

    Adaptation to local environments may be an important determinant of species' geographic range. However, little is known about which traits contribute to adaptation or whether their further evolution would facilitate range expansion. In this study, we assessed the adaptive value of stress avoidance traits in the common annual Cocklebur (Xanthium strumarium) by performing a reciprocal transplant across a broad latitudinal gradient extending to the species' northern border. Populations were locally adapted and stress avoidance traits accounted for most fitness differences between populations. At the northern border where growing seasons are cooler and shorter, native populations had evolved to reproduce earlier than native populations in the lower latitude gardens. This clinal pattern in reproductive timing corresponded to a shift in selection from favouring later to earlier reproduction. Thus, earlier reproduction is an important adaptation to northern latitudes and constraint on the further evolution of this trait in marginal populations could potentially limit distribution. PMID:16313471

  9. Determination of free energy profiles by repository based adaptive umbrella sampling: Bridging nonequilibrium and quasiequilibrium simulations

    Zheng, Han; Zhang, Yingkai

    2008-01-01

    We propose a new adaptive sampling approach to determine free energy profiles with molecular dynamics simulations, which is called as “repository based adaptive umbrella sampling” (RBAUS). Its main idea is that a sampling repository is continuously updated based on the latest simulation data, and the accumulated knowledge and sampling history are then employed to determine whether and how to update the biasing umbrella potential for subsequent simulations. In comparison with other adaptive me...

  10. Parallel importance sampling in conditional linear Gaussian networks

    Salmerón, Antonio; Ramos-López, Darío; Borchani, Hanen;

    2015-01-01

    In this paper we analyse the problem of probabilistic inference in CLG networks when evidence comes in streams. In such situations, fast and scalable algorithms, able to provide accurate responses in a short time are required. We consider the instantiation of variational inference and importance ...

  11. On the importance sampling of self-avoiding walks

    Bousquet-Mélou, Mireille

    2011-01-01

    In a 1976 paper published in Science, Knuth presented an algorithm to sample (non-uniform) self-avoiding walks crossing a square of side k. From this sample, he constructed an estimator for the number of such walks. The quality of this estimator is directly related to the (relative) variance of a certain random variable X_k. From his experiments, Knuth suspected that this variance was extremely large, so that the estimator would not be very efficient. A few years ago, Bassetti and Diaconis showed that, for a similar sampler that only generates walks consisting of North and East steps, the relative variance is O(\\sqrt k). In this note we go one step further by showing that, for walks consisting of North, South and East steps, the relative variance is of the order of 2^{k(k+1)}/(k+1)^{2k}, and thus much larger than exponential in k. We also obtain partial results for general self-avoiding walks, suggesting that the relative variance could be as large as \\mu^{k^2} for some \\mu>1. Knuth's algorithm is a basic exa...

  12. An adaptive two-stage sequential design for sampling rare and clustered populations

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  13. State-independent importance sampling for random walks with regularly varying increments

    Karthyek R. A. Murthy

    2015-03-01

    Full Text Available We develop importance sampling based efficient simulation techniques for three commonly encountered rare event probabilities associated with random walks having i.i.d. regularly varying increments; namely, 1 the large deviation probabilities, 2 the level crossing probabilities, and 3 the level crossing probabilities within a regenerative cycle. Exponential twisting based state-independent methods, which are effective in efficiently estimating these probabilities for light-tailed increments are not applicable when the increments are heavy-tailed. To address the latter case, more complex and elegant state-dependent efficient simulation algorithms have been developed in the literature over the last few years. We propose that by suitably decomposing these rare event probabilities into a dominant and further residual components, simpler state-independent importance sampling algorithms can be devised for each component resulting in composite unbiased estimators with desirable efficiency properties. When the increments have infinite variance, there is an added complexity in estimating the level crossing probabilities as even the well known zero-variance measures have an infinite expected termination time. We adapt our algorithms so that this expectation is finite while the estimators remain strongly efficient. Numerically, the proposed estimators perform at least as well, and sometimes substantially better than the existing state-dependent estimators in the literature.

  14. An Adaptive Sampling System for Sensor Nodes in Body Area Networks.

    Rieger, R; Taylor, J

    2014-04-23

    The importance of body sensor networks to monitor patients over a prolonged period of time has increased with an advance in home healthcare applications. Sensor nodes need to operate with very low-power consumption and under the constraint of limited memory capacity. Therefore, it is wasteful to digitize the sensor signal at a constant sample rate, given that the frequency contents of the signals vary with time. Adaptive sampling is established as a practical method to reduce the sample data volume. In this paper a low-power analog system is proposed, which adjusts the converter clock rate to perform a peak-picking algorithm on the second derivative of the input signal. The presented implementation does not require an analog-to-digital converter or a digital processor in the sample selection process. The criteria for selecting a suitable detection threshold are discussed, so that the maximum sampling error can be limited. A circuit level implementation is presented. Measured results exhibit a significant reduction in the average sample frequency and data rate of over 50% and 38% respectively. PMID:24760918

  15. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    Huibin Lu; Zhengping Hu; Hongxiao Gao

    2015-01-01

    In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the...

  16. Climate variables explain neutral and adaptive variation within salmonid metapopulations: The importance of replication in landscape genetics

    Hand, Brian K; Muhlfeld, Clint C.; Wade, Alisa A.; Kovach, Ryan; Whited, Diane C.; Narum, Shawn R; Matala, Andrew P; Ackerman, Michael W.; Garner, B. A.; Kimball, John S; Stanford, Jack A.; Luikart, Gordon

    2016-01-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  17. An Adaptive Spectral Clustering Algorithm Based on the Importance of Shared Nearest Neighbors

    Xiaoqi He

    2015-05-01

    Full Text Available The construction of a similarity matrix is one significant step for the spectral clustering algorithm; while the Gaussian kernel function is one of the most common measures for constructing the similarity matrix. However, with a fixed scaling parameter, the similarity between two data points is not adaptive and appropriate for multi-scale datasets. In this paper, through quantitating the value of the importance for each vertex of the similarity graph, the Gaussian kernel function is scaled, and an adaptive Gaussian kernel similarity measure is proposed. Then, an adaptive spectral clustering algorithm is gotten based on the importance of shared nearest neighbors. The idea is that the greater the importance of the shared neighbors between two vertexes, the more possible it is that these two vertexes belong to the same cluster; and the importance value of the shared neighbors is obtained with an iterative method, which considers both the local structural information and the distance similarity information, so as to improve the algorithm’s performance. Experimental results on different datasets show that our spectral clustering algorithm outperforms the other spectral clustering algorithms, such as the self-tuning spectral clustering and the adaptive spectral clustering based on shared nearest neighbors in clustering accuracy on most datasets.

  18. Adaptive optics for deeper imaging of biological samples.

    Girkin, John M; Poland, Simon; Wright, Amanda J

    2009-02-01

    Optical microscopy has been a cornerstone of life science investigations since its first practical application around 400 years ago with the goal being subcellular resolution, three-dimensional images, at depth, in living samples. Nonlinear microscopy brought this dream a step closer, but as one images more deeply the material through which you image can greatly distort the view. By using optical devices, originally developed for astronomy, whose optical properties can be changed in real time, active compensation for sample-induced aberrations is possible. Submicron resolution images are now routinely recorded from depths over 1mm into tissue. Such active optical elements can also be used to keep conventional microscopes, both confocal and widefield, in optimal alignment. PMID:19272766

  19. Sampling Plant Diversity and Rarity at Landscape Scales: Importance of Sampling Time in Species Detectability

    Zhang, Jian; Nielsen, Scott E.; Grainger, Tess N.; Kohler, Monica; Chipchar, Tim; Farr, Daniel R.

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys. PMID:24740179

  20. Adapting sampling plans to caribou distribution on calving grounds

    Michel Crête

    1991-10-01

    Full Text Available Between 1984 and 1988, the size of the two caribou herds in northern Québec was derived by combining estimates of female numbers on calving grounds in June and composition counts during rut in autumn. Sampling with aerial photos was conducted on calving grounds to determine the number of animals per km2, telemetry served to estimate the proportion of females in the census area at the time of photography in addition to summer survival rate, and helicopter or ground observations were used for composition counts. Observers were able to detect on black and white negatives over 95 percent of caribou counted from a helicopter flying at low altitude over the same area; photo scale varied between = 1:3 600 and 1:6 000. Sampling units covering less than 15-20 ha were the best for sampling caribou distribution on calving grounds, where density generally averaged » 10 individuals-km"2. Around 90 percent of caribou on calving grounds were females; others were mostly yearling males. During the 1-2 day photographic census, 64 to 77 percent of the females were present on the calving areas. Summer survival exceeded 95 percent in three summers. In autumn, females composed between 45 and 54 percent of each herd. The Rivière George herd was estimated at 682 000 individuals (± 36%; alpha = 0.10 in 1988. This estimate was imprecise due to insufficiens sample size for measuring animal density on the calving ground and for determining proportion of females on the calving ground at the time of the photo census. To improve precision and reduce cost, it is proposed to estimate herd size of tundra caribou in one step, using only aerial photos in early June without telemetry.

  1. Long-term dynamics of adaptive evolution in a globally important phytoplankton species to ocean acidification.

    Schlüter, Lothar; Lohbeck, Kai T; Gröger, Joachim P; Riebesell, Ulf; Reusch, Thorsten B H

    2016-07-01

    Marine phytoplankton may adapt to ocean change, such as acidification or warming, because of their large population sizes and short generation times. Long-term adaptation to novel environments is a dynamic process, and phenotypic change can take place thousands of generations after exposure to novel conditions. We conducted a long-term evolution experiment (4 years = 2100 generations), starting with a single clone of the abundant and widespread coccolithophore Emiliania huxleyi exposed to three different CO2 levels simulating ocean acidification (OA). Growth rates as a proxy for Darwinian fitness increased only moderately under both levels of OA [+3.4% and +4.8%, respectively, at 1100 and 2200 μatm partial pressure of CO2 (Pco2)] relative to control treatments (ambient CO2, 400 μatm). Long-term adaptation to OA was complex, and initial phenotypic responses of ecologically important traits were later reverted. The biogeochemically important trait of calcification, in particular, that had initially been restored within the first year of evolution was later reduced to levels lower than the performance of nonadapted populations under OA. Calcification was not constitutively lost but returned to control treatment levels when high CO2-adapted isolates were transferred back to present-day control CO2 conditions. Selection under elevated CO2 exacerbated a general decrease of cell sizes under long-term laboratory evolution. Our results show that phytoplankton may evolve complex phenotypic plasticity that can affect biogeochemically important traits, such as calcification. Adaptive evolution may play out over longer time scales (>1 year) in an unforeseen way under future ocean conditions that cannot be predicted from initial adaptation responses. PMID:27419227

  2. Estimating the Importance of Private Adaptation to Climate Change in Agriculture: A Review of Empirical Methods

    Moore, F.; Burke, M.

    2015-12-01

    A wide range of studies using a variety of methods strongly suggest that climate change will have a negative impact on agricultural production in many areas. Farmers though should be able to learn about a changing climate and to adjust what they grow and how they grow it in order to reduce these negative impacts. However, it remains unclear how effective these private (autonomous) adaptations will be, or how quickly they will be adopted. Constraining the uncertainty on this adaptation is important for understanding the impacts of climate change on agriculture. Here we review a number of empirical methods that have been proposed for understanding the rate and effectiveness of private adaptation to climate change. We compare these methods using data on agricultural yields in the United States and western Europe.

  3. Dangerous climate change and the importance of adaptation for the Arctic's Inuit population

    Ford, James D.

    2009-04-01

    The Arctic's climate is changing rapidly, to the extent that 'dangerous' climate change as defined by the United Nations Framework on Climate Change might already be occurring. These changes are having implications for the Arctic's Inuit population and are being exacerbated by the dependence of Inuit on biophysical resources for livelihoods and the low socio-economic-health status of many northern communities. Given the nature of current climate change and projections of a rapidly warming Arctic, climate policy assumes a particular importance for Inuit regions. This paper argues that efforts to stabilize and reduce greenhouse gas emissions are urgent if we are to avoid runaway climate change in the Arctic, but unlikely to prevent changes which will be dangerous for Inuit. In this context, a new policy discourse on climate change is required for Arctic regions—one that focuses on adaptation. The paper demonstrates that states with Inuit populations and the international community in general has obligations to assist Inuit to adapt to climate change through international human rights and climate change treaties. However, the adaptation deficit, in terms of what we know and what we need to know to facilitate successful adaptation, is particularly large in an Arctic context and limiting the ability to develop response options. Moreover, adaptation as an option of response to climate change is still marginal in policy negotiations and Inuit political actors have been slow to argue the need for adaptation assistance. A new focus on adaptation in both policy negotiations and scientific research is needed to enhance Inuit resilience and reduce vulnerability in a rapidly changing climate.

  4. Dangerous climate change and the importance of adaptation for the Arctic's Inuit population

    The Arctic's climate is changing rapidly, to the extent that 'dangerous' climate change as defined by the United Nations Framework on Climate Change might already be occurring. These changes are having implications for the Arctic's Inuit population and are being exacerbated by the dependence of Inuit on biophysical resources for livelihoods and the low socio-economic-health status of many northern communities. Given the nature of current climate change and projections of a rapidly warming Arctic, climate policy assumes a particular importance for Inuit regions. This paper argues that efforts to stabilize and reduce greenhouse gas emissions are urgent if we are to avoid runaway climate change in the Arctic, but unlikely to prevent changes which will be dangerous for Inuit. In this context, a new policy discourse on climate change is required for Arctic regions-one that focuses on adaptation. The paper demonstrates that states with Inuit populations and the international community in general has obligations to assist Inuit to adapt to climate change through international human rights and climate change treaties. However, the adaptation deficit, in terms of what we know and what we need to know to facilitate successful adaptation, is particularly large in an Arctic context and limiting the ability to develop response options. Moreover, adaptation as an option of response to climate change is still marginal in policy negotiations and Inuit political actors have been slow to argue the need for adaptation assistance. A new focus on adaptation in both policy negotiations and scientific research is needed to enhance Inuit resilience and reduce vulnerability in a rapidly changing climate.

  5. Adaptation and Initial Validation of the Passion Scale in a Portuguese Sample

    Gabriela Gonçalves

    2014-08-01

    Full Text Available Passion is defined as a strong inclination to engage in an activity that people like, that they find important, and in which they invest time and energy. As no specific measure to assess levels of passion in the workplace in Portugal is available, the aim of this study was to adapt the Passion scale into Portuguese and validate it. The scale was translated from English into Portuguese using the forward-backward translation method and administered to a sample of 551 Portuguese workers. Exploratory factor analyses were conducted to test the replicability of the scale. The results confirmed the expected two-factor structure: harmonious passion and obsessive passion. However, the initial criterion of the replication of the factorial structure based on item factor loadings was not fulfilled. Criterion-related validity was tested by correlations with passion and job satisfaction. Regarding internal consistency, adequate alpha coefficients were obtained for both factors.

  6. Parks, people, and change: the importance of multistakeholder engagement in adaptation planning for conserved areas

    Corrine N. Knapp

    2014-12-01

    Full Text Available Climate change challenges the traditional goals and conservation strategies of protected areas, necessitating adaptation to changing conditions. Denali National Park and Preserve (Denali in south central Alaska, USA, is a vast landscape that is responding to climate change in ways that will impact both ecological resources and local communities. Local observations help to inform understanding of climate change and adaptation planning, but whose knowledge is most important to consider? For this project we interviewed long-term Denali staff, scientists, subsistence community members, bus drivers, and business owners to assess what types of observations each can contribute, how climate change is impacting each, and what they think the National Park Service should do to adapt. The project shows that each type of long-term observer has different types of observations, but that those who depend more directly on natural resources for their livelihoods have more and different observations than those who do not. These findings suggest that engaging multiple groups of stakeholders who interact with the park in distinct ways adds substantially to the information provided by Denali staff and scientists and offers a broader foundation for adaptation planning. It also suggests that traditional protected area paradigms that fail to learn from and foster appropriate engagement of people may be maladaptive in the context of climate change.

  7. Estimating the abundance of clustered animal population by using adaptive cluster sampling and negative binomial distribution

    Bo, Yizhou; Shifa, Naima

    2013-09-01

    An estimator for finding the abundance of a rare, clustered and mobile population has been introduced. This model is based on adaptive cluster sampling (ACS) to identify the location of the population and negative binomial distribution to estimate the total in each site. To identify the location of the population we consider both sampling with replacement (WR) and sampling without replacement (WOR). Some mathematical properties of the model are also developed.

  8. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging

  9. La recherche en gestion en Afrique de l'Ouest:importation ou adaptation?

    Livian, Yves

    2013-01-01

    La recherche en gestion dans les pays du "Nord" a correspondu à un contexte économique et social bien différent de celui des pays d'Afrique aujourdhui.La pure importation est vouée à l'échec,comme l'est aussi la tendance "essentialiste" d'une spécificité africaine irrémédiable.Quelques pistes sont esquissées pour une adaptation au contexte,promise à un grand développement compte tenu des vastes terrains à découvrir.

  10. Intraspecific shape variation in horseshoe crabs: the importance of sexual and natural selection for local adaptation

    Faurby, Søren; Nielsen, Kasper Sauer Kollerup; Bussarawit, Somchai;

    2011-01-01

    A morphometric analysis of the body shape of three species of horseshoe crabs was undertaken in order to infer the importance of natural and sexual selection. It was expected that natural selection would be most intense, leading to highest regional differentiation, in the American species Limulus...... polyphemus, which has the largest climatic differences between different populations. Local adaptation driven by sexual selection was expected in males but not females because horseshoe crab mating behaviour leads to competition between males, but not between females. Three hundred fifty-nine horseshoe crabs...

  11. Assessing employability capacities and career adaptability in a sample of human resource professionals

    Melinde Coetzee; Nadia Ferreira; Ingrid L. Potgieter

    2015-01-01

    Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1) to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2) to identify the variables that contributed the most to this relationship.Motivation fo...

  12. Data reduction in the ITMS system through a data acquisition model with self-adaptive sampling rate

    Ruiz, M. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain)], E-mail: mariano.ruiz@upm.es; Lopez, JM.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Barrera, E. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Melendez, R. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2008-04-15

    Long pulse or steady state operation of fusion experiments require data acquisition and processing systems that reduce the volume of data involved. The availability of self-adaptive sampling rate systems and the use of real-time lossless data compression techniques can help solve these problems. The former is important for continuous adaptation of sampling frequency for experimental requirements. The latter allows the maintenance of continuous digitization under limited memory conditions. This can be achieved by permanent transmission of compressed data to other systems. The compacted transfer ensures the use of minimum bandwidth. This paper presents an implementation based on intelligent test and measurement system (ITMS), a data acquisition system architecture with multiprocessing capabilities that permits it to adapt the system's sampling frequency throughout the experiment. The sampling rate can be controlled depending on the experiment's specific requirements by using an external dc voltage signal or by defining user events through software. The system takes advantage of the high processing capabilities of the ITMS platform to implement a data reduction mechanism based in lossless data compression algorithms which are themselves based in periodic deltas.

  13. Low Bit-Rate Image Compression using Adaptive Down-Sampling technique

    V.Swathi; Prof. K ASHOK BABU

    2011-01-01

    In this paper, we are going to use a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass pre-filtering. The resulting down-sampled pre-filtered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then up-converts it to the original resolut...

  14. The importance of the EU green paper on climate adaptation for the Netherlands

    An analysis of the EU Green Paper Climate Adaptation shows that this is not inconsistent with the National Adaptation, but does differ. The Green Paper also highlights the European dimension of climate adaptation and approaches climate adaptation in a broader context. Furthermore, the challenge is to remain alert and to bring Dutch ideas into the EU policy process, when concrete measures will be formulated

  15. How to apply importance-sampling techniques to simulations of optical systems

    McKinstrie, C. J.; Winzer, P. J.

    2003-01-01

    This report contains a tutorial introduction to the method of importance sampling. The use of this method is illustrated for simulations of the noise-induced energy jitter of return-to-zero pulses in optical communication systems.

  16. An Importance Sampling Scheme on Dual Factor Graphs. II. Models with Strong Couplings

    Molkaraie, Mehdi

    2014-01-01

    We consider the problem of estimating the partition function of the two-dimensional ferromagnetic Ising model in an external magnetic field. The estimation is done via importance sampling in the dual of the Forney factor graph representing the model. We present importance sampling schemes that can efficiently compute an estimate of the partition function in a wide range of model parameters. Emphasis is on models in which a subset of the coupling parameters is strong.

  17. An upgraded version of an importance sampling algorithm for large scale shell model calculations

    Bianco, D; Andreozzi, F; Lo Iudice, N; Porrino, A [Universita di Napoli Federico II, Dipartimento Scienze Fisiche, Monte S. Angelo, via Cintia, 80126 Napoli (Italy); S, Dimitrova, E-mail: loiudice@na.infn.i [Institute of Nuclear Research and Nuclear Energy, Sofia (Bulgaria)

    2010-01-01

    An importance sampling iterative algorithm, developed few years ago, for generating exact eigensolutions of large matrices is upgraded so as to allow large scale shell model calculations in the uncoupled m-scheme. By exploiting the sparsity properties of the Hamiltonian matrix and projecting out effectively the good angular momentum, the new importance sampling allows to reduce drastically the sizes of the matrices while keeping full control of the accuracy of the eigensolutions. Illustrative numerical examples are presented.

  18. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    Huibin Lu

    2015-01-01

    Full Text Available In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the source domain to the target domain; the second step is to construct L1-Graph on the basis of sparse representation, so as to search for the nearest neighbor data with self-adaptation and preserve the samples and the geometric structure; lastly, we integrate two complementary objective functions into the unified optimization issue and use the iterative algorithm to cope with it, and then the estimation of the testing sample classification is completed. Comparative experiments are conducted in USPS-Binary digital database, Three-Domain Object Benchmark database, and ALOI database; the experimental results verify the effectiveness of the proposed algorithm, which improves the recognition accuracy and ensures the robustness of algorithm.

  19. Sample based 3D face reconstruction from a single frontal image by adaptive locally linear embedding

    ZHANG Jian; ZHUANG Yue-ting

    2007-01-01

    In this paper, we propose a highly automatic approach for 3D photorealistic face reconstruction from a single frontal image. The key point of our work is the implementation of adaptive manifold learning approach. Beforehand, an active appearance model (AAM) is trained for automatic feature extraction and adaptive locally linear embedding (ALLE) algorithm is utilized to reduce the dimensionality of the 3D database. Then, given an input frontal face image, the corresponding weights between 3D samples and the image are synthesized adaptively according to the AAM selected facial features. Finally, geometry reconstruction is achieved by linear weighted combination of adaptively selected samples. Radial basis function (RBF) is adopted to map facial texture from the frontal image to the reconstructed face geometry. The texture of invisible regions between the face and the ears is interpolated by sampling from the frontal image. This approach has several advantages: (1) Only a single frontal face image is needed for highly automatic face reconstruction; (2) Compared with former works, our reconstruction approach provides higher accuracy; (3) Constraint based RBF texture mapping provides natural appearance for reconstructed face.

  20. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  1. Adaptation to climate change and climate variability:The importance of understanding agriculture as performance

    Crane, T.A.; Roncoli, C.; Hoogenboom, G.

    2011-01-01

    Most climate change studies that address potential impacts and potential adaptation strategies are largely based on modelling technologies. While models are useful for visualizing potential future outcomes and evaluating options for potential adaptation, they do not adequately represent and integrat

  2. FloodNet: Coupling Adaptive Sampling with Energy Aware Routing in a Flood Warning System

    Jing Zhou; David De Roure

    2007-01-01

    We describe the design of FloodNet, a flood warning system, which uses a grid-based flood predictor model developed by environmental experts to make flood predictions based on readings of water level collected by a set of sensor nodes.To optimize battery consumption, the reporting frequency of sensor nodes is required to be adaptive to local conditions as well as the flood predictor model.We therefore propose an energy aware routing protocol which allows sensor nodes to consume energy according to this need.This system is notable both for the adaptive sampling regime and the methodology adopted in the design of the adaptive behavior, which involved development of simulation tools and very close collaboration with environmental experts.

  3. Assessing employability capacities and career adaptability in a sample of human resource professionals

    Melinde Coetzee

    2015-03-01

    Full Text Available Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1 to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2 to identify the variables that contributed the most to this relationship.Motivation for the study: Global competitive markets and technological advances are increasingly driving the demand for graduate knowledge and skills in a wide variety of jobs. Contemporary career theory further emphasises career adaptability across the lifespan as a critical skill for career management agency. Despite the apparent importance attached to employees’ employability and career adaptability, there seems to be a general lack of research investigating the association between these constructs.Research approach, design and method: A cross-sectional, quantitative research design approach was followed. Descriptive statistics, Pearson product-moment correlations and canonical correlation analysis were performed to achieve the objective of the study. The participants (N = 196 were employed in professional positions in the human resource field and were predominantly early career black people and women.Main findings: The results indicated positive multivariate relationships between the variables and showed that lifelong learning capacities and problem solving, decision-making and interactive skills contributed the most to explaining the participants’ career confidence, career curiosity and career control.Practical/managerial implications: The study suggests that developing professional graduates’ employability capacities may strengthen their career adaptability. These capacities were shown to explain graduates’ active engagement in career management strategies

  4. Low Bit-Rate Image Compression using Adaptive Down-Sampling technique

    V.Swathi

    2011-09-01

    Full Text Available In this paper, we are going to use a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass pre-filtering. The resulting down-sampled pre-filtered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then up-converts it to the original resolution in a constrained least squares restoration process, using a 2-D piecewise autoregressive model and the knowledge of directional low-pass pre-filtering. The proposed compression approach of collaborative adaptive down-sampling and up-conversion (CADU outperforms JPEG 2000 in PSNR measure at low to medium bit rates and achieves superior visual quality, as well. The superior low bit-rate performance of the CADU approach seems to suggest that over-sampling not only wastes hardware resources and energy, and it could be counterproductive to image quality given a tight bit budget.

  5. Using continuous in-situ measurements to adaptively trigger urban storm water samples

    Wong, B. P.; Kerkez, B.

    2015-12-01

    Until cost-effective in-situ sensors are available for biological parameters, nutrients and metals, automated samplers will continue to be the primary source of reliable water quality measurements. Given limited samples bottles, however, autosamplers often obscure insights on nutrient sources and biogeochemical processes which would otherwise be captured using a continuous sampling approach. To that end, we evaluate the efficacy a novel method to measure first-flush nutrient dynamics in flashy, urban watersheds. Our approach reduces the number of samples required to capture water quality dynamics by leveraging an internet-connected sensor node, which is equipped with a suite of continuous in-situ sensors and an automated sampler. To capture both the initial baseflow as well as storm concentrations, a cloud-hosted adaptive algorithm analyzes the high-resolution sensor data along with local weather forecasts to optimize a sampling schedule. The method was tested in a highly developed urban catchment in Ann Arbor, Michigan and collected samples of nitrate, phosphorus, and suspended solids throughout several storm events. Results indicate that the watershed does not exhibit first flush dynamics, a behavior that would have been obscured when using a non-adaptive sampling approach.

  6. An importance sampling algorithm for generating exact eigenstates of the nuclear Hamiltonian

    Andreozzi, F; Iudice, N. Lo; Porrino, A.

    2003-01-01

    We endow a recently devised algorithm for generating exact eigensolutions of large matrices with an importance sampling, which is in control of the extent and accuracy of the truncation of their dimensions. We made several tests on typical nuclei using a correlated basis obtained from partitioning the shell model space. The sampling so implemented allows not only for a substantial reduction of the shell model space but also for an extrapolation to exact eigenvalues and E2 strengths.

  7. An importance sampling algorithm for generating exact eigenstates of the nuclear Hamiltonian

    Andreozzi, F [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Naples (Italy); Iudice, N Lo [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Naples (Italy); Porrino, A [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Naples (Italy)

    2003-10-01

    We endow a recently devised algorithm for generating exact eigensolutions of large matrices with an importance sampling, which is in control of the extent and accuracy of the truncation of their dimensions. We performed several tests on typical nuclei using a correlated basis obtained from partitioning the shell model space. The sampling so implemented allows not only for a substantial reduction of the shell model space but also for an extrapolation to exact eigenvalues and E2 strengths.

  8. Enhanced modeling via network theory: Adaptive sampling of Markov state models

    Bowman, Gregory R; Ensign, Daniel L.; Pande, Vijay S.

    2010-01-01

    Computer simulations can complement experiments by providing insight into molecular kinetics with atomic resolution. Unfortunately, even the most powerful supercomputers can only simulate small systems for short timescales, leaving modeling of most biologically relevant systems and timescales intractable. In this work, however, we show that molecular simulations driven by adaptive sampling of networks called Markov State Models (MSMs) can yield tremendous time and resource savings, allowing p...

  9. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  10. Adaptation and Validation of the Sexual Assertiveness Scale (SAS) in a Sample of Male Drug Users.

    Vallejo-Medina, Pablo; Sierra, Juan Carlos

    2015-01-01

    The aim of the present study was to adapt and validate the Sexual Assertiveness Scale (SAS) in a sample of male drug users. A sample of 326 male drug users and 322 non-clinical males was selected by cluster sampling and convenience sampling, respectively. Results showed that the scale had good psychometric properties and adequate internal consistency reliability (Initiation = .66, Refusal = .74 and STD-P = .79). An evaluation of the invariance showed strong factor equivalence between both samples. A high and moderate effect of Differential Item Functioning was only found in items 1 and 14 (∆R 2 Nagelkerke = .076 and .037, respectively). We strongly recommend not using item 1 if the goal is to compare the scores of both groups, otherwise the comparison will be biased. Correlations obtained between the CSFQ-14 and the safe sex ratio and the SAS subscales were significant (CI = 95%) and indicated good concurrent validity. Scores of male drug users were similar to those of non-clinical males. Therefore, the adaptation of the SAS to drug users provides enough guarantees for reliable and valid use in both clinical practice and research, although care should be taken with item 1. PMID:25896498

  11. Adapting chain referral methods to sample new migrants: Possibilities and limitations

    Lucinda Platt

    2015-09-01

    Full Text Available Background: Demographic research on migration requires representative samples of migrant populations. Yet recent immigrants, who are particularly informative about current migrant flows, are difficult to capture even in specialist surveys. Respondent-driven sampling (RDS, a chain referral sampling and analysis technique, potentially offers the opportunity to achieve population-level inference of recently arrived migrant populations. Objective: We evaluate the attempt to use RDS to sample two groups of migrants, from Pakistan and Poland, who had arrived in the UK within the previous 18 months, and we present an alternative approach adapted to recent migrants. Methods: We discuss how connectedness, privacy, clustering, and motivation are expected to differ among recently arrived migrants, compared to typical applications of RDS. We develop a researcher-led chain referral approach, and compare success in recruitment and indicators of representativeness to standard RDS recruitment. Results: Our researcher-led approach led to higher rates of chain-referral, and enabled us to reach population members with smaller network sizes. The researcher-led approach resulted in similar recruiter-recruit transition probabilities to traditional RDS across many demographic and social characteristics. However, we did not succeed in building up long referral chains, largely due to the lack of connectedness of our target populations and some reluctance to refer. There were some differences between the two migrant groups, with less mobile and less hidden Pakistani men producing longer referral chains. Conclusions: Chain referral is difficult to implement for sampling newly arrived migrants. However, our researcher-led adaptation shows promise for less hidden and more stable recent immigrant populations. Contribution: The paper offers an evaluation of RDS for surveying recent immigrants and an adaptation that may be effective under certain conditions.

  12. An Importance Sampling Algorithm for Diagonalizing the Nuclear Shell-Model Hamiltonian

    We have developed an iterative algorithm for generating exact eigensolutions of large matrices and endowed it with an importance sampling which allows for a reduction of the sizes of the matrices while keeping full control of the accuracy of the eigensolutions. We illustrate the potential of the method through its application to the nuclear shell-model eigenproblem

  13. Adaptive sampling strategy support for the unlined chromic acid pit, chemical waste landfill, Sandia National Laboratories, Albuquerque, New Mexico

    Johnson, R.L.

    1993-11-01

    Adaptive sampling programs offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the real-time data generated by an adaptive sampling program. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system (SitePlanner{trademark} ) for data fusion, management, and display and combined Bayesian/geostatistical methods (PLUME) for contamination-extent estimation and sample location selection. This approach is applied in a retrospective study of a subsurface chromium plume at Sandia National Laboratories` chemical waste landfill. Retrospective analyses suggest the potential for characterization cost savings on the order of 60% through a reduction in the number of sampling programs, total number of soil boreholes, and number of samples analyzed from each borehole.

  14. Importance sampling for Lambda-coalescents in the infinitely many sites model

    Birkner, Matthias; Steinruecken, Matthias; 10.1016/j.tpb.2011.01.005

    2011-01-01

    We present and discuss new importance sampling schemes for the approximate computation of the sample probability of observed genetic types in the infinitely many sites model from population genetics. More specifically, we extend the 'classical framework', where genealogies are assumed to be governed by Kingman's coalescent, to the more general class of Lambda-coalescents and develop further Hobolth et. al.'s (2008) idea of deriving importance sampling schemes based on 'compressed genetrees'. The resulting schemes extend earlier work by Griffiths and Tavar\\'e (1994), Stephens and Donnelly (2000), Birkner and Blath (2008) and Hobolth et. al. (2008). We conclude with a performance comparison of classical and new schemes for Beta- and Kingman coalescents.

  15. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  16. Improved Algorithms and Coupled Neutron-Photon Transport for Auto-Importance Sampling Method

    Wang, Xin; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Li, Jun-Li

    2016-01-01

    Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed by Tsinghua University for deep penetration problem, which can improve computational efficiency significantly without pre-calculations for importance distribution. However AIS method is only validated with several basic deep penetration problems of simple geometries and cannot be used for coupled neutron-photon transport. This paper firstly presented the latest algorithm improvements for AIS method including particle transport, fictitious particles creation and adjustment, fictitious surface geometry, random number allocation and calculation of estimated relative error, which made AIS method applicable to complicated deep penetration problem. Then, a coupled Neutron-Photon Auto-Importance Sampling (NP-AIS) method was proposed to apply AIS method with the improved algorithms in coupled neutron-photon Monte Carlo transport. Finally, the NUREG/CR-6115 PWR benchmark model was calculated with the method of geometry splitti...

  17. The Portuguese adaptation of the Gudjonsson Suggestibility Scale (GSS1) in a sample of inmates.

    Pires, Rute; Silva, Danilo R; Ferreira, Ana Sousa

    2014-01-01

    This paper comprises two studies which address the validity of the Portuguese adaptation of the Gudjonsson Suggestibility Scale, GSS1. In study 1, the means and standard deviations for the suggestibility results of a sample of Portuguese inmates (N=40, Mage=37.5 years, SD=8.1) were compared to those of a sample of Icelandic inmates (Gudjonsson, 1997; Gudjonsson & Sigurdsson, 1996). Portuguese inmates' results were in line with the original results. In study 2, the means and standard deviations for the suggestibility results of the sample of Portuguese inmates were compared to those of a general Portuguese population sample (N=57, Mage=36.1 years, SD=12.7). The forensic sample obtained significantly higher scores in suggestibility measures than the general population sample. ANOVA confirmed that the increased suggestibility in the inmates sample was due to the limited memory capacity of this latter group. Given that the results of both studies 1 and 2 are in keeping with the author's original results (Gudjonsson, 1997), this may be regarded as a confirmation of the validity of the Portuguese GSS1. PMID:24289862

  18. Performance evaluation of an importance sampling technique in a Jackson network

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  19. Use of a Quantum Computer to do Importance and Metropolis-Hastings Sampling of a Classical Bayesian Network

    Tucci, Robert R.

    2008-01-01

    Importance sampling and Metropolis-Hastings sampling (of which Gibbs sampling is a special case) are two methods commonly used to sample multi-variate probability distributions (that is, Bayesian networks). Heretofore, the sampling of Bayesian networks has been done on a conventional "classical computer". In this paper, we propose methods for doing importance sampling and Metropolis-Hastings sampling of a classical Bayesian network on a quantum computer.

  20. Adaptation to lactose in lactose malabsorbers - importance of the intestinal microflora

    Fondén, Rangne

    2001-01-01

    At high intakes most lactose in lactose malabsorbers will be fermented by the intestinal microflora to hydrogen and other fermentation products, as are all other low-molecular, non-absorbable, fermentable carbohydrates. By adaptation higher intakes of lactose could be tolerated partly due to a lower net formation of hydrogen. This shift in fermentation is at least partly caused by a change in the activities of the intestinal microflora. Keywords: Adaptation, hydrogen production, lactose malab...

  1. Improved importance sampling technique for efficient simulation of digital communication systems

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  2. Performance evaluation of Bayesian decision feedback equalizer with M-PAM symbols using importance sampling simulation

    Chen, S.

    2002-01-01

    An importance sampling (IS) simulation method is presented for evaluating the lower-bound symbol error rate (SER) of the Bayesian decision feedback equalizer (DFE) with $M$-PAM symbols, under the assumption of correct decision feedback. By exploiting an asymptotic property of the Bayesian DFE, a design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency (AE) of the IS simulation.

  3. An Importance Sampling Scheme on Dual Factor Graphs. I. Models in a Strong External Field

    Molkaraie, Mehdi

    2014-01-01

    We propose an importance sampling scheme to estimate the partition function of the two-dimensional ferromagnetic Ising model and the two-dimensional ferromagnetic $q$-state Potts model, both in the presence of an external magnetic field. The proposed scheme operates in the dual Forney factor graph and is capable of efficiently computing an estimate of the partition function under a wide range of model parameters. In particular, we consider models that are in a strong external magnetic field.

  4. Specific determination of clinical and toxicological important substances in biological samples by LC-MS

    This thesis of this dissertation is the specific determination of clinical and toxicological important substances in biological samples by LC-MS. Nicotine was determined in serum after application of nicotine plaster and nicotine nasal spray with HPLC-ESI-MS. Cotinine was determined direct in urine with HPLC-ESI-MS. Short time anesthetics were determined in blood and cytostatics were determined in liquor with HPLC-ESI-MS. (botek)

  5. A laser microdissection-based workflow for FFPE tissue microproteomics: Important considerations for small sample processing.

    Longuespée, Rémi; Alberts, Deborah; Pottier, Charles; Smargiasso, Nicolas; Mazzucchelli, Gabriel; Baiwir, Dominique; Kriegsmann, Mark; Herfs, Michael; Kriegsmann, Jörg; Delvenne, Philippe; De Pauw, Edwin

    2016-07-15

    Proteomic methods are today widely applied to formalin-fixed paraffin-embedded (FFPE) tissue samples for several applications in research, especially in molecular pathology. To date, there is an unmet need for the analysis of small tissue samples, such as for early cancerous lesions. Indeed, no method has yet been proposed for the reproducible processing of small FFPE tissue samples to allow biomarker discovery. In this work, we tested several procedures to process laser microdissected tissue pieces bearing less than 3000 cells. Combined with appropriate settings for liquid chromatography mass spectrometry-mass spectrometry (LC-MS/MS) analysis, a citric acid antigen retrieval (CAAR)-based procedure was established, allowing to identify more than 1400 proteins from a single microdissected breast cancer tissue biopsy. This work demonstrates important considerations concerning the handling and processing of laser microdissected tissue samples of extremely limited size, in the process opening new perspectives in molecular pathology. A proof of the proposed method for biomarker discovery, with respect to these specific handling considerations, is illustrated using the differential proteomic analysis of invasive breast carcinoma of no special type and invasive lobular triple-negative breast cancer tissues. This work will be of utmost importance for early biomarker discovery or in support of matrix-assisted laser desorption/ionization (MALDI) imaging for microproteomics from small regions of interest. PMID:26690073

  6. Organ sample generator for expected treatment dose construction and adaptive inverse planning optimization

    Purpose: To create an organ sample generator (OSG) for expected treatment dose construction and adaptive inverse planning optimization. The OSG generates random samples of organs of interest from a distribution obeying the patient specific organ variation probability density function (PDF) during the course of adaptive radiotherapy. Methods: Principle component analysis (PCA) and a time-varying least-squares regression (LSR) method were used on patient specific geometric variations of organs of interest manifested on multiple daily volumetric images obtained during the treatment course. The construction of the OSG includes the determination of eigenvectors of the organ variation using PCA, and the determination of the corresponding coefficients using time-varying LSR. The coefficients can be either random variables or random functions of the elapsed treatment days depending on the characteristics of organ variation as a stationary or a nonstationary random process. The LSR method with time-varying weighting parameters was applied to the precollected daily volumetric images to determine the function form of the coefficients. Eleven h and n cancer patients with 30 daily cone beam CT images each were included in the evaluation of the OSG. The evaluation was performed using a total of 18 organs of interest, including 15 organs at risk and 3 targets. Results: Geometric variations of organs of interest during h and n cancer radiotherapy can be represented using the first 3 ∼ 4 eigenvectors. These eigenvectors were variable during treatment, and need to be updated using new daily images obtained during the treatment course. The OSG generates random samples of organs of interest from the estimated organ variation PDF of the individual. The accuracy of the estimated PDF can be improved recursively using extra daily image feedback during the treatment course. The average deviations in the estimation of the mean and standard deviation of the organ variation PDF for h and n

  7. Local adaptation of a bacterium is as important as its presence in structuring a natural microbial community.

    Gómez, Pedro; Paterson, Steve; De Meester, Luc; Liu, Xuan; Lenzi, Luca; Sharma, M D; McElroy, Kerensa; Buckling, Angus

    2016-01-01

    Local adaptation of a species can affect community composition, yet the importance of local adaptation compared with species presence per se is unknown. Here we determine how a compost bacterial community exposed to elevated temperature changes over 2 months as a result of the presence of a focal bacterium, Pseudomonas fluorescens SBW25, that had been pre-adapted or not to the compost for 48 days. The effect of local adaptation on community composition is as great as the effect of species presence per se, with these results robust to the presence of an additional strong selection pressure: an SBW25-specific virus. These findings suggest that evolution occurring over ecological time scales can be a key driver of the structure of natural microbial communities, particularly in situations where some species have an evolutionary head start following large perturbations, such as exposure to antibiotics or crop planting and harvesting. PMID:27501868

  8. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  9. A Surrogate-based Adaptive Sampling Approach for History Matching and Uncertainty Quantification

    Li, Weixuan; Zhang, Dongxiao; Lin, Guang

    2015-02-25

    A critical procedure in reservoir simulations is history matching (or data assimilation in a broader sense), which calibrates model parameters such that the simulation results are consistent with field measurements, and hence improves the credibility of the predictions given by the simulations. Often there exist non-unique combinations of parameter values that all yield the simulation results matching the measurements. For such ill-posed history matching problems, Bayesian theorem provides a theoretical foundation to represent different solutions and to quantify the uncertainty with the posterior PDF. Lacking an analytical solution in most situations, the posterior PDF may be characterized with a sample of realizations, each representing a possible scenario. A novel sampling algorithm is presented here for the Bayesian solutions to history matching problems. We aim to deal with two commonly encountered issues: 1) as a result of the nonlinear input-output relationship in a reservoir model, the posterior distribution could be in a complex form, such as multimodal, which violates the Gaussian assumption required by most of the commonly used data assimilation approaches; 2) a typical sampling method requires intensive model evaluations and hence may cause unaffordable computational cost. In the developed algorithm, we use a Gaussian mixture model as the proposal distribution in the sampling process, which is simple but also flexible to approximate non-Gaussian distributions and is particularly efficient when the posterior is multimodal. Also, a Gaussian process is utilized as a surrogate model to speed up the sampling process. Furthermore, an iterative scheme of adaptive surrogate refinement and re-sampling ensures sampling accuracy while keeping the computational cost at a minimum level. The developed approach is demonstrated with an illustrative example and shows its capability in handling the above-mentioned issues. Multimodal posterior of the history matching

  10. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  11. Geographic variation in the songs of neotropical singing mice: testing the relative importance of drift and local adaptation.

    Campbell, Polly; Pasch, Bret; Pino, Jorge L; Crino, Ondi L; Phillips, Molly; Phelps, Steven M

    2010-07-01

    Patterns of geographic variation in communication systems can provide insight into the processes that drive phenotypic evolution. Although work in birds, anurans, and insects demonstrates that acoustic signals are sensitive to diverse selective and stochastic forces, processes that shape variation in mammalian vocalizations are poorly understood. We quantified geographic variation in the advertisement songs of sister species of singing mice, montane rodents with a unique mode of vocal communication. We tested three hypotheses to explain spatial variation in the song of the lower altitude species, Scotinomys teguina: selection for species recognition in sympatry with congener, S. xerampelinus, acoustic adaptation to different environments, and stochastic divergence. Mice were sampled at seven sites in Costa Rica and Panamá; genetic distances were estimated from mitochondrial control region sequences, between-site differences in acoustic environment were estimated from climatic data. Acoustic, genetic and geographic distances were all highly correlated in S. teguina, suggesting that population differentiation in song is largely shaped by genetic drift. Contrasts between interspecific genetic-acoustic distances were significantly greater than expectations derived from intraspecific contrasts, indicating accelerated evolution of species-specific song. We propose that, although much intraspecific acoustic variation is effectively neutral, selection has been important in shaping species differences in song. PMID:20148958

  12. Role of importance of X-ray fluorescence analysis of forensic samples

    Full text: In the field of forensic science, it is very important to investigate the evidential samples obtained at various crime scenes. X-ray fluorescence (XRF) is used widely in forensic science [1]. Its main strength is its non-destructive nature, thus preserving evidence [2, 3]. In this paper, we report the application of XRF to examine the evidences like purity gold and silver jewelry (Indian Ornaments), remnants of glass pieces and paint chips recovered from crime scenes. The experimental measurements on these samples have been made using X-ray fluorescence spectrometer (LAB Center XRF-1800) procured from Shimazdu Scientific Inst., USA. The results are explained in terms of quantitative/ qualitative analysis of trace elements. (author)

  13. Cold adaptation in geographical populations of Drosophila melanogaster : phenotypic plasticity is more important than genetic variability

    Ayrinhac, A; Debat, [No Value; Gibert, P; Kister, AG; Legout, H; Moreteau, B; Vergilino, R; David, [No Value

    2004-01-01

    1. According to their geographical distribution, most Drosophila species may be classified as either temperate or tropical, and this pattern is assumed to reflect differences in their thermal adaptation, especially in their cold tolerance. We investigated cold tolerance in a global collection of D.

  14. Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology

    Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan

    2012-01-01

    A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.

  15. An Unbiased Adaptive Sampling Algorithm for the Exploration of RNA Mutational Landscapes under Evolutionary Pressure

    Waldispühl, Jérôme; Ponty, Yann

    The analysis of the impact of mutations on folding properties of RNAs is essential to decipher principles driving molecular evolution and to design new molecules. We recently introduced an algorithm called RNAmutants which samples RNA sequence-structure maps in polynomial time and space. However, since the mutation probabilities depend of the free energy of the structures, RNAmutants is bias toward G+C-rich regions of the mutational landscape. In this paper we introduce an unbiased adaptive sampling algorithm that enables RNAmutants to sample regions of the mutational landscape poorly covered by previous techniques. We applied the method to sample mutations in complete RNA sequence-structures maps of sizes up to 40 nucleotides. Our results indicate that the G+C-contents has a strong influence on the evolutionary accessible structural ensembles. In particular, we show that low G+C-contents favor the apparition of internal loops, while high G+C-contents reduce the size of the evolutionary accessible mutational landscapes.

  16. Evaluation of endoscopically obtained duodenal biopsy samples from cats and dogs in an adapter-modified Ussing chamber

    Ruhnke, Isabelle; DeBiasio, John V.; Suchodolski, Jan S.; Newman, Shelley; Musch, Mark W.; Steiner, Jörg M.

    2014-01-01

    This study was conducted to evaluate an adapter-modified Ussing chamber for assessment of transport physiology in endoscopically obtained duodenal biopsies from healthy cats and dogs, as well as dogs with chronic enteropathies. 17 duodenal biopsies from five cats and 51 duodenal biopsies from 13 dogs were obtained. Samples were transferred into an adapter-modified Ussing chamber and sequentially exposed to various absorbagogues and secretagogues. Overall, 78.6% of duodenal samples obtained fr...

  17. Component-adaptive up-sampling for inter layer interpolation in scalable video coding

    WANG Zhang; ZHANG JiXian; LI HaiTao

    2009-01-01

    Scalable video coding (SVC) is a newly emerging standard to be finalized as an extension of H.264/AVC. The most attractive characters in SVC are the inter layer prediction techniques, such as Intra_BL mode. But in current SVC scheme, a uniform up-sampling filter (UUSF) is employed to magnify all components of an image, which will be very inefficient and result in a lot of redundant computational complexity. To overcome this, we propose an efficient component-adaptive up-sampling filter (CAUSF) for inter layer interpolation. In CAUSF, one character of human vision system is considered, and different up-sampling filters are assigned to different components. In particular, the six-tap FIR filter used in UUSF is kept and assigned for luminance component. But for chrominance components, a new four-tap FIR filter is used. Experimental results show that CAUSF maintains the performances of coded bit-rate and PSNR-Y without any noticeable loss, and provides significant reduction in computational complexity.

  18. Do women's voices provide cues of the likelihood of ovulation? The importance of sampling regime.

    Julia Fischer

    Full Text Available The human voice provides a rich source of information about individual attributes such as body size, developmental stability and emotional state. Moreover, there is evidence that female voice characteristics change across the menstrual cycle. A previous study reported that women speak with higher fundamental frequency (F0 in the high-fertility compared to the low-fertility phase. To gain further insights into the mechanisms underlying this variation in perceived attractiveness and the relationship between vocal quality and the timing of ovulation, we combined hormone measurements and acoustic analyses, to characterize voice changes on a day-to-day basis throughout the menstrual cycle. Voice characteristics were measured from free speech as well as sustained vowels. In addition, we asked men to rate vocal attractiveness from selected samples. The free speech samples revealed marginally significant variation in F0 with an increase prior to and a distinct drop during ovulation. Overall variation throughout the cycle, however, precluded unequivocal identification of the period with the highest conception risk. The analysis of vowel samples revealed a significant increase in degree of unvoiceness and noise-to-harmonic ratio during menstruation, possibly related to an increase in tissue water content. Neither estrogen nor progestogen levels predicted the observed changes in acoustic characteristics. The perceptual experiments revealed a preference by males for voice samples recorded during the pre-ovulatory period compared to other periods in the cycle. While overall we confirm earlier findings in that women speak with a higher and more variable fundamental frequency just prior to ovulation, the present study highlights the importance of taking the full range of variation into account before drawing conclusions about the value of these cues for the detection of ovulation.

  19. PARALLEL ADAPTIVE MULTILEVEL SAMPLING ALGORITHMS FOR THE BAYESIAN ANALYSIS OF MATHEMATICAL MODELS

    Prudencio, Ernesto

    2012-01-01

    In recent years, Bayesian model updating techniques based on measured data have been applied to many engineering and applied science problems. At the same time, parallel computational platforms are becoming increasingly more powerful and are being used more frequently by the engineering and scientific communities. Bayesian techniques usually require the evaluation of multi-dimensional integrals related to the posterior probability density function (PDF) of uncertain model parameters. The fact that such integrals cannot be computed analytically motivates the research of stochastic simulation methods for sampling posterior PDFs. One such algorithm is the adaptive multilevel stochastic simulation algorithm (AMSSA). In this paper we discuss the parallelization of AMSSA, formulating the necessary load balancing step as a binary integer programming problem. We present a variety of results showing the effectiveness of load balancing on the overall performance of AMSSA in a parallel computational environment.

  20. Accelerating the convergence of replica exchange simulations using Gibbs sampling and adaptive temperature sets

    Vogel, Thomas

    2015-01-01

    We recently introduced a novel replica-exchange scheme in which an individual replica can sample from states encountered by other replicas at any previous time by way of a global configuration database, enabling the fast propagation of relevant states through the whole ensemble of replicas. This mechanism depends on the knowledge of global thermodynamic functions which are measured during the simulation and not coupled to the heat bath temperatures driving the individual simulations. Therefore, this setup also allows for a continuous adaptation of the temperature set. In this paper, we will review the new scheme and demonstrate its capability. The method is particularly useful for the fast and reliable estimation of the microcanonical temperature T(U) or, equivalently, of the density of states g(U) over a wide range of energies.

  1. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  2. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  3. Towards an Effective Importance Sampling in Monte Carlo Simulations of a System with a Complex Action

    Anagnostopoulos, K.; Azuma, T.; Nishimura, J.

    The sign problem is a notorious problem, which occurs in Monte Carlo simulations of a system with a partition function whose integrand is not positive. One way to simulate such a system is to use the factorization method where one enforces sampling in the part of the configuration space which gives important contribution to the partition function. This is accomplished by using constraints on some observables chosen appropriately and minimizing the free energy associated with their joint distribution functions. These observables are maximally correlated with the complex phase. Observables not in this set essentially decouple from the phase and can be calculated without the sign problem in the corresponding "microcanonical" ensemble. These ideas are applied on a simple matrix model with very strong sign problem and the results are found to be consistent with analytic calculations using the Gaussian Expansion Method.

  4. Estimation variance bounds of importance sampling simulations in digital communication systems

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  5. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  6. The importance of including variability in climate change projections used for adaptation

    Sexton, David M. H.; Harris, Glen R.

    2015-10-01

    Our understanding of mankind’s influence on the climate is largely based on computer simulations. Model output is typically averaged over several decades so that the anthropogenic climate change signal stands out from the largely unpredictable `noise’ of climate variability. Similar averaging periods (30-year) are used for regional climate projections to inform adaptation. According to two such projections, UKCIP02 (ref. ) and UKCP09 (ref. ), the UK will experience `hotter drier summers and warmer wetter winters’ in the future. This message is about a typical rather than any individual future season, and these projections should not be compared directly to observed weather as this neglects the sizeable contribution from year-to-year climate variability. Therefore, despite the apparent contradiction with the messages, it is a fallacy to suggest the recent cold UK winters like 2009/2010 disprove human-made climate change. Nevertheless, such claims understandably cause public confusion and doubt. Here we include year-to-year variability to provide projections for individual seasons. This approach has two advantages. First, it allows fair comparisons with recent weather events, for instance showing that recent cold winters are within projected ranges. Second, it allows the projections to be expressed in terms of the extreme hot, cold, wet or dry seasons that impact society, providing a better idea of adaptation needs.

  7. Importance Sampling Variance Reduction for the Fokker-Planck Rarefied Gas Particle Method

    Collyer, Benjamin; Lockerby, Duncan

    2015-01-01

    Models and methods that are able to accurately and efficiently predict the flows of low-speed rarefied gases are in high demand, due to the increasing ability to manufacture devices at micro and nano scales. One such model and method is a Fokker-Planck approximation to the Boltzmann equation, which can be solved numerically by a stochastic particle method. The stochastic nature of this method leads to noisy estimates of the thermodynamic quantities one wishes to sample when the signal is small in comparison to the thermal velocity of the gas. Recently, Gorji et al have proposed a method which is able to greatly reduce the variance of the estimators, by creating a correlated stochastic process which acts as a control variate for the noisy estimates. However, there are potential difficulties involved when the geometry of the problem is complex, as the method requires the density to be solved for independently. Importance sampling is a variance reduction technique that has already been shown to successfully redu...

  8. Importance sampling implemented in the code PRIZMA for deep penetration and detection problems in reactor physics

    At RFNC-VNIITF, the PRIZMA code which has been developed for more than 30 years, is used to model radiation transport by the Monte Carlo method. The code implements individual and coupled tracking of neutrons, photons, electrons, positrons and ions in one dimensional (1D), 2D or 3D geometry. Attendance estimators are used for tallying, i.e., the estimators whose scores are only nonzero from particles which cross a region or surface of interest. Importance sampling is used to make deep penetration and detection calculations more effective. However, its application to reactor analysis appeared peculiar and required further development. The paper reviews methods used for deep penetration and detection calculations by PRIZMA. It describes in what these calculations differ when applied to reactor analysis and how we compute approximated importance functions and parameters for biased distributions. Methods to control the statistical weight of particles are also discussed. A number of test and applied calculations which were done for the purpose of verification are provided. They are shown to agree either with asymptotic solutions if exist, or with results of analog calculations or predictions by other codes. The applied calculations include the estimation of ex-core detector response from neutron sources arranged in the core, and the estimation of in-core detector response. (authors)

  9. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity over Generalized Fading Channels

    Ben Rached, Nadhir

    2015-11-13

    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closedform expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive Monte Carlo (MC) simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations.

  10. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Oscar Vedder

    2013-07-01

    Full Text Available Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations.

  11. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Vedder, Oscar; Bouwhuis, Sandra; Sheldon, Ben C

    2013-07-01

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major) breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations. PMID:23874152

  12. Adapt

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  13. Cortisol Secretion and Functional Disabilities in Old Age: Importance of Using Adaptive Control Strategies

    Wrosch, Carsten; Miller, Gregory E.; Schulz, Richard

    2009-01-01

    Objectives To examine whether the use of health-related control strategies moderates the association between elevated diurnal cortisol secretion and increases in older adults’ functional disabilities. Methods Functional disabilities of 164 older adults were assessed over 4 years by measuring participants’ problems with performing activities of daily living. The main predictors included baseline levels of diurnal cortisol secretion and control strategies used to manage physical health threats. Results A large increase in functional disabilities was observed among participants who secreted elevated baseline levels of cortisol and did not use health-related control strategies. By contrast, high cortisol level was not associated with increases in functional disabilities among participants who reported using these control strategies. Among participants with low cortisol level, there was a relatively smaller increase in functional disabilities over time, and the use of control strategies was not significantly associated with changes in functional disabilities. Conclusions The findings suggest that high cortisol level is associated with an increase in older adults’ functional disabilities, but only if older adults do not engage in adaptive control strategies. PMID:19875635

  14. Estimation of failure probabilities of linear dynamic systems by importance sampling

    Anna Ivanova Olsen; Arvid Naess

    2006-08-01

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the first iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the first maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure.

  15. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  16. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    HASAN CANI; ARSEN PROKO; VATH TABAKU

    2014-01-01

    Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended t...

  17. Physical Activity: An Important Adaptative Mechanism for Body-Weight Control

    Finelli, Carmine; Gioia, Saverio; La Sala, Nicolina

    2012-01-01

    We review the current concepts about energy expenditure and evaluate the physical activity (PhA) in the context of this knowledge and the available literature. Regular PhA is correlated with low body weight and low body fat mass. The negative fat balance is probably secondary to this negative energy balance. Nonexercise activity thermogenesis (NEAT) and physical activity, that is crucial for weight control, may be important in the physiology of weight change. An intriguing doubt that remains ...

  18. Physical activity: an important adaptative mechanism for body-weight control.

    Finelli, Carmine; Gioia, Saverio; La Sala, Nicolina

    2012-01-01

    We review the current concepts about energy expenditure and evaluate the physical activity (PhA) in the context of this knowledge and the available literature. Regular PhA is correlated with low body weight and low body fat mass. The negative fat balance is probably secondary to this negative energy balance. Nonexercise activity thermogenesis (NEAT) and physical activity, that is crucial for weight control, may be important in the physiology of weight change. An intriguing doubt that remains unresolved is whether changes in nutrient intake or body composition secondarily affect the spontaneous physical activity. PMID:24533208

  19. Massively parallel sampling of lattice proteins reveals foundations of thermal adaptation

    Venev, Sergey V.; Zeldovich, Konstantin B.

    2015-08-01

    Evolution of proteins in bacteria and archaea living in different conditions leads to significant correlations between amino acid usage and environmental temperature. The origins of these correlations are poorly understood, and an important question of protein theory, physics-based prediction of types of amino acids overrepresented in highly thermostable proteins, remains largely unsolved. Here, we extend the random energy model of protein folding by weighting the interaction energies of amino acids by their frequencies in protein sequences and predict the energy gap of proteins designed to fold well at elevated temperatures. To test the model, we present a novel scalable algorithm for simultaneous energy calculation for many sequences in many structures, targeting massively parallel computing architectures such as graphics processing unit. The energy calculation is performed by multiplying two matrices, one representing the complete set of sequences, and the other describing the contact maps of all structural templates. An implementation of the algorithm for the CUDA platform is available at http://www.github.com/kzeldovich/galeprot and calculates protein folding energies over 250 times faster than a single central processing unit. Analysis of amino acid usage in 64-mer cubic lattice proteins designed to fold well at different temperatures demonstrates an excellent agreement between theoretical and simulated values of energy gap. The theoretical predictions of temperature trends of amino acid frequencies are significantly correlated with bioinformatics data on 191 bacteria and archaea, and highlight protein folding constraints as a fundamental selection pressure during thermal adaptation in biological evolution.

  20. The Importance of Pressure Sampling Frequency in Models for Determination of Critical Wave Loadingson Monolithic Structures

    Burcharth, Hans F.; Andersen, Thomas Lykke; Meinert, Palle

    2008-01-01

    This paper discusses the influence of wave load sampling frequency on calculated sliding distance in an overall stability analysis of a monolithic caisson. It is demonstrated by a specific example of caisson design that for this kind of analyses the sampling frequency in a small scale model could...

  1. Indigenizing or Adapting? Importing Buddhism into a Settler-colonial Society

    Sally McAra

    2015-02-01

    Full Text Available In this paper I problematize the phrase "indigenization of Buddhism" (Spuler 2003, cf. Baumann 1997 through an investigation of a Buddhist project in a settler-colonial society. An international organization called the Foundation for the Preservation of the Mahayana Tradition (FPMT is constructing a forty-five-meter high stupa in rural Australia with the intention "to provide a refuge of peace and serenity for all." In 2003, a woman of Aboriginal descent met with the stupa developers to express her concern about the project. While her complaint does not represent local Aboriginal views about the stupa (other Aboriginal groups expressed support for it, it illustrates how in settler-colonial societies, Buddhist cultural imports that mark the land can have unexpected implications for indigenous people. This paper offers a glimpse of the multi-layered power relations that form the often invisible backdrop to the establishment of Buddhism in settler-colonial societies and suggests that we need to find terms other than "indigenization" when analyzing this.

  2. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    HASAN CANI

    2014-06-01

    Full Text Available Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended to produce biological knowledge that can be used to better manage forest resources for sustainable production of economic and non-economic values and aims to improve the understanding of past and current dynamics of Mediterranean and temperate forests. The overarching goal is to quantify the influence of genetics, climate, environmental stresses, and forest management inputs on forest productivity and carbon sequestration, and to understand the physiological mechanisms underlying these responses.Process-based models open the way to useful predictions of the future growth rate of forests and provide a means of assessing the probable effects of variations in climate and management on forest productivity. As such they have the potential to overcome the limitations of conventional forest growth and yield models. This paper discusses the basic physiological processes that determine the growth of plants, the way they are affected by environmental factors and how we can improve processes that are well-understood such as growth from leaf to stand level and productivity. The study trays to show a clear relationship between temperature and water relations and other factors affecting forest plant germination and growth that are often looked at separately. This integrated approach will provide the most comprehensive source for process-based modelling, which is valuable to ecologists, plant physiologists, forest planners and environmental scientists [10]. Actually the

  3. Importance of sampling in relation to the gamma spectroscopic analysis of NORM and TENORM material

    This paper describes the developments over the past 25 years of low background gamma spectroscopic analysis of NORM and TENORM materials to a state-of-the-art semi-automatic gamma analysis system. The developments were initiated in the early 1980s in order to be able to measure low specific activities in fly ash samples. The developments involved modifications and improvements of commercially available hardware, auxiliary equipment, improvement and development of analyzing software, correction software and processing software to a semi-automatic reporting of the analysis results. The effort summarized above has led to detection limits of 238U: 3 Bq/kg, 235U: 0.3 Bq/kg, 226Ra: 5 Bq/kg, 210Pb: 30 Bq/kg, 40K: 60 Bq/kg, with a measuring time of 70,000s using a specially tuned gamma spectroscopy system for NORM and TENORM materials. These low detection limits show the need to set up representative sampling procedures for NORM and TENORM materials. It is not possible to define a sampling procedure that would be valid for all types of sampling. Therefore it is advised that, where sampling is expected to be performed at regular times, a sampling procedure for the materials being dealt with should be set-up and validated. The procedure has to be based on an existing national or international standard. (author)

  4. Cas9 function and host genome sampling in Type II-A CRISPR–Cas adaptation

    Wei, Yunzhou; Terns, Rebecca M.; Terns, Michael P.

    2015-01-01

    Wei et al. found that Cas9, previously identified as the nuclease responsible for ultimate invader destruction, is also essential for adaptation in Streptococcus thermophilus. Cas9 nuclease activity is dispensable for adaptation. Wei et al. also revealed extensive, unbiased acquisition of the self-targeting host genome sequence by the CRISPR–Cas system that is masked in the presence of active target destruction.

  5. The control of radioactivity in the imported food and other samples in Republic of Serbia for period 1990-1994

    The results of measurements of the 137Cs activities in the imported food an the other samples (1663) in Serbia for period 1990-1994 are presented. The 137Cs activities in the majority of samples are in the interval 0.2-0.3 Bq/kg, and are in agreement with results of radioactivity monitoring program in the environment in Serbia. (author)

  6. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    2010-07-01

    ... applicable. (b) Quality assurance program. The importer must conduct a quality assurance program, as specified in this paragraph (b), for each truck or rail car loading terminal. (1) Quality assurance samples... an independent laboratory, and the terminal operator must not know in advance when samples are to...

  7. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function. PMID:26801023

  8. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies

    Mielke, Steven L.; Truhlar, Donald G.

    2016-01-01

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  9. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  10. Importance of covariance components of waveform data with high sampling rate in seismic source inversion

    Yagi, Y.; Fukahata, Y.

    2007-12-01

    As computer technology advanced, it has become possible to observe seismic wave with a higher sampling rate and perform inversion for a larger data set. In general, to obtain a finer image of seismic source processes, waveform data with a higher sampling rate are needed. Then we encounter a problem whether there is no limitation of sampling rate in waveform inversion. In traditional seismic source inversion, covariance components of sampled waveform data have commonly been neglected. In fact, however, observed waveform data are not completely independent of each other at least in time domain, because they are always affected by un-elastic attenuation in the propagation of seismic waves through the Earth. In this study, we have developed a method of seismic source inversion to take the data covariance into account, and applied it to teleseismic P-wave data of the 2003 Boumerdes-Zemmouri, Algeria earthquake. From a comparison of final slip distributions inverted by the new formulation and the traditional formulation, we found that the effect of covariance components is crucial for a data set of higher sampling rates (≥ 5 Hz). For higher sampling rates, the slip distributions by the new formulation look stable, whereas the slip distributions by the traditional formulation tend to concentrate into small patches due to overestimation of the information from observed data. Our result indicates that the un-elastic effect of the Earth gives a limitation to the resolution of inverted seismic source models. It has been pointed out that seismic source models obtained from waveform data analyses are quite different from one another. One possible reason for the discrepancy is the neglect of covariance components. The new formulation must be useful to obtain a standard seismic source model.

  11. On the importance of sampling variance to investigations of temporal variation in animal population size

    Link, W.A.; Nichols, J.D.

    1994-01-01

    Our purpose here is to emphasize the need to properly deal with sampling variance when studying population variability and to present a means of doing so. We present an estimator for temporal variance of population size for the general case in which there are both sampling variances and covariances associated with estimates of population size. We illustrate the estimation approach with a series of population size estimates for black-capped chickadees (Parus atricapillus) wintering in a Connecticut study area and with a series of population size estimates for breeding populations of ducks in southwestern Manitoba.

  12. Sampling procedure in a willow plantation for chemical elements important for biomass combustion quality

    Liu, Na; Nielsen, Henrik Kofoed; Jørgensen, Uffe;

    2015-01-01

    Willow (Salix spp.) is expected to contribute significantly to the woody bioenergy system in the future, so more information on how to sample the quality of the willow biomass is needed. The objectives of this study were to investigate the spatial variation of elements within shoots of a willow...

  13. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    2010-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline...

  14. The Importance of Cultural and Gastronomic Tourism in Local Economic Development: Zile Sample

    Mehmet Kocaman; Emel Memis Kocaman

    2014-01-01

    More rational source distribution in Turkey recently has brought forward the principles of optimality in investment planning. Therefore, many rural areas have been negatively affected from this state. Accordingly, alternative tourism provides important opportunities for rural regions. People living in these regions have become to give importance to local tangible and intangible cultural assets, which are present around their environment and gastronomic products consisting of regional tastes. ...

  15. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    2010-04-01

    ...; repacking; examination of merchandise by prospective purchasers. 19.8 Section 19.8 Customs Duties U.S... goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. Importers... conduct of Customs business and no danger to the revenue prospective purchaser may be permitted to...

  16. Sample preparation and biomass determination of SRF model mixture using cryogenic milling and the adapted balance method

    Schnöller, Johannes, E-mail: johannes.schnoeller@chello.at; Aschenbrenner, Philipp; Hahn, Manuel; Fellner, Johann; Rechberger, Helmut

    2014-11-15

    Highlights: • An alternative sample comminution procedure for SRF is tested. • Proof of principle is shown on a SRF model mixture. • The biogenic content of the SRF is analyzed with the adapted balance method. • The novel method combines combustion analysis and a data reconciliation algorithm. • Factors for the variance of the analysis results are statistically quantified. - Abstract: The biogenic fraction of a simple solid recovered fuel (SRF) mixture (80 wt% printer paper/20 wt% high density polyethylene) is analyzed with the in-house developed adapted balance method (aBM). This fairly new approach is a combination of combustion elemental analysis (CHNS) and a data reconciliation algorithm based on successive linearisation for evaluation of the analysis results. This method shows a great potential as an alternative way to determine the biomass content in SRF. However, the employed analytical technique (CHNS elemental analysis) restricts the probed sample mass to low amounts in the range of a few hundred milligrams. This requires sample comminution to small grain sizes (<200 μm) to generate representative SRF specimen. This is not easily accomplished for certain material mixtures (e.g. SRF with rubber content) by conventional means of sample size reduction. This paper presents a proof of principle investigation of the sample preparation and analysis of an SRF model mixture with the use of cryogenic impact milling (final sample comminution) and the adapted balance method (determination of biomass content). The so derived sample preparation methodology (cutting mills and cryogenic impact milling) shows a better performance in accuracy and precision for the determination of the biomass content than one solely based on cutting mills. The results for the determination of the biogenic fraction are within 1–5% of the data obtained by the reference methods, selective dissolution method (SDM) and {sup 14}C-method ({sup 14}C-M)

  17. Fast-adapting mechanoreceptors are important for force control in precision grip but not for sensorimotor memory.

    Park, Susanna B; Davare, Marco; Falla, Marika; Kennedy, William R; Selim, Mona M; Wendelschafer-Crabb, Gwen; Koltzenburg, Martin

    2016-06-01

    Sensory feedback from cutaneous mechanoreceptors in the fingertips is important in effective object manipulation, allowing appropriate scaling of grip and load forces during precision grip. However, the role of mechanoreceptor subtypes in these tasks remains incompletely understood. To address this issue, psychophysical tasks that may specifically assess function of type I fast-adapting (FAI) and slowly adapting (SAI) mechanoreceptors were used with object manipulation experiments to examine the regulation of grip force control in an experimental model of graded reduction in tactile sensitivity (healthy volunteers wearing 2 layers of latex gloves). With gloves, tactile sensitivity decreased significantly from 1.9 ± 0.4 to 12.3 ± 2.2 μm in the Bumps task assessing function of FAI afferents but not in a grating orientation task assessing SAI afferents (1.6 ± 0.1 to 1.8 ± 0.2 mm). Six axis force/torque sensors measured peak grip (PGF) and load (PLF) forces generated by the fingertips during a grip-lift task. With gloves there was a significant increase of PGF (14 ± 6%), PLF (17 ± 5%), and grip and load force rates (26 ± 8%, 20 ± 8%). A variable-weight series task was used to examine sensorimotor memory. There was a 20% increase in PGF when the lift of a light object was preceded by a heavy relative to a light object. This relationship was not significantly altered when lifting with gloves, suggesting that the addition of gloves did not change sensorimotor memory effects. We conclude that FAI fibers may be important for the online force scaling but not for the buildup of a sensorimotor memory. PMID:27052582

  18. Adaption of egg and larvae sampling techniques for lake sturgeon and broadcast spawning fishes in a deep river

    Roseman, Edward F.; Kennedy, Gregory W.; Craig, Jaquelyn; Boase, James; Soper, Karen

    2011-01-01

    In this report we describe how we adapted two techniques for sampling lake sturgeon (Acipenser fulvescens) and other fish early life history stages to meet our research needs in the Detroit River, a deep, flowing Great Lakes connecting channel. First, we developed a buoy-less method for sampling fish eggs and spawning activity using egg mats deployed on the river bottom. The buoy-less method allowed us to fish gear in areas frequented by boaters and recreational anglers, thus eliminating surface obstructions that interfered with recreational and boating activities. The buoy-less method also reduced gear loss due to drift when masses of floating aquatic vegetation would accumulate on buoys and lines, increasing the drag on the gear and pulling it downstream. Second, we adapted a D-frame drift net system formerly employed in shallow streams to assess larval lake sturgeon dispersal for use in the deeper (>8 m) Detroit River using an anchor and buoy system.

  19. Optimal importance sampling for tracking in image sequences: application to point tracking

    Arnaud, Elise; Memin, Etienne

    2004-01-01

    International audience In this paper, we propose a particle filtering technique for tracking applications in image sequences. The system we propose combines a measurement equation and a dynamic equation which both depend on the image sequence. Taking into account several possible observations, the peculiar measure model we consider is a linear combination of Gaussian laws. Such a model allows us to infer an analytic expression of the optimal importance function used in the diffusion proces...

  20. Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models

    Koopman, S.J.; Lucas, A.; Scharth Figueiredo Pinto, M.

    2011-01-01

    This paper led to a publication in the 'Journal of Business & Economic Statistics' , 2015, 33 (1), 114-127. We introduce a new efficient importance sampler for nonlinear non-Gaussian state space models. We propose a general and efficient likelihood evaluation method for this class of models via the combination of numerical and Monte Carlo integration methods. Our methodology explores the idea that only a small part of the likelihood evaluation problem requires simulation. We refer to our new ...

  1. A Monte Carlo Simulation of the Flow Network Reliability using Importance and Stratified Sampling

    Bulteau, Stéphane; El Khadiri, Mohamed

    1997-01-01

    We consider the evaluation of the flow network reliability parameter. Because the exact evaluation of this parameter has exponential time complexity- , simulation methods are used to derive an estimate. In this paper, we use the state space decomposition methodology of Doulliez and Jamoulle for constructing a new simulation method which combines the importance and the stratified Monte Carlo principles. We show that the related estimator belongs to the variance-reduction family. By numerical c...

  2. The importance of effective sampling for exploring the population dynamics of haploid-diploid seaweeds.

    Krueger-Hadfield, Stacy A; Hoban, Sean M

    2016-02-01

    The mating system partitions genetic diversity within and among populations and the links between life history traits and mating systems have been extensively studied in diploid organisms. As such most evolutionary theory is focused on species for which sexual reproduction occurs between diploid male and diploid female individuals. However, there are many multicellular organisms with biphasic life cycles in which the haploid stage is prolonged and undergoes substantial somatic development. In particular, biphasic life cycles are found across green, brown and red macroalgae. Yet, few studies have addressed the population structure and genetic diversity in both the haploid and diploid stages in these life cycles. We have developed some broad guidelines with which to develop population genetic studies of haploid-diploid macroalgae and to quantify the relationship between power and sampling strategy. We address three common goals for studying macroalgal population dynamics, including haploid-diploid ratios, genetic structure and paternity analyses. PMID:26987084

  3. Importance sampling techniques and treatment of electron transport in MCNP 4A

    The continuous energy Monte Carlo code MCNP was developed by the Radiation Transport Group at Los Alamos National Laboratory and the MCNP 4A version is available, now. The MCNP 4A is able to do the coupled neutron-secondary gamma-ray-electron-bremsstrahlung calculation. The calculated results, such as energy spectra, tally fluctuation chart, and geometrical input data can be displayed by using a work station. The document of the MCNP 4A code has no description on the subroutines, except few ones of 'SOURCE', 'TALLYX'. However, when we want to improve the MCNP Monte Carlo sampling techniques to get more accuracy or efficiency results for some problems, some subroutines are required or needed to revised. Three subroutines have been revised and built in the MCNP 4A code. (author)

  4. Gravimetric and volumetric approaches adapted for hydrogen sorption measurements with in situ conditioning on small sorbent samples

    We present high sensitivity (0 to 1 bar, 295 K) gravimetric and volumetric hydrogen sorption measurement systems adapted for in situ sample conditioning at high temperature and high vacuum. These systems are designed especially for experiments on sorbents available in small masses (mg) and requiring thorough degassing prior to sorption measurements. Uncertainty analysis from instrumental specifications and hydrogen absorption measurements on palladium are presented. The gravimetric and volumetric systems yield cross-checkable results within about 0.05 wt % on samples weighing from (3 to 25) mg. Hydrogen storage capacities of single-walled carbon nanotubes measured at 1 bar and 295 K with both systems are presented

  5. Low-carbon steel samples deformed by cold rolling - analysis by the magnetic adaptive testing

    Tomáš, Ivan; Vértesy, G.; Kobayashi, S.; Kadlecová, Jana; Stupakov, Oleksandr

    2009-01-01

    Roč. 321, č. 17 (2009), s. 2670-2676. ISSN 0304-8853 R&D Projects: GA MŠk MEB040702; GA ČR GA102/06/0866; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * plastic deformation * ow-carbon steel Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.204, year: 2009

  6. Spatiotonal adaptivity in super-resolution of under-sampled image sequences

    Pham, T Q

    2006-01-01

    This thesis concerns the use of spatial and tonal adaptivity in improving the resolution of aliased image sequences under scene or camera motion. Each of the five content chapters focuses on a different subtopic of super-resolution: image registration (chapter 2), image fusion (chapter 3 and 4), super-resolution restoration (chapter 5), and super-resolution synthesis (chapter 6). Chapter 2 derives the Cramer-Rao lower bound of image registration and shows that iterative gradient-based estimat...

  7. Complex mixture analysis of environmentally important samples utilizing GC/Matrix isolation-FTIR-MS

    Gas chromatography/matrix isolation Fourier transform infrared spectroscopy-mass spectroscopy (GC/MI-FTIR-MS) is a highly sensitive and specific hyphenated technique that combines the capabilities of capillary gas chromatography for separating components of complex mixtures with the high sensitivity and specificity of both matrix isolation infrared and mass spectrometric detection. Research intended to extend application of this methodology to analysis of a variety of environmental mixtures will be described. The authors instrument utilizes a Hewlett-Packard 58900 GC with a 40:40:20 three way splitter to the infrared detector, mass selective detector and flame ionization detector, respectively. The FTIR used is a Mattson Cryolect 4800, while the MS is a Hewlett-Packard 5970B MSD. Their most recent results from analysis of mixtures containing picogram quantities of such environmentally important materials as PAH's and chlorinated pesticides will be presented. These mixtures can be chromatography separated and analyzed at the low-nanogram to several hundred picogram level. The results presented will include both MS and IR spectra, MS and IR reconstructed chromatograms as well as FID traces. The results of MS and IR database searches will also be shown

  8. Importance Sampling Based Decision Trees for Security Assessment and the Corresponding Preventive Control Schemes: the Danish Case Study

    Liu, Leo; Rather, Zakir Hussain; Chen, Zhe;

    2013-01-01

    and adopts a methodology of importance sampling to maximize the information contained in the database so as to increase the accuracy of DT. Further, this paper also studies the effectiveness of DT by implementing its corresponding preventive control schemes. These approaches are tested on the detailed model...

  9. Quantitative Assessment of the Importance of Phenotypic Plasticity in Adaptation to Climate Change in Wild Bird Populations

    Vedder, Oscar; Bouwhuis, Sandra; SHELDON, BEN C.

    2013-01-01

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major) breeding in Wytham Woods, Oxford, UK t...

  10. FloodNet: Coupling Adaptive Sampling with Energy Aware Routing in a Flood Warning System

    Jing ZHOU; De Roure, David

    2007-01-01

    We describe the design of FloodNet, a flood warning system, which uses a grid-based flood predictor model developed by environmental experts to make flood predictions based on readings of water level collected by a set of sensor nodes. To optimize battery consumption, the reporting frequency of sensor nodes is required to be adaptive to local conditions as well as the °ood predictor model. We therefore propose an energy aware routing protocol which allows sensor nodes to consume energy accord...

  11. Clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile cultivated from stool samples of hospitalized patients

    Predrag Stojanovic

    2012-03-01

    Full Text Available The aim of this study was to fortify the clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile isolated from stool samples of hospitalized patients. This survey included 80 hospitalized patients with diarrhea and positive findings of Clostridium difficile in stool samples, and 100 hospitalized patients with formed stool as a control group. Bacteriological examination of a stool samples was conducted using standard microbiological methods. Stool sample were inoculated directly on nutrient media for bacterial cultivation (blood agar using 5% sheep blood, Endo agar, selective Salmonella Shigella agar, Selenite-F broth, CIN agar and Skirrow's medium, and to selective cycloserine-cefoxitin-fructose agar (CCFA (Biomedics, Parg qe tehnicologico, Madrid, Spain for isolation of Clostridium difficile. Clostridium difficile toxin was detected by ELISA-ridascreen Clostridium difficile Toxin A/B (R-Biopharm AG, Germany and ColorPAC ToxinA test (Becton Dickinson, USA. Examination of stool specimens for the presence of parasites (causing diarrhea was done using standard methods (conventional microscopy, commercial concentration test Paraprep S Gold kit (Dia Mondial, France and RIDA®QUICK Cryptosporidium/Giardia Combi test (R-Biopharm AG, Germany. Examination of stool specimens for the presence of fungi (causing diarrhea was performed by standard methods. All stool samples positive for Clostridium difficile were tested for Rota, Noro, Astro and Adeno viruses by ELISA - ridascreen (R-Biopharm AG, Germany. In this research we isolated 99 Clostridium difficile strains from 116 stool samples of 80 hospitalized patients with diarrhea. The 53 (66.25% of patients with diarrhea were positive for toxins A and B, one (1.25% were positive for only toxin B. Non-toxigenic Clostridium difficile isolated from samples of 26 (32.5% patients. However, other pathogenic microorganisms of intestinal tract cultivated from samples of 16 patients

  12. Unconstrained Recursive Importance Sampling

    Lemaire, Vincent; Pagès, Gilles

    2010-01-01

    We propose an unconstrained stochastic approximation method for finding the optimal change of measure (in an a priori parametric family) to reduce the variance of a Monte Carlo simulation. We consider different parametric families based on the Girsanov theorem and the Esscher transform (exponential-tilting). In [Monte Carlo Methods Appl. 10 (2004) 1–24], it described a projected Robbins–Monro procedure to select the parameter minimizing the variance in a multidimensional Gaussian framework. I...

  13. An adaptive non-raster scanning method in atomic force microscopy for simple sample shapes

    It is a significant challenge to reduce the scanning time in atomic force microscopy while retaining imaging quality. In this paper, a novel non-raster scanning method for high-speed imaging is presented. The method proposed here is developed for a specimen with the simple shape of a cell. The image is obtained by scanning the boundary of the specimen at successively increasing heights, creating a set of contours. The scanning speed is increased by employing a combined prediction algorithm, using a weighted prediction from the contours scanned earlier, and from the currently scanned contour. In addition, an adaptive change in the height step after each contour scan is suggested. A rigorous simulation test bed recreates the x–y specimen stage dynamics and the cantilever height control dynamics, so that a detailed parametric comparison of the scanning algorithms is possible. The data from different scanning algorithms are compared after the application of an image interpolation algorithm (the Delaunay interpolation algorithm), which can also run on-line. (paper)

  14. Efficient Bayes-Adaptive Reinforcement Learning using Sample-Based Search

    Guez, Arthur; Dayan, Peter

    2012-01-01

    Bayesian model-based reinforcement learning is a formally elegant approach to learning optimal behaviour under model uncertainty. In this setting, a Bayes-optimal policy captures the ideal trade-off between exploration and exploitation. Unfortunately, finding Bayes-optimal policies is notoriously taxing due to the enormous search space in the augmented belief-state MDP. In this paper we exploit recent advances in sample-based planning, based on Monte-Carlo tree search, to introduce a tractable method for approximate Bayes-optimal planning. Unlike prior work in this area, we avoid expensive applications of Bayes rule within the search tree, by lazily sampling models from the current beliefs. Our approach outperformed prior Bayesian model-based RL algorithms by a significant margin on several well-known benchmark problems.

  15. SAMPLING ADAPTIVE STRATEGY AND SPATIAL ORGANISATION ESTIMATION OF SOIL ANIMAL COMMUNITIES AT VARIOUS HIERARCHICAL LEVELS OF URBANISED TERRITORIES

    Baljuk J.A.

    2014-12-01

    Full Text Available In work the algorithm of adaptive strategy of optimum spatial sampling for studying of the spatial organisation of communities of soil animals in the conditions of an urbanization have been presented. As operating variables the principal components obtained as a result of the analysis of the field data on soil penetration resistance, soils electrical conductivity and density of a forest stand, collected on a quasiregular grid have been used. The locations of experimental polygons have been stated by means of program ESAP. The sampling has been made on a regular grid within experimental polygons. The biogeocoenological estimation of experimental polygons have been made on a basis of A.L.Belgard's ecomorphic analysis. The spatial configuration of biogeocoenosis types has been established on the basis of the data of earth remote sensing and the analysis of digital elevation model. The algorithm was suggested which allows to reveal the spatial organisation of soil animal communities at investigated point, biogeocoenosis, and landscape.

  16. Preliminary Efficacy of Adapted Responsive Teaching for Infants at Risk of Autism Spectrum Disorder in a Community Sample

    Grace T. Baranek

    2015-01-01

    Full Text Available This study examined the (a feasibility of enrolling 12-month-olds at risk of ASD from a community sample into a randomized controlled trial, (b subsequent utilization of community services, and (c potential of a novel parent-mediated intervention to improve outcomes. The First Year Inventory was used to screen and recruit 12-month-old infants at risk of ASD to compare the effects of 6–9 months of Adapted Responsive Teaching (ART versus referral to early intervention and monitoring (REIM. Eighteen families were followed for ~20 months. Assessments were conducted before randomization, after treatment, and at 6-month follow-up. Utilization of community services was highest for the REIM group. ART significantly outperformed REIM on parent-reported and observed measures of child receptive language with good linear model fit. Multiphase growth models had better fit for more variables, showing the greatest effects in the active treatment phase, where ART outperformed REIM on parental interactive style (less directive, child sensory responsiveness (less hyporesponsive, and adaptive behavior (increased communication and socialization. This study demonstrates the promise of a parent-mediated intervention for improving developmental outcomes for infants at risk of ASD in a community sample and highlights the utility of earlier identification for access to community services earlier than standard practice.

  17. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  18. Adaptive foveated single-pixel imaging with dynamic super-sampling

    Phillips, David B; Taylor, Jonathan M; Edgar, Matthew P; Barnett, Stephen M; Gibson, Graham G; Padgett, Miles J

    2016-01-01

    As an alternative to conventional multi-pixel cameras, single-pixel cameras enable images to be recorded using a single detector that measures the correlations between the scene and a set of patterns. However, to fully sample a scene in this way requires at least the same number of correlation measurements as there are pixels in the reconstructed image. Therefore single-pixel imaging systems typically exhibit low frame-rates. To mitigate this, a range of compressive sensing techniques have been developed which rely on a priori knowledge of the scene to reconstruct images from an under-sampled set of measurements. In this work we take a different approach and adopt a strategy inspired by the foveated vision systems found in the animal kingdom - a framework that exploits the spatio-temporal redundancy present in many dynamic scenes. In our single-pixel imaging system a high-resolution foveal region follows motion within the scene, but unlike a simple zoom, every frame delivers new spatial information from acros...

  19. Generalized likelihood uncertainty estimation (GLUE) using adaptive Markov chain Monte Carlo sampling

    Blasone, Roberta-Serena; Vrugt, Jasper A.; Madsen, Henrik;

    2008-01-01

    estimate of the associated uncertainty. This uncertainty arises from incomplete process representation, uncertainty in initial conditions, input, output and parameter error. The generalized likelihood uncertainty estimation (GLUE) framework was one of the first attempts to represent prediction uncertainty...... within the context of Monte Carlo (MC) analysis coupled with Bayesian estimation and propagation of uncertainty. Because of its flexibility, ease of implementation and its suitability for parallel implementation on distributed computer systems, the GLUE method has been used in a wide variety of...... applications. However, the MC based sampling strategy of the prior parameter space typically utilized in GLUE is not particularly efficient in finding behavioral simulations. This becomes especially problematic for high-dimensional parameter estimation problems, and in the case of complex simulation models...

  20. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping [Academia Sinica, Institute of Astronomy and Astrophysics, Taipei, Taiwan (China); Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Girart, Josep M. [Institut de Ciències de l' Espai, CSIC-IEEC, Campus UAB, Facultat de Ciències, C5p 2, 08193 Bellaterra, Catalonia (Spain); Frau, Pau [Observatorio Astronómico Nacional, Alfonso XII, 3 E-28014 Madrid (Spain); Li, Hua-Bai [Department of Physics, The Chinese University of Hong Kong (Hong Kong); Li, Zhi-Yun [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904 (United States); Padovani, Marco [Laboratoire Univers et Particules de Montpellier, UMR 5299 du CNRS, Université de Montpellier II, place E. Bataillon, cc072, F-34095 Montpellier (France); Qiu, Keping [School of Astronomy and Space Science, Nanjing University, 22 Hankou Road, Nanjiing 210093 (China); Rao, Ramprasad, E-mail: pmkoch@asiaa.sinica.edu.tw [Academia Sinica, Institute of Astronomy and Astrophysics, 645 N. Aohoku Place, Hilo, HI 96720 (United States)

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  1. Using adaptive sampling and triangular meshes for the processing and inversion of potential field data

    Foks, Nathan Leon

    The interpretation of geophysical data plays an important role in the analysis of potential field data in resource exploration industries. Two categories of interpretation techniques are discussed in this thesis; boundary detection and geophysical inversion. Fault or boundary detection is a method to interpret the locations of subsurface boundaries from measured data, while inversion is a computationally intensive method that provides 3D information about subsurface structure. My research focuses on these two aspects of interpretation techniques. First, I develop a method to aid in the interpretation of faults and boundaries from magnetic data. These processes are traditionally carried out using raster grid and image processing techniques. Instead, I use unstructured meshes of triangular facets that can extract inferred boundaries using mesh edges. Next, to address the computational issues of geophysical inversion, I develop an approach to reduce the number of data in a data set. The approach selects the data points according to a user specified proxy for its signal content. The approach is performed in the data domain and requires no modification to existing inversion codes. This technique adds to the existing suite of compressive inversion algorithms. Finally, I develop an algorithm to invert gravity data for an interfacing surface using an unstructured mesh of triangular facets. A pertinent property of unstructured meshes is their flexibility at representing oblique, or arbitrarily oriented structures. This flexibility makes unstructured meshes an ideal candidate for geometry based interface inversions. The approaches I have developed provide a suite of algorithms geared towards large-scale interpretation of potential field data, by using an unstructured representation of both the data and model parameters.

  2. Performances of a bent-crystal spectrometer adapted to resonant x-ray emission measurements on gas-phase samples

    We describe a bent-crystal spectrometer adapted to measure x-ray emission resulting from core-level excitation of gas-phase molecules in the 0.8-8 keV energy range. The spectrometer is based on the Johann principle, and uses a microfocused photon beam to provide high-resolution (resolving power of ∼7500). A gas cell was designed to hold a high-pressure (300 mbar) sample of gas while maintaining a high vacuum (10-9 mbar) in the chamber. The cell was designed to optimize the counting rate (2000 cts/s at the maximum of the Cl Kα emission line), while minimizing self-absorption. Example of the Kα emission lines of CH3Cl molecules is presented to illustrate the capabilities of this new instrument.

  3. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  4. Predicting the impacts of climate change on animal distributions: the importance of local adaptation and species' traits

    HELLMANN, J. J.; LOBO, N. F.

    2011-12-20

    The geographic range limits of many species are strongly affected by climate and are expected to change under global warming. For species that are able to track changing climate over broad geographic areas, we expect to see shifts in species distributions toward the poles and away from the equator. A number of ecological and evolutionary factors, however, could restrict this shifting or redistribution under climate change. These factors include restricted habitat availability, restricted capacity for or barriers to movement, or reduced abundance of colonists due the perturbation effect of climate change. This research project examined the last of these constraints - that climate change could perturb local conditions to which populations are adapted, reducing the likelihood that a species will shift its distribution by diminishing the number of potential colonists. In the most extreme cases, species ranges could collapse over a broad geographic area with no poleward migration and an increased risk of species extinction. Changes in individual species ranges are the processes that drive larger phenomena such as changes in land cover, ecosystem type, and even changes in carbon cycling. For example, consider the poleward range shift and population outbreaks of the mountain pine beetle that has decimated millions of acres of Douglas fir trees in the western US and Canada. Standing dead trees cause forest fires and release vast quantities of carbon to the atmosphere. The beetle likely shifted its range because it is not locally adapted across its range, and it appears to be limited by winter low temperatures that have steadily increased in the last decades. To understand range and abundance changes like the pine beetle, we must reveal the extent of adaptive variation across species ranges - and the physiological basis of that adaptation - to know if other species will change as readily as the pine beetle. Ecologists tend to assume that range shifts are the dominant

  5. Exploring equivalence domain in nonlinear inverse problems using Covariance Matrix Adaption Evolution Strategy (CMAES) and random sampling

    Grayver, Alexander V.; Kuvshinov, Alexey V.

    2016-05-01

    This paper presents a methodology to sample equivalence domain (ED) in nonlinear partial differential equation (PDE)-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low-misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of magneotelluric, controlled-source electromagnetic (EM) and global EM induction data.

  6. Exploring equivalence domain in non-linear inverse problems using Covariance Matrix Adaption Evolution Strategy (CMAES) and random sampling

    Grayver, Alexander V.; Kuvshinov, Alexey V.

    2016-02-01

    This paper presents a methodology to sample equivalence domain (ED) in non-linear PDE-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo (MCMC) algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of Magneotelluric, Controlled-source Electromagnetic (EM) and Global EM induction data.

  7. HOW TO ESTIMATE THE AMOUNT OF IMPORTANT CHARACTERISTICS MISSING IN A CONSUMERS SAMPLE BY USING BAYESIAN ESTIMATORS

    Sueli A. Mingoti

    2001-06-01

    Full Text Available Consumers surveys are conducted very often by many companies with the main objective of obtaining information about the opinions the consumers have about a specific prototype, product or service. In many situations the goal is to identify the characteristics that are considered important by the consumers when taking the decision of buying or using the products or services. When the survey is performed some characteristics that are present in the consumers population might not be reported by those consumers in the observed sample. Therefore, some important characteristics of the product according to the consumers opinions could be missing in the observed sample. The main objective of this paper is to show how the amount of characteristics missing in the observed sample could be easily estimated by using some Bayesian estimators proposed by Mingoti & Meeden (1992 and Mingoti (1999. An example of application related to an automobile survey is presented.Pesquisas de mercado são conduzidas freqüentemente com o propósito de obter informações sobre a opinião dos consumidores em relação a produtos já existentes no mercado, protótipos, ou determinados tipos de serviços prestados pela empresa. Em muitas situações deseja-se identificar as características que são consideradas importantes pelos consumidores no que se refere à tomada de decisão de compra do produto ou de opção pelo serviço prestado pela empresa. Como as pesquisas são feitas com amostras de consumidores do mercado potencial, algumas características consideradas importantes pela população podem não estar representadas nas amostras. O objetivo deste artigo é mostrar como a quantidade de características presentes na população e que não estão representadas nas amostras, pode ser facilmente estimada através de estimadores Bayesianos propostos por Mingoti & Meeden (1992 e Mingoti (1999. Como ilustração apresentamos um exemplo de uma pesquisa de mercado sobre um

  8. Comparative chemical composition and antimicrobial activity study of essential oils from two imported lemon fruits samples against pathogenic bacteria

    Najwa Nasser AL-Jabri

    2014-12-01

    Full Text Available The aim of this work to isolate and identify two essential oils by hydro distillation method from two imported lemon fruits samples collected from local supermarket and evaluate their antimicrobial activity against pathogenic bacteria through disc diffusion method. The essential oil was obtained from Turkish and Indian lemon fruits samples by hydro distillation method using Clevenger type apparatus. Both isolated essential oils were identified by GC–MS and determine their in vitro antimicrobial activity against pathogenic bacteria through agar gel method. Twenty two bioactive ingredients with different percentage were identified based on GC retention time from Turkish and Indian lemon collected from local supermarket. The predominant bioactive ingredients with high percentage in Turkish essential oil were dl-limonene (78.92%, α-pinene (5.08%, l-α-terpineol (4.61%, β-myrcene (1.75%, β-pinene (1.47% and β-linalool (0.95% and in Indian essential oil were dl-limonene (53.57%, l-α-terpineol (15.15%, β-pinene (7.44%, α-terpinolene (4.33%, terpinen-4-ol (3.55%, cymene (2.88% and E-citral (2.38% respectively. Both isolated essential oils by hydro distillation were used for the study of antimicrobial activity against four pathogenic bacterial strains such as Staphylococcus aureus (S. aureus, Escherichia coli (E. coli, Pseudomonas aeruginosa (P. aeruginosa and Proteus vulgaris (Pseudomonas vulgaris. Almost all bacterial strains did not give any activity against the employed essential oils at different concentrations. Therefore, the obtained results show that both essential oils could be needed further extensive biological study and their mechanism of action.

  9. Comparison of sample preparation methods for the determination of essential and toxic elements in important indigenous medicinal plant Aloe barbadensis

    The role of elements particularly traces elements in health and disease is now well established. In this paper we investigate the presence of various elements in very important herb Aloe barbadensis, it is commonly used in different ailments especially of elementary tract. We used four extraction methods for the determination of total elements in Aloe barbadensis. The procedure, which is found to be more efficient and decompose the biological material, is nitric acid and 30% hydrogen peroxide as compared to other method. The sample of plants was collected from surrounding of Hyderabad; Sindh University and vouches specimens were prepared following the standard herbarium techniques. Fifteen essential, trace and toxic elements such as Zn, Cr, K, Mg, Ca, Na, Fe, Pb, Al, Ba, Mn, Co, Ni and Cd were determined in plant and in its decoction. Using Flame Atomic Absorption Spectrophotometer Hitachi Model 180-50. It is noted that, level of essential elements was found high as compare to the level of toxic elements. (author)

  10. Glove box adaptation of a high resolution ICP emission spectrometer and its operating experience for analysis of radioactive samples

    ICP-AES units are commercially available in market from many well established companies. These units are all compact in design and are not suitable for its glove box adaptation. As per our divisional requirement of ICP-AES to be incorporated in glove box for the purpose of analyzing radioactive material, it was decided to have all electronic and optical components to keep outside radioactive containment and the entire assembly of ICP-torch, r.f. coil, nebulizer, spray chamber, peristaltic pump and drainage system to be placed inside the glove-box. Simultaneously it was essential to maintain the analytical performance of the spectrometer. From its ore to nuclear fuel to reprocessing and disposal, uranium undergoes several different transformations within the nuclear fuel cycle, including concentration, purification, isotope enrichment, metallurgical processing and obtaining precious element i.e. plutonium (Pu). The determination of impurities in uranium/plutonium at various stages of nuclear fuel cycle plays an important role in quality control and achievement of chemical and metallurgical requirements

  11. Methodological Adaptations for Investigating the Perceptions of Language-Impaired Adolescents Regarding the Relative Importance of Selected Communication Skills

    Reed, Vicki A.; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…

  12. Importance of Mobile Genetic Elements and Conjugal Gene Transfer for Subsurface Microbial Community Adaptation to Biotransformation of Metals

    Soils used in the present DOE project were obtained from the Field Research Center (FRC) through correspondence with FRC Manager David Watson. We obtained a total of six soils sampled at different distances from the surface: (A) Non-contaminated surface soil from Hinds Creek Floodplain (0 mbs (meter below surface)). (B) Mercury-contaminated surface soil from Lower East Fork Poplar Creek Floodplain (0 mbs). (C) Mercury-contaminated subsurface soil from Lower East Fork Poplar Creek Floodplain (0.5 mbs). (D) Mercury-contaminated subsurface soil from Lower East Fork Poplar Creek Floodplain (1.0 mbs). (E) Non-contaminated surface soil from Ish Creek Floodplain (0 mbs). (F) Non-contaminated surface soil from Ish Creek Floodplain (0.5 mbs)

  13. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally-Lensed Star-Forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    Leethochawalit, Nicha; Ellis, Richard S; Stark, Daniel P; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2015-01-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of $z\\simeq2$ based on Keck laser-assisted adaptive optics observations undertaken with the recently-improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L$^{\\ast}$ galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2-D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offers a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how serious errors of interpretation can only be revealed through better sampling. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary t...

  14. Neuronal Hypoxia Induces Hsp40-Mediated Nuclear Import of Type 3 Deiodinase As an Adaptive Mechanism to Reduce Cellular Metabolism

    Jo, S; Kallo, I.; Bardoczi, Z.; Arrojo e Drigo, R.; Zeold, A.; Liposits, Z.; Oliva, A.; Lemmon, V.P.; Bixby, J. L.; Gereben, B.; A.C. Bianco

    2012-01-01

    In neurons, the type 3 deiodinase (D3) inactivates thyroid hormone and reduces oxygen consumption, thus creating a state of cell-specific hypothyroidism. Here we show that hypoxia leads to nuclear import of D3 in neurons, without which thyroid hormone signaling and metabolism cannot be reduced. After unilateral hypoxia in the rat brain, D3 protein level is increased predominantly in the nucleus of the neurons in the pyramidal and granular ipsilateral layers, as well as in the hilus of the den...

  15. 76 FR 65165 - Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative...

    2011-10-20

    ..., to be planted or replanted. The definition of plant in that section includes any plant (including any... October 17, 2011. The risk-based sampling will be implemented following further analysis of the sampling... planting infested with quarantine pests do not enter the United States, while providing a...

  16. Swarm-founding in the polistine wasps: the importance of finding many microsatellite loci in studies of adaptation.

    Henshaw, M T; Strassmann, J E; Queller, D C

    2001-01-01

    We developed 52 microsatellite loci for the wasp, Polybioides tabidus, for the purpose of studying the evolution and inclusive fitness consequences of swarm-founding. The large number of loci is important for three reasons that may apply to many other systems. Heterozygosity was low in our target species, yet we found enough polymorphic loci for accurate kinship studies in this species. Many monomorphic loci were polymorphic in other polistine wasps, making comparative studies possible. Finally, enough loci amplified over a broad range of species to add a historical dimension. We sequenced six loci in other polistine wasps and used the flanking sequences to construct a phylogeny. Based on this phylogeny, we infer that swarm-founding has evolved independently three times in the polistine wasps. PMID:11251797

  17. Perceptions of Australian marine protected area managers regarding the role, importance, and achievability of adaptation for managing the risks of climate change

    Christopher Cvitanovic

    2014-12-01

    Full Text Available The rapid development of adaptation as a mainstream strategy for managing the risks of climate change has led to the emergence of a broad range of adaptation policies and management strategies globally. However, the success of such policies or management interventions depends on the effective integration of new scientific research into the decision-making process. Ineffective communication between scientists and environmental decision makers represents one of the key barriers limiting the integration of science into the decision-making process in many areas of natural resource management. This can be overcome by understanding the perceptions of end users, so as to identify knowledge gaps and develop improved and targeted strategies for communication and engagement. We assessed what one group of environmental decision makers, Australian marine protected area (MPA managers, viewed as the major risks associated with climate change, and their perceptions regarding the role, importance, and achievability of adaptation for managing these risks. We also assessed what these managers perceived as the role of science in managing the risks from climate change, and identified the factors that increased their trust in scientific information. We do so by quantitatively surveying 30 MPA managers across 3 Australian management agencies. We found that although MPA managers have a very strong awareness of the range and severity of risks posed by climate change, their understanding of adaptation as an option for managing these risks is less comprehensive. We also found that although MPA managers view science as a critical source of information for informing the decision-making process, it should be considered in context with other knowledge types such as community and cultural knowledge, and be impartial, evidence based, and pragmatic in outlining policy and management recommendations that are realistically achievable.

  18. A quasi-exclusive European ancestry in the Senepol tropical cattle breed highlights the importance of the slick locus in tropical adaptation.

    Laurence Flori

    Full Text Available BACKGROUND: The Senepol cattle breed (SEN was created in the early XX(th century from a presumed cross between a European (EUT breed (Red Poll and a West African taurine (AFT breed (N'Dama. Well adapted to tropical conditions, it is also believed trypanotolerant according to its putative AFT ancestry. However, such origins needed to be verified to define relevant husbandry practices and the genetic background underlying such adaptation needed to be characterized. METHODOLOGY/PRINCIPAL FINDINGS: We genotyped 153 SEN individuals on 47,365 SNPs and combined the resulting data with those available on 18 other populations representative of EUT, AFT and Zebu (ZEB cattle. We found on average 89% EUT, 10.4% ZEB and 0.6% AFT ancestries in the SEN genome. We further looked for footprints of recent selection using standard tests based on the extent of haplotype homozygosity. We underlined i three footprints on chromosome (BTA 01, two of which are within or close to the polled locus underlying the absence of horns and ii one footprint on BTA20 within the slick hair coat locus, involved in thermotolerance. Annotation of these regions allowed us to propose three candidate genes to explain the observed signals (TIAM1, GRIK1 and RAI14. CONCLUSIONS/SIGNIFICANCE: Our results do not support the accepted concept about the AFT origin of SEN breed. Initial AFT ancestry (if any might have been counter-selected in early generations due to breeding objectives oriented in particular toward meat production and hornless phenotype. Therefore, SEN animals are likely susceptible to African trypanosomes which questions the importation of SEN within the West African tsetse belt, as promoted by some breeding societies. Besides, our results revealed that SEN breed is predominantly a EUT breed well adapted to tropical conditions and confirmed the importance in thermotolerance of the slick locus.

  19. Climate impacts on European agriculture and water management in the context of adaptation and mitigation-The importance of an integrated approach

    We review and qualitatively assess the importance of interactions and feedbacks in assessing climate change impacts on water and agriculture in Europe. We focus particularly on the impact of future hydrological changes on agricultural greenhouse gas (GHG) mitigation and adaptation options. Future projected trends in European agriculture include northward movement of crop suitability zones and increasing crop productivity in Northern Europe, but declining productivity and suitability in Southern Europe. This may be accompanied by a widening of water resource differences between the North and South, and an increase in extreme rainfall events and droughts. Changes in future hydrology and water management practices will influence agricultural adaptation measures and alter the effectiveness of agricultural mitigation strategies. These interactions are often highly complex and influenced by a number of factors which are themselves influenced by climate. Mainly positive impacts may be anticipated for Northern Europe, where agricultural adaptation may be shaped by reduced vulnerability of production, increased water supply and reduced water demand. However, increasing flood hazards may present challenges for agriculture, and summer irrigation shortages may result from earlier spring runoff peaks in some regions. Conversely, the need for effective adaptation will be greatest in Southern Europe as a result of increased production vulnerability, reduced water supply and increased demands for irrigation. Increasing flood and drought risks will further contribute to the need for robust management practices. The impacts of future hydrological changes on agricultural mitigation in Europe will depend on the balance between changes in productivity and rates of decomposition and GHG emission, both of which depend on climatic, land and management factors. Small increases in European soil organic carbon (SOC) stocks per unit land area are anticipated considering changes in climate

  20. Bottom–up protein identifications from microliter quantities of individual human tear samples. Important steps towards clinical relevance.

    Peter Raus

    2015-12-01

    With 375 confidently identified proteins in the healthy adult tear, the obtained results are comprehensive and in large agreement with previously published observations on pooled samples of multiple patients. We conclude that, to a limited extent, bottom–up tear protein identifications from individual patients may have clinical relevance.

  1. Liver kinetics of glucose analogs measured in pigs by PET: importance of dual-input blood sampling

    Munk, O L; Bass, L; Roelsgaard, K; Bender, D; Hansen, S B; Keiding, S

    2001-01-01

    parameters, because of ignorance of the dual blood supply from the hepatic artery and the portal vein to the liver. METHODS: Six pigs underwent PET after [15O]carbon monoxide inhalation, 3-O-[11C]methylglucose (MG) injection, and [18F]FDG injection. For the glucose scans, PET data were acquired for 90 min......Metabolic processes studied by PET are quantified traditionally using compartmental models, which relate the time course of the tracer concentration in tissue to that in arterial blood. For liver studies, the use of arterial input may, however, cause systematic errors to the estimated kinetic....... Hepatic arterial and portal venous blood samples and flows were measured during the scan. The dual-input function was calculated as the flow-weighted input. RESULTS: For both MG and FDG, the compartmental analysis using arterial input led to systematic underestimation of the rate constants for rapid blood...

  2. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    Peterson, A Townsend; Moses, Lina M; Bausch, Daniel G

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk. PMID:25105746

  3. Mapping Transmission Risk of Lassa Fever in West Africa: The Importance of Quality Control, Sampling Bias, and Error Weighting

    Peterson, A. Townsend; Moses, Lina M.; Bausch, Daniel G.

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk. PMID:25105746

  4. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    A Townsend Peterson

    Full Text Available Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk.

  5. Who art thou? Personality predictors of artistic preferences in a large UK sample: the importance of openness.

    Chamorro-Premuzic, Tomas; Reimers, Stian; Hsu, Anne; Ahmetoglu, Gorkan

    2009-08-01

    The present study examined individual differences in artistic preferences in a sample of 91,692 participants (60% women and 40% men), aged 13-90 years. Participants completed a Big Five personality inventory (Goldberg, 1999) and provided preference ratings for 24 different paintings corresponding to cubism, renaissance, impressionism, and Japanese art, which loaded on to a latent factor of overall art preferences. As expected, the personality trait openness to experience was the strongest and only consistent personality correlate of artistic preferences, affecting both overall and specific preferences, as well as visits to galleries, and artistic (rather than scientific) self-perception. Overall preferences were also positively influenced by age and visits to art galleries, and to a lesser degree, by artistic self-perception and conscientiousness (negatively). As for specific styles, after overall preferences were accounted for, more agreeable, more conscientious and less open individuals reported higher preference levels for impressionist, younger and more extraverted participants showed higher levels of preference for cubism (as did males), and younger participants, as well as males, reported higher levels of preferences for renaissance. Limitations and recommendations for future research are discussed. PMID:19026107

  6. Sonochemical degradation of ethyl paraben in environmental samples: Statistically important parameters determining kinetics, by-products and pathways.

    Papadopoulos, Costas; Frontistis, Zacharias; Antonopoulou, Maria; Venieri, Danae; Konstantinou, Ioannis; Mantzavinos, Dionissios

    2016-07-01

    The sonochemical degradation of ethyl paraben (EP), a representative of the parabens family, was investigated. Experiments were conducted at constant ultrasound frequency of 20kHz and liquid bulk temperature of 30°C in the following range of experimental conditions: EP concentration 250-1250μg/L, ultrasound (US) density 20-60W/L, reaction time up to 120min, initial pH 3-8 and sodium persulfate 0-100mg/L, either in ultrapure water or secondary treated wastewater. A factorial design methodology was adopted to elucidate the statistically important effects and their interactions and a full empirical model comprising seventeen terms was originally developed. Omitting several terms of lower significance, a reduced model that can reliably simulate the process was finally proposed; this includes EP concentration, reaction time, power density and initial pH, as well as the interactions (EP concentration)×(US density), (EP concentration)×(pHo) and (EP concentration)×(time). Experiments at an increased EP concentration of 3.5mg/L were also performed to identify degradation by-products. LC-TOF-MS analysis revealed that EP sonochemical degradation occurs through dealkylation of the ethyl chain to form methyl paraben, while successive hydroxylation of the aromatic ring yields 4-hydroxybenzoic, 2,4-dihydroxybenzoic and 3,4-dihydroxybenzoic acids. By-products are less toxic to bacterium V. fischeri than the parent compound. PMID:26964924

  7. Virulence Characterisation of Salmonella enterica Isolates of Differing Antimicrobial Resistance Recovered from UK Livestock and Imported Meat Samples

    Card, Roderick; Vaughan, Kelly; Bagnall, Mary; Spiropoulos, John; Cooley, William; Strickland, Tony; Davies, Rob; Anjum, Muna F.

    2016-01-01

    Salmonella enterica is a foodborne zoonotic pathogen of significant public health concern. We have characterized the virulence and antimicrobial resistance gene content of 95 Salmonella isolates from 11 serovars by DNA microarray recovered from UK livestock or imported meat. Genes encoding resistance to sulphonamides (sul1, sul2), tetracycline [tet(A), tet(B)], streptomycin (strA, strB), aminoglycoside (aadA1, aadA2), beta-lactam (blaTEM), and trimethoprim (dfrA17) were common. Virulence gene content differed between serovars; S. Typhimurium formed two subclades based on virulence plasmid presence. Thirteen isolates were selected by their virulence profile for pathotyping using the Galleria mellonella pathogenesis model. Infection with a chicken invasive S. Enteritidis or S. Gallinarum isolate, a multidrug resistant S. Kentucky, or a S. Typhimurium DT104 isolate resulted in high mortality of the larvae; notably presence of the virulence plasmid in S. Typhimurium was not associated with increased larvae mortality. Histopathological examination showed that infection caused severe damage to the Galleria gut structure. Enumeration of intracellular bacteria in the larvae 24 h post-infection showed increases of up to 7 log above the initial inoculum and transmission electron microscopy (TEM) showed bacterial replication in the haemolymph. TEM also revealed the presence of vacuoles containing bacteria in the haemocytes, similar to Salmonella containing vacuoles observed in mammalian macrophages; although there was no evidence from our work of bacterial replication within vacuoles. This work shows that microarrays can be used for rapid virulence genotyping of S. enterica and that the Galleria animal model replicates some aspects of Salmonella infection in mammals. These procedures can be used to help inform on the pathogenicity of isolates that may be antibiotic resistant and have scope to aid the assessment of their potential public and animal health risk. PMID:27199965

  8. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally Lensed Star-forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    Leethochawalit, Nicha; Jones, Tucker A.; Ellis, Richard S.; Stark, Daniel P.; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2016-04-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of z≃ 2 based on Keck laser-assisted adaptive optics observations undertaken with the recently improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L{}* galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offer a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how kinematic complexities essential to understanding the maturity of an early star-forming galaxy can often only be revealed with better sampled data. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary to earlier suggestions, our data indicate a more diverse range of kinematic and metal gradient behavior inconsistent with a simple picture of well-ordered rotation developing concurrently with established steep metal gradients in all but merging systems. Comparing our observations with the predictions of hydrodynamical simulations suggests that gas and metals have been mixed by outflows or other strong feedback processes, flattening the metal gradients in early star-forming galaxies.

  9. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  10. Adaptation of the Participant Role Scale (PRS) in a Spanish youth sample: measurement invariance across gender and relationship with sociometric status.

    Lucas-Molina, Beatriz; Williamson, Ariel A; Pulido, Rosa; Calderón, Sonsoles

    2014-11-01

    In recent years, bullying research has transitioned from investigating the characteristics of the bully-victim dyad to examining bullying as a group-level process, in which the majority of children play some kind of role. This study used a shortened adaptation of the Participant Role Scale (PRS) to identify these roles in a representative sample of 2,050 Spanish children aged 8 to 13 years. Confirmatory factor analysis revealed three different roles, indicating that the adapted scale remains a reliable way to distinguish the Bully, Defender, and Outsider roles. In addition, measurement invariance of the adapted scale was examined to analyze possible gender differences among the roles. Peer status was assessed separately by gender through two sociometric procedures: the nominations-based method and the ratings-based method. Across genders, children in the Bully role were more often rated as rejected, whereas Defenders were more popular. Results suggest that although the PRS can reveal several different peer roles in the bullying process, a more clear distinction between bullying roles (i.e., Bully, Assistant, and Reinforcer) could better inform strategies for bullying interventions. PMID:24707035