WorldWideScience

Sample records for adaptive importance sampling

  1. Adaptive Importance Sampling in Particle Filtering

    Šmídl, Václav; Hofman, Radek

    Istanbul : ISIF, 2013. ISBN 978-605-86311-1-3. [16th International Conference on Information Fusion. Istanbul (TR), 09.07.2013-12.07.2013] R&D Projects: GA MV VG20102013018; GA ČR(CZ) GAP102/11/0437 Keywords : importance sampling * sequential monte carlo * sufficient statistics Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2013/AS/smidl-adaptive importance sampling in particle filtering.pdf

  2. Adaptive Importance Sampling for Control and Inference

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  3. Adaptive importance sampling of random walks on continuous state spaces

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  4. AIS-BN: An Adaptive Importance Sampling Algorithm for Evidential Reasoning in Large Bayesian Networks

    Cheng, J; 10.1613/jair.764

    2011-01-01

    Stochastic sampling algorithms, while an attractive alternative to exact algorithms in very large Bayesian network models, have been observed to perform poorly in evidential reasoning with extremely unlikely evidence. To address this problem, we propose an adaptive importance sampling algorithm, AIS-BN, that shows promising convergence rates even under extreme conditions and seems to outperform the existing sampling algorithms consistently. Three sources of this performance improvement are (1) two heuristics for initialization of the importance function that are based on the theoretical properties of importance sampling in finite-dimensional integrals and the structural advantages of Bayesian networks, (2) a smooth learning method for the importance function, and (3) a dynamic weighting function for combining samples from different stages of the algorithm. We tested the performance of the AIS-BN algorithm along with two state of the art general purpose sampling algorithms, likelihood weighting (Fung and Chang...

  5. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van Wim C.; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models t

  6. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations

  7. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van, WC Wim; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models that are analytically tractable are employed to demonstrate the validity of the techniques. As an application to situations that are analytically intractable and numerically intensive, the influence...

  8. An improved adaptive kriging-based importance technique for sampling multiple failure regions of low probability

    The estimation of system failure probabilities may be a difficult task when the values involved are very small, so that sampling-based Monte Carlo methods may become computationally impractical, especially if the computer codes used to model the system response require large computational efforts, both in terms of time and memory. This paper proposes a modification of an algorithm proposed in literature for the efficient estimation of small failure probabilities, which combines FORM to an adaptive kriging-based importance sampling strategy (AK-IS). The modification allows overcoming an important limitation of the original AK-IS in that it provides the algorithm with the flexibility to deal with multiple failure regions characterized by complex, non-linear limit states. The modified algorithm is shown to offer satisfactory results with reference to four case studies of literature, outperforming in general several other alternative methods of literature. - Highlights: • We tackle low failure probability estimation within reliability analysis context. • We improve a kriging-based importance sampling for estimating failure probabilities. • The new algorithm is capable of dealing with multiple-disconnected failure regions. • The performances are better than other methods of literature on 4 test case-studies

  9. Network and adaptive sampling

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  10. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    Liu, Bin, E-mail: bins@ieee.org [School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 (China)

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  11. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis

  12. Covariance-Adaptive Slice Sampling

    Thompson, Madeleine; Neal, Radford M.

    2010-01-01

    We describe two slice sampling methods for taking multivariate steps using the crumb framework. These methods use the gradients at rejected proposals to adapt to the local curvature of the log-density surface, a technique that can produce much better proposals when parameters are highly correlated. We evaluate our methods on four distributions and compare their performance to that of a non-adaptive slice sampling method and a Metropolis method. The adaptive methods perform favorably on low-di...

  13. Adaptive sampling for noisy problems

    Cantu-Paz, E

    2004-03-26

    The usual approach to deal with noise present in many real-world optimization problems is to take an arbitrary number of samples of the objective function and use the sample average as an estimate of the true objective value. The number of samples is typically chosen arbitrarily and remains constant for the entire optimization process. This paper studies an adaptive sampling technique that varies the number of samples based on the uncertainty of deciding between two individuals. Experiments demonstrate the effect of adaptive sampling on the final solution quality reached by a genetic algorithm and the computational cost required to find the solution. The results suggest that the adaptive technique can effectively eliminate the need to set the sample size a priori, but in many cases it requires high computational costs.

  14. Quantization based recursive Importance Sampling

    Frikha, Noufel

    2011-01-01

    We investigate in this paper an alternative method to simulation based recursive importance sampling procedure to estimate the optimal change of measure for Monte Carlo simulations. We propose an algorithm which combines (vector and functional) optimal quantization with Newton-Raphson zero search procedure. Our approach can be seen as a robust and automatic deterministic counterpart of recursive importance sampling by means of stochastic approximation algorithm which, in practice, may require tuning and a good knowledge of the payoff function in practice. Moreover, unlike recursive importance sampling procedures, the proposed methodology does not rely on simulations so it is quite generic and can come along on the top of Monte Carlo simulations. We first emphasize on the consistency of quantization for designing an importance sampling algorithm for both multi-dimensional distributions and diffusion processes. We show that the induced error on the optimal change of measure is controlled by the mean quantizatio...

  15. Adaptive Sampling in Hierarchical Simulation

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  16. Large-Flip Importance Sampling

    Hamze, Firas; De Freitas, Nando

    2012-01-01

    We propose a new Monte Carlo algorithm for complex discrete distributions. The algorithm is motivated by the N-Fold Way, which is an ingenious event-driven MCMC sampler that avoids rejection moves at any specific state. The N-Fold Way can however get "trapped" in cycles. We surmount this problem by modifying the sampling process. This correction does introduce bias, but the bias is subsequently corrected with a carefully engineered importance sampler.

  17. A new design for sampling with adaptive sample plots

    Yang, Haijun; Kleinn, Christoph; Fehrmann, Lutz; Tang, Shouzheng; Magnussen, Steen

    2009-01-01

    Adaptive cluster sampling (ACS) is a sampling technique for sampling rare and geographically clustered populations. Aiming to enhance the practicability of ACS while maintaining some of its major characteristics, an adaptive sample plot design is introduced in this study which facilitates field work compared to “standard” ACS. The plot design is based on a conditional plot expansion: a larger plot (by a pre-defined plot size factor) is installed at a sample point instead of the smaller initia...

  18. Simulated Maximum Likelihood using Tilted Importance Sampling

    Christian N. Brinch

    2008-01-01

    Abstract: This paper develops the important distinction between tilted and simple importance sampling as methods for simulating likelihood functions for use in simulated maximum likelihood. It is shown that tilted importance sampling removes a lower bound to simulation error for given importance sample size that is inherent in simulated maximum likelihood using simple importance sampling, the main method for simulating likelihood functions in the statistics literature. In addit...

  19. Adaptive processing with signal contaminated training samples

    Besson, Olivier; Bidon, Stéphanie

    2013-01-01

    We consider the adaptive beamforming or adaptive detection problem in the case of signal contaminated training samples, i.e., when the latter may contain a signal-like component. Since this results in a significant degradation of the signal to interference and noise ratio at the output of the adaptive filter, we investigate a scheme to jointly detect the contaminated samples and subsequently take this information into account for estimation of the disturbance covariance matrix. Towards this e...

  20. Monte Carlo Integration Using Importance Sampling and Gibbs Sampling

    Hörmann, Wolfgang; Leydold, Josef

    2005-01-01

    To evaluate the expectation of a simple function with respect to a complicated multivariate density Monte Carlo integration has become the main technique. Gibbs sampling and importance sampling are the most popular methods for this task. In this contribution we propose a new simple general purpose importance sampling procedure. In a simulation study we compare the performance of this method with the performance of Gibbs sampling and of importance sampling using a vector of independent variate...

  1. An Adaptive Importance Sampling Theory Based on The Generalized Genetic Algorithm%基于广义遗传算法的自适应重要抽样理论

    董聪; 郭晓华

    2000-01-01

    In the present paper,using the generalized genetic algorithm,the problem of finding out all design points in the case of generalized multiple design point is solved,establishing recursion-type bound-and-classification algorithm,the problem of reducing and synthesizing generaliged multiple design points is also solved.The present paper shows that the adaptive importance sampling theory based on the generalized genetic algorithm is an more efficient tool for the reliability simulation of nonlinear sys-tems.

  2. Importance Sampling for the Infinite Sites Model*

    Hobolth, Asger; Uyenoyama, Marcy K.; Wiuf, Carsten

    2008-01-01

    Importance sampling or Markov Chain Monte Carlo sampling is required for state-of-the-art statistical analysis of population genetics data. The applicability of these sampling-based inference techniques depends crucially on the proposal distribution. In this paper, we discuss importance sampling for the infinite sites model. The infinite sites assumption is attractive because it constraints the number of possible genealogies, thereby allowing for the analysis of larger data sets. We recall th...

  3. On Invertible Sampling and Adaptive Security

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio;

    2011-01-01

    Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot reliably...... erase their secrets prior to corruption. Previous feasibility results for adaptively secure MPC in this setting applied either to deterministic functionalities or to randomized functionalities which satisfy a certain technical requirement. The question whether adaptive security is possible for all...... functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...

  4. Adaptive Stochastic Methods for Sampling Driven Systems

    Jones, Andrew; Leimkuhler, Benedict

    2011-01-01

    Thermostatting methods are discussed in the context of canonical sampling in the presence of driving stochastic forces. Generalisations of the Nosé-Hoover method and Langevin dynamics are introduced which are able to dissipate excess heat introduced by steady Brownian perturbation (without a priori knowledge of its strength) while preserving ergodicity. Implementation and parameter selection are considered. It is demonstrated using numerical experiments that the methods derived can adaptively...

  5. 电力系统可靠性评估的自适应分层重要抽样法%A Self-adapting Stratified and Importance Sampling Method for Power System Reliability Evaluation

    王晓滨; 郭瑞鹏; 曹一家; 余秀月; 杨桂钟

    2011-01-01

    提出了电力系统可靠性评估的自适应分层重要抽样算法,将系统状态空间分割成无故障状态子空间和各重故障状态子空间,避免对无故障状态子空间抽样,对各重故障状态子空间的抽样次数进行最优分配,并不断修正最优重要抽样概率密度函数,可以显著提高计算效率并解决了以往蒙特卡洛方法在高可靠性系统中效率低下的问题.对IEEE-RTS系统的发输电部分进行可靠性评估,结果表明该方法合理、高效,且不会出现退化现象.%A new method for power system reliability evaluation called self-adapting stratified and importance sampling (SASIS) is presented. With the SASIS, the system state space is partitioned into one contingency-free state subspace and various contingency order state subspaces. As contingency-free state subspace sampling is completely avoided, the SASIS converges fast in the system with high reliability. The number of sampling is optimally allocated among the contingency order state subspaces and the probability density function is steadily rectified. This method will markedly increase the calculating efficiency while eradicating the problem of low efficiency with the Monte Carlo method in high efficiency systems as reported in the past. Compared with other Monte Carlo methods, the results of the IEEE-RTS test system show that the method proposed is rational and highly effective and free from degradation.This work is supported by Important Zhejiang Science & Technology Specific Projects (No. 2007C11098).

  6. Feature Adaptive Sampling for Scanning Electron Microscopy

    Dahmen, Tim; Engstler, Michael; Pauly, Christoph; Trampert, Patrick; de Jonge, Niels; Mücklich, Frank; Slusallek, Philipp

    2016-01-01

    A new method for the image acquisition in scanning electron microscopy (SEM) was introduced. The method used adaptively increased pixel-dwell times to improve the signal-to-noise ratio (SNR) in areas of high detail. In areas of low detail, the electron dose was reduced on a per pixel basis, and a-posteriori image processing techniques were applied to remove the resulting noise. The technique was realized by scanning the sample twice. The first, quick scan used small pixel-dwell times to gener...

  7. A software sampling frequency adaptive algorithm for reducing spectral leakage

    PAN Li-dong; WANG Fei

    2006-01-01

    Spectral leakage caused by synchronous error in a nonsynchronous sampling system is an important cause that reduces the accuracy of spectral analysis and harmonic measurement.This paper presents a software sampling frequency adaptive algorithm that can obtain the actual signal frequency more accurately,and then adjusts sampling interval base on the frequency calculated by software algorithm and modifies sampling frequency adaptively.It can reduce synchronous error and impact of spectral leakage;thereby improving the accuracy of spectral analysis and harmonic measurement for power system signal where frequency changes slowly.This algorithm has high precision just like the simulations show,and it can be a practical method in power system harmonic analysis since it can be implemented easily.

  8. An adaptive sampling scheme for deep-penetration calculation

    As we know, the deep-penetration problem has been one of the important and difficult problems in shielding calculation with Monte Carlo Method for several decades. In this paper, an adaptive Monte Carlo method under the emission point as a sampling station for shielding calculation is investigated. The numerical results show that the adaptive method may improve the efficiency of the calculation of shielding and might overcome the under-estimation problem easy to happen in deep-penetration calculation in some degree

  9. A support vector density-based importance sampling for reliability assessment

    An importance sampling method based on the adaptive Markov chain simulation and support vector density estimation is developed in this paper for efficient structural reliability assessment. The methodology involves the generation of samples that can adaptively populate the important region by the adaptive Metropolis algorithm, and the construction of importance sampling density by support vector density. The use of the adaptive Metropolis algorithm may effectively improve the convergence and stability of the classical Markov chain simulation. The support vector density can approximate the sampling density with fewer samples in comparison to the conventional kernel density estimation. The proposed importance sampling method can effectively reduce the number of structural analysis required for achieving a given accuracy. Examples involving both numerical and practical structural problems are given to illustrate the application and efficiency of the proposed methodology.

  10. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  11. Sparse signals estimation for adaptive sampling

    Andrey Ordin

    2014-08-01

    Full Text Available This paper presents an estimation procedure for sparse signals in adaptive setting. We show that when the pure signal is strong enough, the value of loss function is asymptotically the same as for an optimal estimator up to a constant multiplier.

  12. New adaptive sampling method in particle image velocimetry

    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method. (technical design note)

  13. Adaptive Sampling for Large Scale Boosting

    Dubout, Charles; Fleuret, Francois

    2014-01-01

    Classical Boosting algorithms, such as AdaBoost, build a strong classifier without concern for the computational cost. Some applications, in particular in computer vision, may involve millions of training examples and very large feature spaces. In such contexts, the training time of off-the-shelf Boosting algorithms may become prohibitive. Several methods exist to accelerate training, typically either by sampling the features or the examples used to train the weak learners. Even if some of th...

  14. Domain Adaptation: Overfitting and Small Sample Statistics

    Foster, Dean; Salakhutdinov, Ruslan

    2011-01-01

    We study the prevalent problem when a test distribution differs from the training distribution. We consider a setting where our training set consists of a small number of sample domains, but where we have many samples in each domain. Our goal is to generalize to a new domain. For example, we may want to learn a similarity function using only certain classes of objects, but we desire that this similarity function be applicable to object classes not present in our training sample (e.g. we might seek to learn that "dogs are similar to dogs" even though images of dogs were absent from our training set). Our theoretical analysis shows that we can select many more features than domains while avoiding overfitting by utilizing data-dependent variance properties. We present a greedy feature selection algorithm based on using T-statistics. Our experiments validate this theory showing that our T-statistic based greedy feature selection is more robust at avoiding overfitting than the classical greedy procedure.

  15. Averaging analysis for discrete time and sampled data adaptive systems

    Fu, Li-Chen; Bai, Er-Wei; Sastry, Shankar S.

    1986-01-01

    Earlier continuous time averaging theorems are extended to the nonlinear discrete time case. Theorems for the study of the convergence analysis of discrete time adaptive identification and control systems are used. Instability theorems are also derived and used for the study of robust stability and instability of adaptive control schemes applied to sampled data systems. As a by product, the effects of sampling on unmodeled dynamics in continuous time systems are also studied.

  16. Pricing and Risk Management with Stochastic Volatility Using Importance Sampling

    Przemyslaw S. Stilger, Simon Acomb and Ser-Huang Poon

    2012-01-01

    In this paper, we apply importance sampling to Heston's stochastic volatility model and Bates's stochastic volatility model with jumps. We propose an effective numerical scheme that dramatically improves the speed of importance sampling. We show how the Greeks can be computed using the Likelihood Ratio Method based on characteristic function, and how combining it with importance sampling leads to a significant variance reduction for the Greeks. All results are illustrated using European and b...

  17. Application of adaptive cluster sampling to low-density populations of freshwater mussels

    Smith, D.R.; Villella, R.F.; Lemarie, D.P.

    2003-01-01

    Freshwater mussels appear to be promising candidates for adaptive cluster sampling because they are benthic macroinvertebrates that cluster spatially and are frequently found at low densities. We applied adaptive cluster sampling to estimate density of freshwater mussels at 24 sites along the Cacapon River, WV, where a preliminary timed search indicated that mussels were present at low density. Adaptive cluster sampling increased yield of individual mussels and detection of uncommon species; however, it did not improve precision of density estimates. Because finding uncommon species, collecting individuals of those species, and estimating their densities are important conservation activities, additional research is warranted on application of adaptive cluster sampling to freshwater mussels. However, at this time we do not recommend routine application of adaptive cluster sampling to freshwater mussel populations. The ultimate, and currently unanswered, question is how to tell when adaptive cluster sampling should be used, i.e., when is a population sufficiently rare and clustered for adaptive cluster sampling to be efficient and practical? A cost-effective procedure needs to be developed to identify biological populations for which adaptive cluster sampling is appropriate.

  18. Cellular adaptation as an important response during chemical carcinogenesis

    Since disease processes are largely expressions of how living organisms react and respond to perturbations in the external and internal environments, adaptive or protective responses and their modulations and mechanisms are of the greatest concern in fundamental studies of disease pathogenesis. Such considerations are also of the greatest relevance in toxicology, including how living organisms respond to low levels of single and multiple xenobiotics and radiations. As the steps and mechanisms during cancer development are studied in greater depth, phenomena become apparent that suggest that adaptive reactions and responses may play important or even critical roles in the process of carcinogenesis. The question becomes whether the process of carcinogenesis is fundamentally an adversarial one (i.e., an abnormal cell in a vulnerable host), or is it more in the nature of a physiological selection or differentiation, which has survival value for the host as an adaptive phenomena? The very early initial interactions of mutagenic chemical carcinogens, radiations and viruses with DNA prejudice most to consider the adversarial 'abnormal' view as the appropriate one. Yet, the unusually common nature of the earliest altered rare cells that appear during carcinogenesis, their unusually bland nature, and their spontaneous differentiation to normal-appearing adult liver should be carefully considered

  19. State-dependent importance sampling for a Jackson tandem network

    D. Miretskiy; W. Scheinhardt; M. Mandjes

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting "traditional" state-independent importance-sampling distributions per

  20. State-dependent importance sampling for a Jackson tandem network

    Miretskiy, Denis; Scheinhardt, Werner; Mandjes, Michel

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting “traditional” state-independent importance-sampling distributions per

  1. Fast Efficient Importance Sampling by State Space Methods

    Koopman, S.J.; Nguyen, T.M.

    2012-01-01

    We show that efficient importance sampling for nonlinear non-Gaussian state space models can be implemented by computationally efficient Kalman filter and smoothing methods. The result provides some new insights but it primarily leads to a simple and fast method for efficient importance sampling. A simulation study and empirical illustration provide some evidence of the computational gains.

  2. Adaptive sampling program support for expedited site characterization

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ''real-time'' data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection

  3. Adaptive maximal poisson-disk sampling on surfaces

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  4. Adaptive sampling program support for expedited site characterization

    Johnson, R.

    1993-10-01

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ``real-time`` data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection.

  5. Multi-Level Monte Carlo Simulations with Importance Sampling

    Przemyslaw S. Stilger and Ser-Huang Poon

    2013-01-01

    We present an application of importance sampling to multi-asset options under the Heston and the Bates models as well as to the Heston-Hull-White and the Heston-Cox-Ingersoll-Ross models. Moreover, we provide an efficient importance sampling scheme in a Multi-Level Monte Carlo simulation. In all cases, we explain how the Greeks can be computed in the different simulation schemes using the Likelihood Ratio Method, and how combining it with importance sampling leads to a significant variance re...

  6. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  7. Computing Greeks with Multilevel Monte Carlo Methods using Importance Sampling

    Euget, Thomas

    2012-01-01

    This paper presents a new efficient way to reduce the variance of an estimator of popular payoffs and greeks encounter in financial mathematics. The idea is to apply Importance Sampling with the Multilevel Monte Carlo recently introduced by M.B. Giles. So far, Importance Sampling was proved successful in combination with standard Monte Carlo method. We will show efficiency of our approach on the estimation of financial derivatives prices and then on the estimation of Greeks (i.e. sensitivitie...

  8. On the Use of Importance Sampling in Particle Transport Problems

    The idea of importance sampling is applied to the problem of solving integral equations of Fredholm's type. Especially Bolzmann's neutron transport equation is taken into consideration. For the solution of the latter equation, an importance sampling technique is derived from some simple transformations at the original transport equation into a similar equation. Examples of transformations are given, which have been used with great success in practice

  9. Two-phase importance sampling for inference about transmission trees

    Numminen, E.; Chewapreecha, C.; Siren, J.; Turner, C.; Turner, P; Bentley, S.D.; Corander, J.

    2014-01-01

    There has been growing interest in the statistics community to develop methods for inferring transmission pathways of infectious pathogens from molecular sequence data. For many datasets, the computational challenge lies in the huge dimension of the missing data. Here, we introduce an importance sampling scheme in which the transmission trees and phylogenies of pathogens are both sampled from reasonable importance distributions, alleviating the inference. Using this approach, arbitrary models...

  10. Iterative importance sampling algorithms for parameter estimation problems

    Morzfeld, Matthias; Day, Marcus S.; Grout, Ray W.; Pau, George Shu Heng; Finsterle, Stefan A.; Bell, John B.

    2016-01-01

    In parameter estimation problems one approximates a posterior distribution over uncertain param- eters defined jointly by a prior distribution, a numerical model, and noisy data. Typically, Markov Chain Monte Carlo (MCMC) is used for the numerical solution of such problems. An alternative to MCMC is importance sampling, where one draws samples from a proposal distribution, and attaches weights to each sample to account for the fact that the proposal distribution is not the posterior distribut...

  11. Geometrical importance sampling in Geant4 from design to verification

    Dressel, M

    2003-01-01

    The addition of flexible, general implementations of geometrical splitting and Russian Roulette, in combination called geometrical importance sampling, for variance reduction and of a scoring system, for controlling the sampling, are described. The efficiency of the variance reduction implementation is measured in a simulation of a typical benchmark experiment for neutron shielding. Using geometrical importance sampling a reduction of the computing time of a factor 89 compared to the analog calculation, for obtaining a neutron flux with a certain precision, was achieved for the benchmark application.

  12. Adaptation of the methodology of sample surveys for marketing researches

    Kataev Andrey

    2015-08-01

    Full Text Available The article presents the results of the theory of adaptation of sample survey for the purposes of marketing, that allows to answer the fundamental question of any marketing research – how many objects should be studied for drawing adequate conclusions.

  13. Efficient computation of smoothing splines via adaptive basis sampling

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  14. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Huang Jiangtao

    2015-10-01

    Full Text Available Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is proposed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sampling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowdness enhance (RCE adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic optimization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  15. Adaptive video compressed sampling in the wavelet domain

    Dai, Hui-dong; Gu, Guo-hua; He, Wei-ji; Chen, Qian; Mao, Tian-yi

    2016-07-01

    In this work, we propose a multiscale video acquisition framework called adaptive video compressed sampling (AVCS) that involves sparse sampling and motion estimation in the wavelet domain. Implementing a combination of a binary DMD and a single-pixel detector, AVCS acquires successively finer resolution sparse wavelet representations in moving regions directly based on extended wavelet trees, and alternately uses these representations to estimate the motion in the wavelet domain. Then, we can remove the spatial and temporal redundancies and provide a method to reconstruct video sequences from compressed measurements in real time. In addition, the proposed method allows adaptive control over the reconstructed video quality. The numerical simulation and experimental results indicate that AVCS performs better than the conventional CS-based methods at the same sampling rate even under the influence of noise, and the reconstruction time and measurements required can be significantly reduced.

  16. An Importance Sampling Simulation Method for Bayesian Decision Feedback Equalizers

    Chen, S.; Hanzo, L.

    2000-01-01

    An importance sampling (IS) simulation technique is presented for evaluating the lower-bound bit error rate (BER) of the Bayesian decision feedback equalizer (DFE) under the assumption of correct decisions being fed back. A design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency of the IS simulation.

  17. Importance Sampling for Failure Probabilities in Computing and Data Transmission

    Asmussen, Søren

    given to T having either a gamma-like tail or a regularly varying tail, and to failures at Poisson times. Different type of conditional limits occur, in particular exponentially tilted Gumbel distributions and Pareto distributions. The algorithms based upon importance distributions for T using these......We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description of the...

  18. Importance sampling for failure probabilities in computing and data transmission

    Asmussen, Søren

    2009-01-01

    attention is given to T having either a gamma-like tail or a regularly varying tail, and to failures at Poisson times. Different types of conditional limit occur, in particular exponentially tilted Gumbel distributions and Pareto distributions. The algorithms based upon importance distributions for T using......In this paper we study efficient simulation algorithms for estimating P(X›x), where X is the total time of a job with ideal time $T$ that needs to be restarted after a failure. The main tool is importance sampling, where a good importance distribution is identified via an asymptotic description of...

  19. A flexible importance sampling method for integrating subgrid processes

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  20. The Importance of Formalizing Computational Models of Face Adaptation Aftereffects

    Ross, David A.; Palmeri, Thomas J.

    2016-01-01

    Face adaptation is widely used as a means to probe the neural representations that support face recognition. While the theories that relate face adaptation to behavioral aftereffects may seem conceptually simple, our work has shown that testing computational instantiations of these theories can lead to unexpected results. Instantiating a model of face adaptation not only requires specifying how faces are represented and how adaptation shapes those representations but also specifying how decisions are made, translating hidden representational states into observed responses. Considering the high-dimensionality of face representations, the parallel activation of multiple representations, and the non-linearity of activation functions and decision mechanisms, intuitions alone are unlikely to succeed. If the goal is to understand mechanism, not simply to examine the boundaries of a behavioral phenomenon or correlate behavior with brain activity, then formal computational modeling must be a component of theory testing. To illustrate, we highlight our recent computational modeling of face adaptation aftereffects and discuss how models can be used to understand the mechanisms by which faces are recognized. PMID:27378960

  1. Importance sampling for failure probabilities in computing and data transmission

    Asmussen, Søren

    2009-01-01

    We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description of the conditional distribution of T given Χ > χ. If T ≡ t is constant, the problem reduces to the efficient simulation of geometric sums, and a standard algorithm involving a Cramér type root  γ(t) is ...

  2. Adaptive Sampling of Time Series During Remote Exploration

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  3. Learning Adaptive Forecasting Models from Irregularly Sampled Multivariate Clinical Data

    Liu, Zitao; Hauskrecht, Milos

    2016-01-01

    Building accurate predictive models of clinical multivariate time series is crucial for understanding of the patient condition, the dynamics of a disease, and clinical decision making. A challenging aspect of this process is that the model should be flexible and adaptive to reflect well patient-specific temporal behaviors and this also in the case when the available patient-specific data are sparse and short span. To address this problem we propose and develop an adaptive two-stage forecasting approach for modeling multivariate, irregularly sampled clinical time series of varying lengths. The proposed model (1) learns the population trend from a collection of time series for past patients; (2) captures individual-specific short-term multivariate variability; and (3) adapts by automatically adjusting its predictions based on new observations. The proposed forecasting model is evaluated on a real-world clinical time series dataset. The results demonstrate the benefits of our approach on the prediction tasks for multivariate, irregularly sampled clinical time series, and show that it can outperform both the population based and patient-specific time series prediction models in terms of prediction accuracy.

  4. Distributed Database Kriging for Adaptive Sampling (D2 KAS)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; Rouet-Leduc, Bertrand; McPherson, Allen L.; Germann, Timothy C.; Junghans, Christoph

    2015-07-01

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5-25, while retaining high accuracy for various choices of the algorithm parameters.

  5. Distributed database kriging for adaptive sampling (D2KAS)

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters

  6. Gap processing for adaptive maximal Poisson-disk sampling

    Yan, Dongming

    2013-09-01

    In this article, we study the generation of maximal Poisson-disk sets with varying radii. First, we present a geometric analysis of gaps in such disk sets. This analysis is the basis for maximal and adaptive sampling in Euclidean space and on manifolds. Second, we propose efficient algorithms and data structures to detect gaps and update gaps when disks are inserted, deleted, moved, or when their radii are changed.We build on the concepts of regular triangulations and the power diagram. Third, we show how our analysis contributes to the state-of-the-art in surface remeshing. © 2013 ACM.

  7. Roof Reconstruction from Point Clouds using Importance Sampling

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2013-10-01

    We propose a novel fully automatic technique for roof fitting in 3D point clouds based on sequential importance sampling (SIS). Our approach makes no assumption of the nature (sparse, dense) or origin (LIDAR, image matching) of the point clouds and further distinguishes, automatically, between different basic roof types based on model selection. The algorithm comprises an inherent data parallelism, the lack of which has been a major drawback of most Monte Carlo schemes. A further speedup is achieved by applying a coarse to fine search within all probable roof configurations in the sample space of roofs. The robustness and effectiveness of our roof reconstruction algorithm is illustrated for point clouds of varying nature.

  8. Effect of imperfect detectability on adaptive and conventional sampling: Simulated sampling of freshwater mussels in the upper Mississippi River

    Smith, D.R.; Gray, B.R.; Newton, T.J.; Nichols, D.

    2010-01-01

    Adaptive sampling designs are recommended where, as is typical with freshwater mussels, the outcome of interest is rare and clustered. However, the performance of adaptive designs has not been investigated when outcomes are not only rare and clustered but also imperfectly detected. We address this combination of challenges using data simulated to mimic properties of freshwater mussels from a reach of the upper Mississippi River. Simulations were conducted under a range of sample sizes and detection probabilities. Under perfect detection, efficiency of the adaptive sampling design increased relative to the conventional design as sample size increased and as density decreased. Also, the probability of sampling occupied habitat was four times higher for adaptive than conventional sampling of the lowest density population examined. However, imperfect detection resulted in substantial biases in sample means and variances under both adaptive sampling and conventional designs. The efficiency of adaptive sampling declined with decreasing detectability. Also, the probability of encountering an occupied unit during adaptive sampling, relative to conventional sampling declined with decreasing detectability. Thus, the potential gains in the application of adaptive sampling to rare and clustered populations relative to conventional sampling are reduced when detection is imperfect. The results highlight the need to increase or estimate detection to improve performance of conventional and adaptive sampling designs.

  9. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    Grace, J. M.; Verseux, C.; Gentry, D.; Moffet, A.; Thayabaran, R.; Wong, N.; Rothschild, L.

    2013-12-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates[Wielgoss et al., 2013]. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques[Wassmann et al., 2010]. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols[Alcántara-Díaz et al., 2004; Goldman and Travisano, 2011]. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of

  10. Semigroups and sequential importance sampling for multiway tables

    Yoshida, Ruriko; Wei, Shaoceng; Zhou, Feng; Haws, David

    2011-01-01

    When an interval of integers between the lower bound $l_i$ and the upper bound $u_i$ is the support of the marginal distribution $n_i|(n_{i-1}, ...,n_1)$, Chen et al, 2005 noticed that sampling from the interval at each step, for $n_i$ during a sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection Chen et al 2006. In this paper we consider the uniform distribution as the target distribution. First we show that if we fix the number of rows and columns of the design matrix of the model for contingency tables then there exists a polynomial time algorithm in terms of the input size to sample a table from the set of all tables satisfying all marginals defined by the given model via the SIS procedure without rejection. We then show experimentall...

  11. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    Cheon, Sooyoung

    2013-02-16

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  12. Local Importance Sampling: A Novel Technique to Enhance Particle Filtering

    Péter Torma

    2006-04-01

    Full Text Available In the low observation noise limit particle filters become inefficient. In this paper a simple-to- implement particle filter is suggested as a solution to this well-known problem. The proposed Local Importance Sampling based particle filters draw the particles’ positions in a two-step process that makes use of both the dynamics of the system and the most recent observation. Experiments with the standard bearings-only tracking problem indicate that the proposed new particle filter method is indeed very successful when observations are reliable. Experiments with a high-dimensional variant of this problem further show that the advantage of the new filter grows with the increasing dimensionality of the system.

  13. Semigroups and sequential importance sampling for multiway tables and beyond

    Xi, Jing; Zhou, Feng; Yoshida, Ruriko; Haws, David

    2011-01-01

    When an interval of integers between the lower bound l_i and the upper bounds u_i is the support of the marginal distribution n_i|(n_{i-1}, ...,n_1), Chen et al. 2005 noticed that sampling from the interval at each step, for n_i during the sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection [Chen et al. 2006]. Rejecting tables is computationally expensive and incorrect proposal distributions result in biased estimators for the number of tables given its marginal sums. This paper has two focuses; (1) we propose a correction coefficient which corrects an interval of integers between the lower bound l_i and the upper bounds u_i to the support of the marginal distribution asymptotically even with rejections and with the same time complexity ...

  14. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  15. Structured estimation - Sample size reduction for adaptive pattern classification

    Morgera, S.; Cooper, D. B.

    1977-01-01

    The Gaussian two-category classification problem with known category mean value vectors and identical but unknown category covariance matrices is considered. The weight vector depends on the unknown common covariance matrix, so the procedure is to estimate the covariance matrix in order to obtain an estimate of the optimum weight vector. The measure of performance for the adapted classifier is the output signal-to-interference noise ratio (SIR). A simple approximation for the expected SIR is gained by using the general sample covariance matrix estimator; this performance is both signal and true covariance matrix independent. An approximation is also found for the expected SIR obtained by using a Toeplitz form covariance matrix estimator; this performance is found to be dependent on both the signal and the true covariance matrix.

  16. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  17. Multi-Scaling Sampling: An Adaptive Sampling Method for Discovering Approximate Association Rules

    Cai-Yan Jia; Xie-Ping Gao

    2005-01-01

    One of the obstacles of the efficient association rule mining is the explosive expansion of data sets since it is costly or impossible to scan large databases, esp., for multiple times. A popular solution to improve the speed and scalability of the association rule mining is to do the algorithm on a random sample instead of the entire database. But how to effectively define and efficiently estimate the degree of error with respect to the outcome of the algorithm, and how to determine the sample size needed are entangling researches until now. In this paper, an effective and efficient algorithm is given based on the PAC (Probably Approximate Correct) learning theory to measure and estimate sample error. Then, a new adaptive, on-line, fast sampling strategy - multi-scaling sampling - is presented inspired by MRA (Multi-Resolution Analysis) and Shannon sampling theorem, for quickly obtaining acceptably approximate association rules at appropriate sample size. Both theoretical analysis and empirical study have showed that the sampling strategy can achieve a very good speed-accuracy trade-off.

  18. Importance sampling approach for the nonstationary approximation error method

    The approximation error approach has previously been proposed to handle modelling, numerical and computational errors. This approach has been developed both for stationary and nonstationary inverse problems (Kalman filtering). The key idea of the approach is to compute the approximate statistics of the errors over the distribution of all unknowns and uncertainties and carry out approximative marginalization with respect to these errors. In nonstationary problems, however, information is accumulated over time, and the initial uncertainties may turn out to have been exaggerated. In this paper, we propose an algorithm with which the approximation error statistics can be updated during the accumulation of measurement information. The proposed algorithm is based on importance sampling. The recursions that are proposed here are, however, based on the (extended) Kalman filter and therefore do not employ the often exceedingly heavy computational load of particle filtering. As a computational example, we study an estimation problem that is related to a convection–diffusion problem in which the velocity field is not accurately specified

  19. Monte Carlo importance sampling for the MCNP{trademark} general source

    Lichtenstein, H.

    1996-01-09

    Research was performed to develop an importance sampling procedure for a radiation source. The procedure was developed for the MCNP radiation transport code, but the approach itself is general and can be adapted to other Monte Carlo codes. The procedure, as adapted to MCNP, relies entirely on existing MCNP capabilities. It has been tested for very complex descriptions of a general source, in the context of the design of spent-reactor-fuel storage casks. Dramatic improvements in calculation efficiency have been observed in some test cases. In addition, the procedure has been found to provide an acceleration to acceptable convergence, as well as the benefit of quickly identifying user specified variance-reduction in the transport that effects unstable convergence.

  20. Turkish adaptation of the Fear of Spiders Questionnaire: Reliability and validity in non-clinical samples

    Robert W. Booth

    2016-12-01

    Full Text Available The rapid, objective measurement of spider fear is important for clinicians, and for researchers studying fear. To facilitate this, we adapted the Fear of Spiders Questionnaire (FSQ into Turkish. The FSQ is quick to complete and easy to understand. Compared to the commonly used Spider Phobia Questionnaire, it has shown superior test-retest reliability and better discrimination of lower levels of spider fear, facilitating fear research in non-clinical samples. In two studies, with 137 and 105 undergraduates and unselected volunteers, our adapted FSQ showed excellent internal consistency (Cronbach’s α = .95 and .96 and test-retest reliability (r = .90, and good discriminant validity against the State–Trait Anxiety Inventory—Trait (r = .23 and Beck Anxiety Inventory—Trait (r = .07. Most importantly, our adapted FSQ significantly predicted 26 students’ self-reported discomfort upon approaching a caged tarantula; however, a measure of behavioural avoidance of the tarantula yielded little variability, so a more sensitive task will be required for future behavioural testing. Based on this initial testing, we recommend our adapted FSQ for research use. Further research is required to verify that our adapted FSQ discriminates individuals with and without phobia effectively. A Turkish-language report of the studies is included as supplementary material.

  1. Large Deviations and Importance Sampling for Systems of Slow-Fast Motion

    Spiliopoulos, Konstantinos, E-mail: kspiliop@dam.brown.edu [Brown University, Division of Applied Mathematics (United States)

    2013-02-15

    In this paper we develop the large deviations principle and a rigorous mathematical framework for asymptotically efficient importance sampling schemes for general, fully dependent systems of stochastic differential equations of slow and fast motion with small noise in the slow component. We assume periodicity with respect to the fast component. Depending on the interaction of the fast scale with the smallness of the noise, we get different behavior. We examine how one range of interaction differs from the other one both for the large deviations and for the importance sampling. We use the large deviations results to identify asymptotically optimal importance sampling schemes in each case. Standard Monte Carlo schemes perform poorly in the small noise limit. In the presence of multiscale aspects one faces additional difficulties and straightforward adaptation of importance sampling schemes for standard small noise diffusions will not produce efficient schemes. It turns out that one has to consider the so called cell problem from the homogenization theory for Hamilton-Jacobi-Bellman equations in order to guarantee asymptotic optimality. We use stochastic control arguments.

  2. The importance of cooling of urine samples for doping analysis

    Kuenen, J.G.; Konings, W.N.

    2009-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  3. The importance of cooling of urine samples for doping analysis

    Kuenen, J. Gijs; Konings, Wil N.

    2010-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  4. Innovation and adaptation in a Turkish sample: a preliminary study.

    Oner, B

    2000-11-01

    The aim of this study was to examine the representations of adaptation and innovation among adults in Turkey. Semi-structured interviews were carried out with a sample of 20 Turkish adults (10 men, 10 women) from various occupations. The participants' ages ranged from 21 to 58 years. Results of content analysis showed that the representation of innovation varied with the type of context. Innovation was not preferred within the family and interpersonal relationship contexts, whereas it was relatively more readily welcomed within the contexts of work, science, and technology. This finding may indicate that the concept of innovation that is assimilated in traditional Turkish culture has limits. Contents of the interviews were also analyzed with respect to M. J. Kirton's (1976) subscales of originality, efficiency, and rule-group conformity. The participants favored efficient innovators, whereas they thought that the risk of failure was high in cases of inefficient innovation. The reasons for and indications of the representations of innovativeness among Turkish people are discussed in relation to their social structure and cultural expectations. PMID:11092420

  5. Joint importance sampling of low-order volumetric scattering

    Georgiev, Iliyan; Křivánek, Jaroslav; Hachisuka, Toshiya; Nowrouzezahrai, Derek; Jarosz, Wojciech

    path-based rendering algorithms such as path tracing, bidirectional path tracing, and many-light methods. We also use our sampling routines to generalize deterministic shadow connections to connection subpaths consisting of two or three random decisions, to efficiently simulate higher-order multiple......Central to all Monte Carlo-based rendering algorithms is the construction of light transport paths from the light sources to the eye. Existing rendering approaches sample path vertices incrementally when constructing these light transport paths. The resulting probability density is thus a product...... of the conditional densities of each local sampling step, constructed without explicit control over the form of the final joint distribution of the complete path. We analyze why current incremental construction schemes often lead to high variance in the presence of participating media, and reveal...

  6. Job performance ratings : The relative importance of mental ability, conscientiousness, and career adaptability

    Ohme, Melanie; Zacher, Hannes

    2015-01-01

    According to career construction theory, continuous adaptation to the work environment is crucial to achieve work and career success. In this study, we examined the relative importance of career adaptability for job performance ratings using an experimental policy-capturing design. Employees (N = 13

  7. On the importance sampling of self-avoiding walks

    Bousquet-Mélou, Mireille

    2014-01-01

    In a 1976 paper published in Science, Knuth presented an algorithm to sample (non-uniform) self-avoiding walks crossing a square of side k. From this sample, he constructed an estimator for the number of such walks. The quality of this estimator is directly related to the (relative) variance of a certain random variable X_k. From his experiments, Knuth suspected that this variance was extremely large (so that the estimator would not be very efficient). But how large? For the analogous Rosenbl...

  8. Stress avoidance in a common annual: reproductive timing is important for local adaptation and geographic distribution.

    Griffith, T M; Watson, M A

    2005-11-01

    Adaptation to local environments may be an important determinant of species' geographic range. However, little is known about which traits contribute to adaptation or whether their further evolution would facilitate range expansion. In this study, we assessed the adaptive value of stress avoidance traits in the common annual Cocklebur (Xanthium strumarium) by performing a reciprocal transplant across a broad latitudinal gradient extending to the species' northern border. Populations were locally adapted and stress avoidance traits accounted for most fitness differences between populations. At the northern border where growing seasons are cooler and shorter, native populations had evolved to reproduce earlier than native populations in the lower latitude gardens. This clinal pattern in reproductive timing corresponded to a shift in selection from favouring later to earlier reproduction. Thus, earlier reproduction is an important adaptation to northern latitudes and constraint on the further evolution of this trait in marginal populations could potentially limit distribution. PMID:16313471

  9. Parallel importance sampling in conditional linear Gaussian networks

    Salmerón, Antonio; Ramos-López, Darío; Borchani, Hanen;

    2015-01-01

    In this paper we analyse the problem of probabilistic inference in CLG networks when evidence comes in streams. In such situations, fast and scalable algorithms, able to provide accurate responses in a short time are required. We consider the instantiation of variational inference and importance ...

  10. Determination of free energy profiles by repository based adaptive umbrella sampling: Bridging nonequilibrium and quasiequilibrium simulations

    Zheng, Han; Zhang, Yingkai

    2008-01-01

    We propose a new adaptive sampling approach to determine free energy profiles with molecular dynamics simulations, which is called as “repository based adaptive umbrella sampling” (RBAUS). Its main idea is that a sampling repository is continuously updated based on the latest simulation data, and the accumulated knowledge and sampling history are then employed to determine whether and how to update the biasing umbrella potential for subsequent simulations. In comparison with other adaptive me...

  11. On the importance sampling of self-avoiding walks

    Bousquet-Mélou, Mireille

    2011-01-01

    In a 1976 paper published in Science, Knuth presented an algorithm to sample (non-uniform) self-avoiding walks crossing a square of side k. From this sample, he constructed an estimator for the number of such walks. The quality of this estimator is directly related to the (relative) variance of a certain random variable X_k. From his experiments, Knuth suspected that this variance was extremely large, so that the estimator would not be very efficient. A few years ago, Bassetti and Diaconis showed that, for a similar sampler that only generates walks consisting of North and East steps, the relative variance is O(\\sqrt k). In this note we go one step further by showing that, for walks consisting of North, South and East steps, the relative variance is of the order of 2^{k(k+1)}/(k+1)^{2k}, and thus much larger than exponential in k. We also obtain partial results for general self-avoiding walks, suggesting that the relative variance could be as large as \\mu^{k^2} for some \\mu>1. Knuth's algorithm is a basic exa...

  12. An adaptive two-stage sequential design for sampling rare and clustered populations

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  13. State-independent importance sampling for random walks with regularly varying increments

    Karthyek R. A. Murthy

    2015-03-01

    Full Text Available We develop importance sampling based efficient simulation techniques for three commonly encountered rare event probabilities associated with random walks having i.i.d. regularly varying increments; namely, 1 the large deviation probabilities, 2 the level crossing probabilities, and 3 the level crossing probabilities within a regenerative cycle. Exponential twisting based state-independent methods, which are effective in efficiently estimating these probabilities for light-tailed increments are not applicable when the increments are heavy-tailed. To address the latter case, more complex and elegant state-dependent efficient simulation algorithms have been developed in the literature over the last few years. We propose that by suitably decomposing these rare event probabilities into a dominant and further residual components, simpler state-independent importance sampling algorithms can be devised for each component resulting in composite unbiased estimators with desirable efficiency properties. When the increments have infinite variance, there is an added complexity in estimating the level crossing probabilities as even the well known zero-variance measures have an infinite expected termination time. We adapt our algorithms so that this expectation is finite while the estimators remain strongly efficient. Numerically, the proposed estimators perform at least as well, and sometimes substantially better than the existing state-dependent estimators in the literature.

  14. An Adaptive Sampling System for Sensor Nodes in Body Area Networks.

    Rieger, R; Taylor, J

    2014-04-23

    The importance of body sensor networks to monitor patients over a prolonged period of time has increased with an advance in home healthcare applications. Sensor nodes need to operate with very low-power consumption and under the constraint of limited memory capacity. Therefore, it is wasteful to digitize the sensor signal at a constant sample rate, given that the frequency contents of the signals vary with time. Adaptive sampling is established as a practical method to reduce the sample data volume. In this paper a low-power analog system is proposed, which adjusts the converter clock rate to perform a peak-picking algorithm on the second derivative of the input signal. The presented implementation does not require an analog-to-digital converter or a digital processor in the sample selection process. The criteria for selecting a suitable detection threshold are discussed, so that the maximum sampling error can be limited. A circuit level implementation is presented. Measured results exhibit a significant reduction in the average sample frequency and data rate of over 50% and 38% respectively. PMID:24760918

  15. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    Huibin Lu; Zhengping Hu; Hongxiao Gao

    2015-01-01

    In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the...

  16. Climate variables explain neutral and adaptive variation within salmonid metapopulations: The importance of replication in landscape genetics

    Hand, Brian K; Muhlfeld, Clint C.; Wade, Alisa A.; Kovach, Ryan; Whited, Diane C.; Narum, Shawn R; Matala, Andrew P; Ackerman, Michael W.; Garner, B. A.; Kimball, John S; Stanford, Jack A.; Luikart, Gordon

    2016-01-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  17. An Adaptive Spectral Clustering Algorithm Based on the Importance of Shared Nearest Neighbors

    Xiaoqi He

    2015-05-01

    Full Text Available The construction of a similarity matrix is one significant step for the spectral clustering algorithm; while the Gaussian kernel function is one of the most common measures for constructing the similarity matrix. However, with a fixed scaling parameter, the similarity between two data points is not adaptive and appropriate for multi-scale datasets. In this paper, through quantitating the value of the importance for each vertex of the similarity graph, the Gaussian kernel function is scaled, and an adaptive Gaussian kernel similarity measure is proposed. Then, an adaptive spectral clustering algorithm is gotten based on the importance of shared nearest neighbors. The idea is that the greater the importance of the shared neighbors between two vertexes, the more possible it is that these two vertexes belong to the same cluster; and the importance value of the shared neighbors is obtained with an iterative method, which considers both the local structural information and the distance similarity information, so as to improve the algorithm’s performance. Experimental results on different datasets show that our spectral clustering algorithm outperforms the other spectral clustering algorithms, such as the self-tuning spectral clustering and the adaptive spectral clustering based on shared nearest neighbors in clustering accuracy on most datasets.

  18. Sampling Plant Diversity and Rarity at Landscape Scales: Importance of Sampling Time in Species Detectability

    Zhang, Jian; Nielsen, Scott E.; Grainger, Tess N.; Kohler, Monica; Chipchar, Tim; Farr, Daniel R.

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys. PMID:24740179

  19. Adaptive optics for deeper imaging of biological samples.

    Girkin, John M; Poland, Simon; Wright, Amanda J

    2009-02-01

    Optical microscopy has been a cornerstone of life science investigations since its first practical application around 400 years ago with the goal being subcellular resolution, three-dimensional images, at depth, in living samples. Nonlinear microscopy brought this dream a step closer, but as one images more deeply the material through which you image can greatly distort the view. By using optical devices, originally developed for astronomy, whose optical properties can be changed in real time, active compensation for sample-induced aberrations is possible. Submicron resolution images are now routinely recorded from depths over 1mm into tissue. Such active optical elements can also be used to keep conventional microscopes, both confocal and widefield, in optimal alignment. PMID:19272766

  20. Adapting sampling plans to caribou distribution on calving grounds

    Michel Crête

    1991-10-01

    Full Text Available Between 1984 and 1988, the size of the two caribou herds in northern Québec was derived by combining estimates of female numbers on calving grounds in June and composition counts during rut in autumn. Sampling with aerial photos was conducted on calving grounds to determine the number of animals per km2, telemetry served to estimate the proportion of females in the census area at the time of photography in addition to summer survival rate, and helicopter or ground observations were used for composition counts. Observers were able to detect on black and white negatives over 95 percent of caribou counted from a helicopter flying at low altitude over the same area; photo scale varied between = 1:3 600 and 1:6 000. Sampling units covering less than 15-20 ha were the best for sampling caribou distribution on calving grounds, where density generally averaged » 10 individuals-km"2. Around 90 percent of caribou on calving grounds were females; others were mostly yearling males. During the 1-2 day photographic census, 64 to 77 percent of the females were present on the calving areas. Summer survival exceeded 95 percent in three summers. In autumn, females composed between 45 and 54 percent of each herd. The Rivière George herd was estimated at 682 000 individuals (± 36%; alpha = 0.10 in 1988. This estimate was imprecise due to insufficiens sample size for measuring animal density on the calving ground and for determining proportion of females on the calving ground at the time of the photo census. To improve precision and reduce cost, it is proposed to estimate herd size of tundra caribou in one step, using only aerial photos in early June without telemetry.

  1. Long-term dynamics of adaptive evolution in a globally important phytoplankton species to ocean acidification.

    Schlüter, Lothar; Lohbeck, Kai T; Gröger, Joachim P; Riebesell, Ulf; Reusch, Thorsten B H

    2016-07-01

    Marine phytoplankton may adapt to ocean change, such as acidification or warming, because of their large population sizes and short generation times. Long-term adaptation to novel environments is a dynamic process, and phenotypic change can take place thousands of generations after exposure to novel conditions. We conducted a long-term evolution experiment (4 years = 2100 generations), starting with a single clone of the abundant and widespread coccolithophore Emiliania huxleyi exposed to three different CO2 levels simulating ocean acidification (OA). Growth rates as a proxy for Darwinian fitness increased only moderately under both levels of OA [+3.4% and +4.8%, respectively, at 1100 and 2200 μatm partial pressure of CO2 (Pco2)] relative to control treatments (ambient CO2, 400 μatm). Long-term adaptation to OA was complex, and initial phenotypic responses of ecologically important traits were later reverted. The biogeochemically important trait of calcification, in particular, that had initially been restored within the first year of evolution was later reduced to levels lower than the performance of nonadapted populations under OA. Calcification was not constitutively lost but returned to control treatment levels when high CO2-adapted isolates were transferred back to present-day control CO2 conditions. Selection under elevated CO2 exacerbated a general decrease of cell sizes under long-term laboratory evolution. Our results show that phytoplankton may evolve complex phenotypic plasticity that can affect biogeochemically important traits, such as calcification. Adaptive evolution may play out over longer time scales (>1 year) in an unforeseen way under future ocean conditions that cannot be predicted from initial adaptation responses. PMID:27419227

  2. Estimating the Importance of Private Adaptation to Climate Change in Agriculture: A Review of Empirical Methods

    Moore, F.; Burke, M.

    2015-12-01

    A wide range of studies using a variety of methods strongly suggest that climate change will have a negative impact on agricultural production in many areas. Farmers though should be able to learn about a changing climate and to adjust what they grow and how they grow it in order to reduce these negative impacts. However, it remains unclear how effective these private (autonomous) adaptations will be, or how quickly they will be adopted. Constraining the uncertainty on this adaptation is important for understanding the impacts of climate change on agriculture. Here we review a number of empirical methods that have been proposed for understanding the rate and effectiveness of private adaptation to climate change. We compare these methods using data on agricultural yields in the United States and western Europe.

  3. Dangerous climate change and the importance of adaptation for the Arctic's Inuit population

    Ford, James D.

    2009-04-01

    The Arctic's climate is changing rapidly, to the extent that 'dangerous' climate change as defined by the United Nations Framework on Climate Change might already be occurring. These changes are having implications for the Arctic's Inuit population and are being exacerbated by the dependence of Inuit on biophysical resources for livelihoods and the low socio-economic-health status of many northern communities. Given the nature of current climate change and projections of a rapidly warming Arctic, climate policy assumes a particular importance for Inuit regions. This paper argues that efforts to stabilize and reduce greenhouse gas emissions are urgent if we are to avoid runaway climate change in the Arctic, but unlikely to prevent changes which will be dangerous for Inuit. In this context, a new policy discourse on climate change is required for Arctic regions—one that focuses on adaptation. The paper demonstrates that states with Inuit populations and the international community in general has obligations to assist Inuit to adapt to climate change through international human rights and climate change treaties. However, the adaptation deficit, in terms of what we know and what we need to know to facilitate successful adaptation, is particularly large in an Arctic context and limiting the ability to develop response options. Moreover, adaptation as an option of response to climate change is still marginal in policy negotiations and Inuit political actors have been slow to argue the need for adaptation assistance. A new focus on adaptation in both policy negotiations and scientific research is needed to enhance Inuit resilience and reduce vulnerability in a rapidly changing climate.

  4. Dangerous climate change and the importance of adaptation for the Arctic's Inuit population

    The Arctic's climate is changing rapidly, to the extent that 'dangerous' climate change as defined by the United Nations Framework on Climate Change might already be occurring. These changes are having implications for the Arctic's Inuit population and are being exacerbated by the dependence of Inuit on biophysical resources for livelihoods and the low socio-economic-health status of many northern communities. Given the nature of current climate change and projections of a rapidly warming Arctic, climate policy assumes a particular importance for Inuit regions. This paper argues that efforts to stabilize and reduce greenhouse gas emissions are urgent if we are to avoid runaway climate change in the Arctic, but unlikely to prevent changes which will be dangerous for Inuit. In this context, a new policy discourse on climate change is required for Arctic regions-one that focuses on adaptation. The paper demonstrates that states with Inuit populations and the international community in general has obligations to assist Inuit to adapt to climate change through international human rights and climate change treaties. However, the adaptation deficit, in terms of what we know and what we need to know to facilitate successful adaptation, is particularly large in an Arctic context and limiting the ability to develop response options. Moreover, adaptation as an option of response to climate change is still marginal in policy negotiations and Inuit political actors have been slow to argue the need for adaptation assistance. A new focus on adaptation in both policy negotiations and scientific research is needed to enhance Inuit resilience and reduce vulnerability in a rapidly changing climate.

  5. Adaptation and Initial Validation of the Passion Scale in a Portuguese Sample

    Gabriela Gonçalves

    2014-08-01

    Full Text Available Passion is defined as a strong inclination to engage in an activity that people like, that they find important, and in which they invest time and energy. As no specific measure to assess levels of passion in the workplace in Portugal is available, the aim of this study was to adapt the Passion scale into Portuguese and validate it. The scale was translated from English into Portuguese using the forward-backward translation method and administered to a sample of 551 Portuguese workers. Exploratory factor analyses were conducted to test the replicability of the scale. The results confirmed the expected two-factor structure: harmonious passion and obsessive passion. However, the initial criterion of the replication of the factorial structure based on item factor loadings was not fulfilled. Criterion-related validity was tested by correlations with passion and job satisfaction. Regarding internal consistency, adequate alpha coefficients were obtained for both factors.

  6. Parks, people, and change: the importance of multistakeholder engagement in adaptation planning for conserved areas

    Corrine N. Knapp

    2014-12-01

    Full Text Available Climate change challenges the traditional goals and conservation strategies of protected areas, necessitating adaptation to changing conditions. Denali National Park and Preserve (Denali in south central Alaska, USA, is a vast landscape that is responding to climate change in ways that will impact both ecological resources and local communities. Local observations help to inform understanding of climate change and adaptation planning, but whose knowledge is most important to consider? For this project we interviewed long-term Denali staff, scientists, subsistence community members, bus drivers, and business owners to assess what types of observations each can contribute, how climate change is impacting each, and what they think the National Park Service should do to adapt. The project shows that each type of long-term observer has different types of observations, but that those who depend more directly on natural resources for their livelihoods have more and different observations than those who do not. These findings suggest that engaging multiple groups of stakeholders who interact with the park in distinct ways adds substantially to the information provided by Denali staff and scientists and offers a broader foundation for adaptation planning. It also suggests that traditional protected area paradigms that fail to learn from and foster appropriate engagement of people may be maladaptive in the context of climate change.

  7. Estimating the abundance of clustered animal population by using adaptive cluster sampling and negative binomial distribution

    Bo, Yizhou; Shifa, Naima

    2013-09-01

    An estimator for finding the abundance of a rare, clustered and mobile population has been introduced. This model is based on adaptive cluster sampling (ACS) to identify the location of the population and negative binomial distribution to estimate the total in each site. To identify the location of the population we consider both sampling with replacement (WR) and sampling without replacement (WOR). Some mathematical properties of the model are also developed.

  8. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging

  9. Intraspecific shape variation in horseshoe crabs: the importance of sexual and natural selection for local adaptation

    Faurby, Søren; Nielsen, Kasper Sauer Kollerup; Bussarawit, Somchai;

    2011-01-01

    A morphometric analysis of the body shape of three species of horseshoe crabs was undertaken in order to infer the importance of natural and sexual selection. It was expected that natural selection would be most intense, leading to highest regional differentiation, in the American species Limulus...... polyphemus, which has the largest climatic differences between different populations. Local adaptation driven by sexual selection was expected in males but not females because horseshoe crab mating behaviour leads to competition between males, but not between females. Three hundred fifty-nine horseshoe crabs...

  10. La recherche en gestion en Afrique de l'Ouest:importation ou adaptation?

    Livian, Yves

    2013-01-01

    La recherche en gestion dans les pays du "Nord" a correspondu à un contexte économique et social bien différent de celui des pays d'Afrique aujourdhui.La pure importation est vouée à l'échec,comme l'est aussi la tendance "essentialiste" d'une spécificité africaine irrémédiable.Quelques pistes sont esquissées pour une adaptation au contexte,promise à un grand développement compte tenu des vastes terrains à découvrir.

  11. Assessing employability capacities and career adaptability in a sample of human resource professionals

    Melinde Coetzee; Nadia Ferreira; Ingrid L. Potgieter

    2015-01-01

    Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1) to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2) to identify the variables that contributed the most to this relationship.Motivation fo...

  12. Data reduction in the ITMS system through a data acquisition model with self-adaptive sampling rate

    Ruiz, M. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain)], E-mail: mariano.ruiz@upm.es; Lopez, JM.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Barrera, E. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Melendez, R. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2008-04-15

    Long pulse or steady state operation of fusion experiments require data acquisition and processing systems that reduce the volume of data involved. The availability of self-adaptive sampling rate systems and the use of real-time lossless data compression techniques can help solve these problems. The former is important for continuous adaptation of sampling frequency for experimental requirements. The latter allows the maintenance of continuous digitization under limited memory conditions. This can be achieved by permanent transmission of compressed data to other systems. The compacted transfer ensures the use of minimum bandwidth. This paper presents an implementation based on intelligent test and measurement system (ITMS), a data acquisition system architecture with multiprocessing capabilities that permits it to adapt the system's sampling frequency throughout the experiment. The sampling rate can be controlled depending on the experiment's specific requirements by using an external dc voltage signal or by defining user events through software. The system takes advantage of the high processing capabilities of the ITMS platform to implement a data reduction mechanism based in lossless data compression algorithms which are themselves based in periodic deltas.

  13. The importance of the EU green paper on climate adaptation for the Netherlands

    An analysis of the EU Green Paper Climate Adaptation shows that this is not inconsistent with the National Adaptation, but does differ. The Green Paper also highlights the European dimension of climate adaptation and approaches climate adaptation in a broader context. Furthermore, the challenge is to remain alert and to bring Dutch ideas into the EU policy process, when concrete measures will be formulated

  14. Low Bit-Rate Image Compression using Adaptive Down-Sampling technique

    V.Swathi; Prof. K ASHOK BABU

    2011-01-01

    In this paper, we are going to use a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass pre-filtering. The resulting down-sampled pre-filtered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then up-converts it to the original resolut...

  15. How to apply importance-sampling techniques to simulations of optical systems

    McKinstrie, C. J.; Winzer, P. J.

    2003-01-01

    This report contains a tutorial introduction to the method of importance sampling. The use of this method is illustrated for simulations of the noise-induced energy jitter of return-to-zero pulses in optical communication systems.

  16. An Importance Sampling Scheme on Dual Factor Graphs. II. Models with Strong Couplings

    Molkaraie, Mehdi

    2014-01-01

    We consider the problem of estimating the partition function of the two-dimensional ferromagnetic Ising model in an external magnetic field. The estimation is done via importance sampling in the dual of the Forney factor graph representing the model. We present importance sampling schemes that can efficiently compute an estimate of the partition function in a wide range of model parameters. Emphasis is on models in which a subset of the coupling parameters is strong.

  17. An upgraded version of an importance sampling algorithm for large scale shell model calculations

    Bianco, D; Andreozzi, F; Lo Iudice, N; Porrino, A [Universita di Napoli Federico II, Dipartimento Scienze Fisiche, Monte S. Angelo, via Cintia, 80126 Napoli (Italy); S, Dimitrova, E-mail: loiudice@na.infn.i [Institute of Nuclear Research and Nuclear Energy, Sofia (Bulgaria)

    2010-01-01

    An importance sampling iterative algorithm, developed few years ago, for generating exact eigensolutions of large matrices is upgraded so as to allow large scale shell model calculations in the uncoupled m-scheme. By exploiting the sparsity properties of the Hamiltonian matrix and projecting out effectively the good angular momentum, the new importance sampling allows to reduce drastically the sizes of the matrices while keeping full control of the accuracy of the eigensolutions. Illustrative numerical examples are presented.

  18. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  19. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    Huibin Lu

    2015-01-01

    Full Text Available In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the source domain to the target domain; the second step is to construct L1-Graph on the basis of sparse representation, so as to search for the nearest neighbor data with self-adaptation and preserve the samples and the geometric structure; lastly, we integrate two complementary objective functions into the unified optimization issue and use the iterative algorithm to cope with it, and then the estimation of the testing sample classification is completed. Comparative experiments are conducted in USPS-Binary digital database, Three-Domain Object Benchmark database, and ALOI database; the experimental results verify the effectiveness of the proposed algorithm, which improves the recognition accuracy and ensures the robustness of algorithm.

  20. Sample based 3D face reconstruction from a single frontal image by adaptive locally linear embedding

    ZHANG Jian; ZHUANG Yue-ting

    2007-01-01

    In this paper, we propose a highly automatic approach for 3D photorealistic face reconstruction from a single frontal image. The key point of our work is the implementation of adaptive manifold learning approach. Beforehand, an active appearance model (AAM) is trained for automatic feature extraction and adaptive locally linear embedding (ALLE) algorithm is utilized to reduce the dimensionality of the 3D database. Then, given an input frontal face image, the corresponding weights between 3D samples and the image are synthesized adaptively according to the AAM selected facial features. Finally, geometry reconstruction is achieved by linear weighted combination of adaptively selected samples. Radial basis function (RBF) is adopted to map facial texture from the frontal image to the reconstructed face geometry. The texture of invisible regions between the face and the ears is interpolated by sampling from the frontal image. This approach has several advantages: (1) Only a single frontal face image is needed for highly automatic face reconstruction; (2) Compared with former works, our reconstruction approach provides higher accuracy; (3) Constraint based RBF texture mapping provides natural appearance for reconstructed face.

  1. Adaptation to climate change and climate variability:The importance of understanding agriculture as performance

    Crane, T.A.; Roncoli, C.; Hoogenboom, G.

    2011-01-01

    Most climate change studies that address potential impacts and potential adaptation strategies are largely based on modelling technologies. While models are useful for visualizing potential future outcomes and evaluating options for potential adaptation, they do not adequately represent and integrat

  2. FloodNet: Coupling Adaptive Sampling with Energy Aware Routing in a Flood Warning System

    Jing Zhou; David De Roure

    2007-01-01

    We describe the design of FloodNet, a flood warning system, which uses a grid-based flood predictor model developed by environmental experts to make flood predictions based on readings of water level collected by a set of sensor nodes.To optimize battery consumption, the reporting frequency of sensor nodes is required to be adaptive to local conditions as well as the flood predictor model.We therefore propose an energy aware routing protocol which allows sensor nodes to consume energy according to this need.This system is notable both for the adaptive sampling regime and the methodology adopted in the design of the adaptive behavior, which involved development of simulation tools and very close collaboration with environmental experts.

  3. Assessing employability capacities and career adaptability in a sample of human resource professionals

    Melinde Coetzee

    2015-03-01

    Full Text Available Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1 to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2 to identify the variables that contributed the most to this relationship.Motivation for the study: Global competitive markets and technological advances are increasingly driving the demand for graduate knowledge and skills in a wide variety of jobs. Contemporary career theory further emphasises career adaptability across the lifespan as a critical skill for career management agency. Despite the apparent importance attached to employees’ employability and career adaptability, there seems to be a general lack of research investigating the association between these constructs.Research approach, design and method: A cross-sectional, quantitative research design approach was followed. Descriptive statistics, Pearson product-moment correlations and canonical correlation analysis were performed to achieve the objective of the study. The participants (N = 196 were employed in professional positions in the human resource field and were predominantly early career black people and women.Main findings: The results indicated positive multivariate relationships between the variables and showed that lifelong learning capacities and problem solving, decision-making and interactive skills contributed the most to explaining the participants’ career confidence, career curiosity and career control.Practical/managerial implications: The study suggests that developing professional graduates’ employability capacities may strengthen their career adaptability. These capacities were shown to explain graduates’ active engagement in career management strategies

  4. An importance sampling algorithm for generating exact eigenstates of the nuclear Hamiltonian

    Andreozzi, F; Iudice, N. Lo; Porrino, A.

    2003-01-01

    We endow a recently devised algorithm for generating exact eigensolutions of large matrices with an importance sampling, which is in control of the extent and accuracy of the truncation of their dimensions. We made several tests on typical nuclei using a correlated basis obtained from partitioning the shell model space. The sampling so implemented allows not only for a substantial reduction of the shell model space but also for an extrapolation to exact eigenvalues and E2 strengths.

  5. An importance sampling algorithm for generating exact eigenstates of the nuclear Hamiltonian

    Andreozzi, F [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Naples (Italy); Iudice, N Lo [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Naples (Italy); Porrino, A [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Naples (Italy)

    2003-10-01

    We endow a recently devised algorithm for generating exact eigensolutions of large matrices with an importance sampling, which is in control of the extent and accuracy of the truncation of their dimensions. We performed several tests on typical nuclei using a correlated basis obtained from partitioning the shell model space. The sampling so implemented allows not only for a substantial reduction of the shell model space but also for an extrapolation to exact eigenvalues and E2 strengths.

  6. Using continuous in-situ measurements to adaptively trigger urban storm water samples

    Wong, B. P.; Kerkez, B.

    2015-12-01

    Until cost-effective in-situ sensors are available for biological parameters, nutrients and metals, automated samplers will continue to be the primary source of reliable water quality measurements. Given limited samples bottles, however, autosamplers often obscure insights on nutrient sources and biogeochemical processes which would otherwise be captured using a continuous sampling approach. To that end, we evaluate the efficacy a novel method to measure first-flush nutrient dynamics in flashy, urban watersheds. Our approach reduces the number of samples required to capture water quality dynamics by leveraging an internet-connected sensor node, which is equipped with a suite of continuous in-situ sensors and an automated sampler. To capture both the initial baseflow as well as storm concentrations, a cloud-hosted adaptive algorithm analyzes the high-resolution sensor data along with local weather forecasts to optimize a sampling schedule. The method was tested in a highly developed urban catchment in Ann Arbor, Michigan and collected samples of nitrate, phosphorus, and suspended solids throughout several storm events. Results indicate that the watershed does not exhibit first flush dynamics, a behavior that would have been obscured when using a non-adaptive sampling approach.

  7. Low Bit-Rate Image Compression using Adaptive Down-Sampling technique

    V.Swathi

    2011-09-01

    Full Text Available In this paper, we are going to use a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass pre-filtering. The resulting down-sampled pre-filtered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then up-converts it to the original resolution in a constrained least squares restoration process, using a 2-D piecewise autoregressive model and the knowledge of directional low-pass pre-filtering. The proposed compression approach of collaborative adaptive down-sampling and up-conversion (CADU outperforms JPEG 2000 in PSNR measure at low to medium bit rates and achieves superior visual quality, as well. The superior low bit-rate performance of the CADU approach seems to suggest that over-sampling not only wastes hardware resources and energy, and it could be counterproductive to image quality given a tight bit budget.

  8. Enhanced modeling via network theory: Adaptive sampling of Markov state models

    Bowman, Gregory R; Ensign, Daniel L.; Pande, Vijay S.

    2010-01-01

    Computer simulations can complement experiments by providing insight into molecular kinetics with atomic resolution. Unfortunately, even the most powerful supercomputers can only simulate small systems for short timescales, leaving modeling of most biologically relevant systems and timescales intractable. In this work, however, we show that molecular simulations driven by adaptive sampling of networks called Markov State Models (MSMs) can yield tremendous time and resource savings, allowing p...

  9. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  10. Adaptation and Validation of the Sexual Assertiveness Scale (SAS) in a Sample of Male Drug Users.

    Vallejo-Medina, Pablo; Sierra, Juan Carlos

    2015-01-01

    The aim of the present study was to adapt and validate the Sexual Assertiveness Scale (SAS) in a sample of male drug users. A sample of 326 male drug users and 322 non-clinical males was selected by cluster sampling and convenience sampling, respectively. Results showed that the scale had good psychometric properties and adequate internal consistency reliability (Initiation = .66, Refusal = .74 and STD-P = .79). An evaluation of the invariance showed strong factor equivalence between both samples. A high and moderate effect of Differential Item Functioning was only found in items 1 and 14 (∆R 2 Nagelkerke = .076 and .037, respectively). We strongly recommend not using item 1 if the goal is to compare the scores of both groups, otherwise the comparison will be biased. Correlations obtained between the CSFQ-14 and the safe sex ratio and the SAS subscales were significant (CI = 95%) and indicated good concurrent validity. Scores of male drug users were similar to those of non-clinical males. Therefore, the adaptation of the SAS to drug users provides enough guarantees for reliable and valid use in both clinical practice and research, although care should be taken with item 1. PMID:25896498

  11. An Importance Sampling Algorithm for Diagonalizing the Nuclear Shell-Model Hamiltonian

    We have developed an iterative algorithm for generating exact eigensolutions of large matrices and endowed it with an importance sampling which allows for a reduction of the sizes of the matrices while keeping full control of the accuracy of the eigensolutions. We illustrate the potential of the method through its application to the nuclear shell-model eigenproblem

  12. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  13. Importance sampling for Lambda-coalescents in the infinitely many sites model

    Birkner, Matthias; Steinruecken, Matthias; 10.1016/j.tpb.2011.01.005

    2011-01-01

    We present and discuss new importance sampling schemes for the approximate computation of the sample probability of observed genetic types in the infinitely many sites model from population genetics. More specifically, we extend the 'classical framework', where genealogies are assumed to be governed by Kingman's coalescent, to the more general class of Lambda-coalescents and develop further Hobolth et. al.'s (2008) idea of deriving importance sampling schemes based on 'compressed genetrees'. The resulting schemes extend earlier work by Griffiths and Tavar\\'e (1994), Stephens and Donnelly (2000), Birkner and Blath (2008) and Hobolth et. al. (2008). We conclude with a performance comparison of classical and new schemes for Beta- and Kingman coalescents.

  14. Adapting chain referral methods to sample new migrants: Possibilities and limitations

    Lucinda Platt

    2015-09-01

    Full Text Available Background: Demographic research on migration requires representative samples of migrant populations. Yet recent immigrants, who are particularly informative about current migrant flows, are difficult to capture even in specialist surveys. Respondent-driven sampling (RDS, a chain referral sampling and analysis technique, potentially offers the opportunity to achieve population-level inference of recently arrived migrant populations. Objective: We evaluate the attempt to use RDS to sample two groups of migrants, from Pakistan and Poland, who had arrived in the UK within the previous 18 months, and we present an alternative approach adapted to recent migrants. Methods: We discuss how connectedness, privacy, clustering, and motivation are expected to differ among recently arrived migrants, compared to typical applications of RDS. We develop a researcher-led chain referral approach, and compare success in recruitment and indicators of representativeness to standard RDS recruitment. Results: Our researcher-led approach led to higher rates of chain-referral, and enabled us to reach population members with smaller network sizes. The researcher-led approach resulted in similar recruiter-recruit transition probabilities to traditional RDS across many demographic and social characteristics. However, we did not succeed in building up long referral chains, largely due to the lack of connectedness of our target populations and some reluctance to refer. There were some differences between the two migrant groups, with less mobile and less hidden Pakistani men producing longer referral chains. Conclusions: Chain referral is difficult to implement for sampling newly arrived migrants. However, our researcher-led adaptation shows promise for less hidden and more stable recent immigrant populations. Contribution: The paper offers an evaluation of RDS for surveying recent immigrants and an adaptation that may be effective under certain conditions.

  15. Adaptive sampling strategy support for the unlined chromic acid pit, chemical waste landfill, Sandia National Laboratories, Albuquerque, New Mexico

    Johnson, R.L.

    1993-11-01

    Adaptive sampling programs offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the real-time data generated by an adaptive sampling program. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system (SitePlanner{trademark} ) for data fusion, management, and display and combined Bayesian/geostatistical methods (PLUME) for contamination-extent estimation and sample location selection. This approach is applied in a retrospective study of a subsurface chromium plume at Sandia National Laboratories` chemical waste landfill. Retrospective analyses suggest the potential for characterization cost savings on the order of 60% through a reduction in the number of sampling programs, total number of soil boreholes, and number of samples analyzed from each borehole.

  16. Improved Algorithms and Coupled Neutron-Photon Transport for Auto-Importance Sampling Method

    Wang, Xin; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Li, Jun-Li

    2016-01-01

    Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed by Tsinghua University for deep penetration problem, which can improve computational efficiency significantly without pre-calculations for importance distribution. However AIS method is only validated with several basic deep penetration problems of simple geometries and cannot be used for coupled neutron-photon transport. This paper firstly presented the latest algorithm improvements for AIS method including particle transport, fictitious particles creation and adjustment, fictitious surface geometry, random number allocation and calculation of estimated relative error, which made AIS method applicable to complicated deep penetration problem. Then, a coupled Neutron-Photon Auto-Importance Sampling (NP-AIS) method was proposed to apply AIS method with the improved algorithms in coupled neutron-photon Monte Carlo transport. Finally, the NUREG/CR-6115 PWR benchmark model was calculated with the method of geometry splitti...

  17. Performance evaluation of an importance sampling technique in a Jackson network

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  18. The Portuguese adaptation of the Gudjonsson Suggestibility Scale (GSS1) in a sample of inmates.

    Pires, Rute; Silva, Danilo R; Ferreira, Ana Sousa

    2014-01-01

    This paper comprises two studies which address the validity of the Portuguese adaptation of the Gudjonsson Suggestibility Scale, GSS1. In study 1, the means and standard deviations for the suggestibility results of a sample of Portuguese inmates (N=40, Mage=37.5 years, SD=8.1) were compared to those of a sample of Icelandic inmates (Gudjonsson, 1997; Gudjonsson & Sigurdsson, 1996). Portuguese inmates' results were in line with the original results. In study 2, the means and standard deviations for the suggestibility results of the sample of Portuguese inmates were compared to those of a general Portuguese population sample (N=57, Mage=36.1 years, SD=12.7). The forensic sample obtained significantly higher scores in suggestibility measures than the general population sample. ANOVA confirmed that the increased suggestibility in the inmates sample was due to the limited memory capacity of this latter group. Given that the results of both studies 1 and 2 are in keeping with the author's original results (Gudjonsson, 1997), this may be regarded as a confirmation of the validity of the Portuguese GSS1. PMID:24289862

  19. Use of a Quantum Computer to do Importance and Metropolis-Hastings Sampling of a Classical Bayesian Network

    Tucci, Robert R.

    2008-01-01

    Importance sampling and Metropolis-Hastings sampling (of which Gibbs sampling is a special case) are two methods commonly used to sample multi-variate probability distributions (that is, Bayesian networks). Heretofore, the sampling of Bayesian networks has been done on a conventional "classical computer". In this paper, we propose methods for doing importance sampling and Metropolis-Hastings sampling of a classical Bayesian network on a quantum computer.

  20. Adaptation to lactose in lactose malabsorbers - importance of the intestinal microflora

    Fondén, Rangne

    2001-01-01

    At high intakes most lactose in lactose malabsorbers will be fermented by the intestinal microflora to hydrogen and other fermentation products, as are all other low-molecular, non-absorbable, fermentable carbohydrates. By adaptation higher intakes of lactose could be tolerated partly due to a lower net formation of hydrogen. This shift in fermentation is at least partly caused by a change in the activities of the intestinal microflora. Keywords: Adaptation, hydrogen production, lactose malab...

  1. Improved importance sampling technique for efficient simulation of digital communication systems

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  2. An Importance Sampling Scheme on Dual Factor Graphs. I. Models in a Strong External Field

    Molkaraie, Mehdi

    2014-01-01

    We propose an importance sampling scheme to estimate the partition function of the two-dimensional ferromagnetic Ising model and the two-dimensional ferromagnetic $q$-state Potts model, both in the presence of an external magnetic field. The proposed scheme operates in the dual Forney factor graph and is capable of efficiently computing an estimate of the partition function under a wide range of model parameters. In particular, we consider models that are in a strong external magnetic field.

  3. Specific determination of clinical and toxicological important substances in biological samples by LC-MS

    This thesis of this dissertation is the specific determination of clinical and toxicological important substances in biological samples by LC-MS. Nicotine was determined in serum after application of nicotine plaster and nicotine nasal spray with HPLC-ESI-MS. Cotinine was determined direct in urine with HPLC-ESI-MS. Short time anesthetics were determined in blood and cytostatics were determined in liquor with HPLC-ESI-MS. (botek)

  4. Performance evaluation of Bayesian decision feedback equalizer with M-PAM symbols using importance sampling simulation

    Chen, S.

    2002-01-01

    An importance sampling (IS) simulation method is presented for evaluating the lower-bound symbol error rate (SER) of the Bayesian decision feedback equalizer (DFE) with $M$-PAM symbols, under the assumption of correct decision feedback. By exploiting an asymptotic property of the Bayesian DFE, a design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency (AE) of the IS simulation.

  5. A laser microdissection-based workflow for FFPE tissue microproteomics: Important considerations for small sample processing.

    Longuespée, Rémi; Alberts, Deborah; Pottier, Charles; Smargiasso, Nicolas; Mazzucchelli, Gabriel; Baiwir, Dominique; Kriegsmann, Mark; Herfs, Michael; Kriegsmann, Jörg; Delvenne, Philippe; De Pauw, Edwin

    2016-07-15

    Proteomic methods are today widely applied to formalin-fixed paraffin-embedded (FFPE) tissue samples for several applications in research, especially in molecular pathology. To date, there is an unmet need for the analysis of small tissue samples, such as for early cancerous lesions. Indeed, no method has yet been proposed for the reproducible processing of small FFPE tissue samples to allow biomarker discovery. In this work, we tested several procedures to process laser microdissected tissue pieces bearing less than 3000 cells. Combined with appropriate settings for liquid chromatography mass spectrometry-mass spectrometry (LC-MS/MS) analysis, a citric acid antigen retrieval (CAAR)-based procedure was established, allowing to identify more than 1400 proteins from a single microdissected breast cancer tissue biopsy. This work demonstrates important considerations concerning the handling and processing of laser microdissected tissue samples of extremely limited size, in the process opening new perspectives in molecular pathology. A proof of the proposed method for biomarker discovery, with respect to these specific handling considerations, is illustrated using the differential proteomic analysis of invasive breast carcinoma of no special type and invasive lobular triple-negative breast cancer tissues. This work will be of utmost importance for early biomarker discovery or in support of matrix-assisted laser desorption/ionization (MALDI) imaging for microproteomics from small regions of interest. PMID:26690073

  6. Organ sample generator for expected treatment dose construction and adaptive inverse planning optimization

    Purpose: To create an organ sample generator (OSG) for expected treatment dose construction and adaptive inverse planning optimization. The OSG generates random samples of organs of interest from a distribution obeying the patient specific organ variation probability density function (PDF) during the course of adaptive radiotherapy. Methods: Principle component analysis (PCA) and a time-varying least-squares regression (LSR) method were used on patient specific geometric variations of organs of interest manifested on multiple daily volumetric images obtained during the treatment course. The construction of the OSG includes the determination of eigenvectors of the organ variation using PCA, and the determination of the corresponding coefficients using time-varying LSR. The coefficients can be either random variables or random functions of the elapsed treatment days depending on the characteristics of organ variation as a stationary or a nonstationary random process. The LSR method with time-varying weighting parameters was applied to the precollected daily volumetric images to determine the function form of the coefficients. Eleven h and n cancer patients with 30 daily cone beam CT images each were included in the evaluation of the OSG. The evaluation was performed using a total of 18 organs of interest, including 15 organs at risk and 3 targets. Results: Geometric variations of organs of interest during h and n cancer radiotherapy can be represented using the first 3 ∼ 4 eigenvectors. These eigenvectors were variable during treatment, and need to be updated using new daily images obtained during the treatment course. The OSG generates random samples of organs of interest from the estimated organ variation PDF of the individual. The accuracy of the estimated PDF can be improved recursively using extra daily image feedback during the treatment course. The average deviations in the estimation of the mean and standard deviation of the organ variation PDF for h and n

  7. Local adaptation of a bacterium is as important as its presence in structuring a natural microbial community.

    Gómez, Pedro; Paterson, Steve; De Meester, Luc; Liu, Xuan; Lenzi, Luca; Sharma, M D; McElroy, Kerensa; Buckling, Angus

    2016-01-01

    Local adaptation of a species can affect community composition, yet the importance of local adaptation compared with species presence per se is unknown. Here we determine how a compost bacterial community exposed to elevated temperature changes over 2 months as a result of the presence of a focal bacterium, Pseudomonas fluorescens SBW25, that had been pre-adapted or not to the compost for 48 days. The effect of local adaptation on community composition is as great as the effect of species presence per se, with these results robust to the presence of an additional strong selection pressure: an SBW25-specific virus. These findings suggest that evolution occurring over ecological time scales can be a key driver of the structure of natural microbial communities, particularly in situations where some species have an evolutionary head start following large perturbations, such as exposure to antibiotics or crop planting and harvesting. PMID:27501868

  8. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  9. Geographic variation in the songs of neotropical singing mice: testing the relative importance of drift and local adaptation.

    Campbell, Polly; Pasch, Bret; Pino, Jorge L; Crino, Ondi L; Phillips, Molly; Phelps, Steven M

    2010-07-01

    Patterns of geographic variation in communication systems can provide insight into the processes that drive phenotypic evolution. Although work in birds, anurans, and insects demonstrates that acoustic signals are sensitive to diverse selective and stochastic forces, processes that shape variation in mammalian vocalizations are poorly understood. We quantified geographic variation in the advertisement songs of sister species of singing mice, montane rodents with a unique mode of vocal communication. We tested three hypotheses to explain spatial variation in the song of the lower altitude species, Scotinomys teguina: selection for species recognition in sympatry with congener, S. xerampelinus, acoustic adaptation to different environments, and stochastic divergence. Mice were sampled at seven sites in Costa Rica and Panamá; genetic distances were estimated from mitochondrial control region sequences, between-site differences in acoustic environment were estimated from climatic data. Acoustic, genetic and geographic distances were all highly correlated in S. teguina, suggesting that population differentiation in song is largely shaped by genetic drift. Contrasts between interspecific genetic-acoustic distances were significantly greater than expectations derived from intraspecific contrasts, indicating accelerated evolution of species-specific song. We propose that, although much intraspecific acoustic variation is effectively neutral, selection has been important in shaping species differences in song. PMID:20148958

  10. A Surrogate-based Adaptive Sampling Approach for History Matching and Uncertainty Quantification

    Li, Weixuan; Zhang, Dongxiao; Lin, Guang

    2015-02-25

    A critical procedure in reservoir simulations is history matching (or data assimilation in a broader sense), which calibrates model parameters such that the simulation results are consistent with field measurements, and hence improves the credibility of the predictions given by the simulations. Often there exist non-unique combinations of parameter values that all yield the simulation results matching the measurements. For such ill-posed history matching problems, Bayesian theorem provides a theoretical foundation to represent different solutions and to quantify the uncertainty with the posterior PDF. Lacking an analytical solution in most situations, the posterior PDF may be characterized with a sample of realizations, each representing a possible scenario. A novel sampling algorithm is presented here for the Bayesian solutions to history matching problems. We aim to deal with two commonly encountered issues: 1) as a result of the nonlinear input-output relationship in a reservoir model, the posterior distribution could be in a complex form, such as multimodal, which violates the Gaussian assumption required by most of the commonly used data assimilation approaches; 2) a typical sampling method requires intensive model evaluations and hence may cause unaffordable computational cost. In the developed algorithm, we use a Gaussian mixture model as the proposal distribution in the sampling process, which is simple but also flexible to approximate non-Gaussian distributions and is particularly efficient when the posterior is multimodal. Also, a Gaussian process is utilized as a surrogate model to speed up the sampling process. Furthermore, an iterative scheme of adaptive surrogate refinement and re-sampling ensures sampling accuracy while keeping the computational cost at a minimum level. The developed approach is demonstrated with an illustrative example and shows its capability in handling the above-mentioned issues. Multimodal posterior of the history matching

  11. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  12. Role of importance of X-ray fluorescence analysis of forensic samples

    Full text: In the field of forensic science, it is very important to investigate the evidential samples obtained at various crime scenes. X-ray fluorescence (XRF) is used widely in forensic science [1]. Its main strength is its non-destructive nature, thus preserving evidence [2, 3]. In this paper, we report the application of XRF to examine the evidences like purity gold and silver jewelry (Indian Ornaments), remnants of glass pieces and paint chips recovered from crime scenes. The experimental measurements on these samples have been made using X-ray fluorescence spectrometer (LAB Center XRF-1800) procured from Shimazdu Scientific Inst., USA. The results are explained in terms of quantitative/ qualitative analysis of trace elements. (author)

  13. Cold adaptation in geographical populations of Drosophila melanogaster : phenotypic plasticity is more important than genetic variability

    Ayrinhac, A; Debat, [No Value; Gibert, P; Kister, AG; Legout, H; Moreteau, B; Vergilino, R; David, [No Value

    2004-01-01

    1. According to their geographical distribution, most Drosophila species may be classified as either temperate or tropical, and this pattern is assumed to reflect differences in their thermal adaptation, especially in their cold tolerance. We investigated cold tolerance in a global collection of D.

  14. An Unbiased Adaptive Sampling Algorithm for the Exploration of RNA Mutational Landscapes under Evolutionary Pressure

    Waldispühl, Jérôme; Ponty, Yann

    The analysis of the impact of mutations on folding properties of RNAs is essential to decipher principles driving molecular evolution and to design new molecules. We recently introduced an algorithm called RNAmutants which samples RNA sequence-structure maps in polynomial time and space. However, since the mutation probabilities depend of the free energy of the structures, RNAmutants is bias toward G+C-rich regions of the mutational landscape. In this paper we introduce an unbiased adaptive sampling algorithm that enables RNAmutants to sample regions of the mutational landscape poorly covered by previous techniques. We applied the method to sample mutations in complete RNA sequence-structures maps of sizes up to 40 nucleotides. Our results indicate that the G+C-contents has a strong influence on the evolutionary accessible structural ensembles. In particular, we show that low G+C-contents favor the apparition of internal loops, while high G+C-contents reduce the size of the evolutionary accessible mutational landscapes.

  15. Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology

    Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan

    2012-01-01

    A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.

  16. Evaluation of endoscopically obtained duodenal biopsy samples from cats and dogs in an adapter-modified Ussing chamber

    Ruhnke, Isabelle; DeBiasio, John V.; Suchodolski, Jan S.; Newman, Shelley; Musch, Mark W.; Steiner, Jörg M.

    2014-01-01

    This study was conducted to evaluate an adapter-modified Ussing chamber for assessment of transport physiology in endoscopically obtained duodenal biopsies from healthy cats and dogs, as well as dogs with chronic enteropathies. 17 duodenal biopsies from five cats and 51 duodenal biopsies from 13 dogs were obtained. Samples were transferred into an adapter-modified Ussing chamber and sequentially exposed to various absorbagogues and secretagogues. Overall, 78.6% of duodenal samples obtained fr...

  17. Do women's voices provide cues of the likelihood of ovulation? The importance of sampling regime.

    Julia Fischer

    Full Text Available The human voice provides a rich source of information about individual attributes such as body size, developmental stability and emotional state. Moreover, there is evidence that female voice characteristics change across the menstrual cycle. A previous study reported that women speak with higher fundamental frequency (F0 in the high-fertility compared to the low-fertility phase. To gain further insights into the mechanisms underlying this variation in perceived attractiveness and the relationship between vocal quality and the timing of ovulation, we combined hormone measurements and acoustic analyses, to characterize voice changes on a day-to-day basis throughout the menstrual cycle. Voice characteristics were measured from free speech as well as sustained vowels. In addition, we asked men to rate vocal attractiveness from selected samples. The free speech samples revealed marginally significant variation in F0 with an increase prior to and a distinct drop during ovulation. Overall variation throughout the cycle, however, precluded unequivocal identification of the period with the highest conception risk. The analysis of vowel samples revealed a significant increase in degree of unvoiceness and noise-to-harmonic ratio during menstruation, possibly related to an increase in tissue water content. Neither estrogen nor progestogen levels predicted the observed changes in acoustic characteristics. The perceptual experiments revealed a preference by males for voice samples recorded during the pre-ovulatory period compared to other periods in the cycle. While overall we confirm earlier findings in that women speak with a higher and more variable fundamental frequency just prior to ovulation, the present study highlights the importance of taking the full range of variation into account before drawing conclusions about the value of these cues for the detection of ovulation.

  18. Component-adaptive up-sampling for inter layer interpolation in scalable video coding

    WANG Zhang; ZHANG JiXian; LI HaiTao

    2009-01-01

    Scalable video coding (SVC) is a newly emerging standard to be finalized as an extension of H.264/AVC. The most attractive characters in SVC are the inter layer prediction techniques, such as Intra_BL mode. But in current SVC scheme, a uniform up-sampling filter (UUSF) is employed to magnify all components of an image, which will be very inefficient and result in a lot of redundant computational complexity. To overcome this, we propose an efficient component-adaptive up-sampling filter (CAUSF) for inter layer interpolation. In CAUSF, one character of human vision system is considered, and different up-sampling filters are assigned to different components. In particular, the six-tap FIR filter used in UUSF is kept and assigned for luminance component. But for chrominance components, a new four-tap FIR filter is used. Experimental results show that CAUSF maintains the performances of coded bit-rate and PSNR-Y without any noticeable loss, and provides significant reduction in computational complexity.

  19. Accelerating the convergence of replica exchange simulations using Gibbs sampling and adaptive temperature sets

    Vogel, Thomas

    2015-01-01

    We recently introduced a novel replica-exchange scheme in which an individual replica can sample from states encountered by other replicas at any previous time by way of a global configuration database, enabling the fast propagation of relevant states through the whole ensemble of replicas. This mechanism depends on the knowledge of global thermodynamic functions which are measured during the simulation and not coupled to the heat bath temperatures driving the individual simulations. Therefore, this setup also allows for a continuous adaptation of the temperature set. In this paper, we will review the new scheme and demonstrate its capability. The method is particularly useful for the fast and reliable estimation of the microcanonical temperature T(U) or, equivalently, of the density of states g(U) over a wide range of energies.

  20. PARALLEL ADAPTIVE MULTILEVEL SAMPLING ALGORITHMS FOR THE BAYESIAN ANALYSIS OF MATHEMATICAL MODELS

    Prudencio, Ernesto

    2012-01-01

    In recent years, Bayesian model updating techniques based on measured data have been applied to many engineering and applied science problems. At the same time, parallel computational platforms are becoming increasingly more powerful and are being used more frequently by the engineering and scientific communities. Bayesian techniques usually require the evaluation of multi-dimensional integrals related to the posterior probability density function (PDF) of uncertain model parameters. The fact that such integrals cannot be computed analytically motivates the research of stochastic simulation methods for sampling posterior PDFs. One such algorithm is the adaptive multilevel stochastic simulation algorithm (AMSSA). In this paper we discuss the parallelization of AMSSA, formulating the necessary load balancing step as a binary integer programming problem. We present a variety of results showing the effectiveness of load balancing on the overall performance of AMSSA in a parallel computational environment.

  1. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  2. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  3. Estimation variance bounds of importance sampling simulations in digital communication systems

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  4. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  5. Towards an Effective Importance Sampling in Monte Carlo Simulations of a System with a Complex Action

    Anagnostopoulos, K.; Azuma, T.; Nishimura, J.

    The sign problem is a notorious problem, which occurs in Monte Carlo simulations of a system with a partition function whose integrand is not positive. One way to simulate such a system is to use the factorization method where one enforces sampling in the part of the configuration space which gives important contribution to the partition function. This is accomplished by using constraints on some observables chosen appropriately and minimizing the free energy associated with their joint distribution functions. These observables are maximally correlated with the complex phase. Observables not in this set essentially decouple from the phase and can be calculated without the sign problem in the corresponding "microcanonical" ensemble. These ideas are applied on a simple matrix model with very strong sign problem and the results are found to be consistent with analytic calculations using the Gaussian Expansion Method.

  6. The importance of including variability in climate change projections used for adaptation

    Sexton, David M. H.; Harris, Glen R.

    2015-10-01

    Our understanding of mankind’s influence on the climate is largely based on computer simulations. Model output is typically averaged over several decades so that the anthropogenic climate change signal stands out from the largely unpredictable `noise’ of climate variability. Similar averaging periods (30-year) are used for regional climate projections to inform adaptation. According to two such projections, UKCIP02 (ref. ) and UKCP09 (ref. ), the UK will experience `hotter drier summers and warmer wetter winters’ in the future. This message is about a typical rather than any individual future season, and these projections should not be compared directly to observed weather as this neglects the sizeable contribution from year-to-year climate variability. Therefore, despite the apparent contradiction with the messages, it is a fallacy to suggest the recent cold UK winters like 2009/2010 disprove human-made climate change. Nevertheless, such claims understandably cause public confusion and doubt. Here we include year-to-year variability to provide projections for individual seasons. This approach has two advantages. First, it allows fair comparisons with recent weather events, for instance showing that recent cold winters are within projected ranges. Second, it allows the projections to be expressed in terms of the extreme hot, cold, wet or dry seasons that impact society, providing a better idea of adaptation needs.

  7. Importance Sampling Variance Reduction for the Fokker-Planck Rarefied Gas Particle Method

    Collyer, Benjamin; Lockerby, Duncan

    2015-01-01

    Models and methods that are able to accurately and efficiently predict the flows of low-speed rarefied gases are in high demand, due to the increasing ability to manufacture devices at micro and nano scales. One such model and method is a Fokker-Planck approximation to the Boltzmann equation, which can be solved numerically by a stochastic particle method. The stochastic nature of this method leads to noisy estimates of the thermodynamic quantities one wishes to sample when the signal is small in comparison to the thermal velocity of the gas. Recently, Gorji et al have proposed a method which is able to greatly reduce the variance of the estimators, by creating a correlated stochastic process which acts as a control variate for the noisy estimates. However, there are potential difficulties involved when the geometry of the problem is complex, as the method requires the density to be solved for independently. Importance sampling is a variance reduction technique that has already been shown to successfully redu...

  8. Importance sampling implemented in the code PRIZMA for deep penetration and detection problems in reactor physics

    At RFNC-VNIITF, the PRIZMA code which has been developed for more than 30 years, is used to model radiation transport by the Monte Carlo method. The code implements individual and coupled tracking of neutrons, photons, electrons, positrons and ions in one dimensional (1D), 2D or 3D geometry. Attendance estimators are used for tallying, i.e., the estimators whose scores are only nonzero from particles which cross a region or surface of interest. Importance sampling is used to make deep penetration and detection calculations more effective. However, its application to reactor analysis appeared peculiar and required further development. The paper reviews methods used for deep penetration and detection calculations by PRIZMA. It describes in what these calculations differ when applied to reactor analysis and how we compute approximated importance functions and parameters for biased distributions. Methods to control the statistical weight of particles are also discussed. A number of test and applied calculations which were done for the purpose of verification are provided. They are shown to agree either with asymptotic solutions if exist, or with results of analog calculations or predictions by other codes. The applied calculations include the estimation of ex-core detector response from neutron sources arranged in the core, and the estimation of in-core detector response. (authors)

  9. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity over Generalized Fading Channels

    Ben Rached, Nadhir

    2015-11-13

    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closedform expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive Monte Carlo (MC) simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations.

  10. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Oscar Vedder

    2013-07-01

    Full Text Available Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations.

  11. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Vedder, Oscar; Bouwhuis, Sandra; Sheldon, Ben C

    2013-07-01

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major) breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations. PMID:23874152

  12. Cortisol Secretion and Functional Disabilities in Old Age: Importance of Using Adaptive Control Strategies

    Wrosch, Carsten; Miller, Gregory E.; Schulz, Richard

    2009-01-01

    Objectives To examine whether the use of health-related control strategies moderates the association between elevated diurnal cortisol secretion and increases in older adults’ functional disabilities. Methods Functional disabilities of 164 older adults were assessed over 4 years by measuring participants’ problems with performing activities of daily living. The main predictors included baseline levels of diurnal cortisol secretion and control strategies used to manage physical health threats. Results A large increase in functional disabilities was observed among participants who secreted elevated baseline levels of cortisol and did not use health-related control strategies. By contrast, high cortisol level was not associated with increases in functional disabilities among participants who reported using these control strategies. Among participants with low cortisol level, there was a relatively smaller increase in functional disabilities over time, and the use of control strategies was not significantly associated with changes in functional disabilities. Conclusions The findings suggest that high cortisol level is associated with an increase in older adults’ functional disabilities, but only if older adults do not engage in adaptive control strategies. PMID:19875635

  13. Adapt

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  14. Estimation of failure probabilities of linear dynamic systems by importance sampling

    Anna Ivanova Olsen; Arvid Naess

    2006-08-01

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the first iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the first maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure.

  15. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  16. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    HASAN CANI; ARSEN PROKO; VATH TABAKU

    2014-01-01

    Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended t...

  17. Physical Activity: An Important Adaptative Mechanism for Body-Weight Control

    Finelli, Carmine; Gioia, Saverio; La Sala, Nicolina

    2012-01-01

    We review the current concepts about energy expenditure and evaluate the physical activity (PhA) in the context of this knowledge and the available literature. Regular PhA is correlated with low body weight and low body fat mass. The negative fat balance is probably secondary to this negative energy balance. Nonexercise activity thermogenesis (NEAT) and physical activity, that is crucial for weight control, may be important in the physiology of weight change. An intriguing doubt that remains ...

  18. Physical activity: an important adaptative mechanism for body-weight control.

    Finelli, Carmine; Gioia, Saverio; La Sala, Nicolina

    2012-01-01

    We review the current concepts about energy expenditure and evaluate the physical activity (PhA) in the context of this knowledge and the available literature. Regular PhA is correlated with low body weight and low body fat mass. The negative fat balance is probably secondary to this negative energy balance. Nonexercise activity thermogenesis (NEAT) and physical activity, that is crucial for weight control, may be important in the physiology of weight change. An intriguing doubt that remains unresolved is whether changes in nutrient intake or body composition secondarily affect the spontaneous physical activity. PMID:24533208

  19. Massively parallel sampling of lattice proteins reveals foundations of thermal adaptation

    Venev, Sergey V.; Zeldovich, Konstantin B.

    2015-08-01

    Evolution of proteins in bacteria and archaea living in different conditions leads to significant correlations between amino acid usage and environmental temperature. The origins of these correlations are poorly understood, and an important question of protein theory, physics-based prediction of types of amino acids overrepresented in highly thermostable proteins, remains largely unsolved. Here, we extend the random energy model of protein folding by weighting the interaction energies of amino acids by their frequencies in protein sequences and predict the energy gap of proteins designed to fold well at elevated temperatures. To test the model, we present a novel scalable algorithm for simultaneous energy calculation for many sequences in many structures, targeting massively parallel computing architectures such as graphics processing unit. The energy calculation is performed by multiplying two matrices, one representing the complete set of sequences, and the other describing the contact maps of all structural templates. An implementation of the algorithm for the CUDA platform is available at http://www.github.com/kzeldovich/galeprot and calculates protein folding energies over 250 times faster than a single central processing unit. Analysis of amino acid usage in 64-mer cubic lattice proteins designed to fold well at different temperatures demonstrates an excellent agreement between theoretical and simulated values of energy gap. The theoretical predictions of temperature trends of amino acid frequencies are significantly correlated with bioinformatics data on 191 bacteria and archaea, and highlight protein folding constraints as a fundamental selection pressure during thermal adaptation in biological evolution.

  20. The Importance of Pressure Sampling Frequency in Models for Determination of Critical Wave Loadingson Monolithic Structures

    Burcharth, Hans F.; Andersen, Thomas Lykke; Meinert, Palle

    2008-01-01

    This paper discusses the influence of wave load sampling frequency on calculated sliding distance in an overall stability analysis of a monolithic caisson. It is demonstrated by a specific example of caisson design that for this kind of analyses the sampling frequency in a small scale model could...

  1. Indigenizing or Adapting? Importing Buddhism into a Settler-colonial Society

    Sally McAra

    2015-02-01

    Full Text Available In this paper I problematize the phrase "indigenization of Buddhism" (Spuler 2003, cf. Baumann 1997 through an investigation of a Buddhist project in a settler-colonial society. An international organization called the Foundation for the Preservation of the Mahayana Tradition (FPMT is constructing a forty-five-meter high stupa in rural Australia with the intention "to provide a refuge of peace and serenity for all." In 2003, a woman of Aboriginal descent met with the stupa developers to express her concern about the project. While her complaint does not represent local Aboriginal views about the stupa (other Aboriginal groups expressed support for it, it illustrates how in settler-colonial societies, Buddhist cultural imports that mark the land can have unexpected implications for indigenous people. This paper offers a glimpse of the multi-layered power relations that form the often invisible backdrop to the establishment of Buddhism in settler-colonial societies and suggests that we need to find terms other than "indigenization" when analyzing this.

  2. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    HASAN CANI

    2014-06-01

    Full Text Available Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended to produce biological knowledge that can be used to better manage forest resources for sustainable production of economic and non-economic values and aims to improve the understanding of past and current dynamics of Mediterranean and temperate forests. The overarching goal is to quantify the influence of genetics, climate, environmental stresses, and forest management inputs on forest productivity and carbon sequestration, and to understand the physiological mechanisms underlying these responses.Process-based models open the way to useful predictions of the future growth rate of forests and provide a means of assessing the probable effects of variations in climate and management on forest productivity. As such they have the potential to overcome the limitations of conventional forest growth and yield models. This paper discusses the basic physiological processes that determine the growth of plants, the way they are affected by environmental factors and how we can improve processes that are well-understood such as growth from leaf to stand level and productivity. The study trays to show a clear relationship between temperature and water relations and other factors affecting forest plant germination and growth that are often looked at separately. This integrated approach will provide the most comprehensive source for process-based modelling, which is valuable to ecologists, plant physiologists, forest planners and environmental scientists [10]. Actually the

  3. Importance of sampling in relation to the gamma spectroscopic analysis of NORM and TENORM material

    This paper describes the developments over the past 25 years of low background gamma spectroscopic analysis of NORM and TENORM materials to a state-of-the-art semi-automatic gamma analysis system. The developments were initiated in the early 1980s in order to be able to measure low specific activities in fly ash samples. The developments involved modifications and improvements of commercially available hardware, auxiliary equipment, improvement and development of analyzing software, correction software and processing software to a semi-automatic reporting of the analysis results. The effort summarized above has led to detection limits of 238U: 3 Bq/kg, 235U: 0.3 Bq/kg, 226Ra: 5 Bq/kg, 210Pb: 30 Bq/kg, 40K: 60 Bq/kg, with a measuring time of 70,000s using a specially tuned gamma spectroscopy system for NORM and TENORM materials. These low detection limits show the need to set up representative sampling procedures for NORM and TENORM materials. It is not possible to define a sampling procedure that would be valid for all types of sampling. Therefore it is advised that, where sampling is expected to be performed at regular times, a sampling procedure for the materials being dealt with should be set-up and validated. The procedure has to be based on an existing national or international standard. (author)

  4. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    2010-07-01

    ... applicable. (b) Quality assurance program. The importer must conduct a quality assurance program, as specified in this paragraph (b), for each truck or rail car loading terminal. (1) Quality assurance samples... an independent laboratory, and the terminal operator must not know in advance when samples are to...

  5. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function. PMID:26801023

  6. The control of radioactivity in the imported food and other samples in Republic of Serbia for period 1990-1994

    The results of measurements of the 137Cs activities in the imported food an the other samples (1663) in Serbia for period 1990-1994 are presented. The 137Cs activities in the majority of samples are in the interval 0.2-0.3 Bq/kg, and are in agreement with results of radioactivity monitoring program in the environment in Serbia. (author)

  7. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies

    Mielke, Steven L.; Truhlar, Donald G.

    2016-01-01

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  8. Cas9 function and host genome sampling in Type II-A CRISPR–Cas adaptation

    Wei, Yunzhou; Terns, Rebecca M.; Terns, Michael P.

    2015-01-01

    Wei et al. found that Cas9, previously identified as the nuclease responsible for ultimate invader destruction, is also essential for adaptation in Streptococcus thermophilus. Cas9 nuclease activity is dispensable for adaptation. Wei et al. also revealed extensive, unbiased acquisition of the self-targeting host genome sequence by the CRISPR–Cas system that is masked in the presence of active target destruction.

  9. Importance of covariance components of waveform data with high sampling rate in seismic source inversion

    Yagi, Y.; Fukahata, Y.

    2007-12-01

    As computer technology advanced, it has become possible to observe seismic wave with a higher sampling rate and perform inversion for a larger data set. In general, to obtain a finer image of seismic source processes, waveform data with a higher sampling rate are needed. Then we encounter a problem whether there is no limitation of sampling rate in waveform inversion. In traditional seismic source inversion, covariance components of sampled waveform data have commonly been neglected. In fact, however, observed waveform data are not completely independent of each other at least in time domain, because they are always affected by un-elastic attenuation in the propagation of seismic waves through the Earth. In this study, we have developed a method of seismic source inversion to take the data covariance into account, and applied it to teleseismic P-wave data of the 2003 Boumerdes-Zemmouri, Algeria earthquake. From a comparison of final slip distributions inverted by the new formulation and the traditional formulation, we found that the effect of covariance components is crucial for a data set of higher sampling rates (≥ 5 Hz). For higher sampling rates, the slip distributions by the new formulation look stable, whereas the slip distributions by the traditional formulation tend to concentrate into small patches due to overestimation of the information from observed data. Our result indicates that the un-elastic effect of the Earth gives a limitation to the resolution of inverted seismic source models. It has been pointed out that seismic source models obtained from waveform data analyses are quite different from one another. One possible reason for the discrepancy is the neglect of covariance components. The new formulation must be useful to obtain a standard seismic source model.

  10. On the importance of sampling variance to investigations of temporal variation in animal population size

    Link, W.A.; Nichols, J.D.

    1994-01-01

    Our purpose here is to emphasize the need to properly deal with sampling variance when studying population variability and to present a means of doing so. We present an estimator for temporal variance of population size for the general case in which there are both sampling variances and covariances associated with estimates of population size. We illustrate the estimation approach with a series of population size estimates for black-capped chickadees (Parus atricapillus) wintering in a Connecticut study area and with a series of population size estimates for breeding populations of ducks in southwestern Manitoba.

  11. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  12. Sampling procedure in a willow plantation for chemical elements important for biomass combustion quality

    Liu, Na; Nielsen, Henrik Kofoed; Jørgensen, Uffe;

    2015-01-01

    Willow (Salix spp.) is expected to contribute significantly to the woody bioenergy system in the future, so more information on how to sample the quality of the willow biomass is needed. The objectives of this study were to investigate the spatial variation of elements within shoots of a willow...

  13. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    2010-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline...

  14. The Importance of Cultural and Gastronomic Tourism in Local Economic Development: Zile Sample

    Mehmet Kocaman; Emel Memis Kocaman

    2014-01-01

    More rational source distribution in Turkey recently has brought forward the principles of optimality in investment planning. Therefore, many rural areas have been negatively affected from this state. Accordingly, alternative tourism provides important opportunities for rural regions. People living in these regions have become to give importance to local tangible and intangible cultural assets, which are present around their environment and gastronomic products consisting of regional tastes. ...

  15. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    2010-04-01

    ...; repacking; examination of merchandise by prospective purchasers. 19.8 Section 19.8 Customs Duties U.S... goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. Importers... conduct of Customs business and no danger to the revenue prospective purchaser may be permitted to...

  16. Fast-adapting mechanoreceptors are important for force control in precision grip but not for sensorimotor memory.

    Park, Susanna B; Davare, Marco; Falla, Marika; Kennedy, William R; Selim, Mona M; Wendelschafer-Crabb, Gwen; Koltzenburg, Martin

    2016-06-01

    Sensory feedback from cutaneous mechanoreceptors in the fingertips is important in effective object manipulation, allowing appropriate scaling of grip and load forces during precision grip. However, the role of mechanoreceptor subtypes in these tasks remains incompletely understood. To address this issue, psychophysical tasks that may specifically assess function of type I fast-adapting (FAI) and slowly adapting (SAI) mechanoreceptors were used with object manipulation experiments to examine the regulation of grip force control in an experimental model of graded reduction in tactile sensitivity (healthy volunteers wearing 2 layers of latex gloves). With gloves, tactile sensitivity decreased significantly from 1.9 ± 0.4 to 12.3 ± 2.2 μm in the Bumps task assessing function of FAI afferents but not in a grating orientation task assessing SAI afferents (1.6 ± 0.1 to 1.8 ± 0.2 mm). Six axis force/torque sensors measured peak grip (PGF) and load (PLF) forces generated by the fingertips during a grip-lift task. With gloves there was a significant increase of PGF (14 ± 6%), PLF (17 ± 5%), and grip and load force rates (26 ± 8%, 20 ± 8%). A variable-weight series task was used to examine sensorimotor memory. There was a 20% increase in PGF when the lift of a light object was preceded by a heavy relative to a light object. This relationship was not significantly altered when lifting with gloves, suggesting that the addition of gloves did not change sensorimotor memory effects. We conclude that FAI fibers may be important for the online force scaling but not for the buildup of a sensorimotor memory. PMID:27052582

  17. Sample preparation and biomass determination of SRF model mixture using cryogenic milling and the adapted balance method

    Schnöller, Johannes, E-mail: johannes.schnoeller@chello.at; Aschenbrenner, Philipp; Hahn, Manuel; Fellner, Johann; Rechberger, Helmut

    2014-11-15

    Highlights: • An alternative sample comminution procedure for SRF is tested. • Proof of principle is shown on a SRF model mixture. • The biogenic content of the SRF is analyzed with the adapted balance method. • The novel method combines combustion analysis and a data reconciliation algorithm. • Factors for the variance of the analysis results are statistically quantified. - Abstract: The biogenic fraction of a simple solid recovered fuel (SRF) mixture (80 wt% printer paper/20 wt% high density polyethylene) is analyzed with the in-house developed adapted balance method (aBM). This fairly new approach is a combination of combustion elemental analysis (CHNS) and a data reconciliation algorithm based on successive linearisation for evaluation of the analysis results. This method shows a great potential as an alternative way to determine the biomass content in SRF. However, the employed analytical technique (CHNS elemental analysis) restricts the probed sample mass to low amounts in the range of a few hundred milligrams. This requires sample comminution to small grain sizes (<200 μm) to generate representative SRF specimen. This is not easily accomplished for certain material mixtures (e.g. SRF with rubber content) by conventional means of sample size reduction. This paper presents a proof of principle investigation of the sample preparation and analysis of an SRF model mixture with the use of cryogenic impact milling (final sample comminution) and the adapted balance method (determination of biomass content). The so derived sample preparation methodology (cutting mills and cryogenic impact milling) shows a better performance in accuracy and precision for the determination of the biomass content than one solely based on cutting mills. The results for the determination of the biogenic fraction are within 1–5% of the data obtained by the reference methods, selective dissolution method (SDM) and {sup 14}C-method ({sup 14}C-M)

  18. Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models

    Koopman, S.J.; Lucas, A.; Scharth Figueiredo Pinto, M.

    2011-01-01

    This paper led to a publication in the 'Journal of Business & Economic Statistics' , 2015, 33 (1), 114-127. We introduce a new efficient importance sampler for nonlinear non-Gaussian state space models. We propose a general and efficient likelihood evaluation method for this class of models via the combination of numerical and Monte Carlo integration methods. Our methodology explores the idea that only a small part of the likelihood evaluation problem requires simulation. We refer to our new ...

  19. Optimal importance sampling for tracking in image sequences: application to point tracking

    Arnaud, Elise; Memin, Etienne

    2004-01-01

    International audience In this paper, we propose a particle filtering technique for tracking applications in image sequences. The system we propose combines a measurement equation and a dynamic equation which both depend on the image sequence. Taking into account several possible observations, the peculiar measure model we consider is a linear combination of Gaussian laws. Such a model allows us to infer an analytic expression of the optimal importance function used in the diffusion proces...

  20. A Monte Carlo Simulation of the Flow Network Reliability using Importance and Stratified Sampling

    Bulteau, Stéphane; El Khadiri, Mohamed

    1997-01-01

    We consider the evaluation of the flow network reliability parameter. Because the exact evaluation of this parameter has exponential time complexity- , simulation methods are used to derive an estimate. In this paper, we use the state space decomposition methodology of Doulliez and Jamoulle for constructing a new simulation method which combines the importance and the stratified Monte Carlo principles. We show that the related estimator belongs to the variance-reduction family. By numerical c...

  1. Adaption of egg and larvae sampling techniques for lake sturgeon and broadcast spawning fishes in a deep river

    Roseman, Edward F.; Kennedy, Gregory W.; Craig, Jaquelyn; Boase, James; Soper, Karen

    2011-01-01

    In this report we describe how we adapted two techniques for sampling lake sturgeon (Acipenser fulvescens) and other fish early life history stages to meet our research needs in the Detroit River, a deep, flowing Great Lakes connecting channel. First, we developed a buoy-less method for sampling fish eggs and spawning activity using egg mats deployed on the river bottom. The buoy-less method allowed us to fish gear in areas frequented by boaters and recreational anglers, thus eliminating surface obstructions that interfered with recreational and boating activities. The buoy-less method also reduced gear loss due to drift when masses of floating aquatic vegetation would accumulate on buoys and lines, increasing the drag on the gear and pulling it downstream. Second, we adapted a D-frame drift net system formerly employed in shallow streams to assess larval lake sturgeon dispersal for use in the deeper (>8 m) Detroit River using an anchor and buoy system.

  2. The importance of effective sampling for exploring the population dynamics of haploid-diploid seaweeds.

    Krueger-Hadfield, Stacy A; Hoban, Sean M

    2016-02-01

    The mating system partitions genetic diversity within and among populations and the links between life history traits and mating systems have been extensively studied in diploid organisms. As such most evolutionary theory is focused on species for which sexual reproduction occurs between diploid male and diploid female individuals. However, there are many multicellular organisms with biphasic life cycles in which the haploid stage is prolonged and undergoes substantial somatic development. In particular, biphasic life cycles are found across green, brown and red macroalgae. Yet, few studies have addressed the population structure and genetic diversity in both the haploid and diploid stages in these life cycles. We have developed some broad guidelines with which to develop population genetic studies of haploid-diploid macroalgae and to quantify the relationship between power and sampling strategy. We address three common goals for studying macroalgal population dynamics, including haploid-diploid ratios, genetic structure and paternity analyses. PMID:26987084

  3. Importance sampling techniques and treatment of electron transport in MCNP 4A

    The continuous energy Monte Carlo code MCNP was developed by the Radiation Transport Group at Los Alamos National Laboratory and the MCNP 4A version is available, now. The MCNP 4A is able to do the coupled neutron-secondary gamma-ray-electron-bremsstrahlung calculation. The calculated results, such as energy spectra, tally fluctuation chart, and geometrical input data can be displayed by using a work station. The document of the MCNP 4A code has no description on the subroutines, except few ones of 'SOURCE', 'TALLYX'. However, when we want to improve the MCNP Monte Carlo sampling techniques to get more accuracy or efficiency results for some problems, some subroutines are required or needed to revised. Three subroutines have been revised and built in the MCNP 4A code. (author)

  4. Gravimetric and volumetric approaches adapted for hydrogen sorption measurements with in situ conditioning on small sorbent samples

    We present high sensitivity (0 to 1 bar, 295 K) gravimetric and volumetric hydrogen sorption measurement systems adapted for in situ sample conditioning at high temperature and high vacuum. These systems are designed especially for experiments on sorbents available in small masses (mg) and requiring thorough degassing prior to sorption measurements. Uncertainty analysis from instrumental specifications and hydrogen absorption measurements on palladium are presented. The gravimetric and volumetric systems yield cross-checkable results within about 0.05 wt % on samples weighing from (3 to 25) mg. Hydrogen storage capacities of single-walled carbon nanotubes measured at 1 bar and 295 K with both systems are presented

  5. Low-carbon steel samples deformed by cold rolling - analysis by the magnetic adaptive testing

    Tomáš, Ivan; Vértesy, G.; Kobayashi, S.; Kadlecová, Jana; Stupakov, Oleksandr

    2009-01-01

    Roč. 321, č. 17 (2009), s. 2670-2676. ISSN 0304-8853 R&D Projects: GA MŠk MEB040702; GA ČR GA102/06/0866; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * plastic deformation * ow-carbon steel Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.204, year: 2009

  6. Spatiotonal adaptivity in super-resolution of under-sampled image sequences

    Pham, T Q

    2006-01-01

    This thesis concerns the use of spatial and tonal adaptivity in improving the resolution of aliased image sequences under scene or camera motion. Each of the five content chapters focuses on a different subtopic of super-resolution: image registration (chapter 2), image fusion (chapter 3 and 4), super-resolution restoration (chapter 5), and super-resolution synthesis (chapter 6). Chapter 2 derives the Cramer-Rao lower bound of image registration and shows that iterative gradient-based estimat...

  7. Complex mixture analysis of environmentally important samples utilizing GC/Matrix isolation-FTIR-MS

    Gas chromatography/matrix isolation Fourier transform infrared spectroscopy-mass spectroscopy (GC/MI-FTIR-MS) is a highly sensitive and specific hyphenated technique that combines the capabilities of capillary gas chromatography for separating components of complex mixtures with the high sensitivity and specificity of both matrix isolation infrared and mass spectrometric detection. Research intended to extend application of this methodology to analysis of a variety of environmental mixtures will be described. The authors instrument utilizes a Hewlett-Packard 58900 GC with a 40:40:20 three way splitter to the infrared detector, mass selective detector and flame ionization detector, respectively. The FTIR used is a Mattson Cryolect 4800, while the MS is a Hewlett-Packard 5970B MSD. Their most recent results from analysis of mixtures containing picogram quantities of such environmentally important materials as PAH's and chlorinated pesticides will be presented. These mixtures can be chromatography separated and analyzed at the low-nanogram to several hundred picogram level. The results presented will include both MS and IR spectra, MS and IR reconstructed chromatograms as well as FID traces. The results of MS and IR database searches will also be shown

  8. Importance Sampling Based Decision Trees for Security Assessment and the Corresponding Preventive Control Schemes: the Danish Case Study

    Liu, Leo; Rather, Zakir Hussain; Chen, Zhe;

    2013-01-01

    and adopts a methodology of importance sampling to maximize the information contained in the database so as to increase the accuracy of DT. Further, this paper also studies the effectiveness of DT by implementing its corresponding preventive control schemes. These approaches are tested on the detailed model...

  9. Quantitative Assessment of the Importance of Phenotypic Plasticity in Adaptation to Climate Change in Wild Bird Populations

    Vedder, Oscar; Bouwhuis, Sandra; SHELDON, BEN C.

    2013-01-01

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major) breeding in Wytham Woods, Oxford, UK t...

  10. FloodNet: Coupling Adaptive Sampling with Energy Aware Routing in a Flood Warning System

    Jing ZHOU; De Roure, David

    2007-01-01

    We describe the design of FloodNet, a flood warning system, which uses a grid-based flood predictor model developed by environmental experts to make flood predictions based on readings of water level collected by a set of sensor nodes. To optimize battery consumption, the reporting frequency of sensor nodes is required to be adaptive to local conditions as well as the °ood predictor model. We therefore propose an energy aware routing protocol which allows sensor nodes to consume energy accord...

  11. Clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile cultivated from stool samples of hospitalized patients

    Predrag Stojanovic

    2012-03-01

    Full Text Available The aim of this study was to fortify the clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile isolated from stool samples of hospitalized patients. This survey included 80 hospitalized patients with diarrhea and positive findings of Clostridium difficile in stool samples, and 100 hospitalized patients with formed stool as a control group. Bacteriological examination of a stool samples was conducted using standard microbiological methods. Stool sample were inoculated directly on nutrient media for bacterial cultivation (blood agar using 5% sheep blood, Endo agar, selective Salmonella Shigella agar, Selenite-F broth, CIN agar and Skirrow's medium, and to selective cycloserine-cefoxitin-fructose agar (CCFA (Biomedics, Parg qe tehnicologico, Madrid, Spain for isolation of Clostridium difficile. Clostridium difficile toxin was detected by ELISA-ridascreen Clostridium difficile Toxin A/B (R-Biopharm AG, Germany and ColorPAC ToxinA test (Becton Dickinson, USA. Examination of stool specimens for the presence of parasites (causing diarrhea was done using standard methods (conventional microscopy, commercial concentration test Paraprep S Gold kit (Dia Mondial, France and RIDA®QUICK Cryptosporidium/Giardia Combi test (R-Biopharm AG, Germany. Examination of stool specimens for the presence of fungi (causing diarrhea was performed by standard methods. All stool samples positive for Clostridium difficile were tested for Rota, Noro, Astro and Adeno viruses by ELISA - ridascreen (R-Biopharm AG, Germany. In this research we isolated 99 Clostridium difficile strains from 116 stool samples of 80 hospitalized patients with diarrhea. The 53 (66.25% of patients with diarrhea were positive for toxins A and B, one (1.25% were positive for only toxin B. Non-toxigenic Clostridium difficile isolated from samples of 26 (32.5% patients. However, other pathogenic microorganisms of intestinal tract cultivated from samples of 16 patients

  12. Unconstrained Recursive Importance Sampling

    Lemaire, Vincent; Pagès, Gilles

    2010-01-01

    We propose an unconstrained stochastic approximation method for finding the optimal change of measure (in an a priori parametric family) to reduce the variance of a Monte Carlo simulation. We consider different parametric families based on the Girsanov theorem and the Esscher transform (exponential-tilting). In [Monte Carlo Methods Appl. 10 (2004) 1–24], it described a projected Robbins–Monro procedure to select the parameter minimizing the variance in a multidimensional Gaussian framework. I...

  13. An adaptive non-raster scanning method in atomic force microscopy for simple sample shapes

    It is a significant challenge to reduce the scanning time in atomic force microscopy while retaining imaging quality. In this paper, a novel non-raster scanning method for high-speed imaging is presented. The method proposed here is developed for a specimen with the simple shape of a cell. The image is obtained by scanning the boundary of the specimen at successively increasing heights, creating a set of contours. The scanning speed is increased by employing a combined prediction algorithm, using a weighted prediction from the contours scanned earlier, and from the currently scanned contour. In addition, an adaptive change in the height step after each contour scan is suggested. A rigorous simulation test bed recreates the x–y specimen stage dynamics and the cantilever height control dynamics, so that a detailed parametric comparison of the scanning algorithms is possible. The data from different scanning algorithms are compared after the application of an image interpolation algorithm (the Delaunay interpolation algorithm), which can also run on-line. (paper)

  14. Efficient Bayes-Adaptive Reinforcement Learning using Sample-Based Search

    Guez, Arthur; Dayan, Peter

    2012-01-01

    Bayesian model-based reinforcement learning is a formally elegant approach to learning optimal behaviour under model uncertainty. In this setting, a Bayes-optimal policy captures the ideal trade-off between exploration and exploitation. Unfortunately, finding Bayes-optimal policies is notoriously taxing due to the enormous search space in the augmented belief-state MDP. In this paper we exploit recent advances in sample-based planning, based on Monte-Carlo tree search, to introduce a tractable method for approximate Bayes-optimal planning. Unlike prior work in this area, we avoid expensive applications of Bayes rule within the search tree, by lazily sampling models from the current beliefs. Our approach outperformed prior Bayesian model-based RL algorithms by a significant margin on several well-known benchmark problems.

  15. SAMPLING ADAPTIVE STRATEGY AND SPATIAL ORGANISATION ESTIMATION OF SOIL ANIMAL COMMUNITIES AT VARIOUS HIERARCHICAL LEVELS OF URBANISED TERRITORIES

    Baljuk J.A.

    2014-12-01

    Full Text Available In work the algorithm of adaptive strategy of optimum spatial sampling for studying of the spatial organisation of communities of soil animals in the conditions of an urbanization have been presented. As operating variables the principal components obtained as a result of the analysis of the field data on soil penetration resistance, soils electrical conductivity and density of a forest stand, collected on a quasiregular grid have been used. The locations of experimental polygons have been stated by means of program ESAP. The sampling has been made on a regular grid within experimental polygons. The biogeocoenological estimation of experimental polygons have been made on a basis of A.L.Belgard's ecomorphic analysis. The spatial configuration of biogeocoenosis types has been established on the basis of the data of earth remote sensing and the analysis of digital elevation model. The algorithm was suggested which allows to reveal the spatial organisation of soil animal communities at investigated point, biogeocoenosis, and landscape.

  16. Preliminary Efficacy of Adapted Responsive Teaching for Infants at Risk of Autism Spectrum Disorder in a Community Sample

    Grace T. Baranek

    2015-01-01

    Full Text Available This study examined the (a feasibility of enrolling 12-month-olds at risk of ASD from a community sample into a randomized controlled trial, (b subsequent utilization of community services, and (c potential of a novel parent-mediated intervention to improve outcomes. The First Year Inventory was used to screen and recruit 12-month-old infants at risk of ASD to compare the effects of 6–9 months of Adapted Responsive Teaching (ART versus referral to early intervention and monitoring (REIM. Eighteen families were followed for ~20 months. Assessments were conducted before randomization, after treatment, and at 6-month follow-up. Utilization of community services was highest for the REIM group. ART significantly outperformed REIM on parent-reported and observed measures of child receptive language with good linear model fit. Multiphase growth models had better fit for more variables, showing the greatest effects in the active treatment phase, where ART outperformed REIM on parental interactive style (less directive, child sensory responsiveness (less hyporesponsive, and adaptive behavior (increased communication and socialization. This study demonstrates the promise of a parent-mediated intervention for improving developmental outcomes for infants at risk of ASD in a community sample and highlights the utility of earlier identification for access to community services earlier than standard practice.

  17. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  18. Adaptive foveated single-pixel imaging with dynamic super-sampling

    Phillips, David B; Taylor, Jonathan M; Edgar, Matthew P; Barnett, Stephen M; Gibson, Graham G; Padgett, Miles J

    2016-01-01

    As an alternative to conventional multi-pixel cameras, single-pixel cameras enable images to be recorded using a single detector that measures the correlations between the scene and a set of patterns. However, to fully sample a scene in this way requires at least the same number of correlation measurements as there are pixels in the reconstructed image. Therefore single-pixel imaging systems typically exhibit low frame-rates. To mitigate this, a range of compressive sensing techniques have been developed which rely on a priori knowledge of the scene to reconstruct images from an under-sampled set of measurements. In this work we take a different approach and adopt a strategy inspired by the foveated vision systems found in the animal kingdom - a framework that exploits the spatio-temporal redundancy present in many dynamic scenes. In our single-pixel imaging system a high-resolution foveal region follows motion within the scene, but unlike a simple zoom, every frame delivers new spatial information from acros...

  19. Generalized likelihood uncertainty estimation (GLUE) using adaptive Markov chain Monte Carlo sampling

    Blasone, Roberta-Serena; Vrugt, Jasper A.; Madsen, Henrik;

    2008-01-01

    estimate of the associated uncertainty. This uncertainty arises from incomplete process representation, uncertainty in initial conditions, input, output and parameter error. The generalized likelihood uncertainty estimation (GLUE) framework was one of the first attempts to represent prediction uncertainty...... within the context of Monte Carlo (MC) analysis coupled with Bayesian estimation and propagation of uncertainty. Because of its flexibility, ease of implementation and its suitability for parallel implementation on distributed computer systems, the GLUE method has been used in a wide variety of...... applications. However, the MC based sampling strategy of the prior parameter space typically utilized in GLUE is not particularly efficient in finding behavioral simulations. This becomes especially problematic for high-dimensional parameter estimation problems, and in the case of complex simulation models...

  20. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping [Academia Sinica, Institute of Astronomy and Astrophysics, Taipei, Taiwan (China); Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Girart, Josep M. [Institut de Ciències de l' Espai, CSIC-IEEC, Campus UAB, Facultat de Ciències, C5p 2, 08193 Bellaterra, Catalonia (Spain); Frau, Pau [Observatorio Astronómico Nacional, Alfonso XII, 3 E-28014 Madrid (Spain); Li, Hua-Bai [Department of Physics, The Chinese University of Hong Kong (Hong Kong); Li, Zhi-Yun [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904 (United States); Padovani, Marco [Laboratoire Univers et Particules de Montpellier, UMR 5299 du CNRS, Université de Montpellier II, place E. Bataillon, cc072, F-34095 Montpellier (France); Qiu, Keping [School of Astronomy and Space Science, Nanjing University, 22 Hankou Road, Nanjiing 210093 (China); Rao, Ramprasad, E-mail: pmkoch@asiaa.sinica.edu.tw [Academia Sinica, Institute of Astronomy and Astrophysics, 645 N. Aohoku Place, Hilo, HI 96720 (United States)

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  1. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  2. Using adaptive sampling and triangular meshes for the processing and inversion of potential field data

    Foks, Nathan Leon

    The interpretation of geophysical data plays an important role in the analysis of potential field data in resource exploration industries. Two categories of interpretation techniques are discussed in this thesis; boundary detection and geophysical inversion. Fault or boundary detection is a method to interpret the locations of subsurface boundaries from measured data, while inversion is a computationally intensive method that provides 3D information about subsurface structure. My research focuses on these two aspects of interpretation techniques. First, I develop a method to aid in the interpretation of faults and boundaries from magnetic data. These processes are traditionally carried out using raster grid and image processing techniques. Instead, I use unstructured meshes of triangular facets that can extract inferred boundaries using mesh edges. Next, to address the computational issues of geophysical inversion, I develop an approach to reduce the number of data in a data set. The approach selects the data points according to a user specified proxy for its signal content. The approach is performed in the data domain and requires no modification to existing inversion codes. This technique adds to the existing suite of compressive inversion algorithms. Finally, I develop an algorithm to invert gravity data for an interfacing surface using an unstructured mesh of triangular facets. A pertinent property of unstructured meshes is their flexibility at representing oblique, or arbitrarily oriented structures. This flexibility makes unstructured meshes an ideal candidate for geometry based interface inversions. The approaches I have developed provide a suite of algorithms geared towards large-scale interpretation of potential field data, by using an unstructured representation of both the data and model parameters.

  3. Performances of a bent-crystal spectrometer adapted to resonant x-ray emission measurements on gas-phase samples

    We describe a bent-crystal spectrometer adapted to measure x-ray emission resulting from core-level excitation of gas-phase molecules in the 0.8-8 keV energy range. The spectrometer is based on the Johann principle, and uses a microfocused photon beam to provide high-resolution (resolving power of ∼7500). A gas cell was designed to hold a high-pressure (300 mbar) sample of gas while maintaining a high vacuum (10-9 mbar) in the chamber. The cell was designed to optimize the counting rate (2000 cts/s at the maximum of the Cl Kα emission line), while minimizing self-absorption. Example of the Kα emission lines of CH3Cl molecules is presented to illustrate the capabilities of this new instrument.

  4. Predicting the impacts of climate change on animal distributions: the importance of local adaptation and species' traits

    HELLMANN, J. J.; LOBO, N. F.

    2011-12-20

    The geographic range limits of many species are strongly affected by climate and are expected to change under global warming. For species that are able to track changing climate over broad geographic areas, we expect to see shifts in species distributions toward the poles and away from the equator. A number of ecological and evolutionary factors, however, could restrict this shifting or redistribution under climate change. These factors include restricted habitat availability, restricted capacity for or barriers to movement, or reduced abundance of colonists due the perturbation effect of climate change. This research project examined the last of these constraints - that climate change could perturb local conditions to which populations are adapted, reducing the likelihood that a species will shift its distribution by diminishing the number of potential colonists. In the most extreme cases, species ranges could collapse over a broad geographic area with no poleward migration and an increased risk of species extinction. Changes in individual species ranges are the processes that drive larger phenomena such as changes in land cover, ecosystem type, and even changes in carbon cycling. For example, consider the poleward range shift and population outbreaks of the mountain pine beetle that has decimated millions of acres of Douglas fir trees in the western US and Canada. Standing dead trees cause forest fires and release vast quantities of carbon to the atmosphere. The beetle likely shifted its range because it is not locally adapted across its range, and it appears to be limited by winter low temperatures that have steadily increased in the last decades. To understand range and abundance changes like the pine beetle, we must reveal the extent of adaptive variation across species ranges - and the physiological basis of that adaptation - to know if other species will change as readily as the pine beetle. Ecologists tend to assume that range shifts are the dominant

  5. Exploring equivalence domain in nonlinear inverse problems using Covariance Matrix Adaption Evolution Strategy (CMAES) and random sampling

    Grayver, Alexander V.; Kuvshinov, Alexey V.

    2016-05-01

    This paper presents a methodology to sample equivalence domain (ED) in nonlinear partial differential equation (PDE)-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low-misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of magneotelluric, controlled-source electromagnetic (EM) and global EM induction data.

  6. Exploring equivalence domain in non-linear inverse problems using Covariance Matrix Adaption Evolution Strategy (CMAES) and random sampling

    Grayver, Alexander V.; Kuvshinov, Alexey V.

    2016-02-01

    This paper presents a methodology to sample equivalence domain (ED) in non-linear PDE-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo (MCMC) algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of Magneotelluric, Controlled-source Electromagnetic (EM) and Global EM induction data.

  7. HOW TO ESTIMATE THE AMOUNT OF IMPORTANT CHARACTERISTICS MISSING IN A CONSUMERS SAMPLE BY USING BAYESIAN ESTIMATORS

    Sueli A. Mingoti

    2001-06-01

    Full Text Available Consumers surveys are conducted very often by many companies with the main objective of obtaining information about the opinions the consumers have about a specific prototype, product or service. In many situations the goal is to identify the characteristics that are considered important by the consumers when taking the decision of buying or using the products or services. When the survey is performed some characteristics that are present in the consumers population might not be reported by those consumers in the observed sample. Therefore, some important characteristics of the product according to the consumers opinions could be missing in the observed sample. The main objective of this paper is to show how the amount of characteristics missing in the observed sample could be easily estimated by using some Bayesian estimators proposed by Mingoti & Meeden (1992 and Mingoti (1999. An example of application related to an automobile survey is presented.Pesquisas de mercado são conduzidas freqüentemente com o propósito de obter informações sobre a opinião dos consumidores em relação a produtos já existentes no mercado, protótipos, ou determinados tipos de serviços prestados pela empresa. Em muitas situações deseja-se identificar as características que são consideradas importantes pelos consumidores no que se refere à tomada de decisão de compra do produto ou de opção pelo serviço prestado pela empresa. Como as pesquisas são feitas com amostras de consumidores do mercado potencial, algumas características consideradas importantes pela população podem não estar representadas nas amostras. O objetivo deste artigo é mostrar como a quantidade de características presentes na população e que não estão representadas nas amostras, pode ser facilmente estimada através de estimadores Bayesianos propostos por Mingoti & Meeden (1992 e Mingoti (1999. Como ilustração apresentamos um exemplo de uma pesquisa de mercado sobre um

  8. Comparative chemical composition and antimicrobial activity study of essential oils from two imported lemon fruits samples against pathogenic bacteria

    Najwa Nasser AL-Jabri

    2014-12-01

    Full Text Available The aim of this work to isolate and identify two essential oils by hydro distillation method from two imported lemon fruits samples collected from local supermarket and evaluate their antimicrobial activity against pathogenic bacteria through disc diffusion method. The essential oil was obtained from Turkish and Indian lemon fruits samples by hydro distillation method using Clevenger type apparatus. Both isolated essential oils were identified by GC–MS and determine their in vitro antimicrobial activity against pathogenic bacteria through agar gel method. Twenty two bioactive ingredients with different percentage were identified based on GC retention time from Turkish and Indian lemon collected from local supermarket. The predominant bioactive ingredients with high percentage in Turkish essential oil were dl-limonene (78.92%, α-pinene (5.08%, l-α-terpineol (4.61%, β-myrcene (1.75%, β-pinene (1.47% and β-linalool (0.95% and in Indian essential oil were dl-limonene (53.57%, l-α-terpineol (15.15%, β-pinene (7.44%, α-terpinolene (4.33%, terpinen-4-ol (3.55%, cymene (2.88% and E-citral (2.38% respectively. Both isolated essential oils by hydro distillation were used for the study of antimicrobial activity against four pathogenic bacterial strains such as Staphylococcus aureus (S. aureus, Escherichia coli (E. coli, Pseudomonas aeruginosa (P. aeruginosa and Proteus vulgaris (Pseudomonas vulgaris. Almost all bacterial strains did not give any activity against the employed essential oils at different concentrations. Therefore, the obtained results show that both essential oils could be needed further extensive biological study and their mechanism of action.

  9. Comparison of sample preparation methods for the determination of essential and toxic elements in important indigenous medicinal plant Aloe barbadensis

    The role of elements particularly traces elements in health and disease is now well established. In this paper we investigate the presence of various elements in very important herb Aloe barbadensis, it is commonly used in different ailments especially of elementary tract. We used four extraction methods for the determination of total elements in Aloe barbadensis. The procedure, which is found to be more efficient and decompose the biological material, is nitric acid and 30% hydrogen peroxide as compared to other method. The sample of plants was collected from surrounding of Hyderabad; Sindh University and vouches specimens were prepared following the standard herbarium techniques. Fifteen essential, trace and toxic elements such as Zn, Cr, K, Mg, Ca, Na, Fe, Pb, Al, Ba, Mn, Co, Ni and Cd were determined in plant and in its decoction. Using Flame Atomic Absorption Spectrophotometer Hitachi Model 180-50. It is noted that, level of essential elements was found high as compare to the level of toxic elements. (author)

  10. Methodological Adaptations for Investigating the Perceptions of Language-Impaired Adolescents Regarding the Relative Importance of Selected Communication Skills

    Reed, Vicki A.; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…

  11. Glove box adaptation of a high resolution ICP emission spectrometer and its operating experience for analysis of radioactive samples

    ICP-AES units are commercially available in market from many well established companies. These units are all compact in design and are not suitable for its glove box adaptation. As per our divisional requirement of ICP-AES to be incorporated in glove box for the purpose of analyzing radioactive material, it was decided to have all electronic and optical components to keep outside radioactive containment and the entire assembly of ICP-torch, r.f. coil, nebulizer, spray chamber, peristaltic pump and drainage system to be placed inside the glove-box. Simultaneously it was essential to maintain the analytical performance of the spectrometer. From its ore to nuclear fuel to reprocessing and disposal, uranium undergoes several different transformations within the nuclear fuel cycle, including concentration, purification, isotope enrichment, metallurgical processing and obtaining precious element i.e. plutonium (Pu). The determination of impurities in uranium/plutonium at various stages of nuclear fuel cycle plays an important role in quality control and achievement of chemical and metallurgical requirements

  12. Importance of Mobile Genetic Elements and Conjugal Gene Transfer for Subsurface Microbial Community Adaptation to Biotransformation of Metals

    Soils used in the present DOE project were obtained from the Field Research Center (FRC) through correspondence with FRC Manager David Watson. We obtained a total of six soils sampled at different distances from the surface: (A) Non-contaminated surface soil from Hinds Creek Floodplain (0 mbs (meter below surface)). (B) Mercury-contaminated surface soil from Lower East Fork Poplar Creek Floodplain (0 mbs). (C) Mercury-contaminated subsurface soil from Lower East Fork Poplar Creek Floodplain (0.5 mbs). (D) Mercury-contaminated subsurface soil from Lower East Fork Poplar Creek Floodplain (1.0 mbs). (E) Non-contaminated surface soil from Ish Creek Floodplain (0 mbs). (F) Non-contaminated surface soil from Ish Creek Floodplain (0.5 mbs)

  13. Neuronal Hypoxia Induces Hsp40-Mediated Nuclear Import of Type 3 Deiodinase As an Adaptive Mechanism to Reduce Cellular Metabolism

    Jo, S; Kallo, I.; Bardoczi, Z.; Arrojo e Drigo, R.; Zeold, A.; Liposits, Z.; Oliva, A.; Lemmon, V.P.; Bixby, J. L.; Gereben, B.; A.C. Bianco

    2012-01-01

    In neurons, the type 3 deiodinase (D3) inactivates thyroid hormone and reduces oxygen consumption, thus creating a state of cell-specific hypothyroidism. Here we show that hypoxia leads to nuclear import of D3 in neurons, without which thyroid hormone signaling and metabolism cannot be reduced. After unilateral hypoxia in the rat brain, D3 protein level is increased predominantly in the nucleus of the neurons in the pyramidal and granular ipsilateral layers, as well as in the hilus of the den...

  14. 76 FR 65165 - Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative...

    2011-10-20

    ..., to be planted or replanted. The definition of plant in that section includes any plant (including any... October 17, 2011. The risk-based sampling will be implemented following further analysis of the sampling... planting infested with quarantine pests do not enter the United States, while providing a...

  15. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally-Lensed Star-Forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    Leethochawalit, Nicha; Ellis, Richard S; Stark, Daniel P; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2015-01-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of $z\\simeq2$ based on Keck laser-assisted adaptive optics observations undertaken with the recently-improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L$^{\\ast}$ galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2-D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offers a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how serious errors of interpretation can only be revealed through better sampling. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary t...

  16. Swarm-founding in the polistine wasps: the importance of finding many microsatellite loci in studies of adaptation.

    Henshaw, M T; Strassmann, J E; Queller, D C

    2001-01-01

    We developed 52 microsatellite loci for the wasp, Polybioides tabidus, for the purpose of studying the evolution and inclusive fitness consequences of swarm-founding. The large number of loci is important for three reasons that may apply to many other systems. Heterozygosity was low in our target species, yet we found enough polymorphic loci for accurate kinship studies in this species. Many monomorphic loci were polymorphic in other polistine wasps, making comparative studies possible. Finally, enough loci amplified over a broad range of species to add a historical dimension. We sequenced six loci in other polistine wasps and used the flanking sequences to construct a phylogeny. Based on this phylogeny, we infer that swarm-founding has evolved independently three times in the polistine wasps. PMID:11251797

  17. Perceptions of Australian marine protected area managers regarding the role, importance, and achievability of adaptation for managing the risks of climate change

    Christopher Cvitanovic

    2014-12-01

    Full Text Available The rapid development of adaptation as a mainstream strategy for managing the risks of climate change has led to the emergence of a broad range of adaptation policies and management strategies globally. However, the success of such policies or management interventions depends on the effective integration of new scientific research into the decision-making process. Ineffective communication between scientists and environmental decision makers represents one of the key barriers limiting the integration of science into the decision-making process in many areas of natural resource management. This can be overcome by understanding the perceptions of end users, so as to identify knowledge gaps and develop improved and targeted strategies for communication and engagement. We assessed what one group of environmental decision makers, Australian marine protected area (MPA managers, viewed as the major risks associated with climate change, and their perceptions regarding the role, importance, and achievability of adaptation for managing these risks. We also assessed what these managers perceived as the role of science in managing the risks from climate change, and identified the factors that increased their trust in scientific information. We do so by quantitatively surveying 30 MPA managers across 3 Australian management agencies. We found that although MPA managers have a very strong awareness of the range and severity of risks posed by climate change, their understanding of adaptation as an option for managing these risks is less comprehensive. We also found that although MPA managers view science as a critical source of information for informing the decision-making process, it should be considered in context with other knowledge types such as community and cultural knowledge, and be impartial, evidence based, and pragmatic in outlining policy and management recommendations that are realistically achievable.

  18. A quasi-exclusive European ancestry in the Senepol tropical cattle breed highlights the importance of the slick locus in tropical adaptation.

    Laurence Flori

    Full Text Available BACKGROUND: The Senepol cattle breed (SEN was created in the early XX(th century from a presumed cross between a European (EUT breed (Red Poll and a West African taurine (AFT breed (N'Dama. Well adapted to tropical conditions, it is also believed trypanotolerant according to its putative AFT ancestry. However, such origins needed to be verified to define relevant husbandry practices and the genetic background underlying such adaptation needed to be characterized. METHODOLOGY/PRINCIPAL FINDINGS: We genotyped 153 SEN individuals on 47,365 SNPs and combined the resulting data with those available on 18 other populations representative of EUT, AFT and Zebu (ZEB cattle. We found on average 89% EUT, 10.4% ZEB and 0.6% AFT ancestries in the SEN genome. We further looked for footprints of recent selection using standard tests based on the extent of haplotype homozygosity. We underlined i three footprints on chromosome (BTA 01, two of which are within or close to the polled locus underlying the absence of horns and ii one footprint on BTA20 within the slick hair coat locus, involved in thermotolerance. Annotation of these regions allowed us to propose three candidate genes to explain the observed signals (TIAM1, GRIK1 and RAI14. CONCLUSIONS/SIGNIFICANCE: Our results do not support the accepted concept about the AFT origin of SEN breed. Initial AFT ancestry (if any might have been counter-selected in early generations due to breeding objectives oriented in particular toward meat production and hornless phenotype. Therefore, SEN animals are likely susceptible to African trypanosomes which questions the importation of SEN within the West African tsetse belt, as promoted by some breeding societies. Besides, our results revealed that SEN breed is predominantly a EUT breed well adapted to tropical conditions and confirmed the importance in thermotolerance of the slick locus.

  19. Climate impacts on European agriculture and water management in the context of adaptation and mitigation-The importance of an integrated approach

    We review and qualitatively assess the importance of interactions and feedbacks in assessing climate change impacts on water and agriculture in Europe. We focus particularly on the impact of future hydrological changes on agricultural greenhouse gas (GHG) mitigation and adaptation options. Future projected trends in European agriculture include northward movement of crop suitability zones and increasing crop productivity in Northern Europe, but declining productivity and suitability in Southern Europe. This may be accompanied by a widening of water resource differences between the North and South, and an increase in extreme rainfall events and droughts. Changes in future hydrology and water management practices will influence agricultural adaptation measures and alter the effectiveness of agricultural mitigation strategies. These interactions are often highly complex and influenced by a number of factors which are themselves influenced by climate. Mainly positive impacts may be anticipated for Northern Europe, where agricultural adaptation may be shaped by reduced vulnerability of production, increased water supply and reduced water demand. However, increasing flood hazards may present challenges for agriculture, and summer irrigation shortages may result from earlier spring runoff peaks in some regions. Conversely, the need for effective adaptation will be greatest in Southern Europe as a result of increased production vulnerability, reduced water supply and increased demands for irrigation. Increasing flood and drought risks will further contribute to the need for robust management practices. The impacts of future hydrological changes on agricultural mitigation in Europe will depend on the balance between changes in productivity and rates of decomposition and GHG emission, both of which depend on climatic, land and management factors. Small increases in European soil organic carbon (SOC) stocks per unit land area are anticipated considering changes in climate

  20. Bottom–up protein identifications from microliter quantities of individual human tear samples. Important steps towards clinical relevance.

    Peter Raus

    2015-12-01

    With 375 confidently identified proteins in the healthy adult tear, the obtained results are comprehensive and in large agreement with previously published observations on pooled samples of multiple patients. We conclude that, to a limited extent, bottom–up tear protein identifications from individual patients may have clinical relevance.

  1. Liver kinetics of glucose analogs measured in pigs by PET: importance of dual-input blood sampling

    Munk, O L; Bass, L; Roelsgaard, K; Bender, D; Hansen, S B; Keiding, S

    2001-01-01

    parameters, because of ignorance of the dual blood supply from the hepatic artery and the portal vein to the liver. METHODS: Six pigs underwent PET after [15O]carbon monoxide inhalation, 3-O-[11C]methylglucose (MG) injection, and [18F]FDG injection. For the glucose scans, PET data were acquired for 90 min......Metabolic processes studied by PET are quantified traditionally using compartmental models, which relate the time course of the tracer concentration in tissue to that in arterial blood. For liver studies, the use of arterial input may, however, cause systematic errors to the estimated kinetic....... Hepatic arterial and portal venous blood samples and flows were measured during the scan. The dual-input function was calculated as the flow-weighted input. RESULTS: For both MG and FDG, the compartmental analysis using arterial input led to systematic underestimation of the rate constants for rapid blood...

  2. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    A Townsend Peterson

    Full Text Available Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk.

  3. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    Peterson, A Townsend; Moses, Lina M; Bausch, Daniel G

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk. PMID:25105746

  4. Mapping Transmission Risk of Lassa Fever in West Africa: The Importance of Quality Control, Sampling Bias, and Error Weighting

    Peterson, A. Townsend; Moses, Lina M.; Bausch, Daniel G.

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk. PMID:25105746

  5. Who art thou? Personality predictors of artistic preferences in a large UK sample: the importance of openness.

    Chamorro-Premuzic, Tomas; Reimers, Stian; Hsu, Anne; Ahmetoglu, Gorkan

    2009-08-01

    The present study examined individual differences in artistic preferences in a sample of 91,692 participants (60% women and 40% men), aged 13-90 years. Participants completed a Big Five personality inventory (Goldberg, 1999) and provided preference ratings for 24 different paintings corresponding to cubism, renaissance, impressionism, and Japanese art, which loaded on to a latent factor of overall art preferences. As expected, the personality trait openness to experience was the strongest and only consistent personality correlate of artistic preferences, affecting both overall and specific preferences, as well as visits to galleries, and artistic (rather than scientific) self-perception. Overall preferences were also positively influenced by age and visits to art galleries, and to a lesser degree, by artistic self-perception and conscientiousness (negatively). As for specific styles, after overall preferences were accounted for, more agreeable, more conscientious and less open individuals reported higher preference levels for impressionist, younger and more extraverted participants showed higher levels of preference for cubism (as did males), and younger participants, as well as males, reported higher levels of preferences for renaissance. Limitations and recommendations for future research are discussed. PMID:19026107

  6. Sonochemical degradation of ethyl paraben in environmental samples: Statistically important parameters determining kinetics, by-products and pathways.

    Papadopoulos, Costas; Frontistis, Zacharias; Antonopoulou, Maria; Venieri, Danae; Konstantinou, Ioannis; Mantzavinos, Dionissios

    2016-07-01

    The sonochemical degradation of ethyl paraben (EP), a representative of the parabens family, was investigated. Experiments were conducted at constant ultrasound frequency of 20kHz and liquid bulk temperature of 30°C in the following range of experimental conditions: EP concentration 250-1250μg/L, ultrasound (US) density 20-60W/L, reaction time up to 120min, initial pH 3-8 and sodium persulfate 0-100mg/L, either in ultrapure water or secondary treated wastewater. A factorial design methodology was adopted to elucidate the statistically important effects and their interactions and a full empirical model comprising seventeen terms was originally developed. Omitting several terms of lower significance, a reduced model that can reliably simulate the process was finally proposed; this includes EP concentration, reaction time, power density and initial pH, as well as the interactions (EP concentration)×(US density), (EP concentration)×(pHo) and (EP concentration)×(time). Experiments at an increased EP concentration of 3.5mg/L were also performed to identify degradation by-products. LC-TOF-MS analysis revealed that EP sonochemical degradation occurs through dealkylation of the ethyl chain to form methyl paraben, while successive hydroxylation of the aromatic ring yields 4-hydroxybenzoic, 2,4-dihydroxybenzoic and 3,4-dihydroxybenzoic acids. By-products are less toxic to bacterium V. fischeri than the parent compound. PMID:26964924

  7. Virulence Characterisation of Salmonella enterica Isolates of Differing Antimicrobial Resistance Recovered from UK Livestock and Imported Meat Samples

    Card, Roderick; Vaughan, Kelly; Bagnall, Mary; Spiropoulos, John; Cooley, William; Strickland, Tony; Davies, Rob; Anjum, Muna F.

    2016-01-01

    Salmonella enterica is a foodborne zoonotic pathogen of significant public health concern. We have characterized the virulence and antimicrobial resistance gene content of 95 Salmonella isolates from 11 serovars by DNA microarray recovered from UK livestock or imported meat. Genes encoding resistance to sulphonamides (sul1, sul2), tetracycline [tet(A), tet(B)], streptomycin (strA, strB), aminoglycoside (aadA1, aadA2), beta-lactam (blaTEM), and trimethoprim (dfrA17) were common. Virulence gene content differed between serovars; S. Typhimurium formed two subclades based on virulence plasmid presence. Thirteen isolates were selected by their virulence profile for pathotyping using the Galleria mellonella pathogenesis model. Infection with a chicken invasive S. Enteritidis or S. Gallinarum isolate, a multidrug resistant S. Kentucky, or a S. Typhimurium DT104 isolate resulted in high mortality of the larvae; notably presence of the virulence plasmid in S. Typhimurium was not associated with increased larvae mortality. Histopathological examination showed that infection caused severe damage to the Galleria gut structure. Enumeration of intracellular bacteria in the larvae 24 h post-infection showed increases of up to 7 log above the initial inoculum and transmission electron microscopy (TEM) showed bacterial replication in the haemolymph. TEM also revealed the presence of vacuoles containing bacteria in the haemocytes, similar to Salmonella containing vacuoles observed in mammalian macrophages; although there was no evidence from our work of bacterial replication within vacuoles. This work shows that microarrays can be used for rapid virulence genotyping of S. enterica and that the Galleria animal model replicates some aspects of Salmonella infection in mammals. These procedures can be used to help inform on the pathogenicity of isolates that may be antibiotic resistant and have scope to aid the assessment of their potential public and animal health risk. PMID:27199965

  8. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally Lensed Star-forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    Leethochawalit, Nicha; Jones, Tucker A.; Ellis, Richard S.; Stark, Daniel P.; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2016-04-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of z≃ 2 based on Keck laser-assisted adaptive optics observations undertaken with the recently improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L{}* galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offer a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how kinematic complexities essential to understanding the maturity of an early star-forming galaxy can often only be revealed with better sampled data. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary to earlier suggestions, our data indicate a more diverse range of kinematic and metal gradient behavior inconsistent with a simple picture of well-ordered rotation developing concurrently with established steep metal gradients in all but merging systems. Comparing our observations with the predictions of hydrodynamical simulations suggests that gas and metals have been mixed by outflows or other strong feedback processes, flattening the metal gradients in early star-forming galaxies.

  9. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  10. Adaptation of the Participant Role Scale (PRS) in a Spanish youth sample: measurement invariance across gender and relationship with sociometric status.

    Lucas-Molina, Beatriz; Williamson, Ariel A; Pulido, Rosa; Calderón, Sonsoles

    2014-11-01

    In recent years, bullying research has transitioned from investigating the characteristics of the bully-victim dyad to examining bullying as a group-level process, in which the majority of children play some kind of role. This study used a shortened adaptation of the Participant Role Scale (PRS) to identify these roles in a representative sample of 2,050 Spanish children aged 8 to 13 years. Confirmatory factor analysis revealed three different roles, indicating that the adapted scale remains a reliable way to distinguish the Bully, Defender, and Outsider roles. In addition, measurement invariance of the adapted scale was examined to analyze possible gender differences among the roles. Peer status was assessed separately by gender through two sociometric procedures: the nominations-based method and the ratings-based method. Across genders, children in the Bully role were more often rated as rejected, whereas Defenders were more popular. Results suggest that although the PRS can reveal several different peer roles in the bullying process, a more clear distinction between bullying roles (i.e., Bully, Assistant, and Reinforcer) could better inform strategies for bullying interventions. PMID:24707035

  11. Adaptation of triple axis neutron spectrometer for SANS measurements using alumina samples at TRIGA reactor of Bangladesh

    Ahmed, F. U.; Kamal, I.; Yunus, S. M.; Datta, T. K.; Azad, A. K.; Zakaria, A. K. M.; Goyal, P. S.

    2005-09-01

    Double crystal method known as Bonse and Hart's technique has been employed to develop small angle neutron scattering (SANS) facility on a triple axis neutron spectrometer at TRIGA Mark II (3 MW) research reactor of Atomic Energy Research Establishment (AERE), Savar, Dhaka, Bangladesh. Two Si(1 1 1) crystals with very small mosaic spread ∼1 min have been used for this purpose. At an incident neutron wavelength of 1.24 Å, this device is useful for SANS in the Q range between 1.6×10 -3 and 10 -1 Å -1. This Q range allows investigating particle sizes and interparticle correlations on a length scale of ∼200 Å. Results of SANS experiments on three alumina (Al 2O 3) samples as performed using above setup are presented. It is seen that Al 2O 3 particles, indeed, scatter neutrons in regions of small angles. It is also seen that scattering is different for different samples showing that it changes with a change in particle size.

  12. Slice Sampling

    Neal, Radford M.

    2000-01-01

    Markov chain sampling methods that automatically adapt to characteristics of the distribution being sampled can be constructed by exploiting the principle that one can sample from a distribution by sampling uniformly from the region under the plot of its density function. A Markov chain that converges to this uniform distribution can be constructed by alternating uniform sampling in the vertical direction with uniform sampling from the horizontal `slice' defined by the current vertical positi...

  13. Adaptive skills

    Staša Stropnik

    2013-02-01

    Full Text Available Adaptive skills are defined as a collection of conceptual, social and practical skills that are learned by people in order to function in their everyday lives. They include an individual's ability to adapt to and manage her or his surroundings to effectively function and meet social or community expectations. Good adaptive skills promote individual's independence in different environments, whereas poorly developed adaptive skills are connected to individual's dependency and with greater need for control and help with everyday tasks. Assessment of adaptive skills is often connected to assessment of intellectual disability, due to the reason that the diagnosis of intellectual disability includes lower levels of achievements on standardized tests of intellectual abilities as well as important deficits in adaptive skills. Assessment of adaptive behavior is a part of standard assessment battery with children and adults with different problems, disorders or disabilities that affect their everyday functioning. This contribution also presents psychometric tools most regularly used for assessment of adaptive skills and characteristics of adaptive skills with individual clinical groups.

  14. Ambiguous Adaptation

    Møller Larsen, Marcus; Lyngsie, Jacob

    We investigate why some exchange relationships terminate prematurely. We argue that investments in informal governance structures induce premature termination in relationships already governed by formal contracts. The formalized adaptive behavior of formal governance structures and the flexible and...... reciprocal adaptation of informal governance structure create ambiguity in situations of contingencies, which, subsequently, increases the likelihood of premature relationship termination. Using a large sample of exchange relationships in the global service provider industry, we find support for a hypothesis...

  15. Importance of the market portfolio description in the assessment of a sample of Spanish investment funds through the Jensen’s Alpha

    BELÉN VALLEJO ALONSO

    2003-06-01

    Full Text Available The right assessment of the investment funds performance and of the manager’s ability to add value with their management is an important aspect to which has been paid special attention. Among the traditional performance measures, one of the most used is the Jensen’s alpha. However, one of the main problems of the evaluation methods using the beta as a risk measure and, hence of the Jensen’s alpha, is their sensibility to the market portfolio. In this work we aim to study the importance of the market portfolio description in the assessment of a sample of Spanish investment funds through the Jensen’s alpha. We analyze the market portfolio influence, on the one hand, in the alpha outcomes and, on the other, in the ranking of the funds that they provide.

  16. Ex vivo transcriptional profiling reveals a common set of genes important for the adaptation of Pseudomonas aeruginosa to chronically infected host sites

    Bielecki, P.; Komor, U.; Bielecka, A.; Müsken, M.; Puchalka, J.; Pletz, M.W.; Ballmann, M.; Martins Dos Santos, V.A.P.; Weiss, S.; Häussler, S.

    2013-01-01

    The opportunistic bacterium Pseudomonas aeruginosa is a major nosocomial pathogen causing both devastating acute and chronic persistent infections. During the course of an infection, P.¿ aeruginosa rapidly adapts to the specific conditions within the host. In the present study, we aimed at the ident

  17. Investigation on the Importance of Fast Air Temperature Measurements in the Sampling Cell of Short-Tube Closed-Path Gas Analyzer for Eddy-Covariance Fluxes

    Kathilankal, J. C.; Fratini, G.; Burba, G. G.

    2014-12-01

    High-speed, precise gas analyzers used in eddy covariance flux research measure gas content in a known volume, thus essentially measuring gas density. The classical eddy flux equation, however, is based on the dry mole fraction. The relation between dry mole fraction and density is regulated by the ideal gas law and law of partial pressures, and depends on water vapor content, temperature and pressure of air. If the instrument can output precise fast dry mole fraction, the flux processing is significantly simplified and WPL terms accounting for air density fluctuations are no longer required. This will also lead to the reduction in uncertainties associated with the WPL terms. For instruments adopting an open-path design, this method is difficult to use because of complexities with maintaining reliable fast temperature measurements integrated over the entire measuring path, and also because of extraordinary challenges with accurate measurements of fast pressure in the open air flow. For instruments utilizing a traditional long-tube closed-path design, with tube length 1000 or more times the tube diameter, this method can be used when instantaneous fluctuations in the air temperature of the sampled air are effectively dampened, instantaneous pressure fluctuations are regulated or negligible, and water vapor is measured simultaneously with gas, or the sample is dried. For instruments with a short-tube enclosed design, most - but not all - of the temperature fluctuations are attenuated, so calculating unbiased fluxes using fast dry mole fraction output requires high-speed, precise temperature measurements of the air stream inside the cell. In this presentation, authors look at short-term and long-term data sets to assess the importance of high-speed, precise air temperature measurements in the sampling cell of short-tube enclosed gas analyzers. The CO2 and H2O half hourly flux calculations, as well as long-term carbon and water budgets, are examined.

  18. Neural Adaptive Sequential Monte Carlo

    Gu, Shixiang; Ghahramani, Zoubin; Turner, Richard E

    2015-01-01

    Sequential Monte Carlo (SMC), or particle filtering, is a popular class of methods for sampling from an intractable target distribution using a sequence of simpler intermediate distributions. Like other importance sampling-based methods, performance is critically dependent on the proposal distribution: a bad proposal can lead to arbitrarily inaccurate estimates of the target distribution. This paper presents a new method for automatically adapting the proposal using an approximation of the Ku...

  19. The Importance of In Situ Measurements and Sample Return in the Search for Chemical Biosignatures on Mars or other Solar System Bodies (Invited)

    Glavin, D. P.; Brinckerhoff, W. B.; Conrad, P. G.; Dworkin, J. P.; Eigenbrode, J. L.; Getty, S.; Mahaffy, P. R.

    2013-12-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA's robotic exploration program for decades to come. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including organic compounds important in life on Earth and their geological forms. These compounds include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right samples to Earth (i.e. samples containing chemical biosignatures or having a high probability of biosignature preservation) would enable more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct life on Mars or elsewhere. In this presentation we will review the current in situ analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) rover using the Sample Analysis at Mars (SAM) instrument suite [3] and discuss how both future advanced in situ instrumentation [4] and laboratory measurements of samples returned from Mars and other targets of astrobiological interest including the icy moons of Jupiter and Saturn will help advance our understanding of chemical biosignatures in the Solar System. References: [1] Cronin, J. R and Chang S. (1993

  20. Sexual Excitation/Sexual Inhibition Inventory (SESII-W/M): Adaptation and Validation Within a Portuguese Sample of Men and Women.

    Neves, Cide Filipe; Milhausen, Robin R; Carvalheira, Ana

    2016-08-17

    The SESII-W/M is a self-report measure assessing factors that inhibit and enhance sexual arousal in men and women. The goal of this study was to adapt and validate it in a sample of Portuguese men and women. A total of 1,723 heterosexual men and women participated through a web survey, with ages ranging from 18 to 72 years old (M  = 36.05, SD =  11.93). The levels of internal consistency were considered satisfactory in the first four factors, but not in Setting and Dyadic Elements of the Sexual Interaction. Confirmatory factor analysis partially supported the six-factor, 30-item model, as factor loadings and squared multiple correlations pointed to problems with items mainly loading on those two factors. General fit indices were lower than the ones estimated by Milhausen, Graham, Sanders, Yarber, and Maitland (2010). Psychometric sensitivity and construct validity were adequate and gender differences were consistent with the original study. The six-factor, 30-item model was retained, but changes to the factors Setting and Dyadic Elements of the Sexual Interaction, and their corresponding items, were recommended in order to strengthen the measure. PMID:26548421

  1. Hemoglobina y testosterona: importancia en la aclimatación y adaptación a la altura Hemoglobin and testosterone: importance on high altitude acclimatization and adaptation

    Gustavo F. Gonzales

    2011-03-01

    Full Text Available Los diferentes tipos de mecanismos que emplea el organismo cuando se enfrenta a una situación de hipoxia incluyen la acomodación, la aclimatación y la adaptación. La acomodación es la respuesta inicial a la exposición aguda a la hipoxia de altura y se caracteriza por aumento de la ventilación y de la frecuencia cardiaca. La aclimatación se presenta en los individuos que están temporalmente expuestos a la altura y, que en cierto grado, les permite tolerar la altura. En esta fase hay un incremento en la eritropoyesis, se incrementa la concentración de hemoglobina y mejora la capacidad de transporte de oxígeno. La adaptación es el proceso de aclimatación natural donde entra en juego las variaciones genéticas y la aclimatación que les permiten a los individuos vivir sin dificultad en la altura. La testosterona es una hormona que regula la eritropoyesis y la ventilación, podría estar asociada con los procesos de aclimatación y adaptación a la altura. La eritrocitosis excesiva que conduce al mal de montaña crónico es causada por una baja saturación arterial de oxígeno, una ineficiencia ventilatoria y reducida respuesta ventilatoria a la hipoxia. La testosterona se incrementa en la exposición aguda en la altura y en los nativos de altura con eritrocitosis excesiva. Los resultados de las investigaciones actuales permitirían concluir que el incremento de la testosterona y de la hemoglobina son buenas para la aclimatación adquirida pues mejoran el transporte de oxígeno pero no para la adaptación a la altura, dado que valores altos de testosterona en suero se asocian con eritrocitosis excesiva.The different types of response mechanisms that the organism uses when exposed to hypoxia include accommodation, acclimatization and adaptation. Accommodation is the initial response to acute exposure to high altitude hypoxia and is characterized by an increase in ventilation and heart rate. Acclimatization is observed in individuals

  2. Thriving While Engaging in Risk? Examining Trajectories of Adaptive Functioning, Delinquency, and Substance Use in a Nationally Representative Sample of U.S. Adolescents

    Warren, Michael T.; Wray-Lake, Laura; Rote, Wendy M.; Shubert, Jennifer

    2016-01-01

    Recent advances in positive youth development theory and research explicate complex associations between adaptive functioning and risk behavior, acknowledging that high levels of both co-occur in the lives of some adolescents. However, evidence on nuanced overlapping developmental trajectories of adaptive functioning and risk has been limited to 1…

  3. Evolution of the MIDTAL microarray: the adaption and testing of oligonucleotide 18S and 28S rDNA probes and evaluation of subsequent microarray generations with Prymnesium spp. cultures and field samples.

    McCoy, Gary R; Touzet, Nicolas; Fleming, Gerard T A; Raine, Robin

    2015-07-01

    The toxic microalgal species Prymnesium parvum and Prymnesium polylepis are responsible for numerous fish kills causing economic stress on the aquaculture industry and, through the consumption of contaminated shellfish, can potentially impact on human health. Monitoring of toxic phytoplankton is traditionally carried out by light microscopy. However, molecular methods of identification and quantification are becoming more common place. This study documents the optimisation of the novel Microarrays for the Detection of Toxic Algae (MIDTAL) microarray from its initial stages to the final commercial version now available from Microbia Environnement (France). Existing oligonucleotide probes used in whole-cell fluorescent in situ hybridisation (FISH) for Prymnesium species from higher group probes to species-level probes were adapted and tested on the first-generation microarray. The combination and interaction of numerous other probes specific for a whole range of phytoplankton taxa also spotted on the chip surface caused high cross reactivity, resulting in false-positive results on the microarray. The probe sequences were extended for the subsequent second-generation microarray, and further adaptations of the hybridisation protocol and incubation temperatures significantly reduced false-positive readings from the first to the second-generation chip, thereby increasing the specificity of the MIDTAL microarray. Additional refinement of the subsequent third-generation microarray protocols with the addition of a poly-T amino linker to the 5' end of each probe further enhanced the microarray performance but also highlighted the importance of optimising RNA labelling efficiency when testing with natural seawater samples from Killary Harbour, Ireland. PMID:25631743

  4. Defining “Normophilic” and “Paraphilic” Sexual Fantasies in a Population‐Based Sample: On the Importance of Considering Subgroups

    2015-01-01

    criteria for paraphilia are too inclusive. Suggestions are given to improve the definition of pathological sexual interests, and the crucial difference between SF and sexual interest is underlined. Joyal CC. Defining “normophilic” and “paraphilic” sexual fantasies in a population‐based sample: On the importance of considering subgroups. Sex Med 2015;3:321–330. PMID:26797067

  5. THE IMPORTANCE OF THE STANDARD SAMPLE FOR ACCURATE ESTIMATION OF THE CONCENTRATION OF NET ENERGY FOR LACTATION IN FEEDS ON THE BASIS OF GAS PRODUCED DURING THE INCUBATION OF SAMPLES WITH RUMEN LIQUOR

    ŽNIDARŠIČ, T.; Verbič, J.; Babnik, D.

    2003-01-01

    The aim of this work was to examine the necessity of using the standard sample at the Hohenheim gas test. During a three year period, 24 runs of forage samples were incubated with rumen liquor in vitro. Beside the forage samples also the standard hay sample provided by the Hohenheim University (HFT-99) was included in the experiment. Half of the runs were incubated with rumen liquor of cattle and half with the rumen liquor of sheep. Gas produced during the 24 h incubation of standard sample w...

  6. The Importance of Sampling Strategies on AMS Determination of Dykes II. Further Examples from the Kapaa Quarry, Koolau Volcano, Oahu, Hawaii

    Mendoza-Borunda, R.; Herrero-Bervera, E.; Canon-Tapia, E.

    2012-12-01

    Recent work has suggested the convenience of dyke sampling along several profiles parallel and perpendicular to its walls to increase the probability of determining a geologically significant magma flow direction using anisotropy of magnetic susceptibility (AMS) measurements. For this work, we have resampled in great detail some dykes from the Kapaa Quarry, Koolau Volcano in Oahu Hawaii, comparing the results of a more detailed sampling scheme with those obtained previously with a traditional sampling scheme. In addition to the AMS results we will show magnetic properties, including magnetic grain sizes, Curie points and AMS measured at two different frequencies on a new MFK1-FA Spinner Kappabridge. Our results thus far provide further empirical evidence supporting the occurrence of a definite cyclic fabric acquisition during the emplacement of at least some of the dykes. This cyclic behavior can be captured using the new sampling scheme, but might be easily overlooked if the simple, more traditional sampling scheme is used. Consequently, previous claims concerning the advantages of adopting a more complex sampling scheme are justified since this approach can serve to reduce the uncertainty in the interpretation of AMS results.

  7. Effect of some important sample parameters on the X-ray fluorescence determination of impurities in pure materials: determination of Ca, Y, Gd and Th in uranium

    Calibration standards obtained by dry mixing of the components in appropriate proportions may be compositionally accurate but they can be a major source of error in the x-ray fluorescence analytical results owing to the mismatch in the degree of their homogeneity with that of the sample. An in-depth study has been made on the above aspect taking U3O8 as a matrix and Ca, Y, Gd and Th in the low concentration range (0-1500 ppm) as representative impurities. This study clearly illustrates the incompatibility of the truly homogeneous standards with the standards obtained by dry mixing and grinding. This incompatibility is dependent on the analytical line wavelength, being large for soft x-ray lines and small for harder x-rays. The effect of binding material which is often required for briquetting the sample into a pellet form and is necessarily to be mixed with the sample by dry mixing is discussed. (author)

  8. 一种适于时差法超声流量计的自适应采样方法%A Self-adaptive Sampling Method for Time Difference Method Ultrasonic Flow-meter

    罗永; 王让定; 姚灵

    2012-01-01

    For the requirements of high precision and low power in Ultrasonic Flow-meter, an self-adaptive sampling method is proposed in order to overcome the deficiency of periodic sampling. The sampling rate of change in the adjacent time difference is considered as the parameter to control sampling period, which can be adjusted automatically based on the fluid flow. Through the comparative analysis of experimental data between self-adaptive sampling and periodic sampling, adaptive sampling method can not only significantly improve the measurement accuracy in a fluid fluctuation environment, but also can reduce system power consumption while the fluid is stable.%针对超声流量计高精度、低功耗的要求,提出一种自适应采样方法,克服了周期性采样的不足.该方法以相邻二次采样的时差变化率作为动态控制采样周期的主要指标,可根据流体流动情况自动调整采样周期.通过对自适应和周期性采样算法的实验数据进行比较分析,自适应采样方法不仅能在流体波动环境中显著提高计量精度,而且还可以在流体处于稳定时降低系统功耗.

  9. THE IMPORTANCE OF THE STANDARD SAMPLE FOR ACCURATE ESTIMATION OF THE CONCENTRATION OF NET ENERGY FOR LACTATION IN FEEDS ON THE BASIS OF GAS PRODUCED DURING THE INCUBATION OF SAMPLES WITH RUMEN LIQUOR

    T ŽNIDARŠIČ

    2003-10-01

    Full Text Available The aim of this work was to examine the necessity of using the standard sample at the Hohenheim gas test. During a three year period, 24 runs of forage samples were incubated with rumen liquor in vitro. Beside the forage samples also the standard hay sample provided by the Hohenheim University (HFT-99 was included in the experiment. Half of the runs were incubated with rumen liquor of cattle and half with the rumen liquor of sheep. Gas produced during the 24 h incubation of standard sample was measured and compared to a declared value of sample HFT-99. Beside HFT-99, 25 test samples with known digestibility coefficients determined in vivo were included in the experiment. Based on the gas production of HFT-99, it was found that donor animal (cattle or sheep did not significantly affect the activity of rumen liquor (41.4 vs. 42.2 ml of gas per 200 mg dry matter, P>0.1. Neither differences between years (41.9, 41.2 and 42.3 ml of gas per 200 mg dry matter, P>0.1 were significant. However, a variability of about 10% (from 38.9 to 43.7 ml of gas per 200 mg dry matter was observed between runs. In the present experiment, the gas production in HFT-99 was about 6% lower than the value obtained by the Hohenheim University (41.8 vs. 44.43 ml per 200 mg dry matter. This indicates a systematic error between the laboratories. In the case of twenty-five test samples, correction on the basis of the standard sample reduced the average difference of the in vitro estimates of net energy for lactation (NEL from the in vivo determined values. It was concluded that, due to variation between runs and systematical differences in rumen liquor activity between two laboratories, the results of Hohenheim gas test have to be corrected on the basis of standard sample.

  10. Importance of covariance components in inversion analyses of densely sampled observed data: an application to waveform data inversion for seismic source processes

    Yagi, Yuji; Fukahata, Yukitoshi

    2008-10-01

    Nominally continuous data in space and/or time is obtained in various observations in geophysics. Due to an enhanced technology of computers, we can now invert such observed data with a very high sampling rate. Densely sampled observed data are usually not completely independent of each other, and so we must take this effect into account. As for seismic waveform data, they have at least temporal correlation due to the effect of inelastic attenuation of the Earth. Taking the data covariance into account, we have developed a method of seismic source inversion and applied it to teleseismic P-wave data of the 2003 Boumerdes-Zemmouri, Algeria earthquake. From the comparison of the final slip distributions inverted with and without the covariance components, we found that the effect of covariance components is crucial for a data set of higher sampling rates (>=5 Hz). If we neglect the covariance components, the inverted results become unstable due to overestimation of the information from observed data. So far, it has been widely believed that we can obtain a finer image of seismic source processes, by inverting waveform data with a higher sampling rate. However, the covariance components of observed data originated from inelastic effect of the Earth give a limitation on the resolution of inverted seismic source models.

  11. Importance of the market portfolio description in the assessment of a sample of Spanish investment funds through the Jensen’s Alpha

    BELÉN VALLEJO ALONSO

    2003-01-01

    The right assessment of the investment funds performance and of the manager’s ability to add value with their management is an important aspect to which has been paid special attention. Among the traditional performance measures, one of the most used is the Jensen’s alpha. However, one of the main problems of the evaluation methods using the beta as a risk measure and, hence of the Jensen’s alpha, is their sensibility to the market portfolio. In this work we aim to study the importance of the...

  12. Signal sampling circuit

    Louwsma, Simon Minze; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  13. Signal sampling circuit

    Louwsma, Simon Minze; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  14. Robust Adaptive Photon Tracing using Photon Path Visibility

    Hachisuka, Toshiya; Jensen, Henrik Wann

    2011-01-01

    We present a new adaptive photon tracing algorithm which can handle illumination settings that are considered difficult for photon tracing approaches such as outdoor scenes, close-ups of a small part of an illuminated region, and illumination coming through a small gap. The key contribution in our...... algorithm is the use of visibility of photon path as the importance function which ensures that our sampling algorithm focuses on paths that are visible from the given viewpoint. Our sampling algorithm builds on two recent developments in Markov chain Monte Carlo methods: adaptive Markov chain sampling and...... replica exchange. Using these techniques, each photon path is adaptively mutated and it explores the sampling space efficiently without being stuck at a local peak of the importance function. We have implemented this sampling approach in the progressive photon mapping algorithm which provides visibility...

  15. Global dust attenuation in disc galaxies: strong variation with specific star formation and stellar mass, and the importance of sample selection

    Devour, Brian M.; Bell, Eric F.

    2016-06-01

    We study the relative dust attenuation-inclination relation in 78 721 nearby galaxies using the axis ratio dependence of optical-near-IR colour, as measured by the Sloan Digital Sky Survey, the Two Micron All Sky Survey, and the Wide-field Infrared Survey Explorer. In order to avoid to the greatest extent possible attenuation-driven biases, we carefully select galaxies using dust attenuation-independent near- and mid-IR luminosities and colours. Relative u-band attenuation between face-on and edge-on disc galaxies along the star-forming main sequence varies from ˜0.55 mag up to ˜1.55 mag. The strength of the relative attenuation varies strongly with both specific star formation rate and galaxy luminosity (or stellar mass). The dependence of relative attenuation on luminosity is not monotonic, but rather peaks at M3.4 μm ≈ -21.5, corresponding to M* ≈ 3 × 1010 M⊙. This behaviour stands seemingly in contrast to some older studies; we show that older works failed to reliably probe to higher luminosities, and were insensitive to the decrease in attenuation with increasing luminosity for the brightest star-forming discs. Back-of-the-envelope scaling relations predict the strong variation of dust optical depth with specific star formation rate and stellar mass. More in-depth comparisons using the scaling relations to model the relative attenuation require the inclusion of star-dust geometry to reproduce the details of these variations (especially at high luminosities), highlighting the importance of these geometrical effects.

  16. Technology transfer for adaptation

    Biagini, Bonizella; Kuhl, Laura; Gallagher, Kelly Sims; Ortiz, Claudia

    2014-09-01

    Technology alone will not be able to solve adaptation challenges, but it is likely to play an important role. As a result of the role of technology in adaptation and the importance of international collaboration for climate change, technology transfer for adaptation is a critical but understudied issue. Through an analysis of Global Environment Facility-managed adaptation projects, we find there is significantly more technology transfer occurring in adaptation projects than might be expected given the pessimistic rhetoric surrounding technology transfer for adaptation. Most projects focused on demonstration and early deployment/niche formation for existing technologies rather than earlier stages of innovation, which is understandable considering the pilot nature of the projects. Key challenges for the transfer process, including technology selection and appropriateness under climate change, markets and access to technology, and diffusion strategies are discussed in more detail.

  17. Personality and adaptive performance at work: a meta-analytic investigation.

    Huang, Jason L; Ryan, Ann Marie; Zabel, Keith L; Palmer, Ashley

    2014-01-01

    We examined emotional stability, ambition (an aspect of extraversion), and openness as predictors of adaptive performance at work, based on the evolutionary relevance of these traits to human adaptation to novel environments. A meta-analysis on 71 independent samples (N = 7,535) demonstrated that emotional stability and ambition are both related to overall adaptive performance. Openness, however, does not contribute to the prediction of adaptive performance. Analysis of predictor importance suggests that ambition is the most important predictor for proactive forms of adaptive performance, whereas emotional stability is the most important predictor for reactive forms of adaptive performance. Job level (managers vs. employees) moderates the effects of personality traits: Ambition and emotional stability exert stronger effects on adaptive performance for managers as compared to employees. PMID:24016205

  18. Adaptive regularization

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.;

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient desce...

  19. Adaptive Lighting

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive LightingAdaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled i...

  20. Origins of adaptive immunity.

    Liongue, Clifford; John, Liza B; Ward, Alister

    2011-01-01

    Adaptive immunity, involving distinctive antibody- and cell-mediated responses to specific antigens based on "memory" of previous exposure, is a hallmark of higher vertebrates. It has been argued that adaptive immunity arose rapidly, as articulated in the "big bang theory" surrounding its origins, which stresses the importance of coincident whole-genome duplications. Through a close examination of the key molecules and molecular processes underpinning adaptive immunity, this review suggests a less-extreme model, in which adaptive immunity emerged as part of longer evolutionary journey. Clearly, whole-genome duplications provided additional raw genetic materials that were vital to the emergence of adaptive immunity, but a variety of other genetic events were also required to generate some of the key molecules, whereas others were preexisting and simply co-opted into adaptive immunity. PMID:21395512

  1. Adaptive skills

    Staša Stropnik; Jana Kodrič

    2013-01-01

    Adaptive skills are defined as a collection of conceptual, social and practical skills that are learned by people in order to function in their everyday lives. They include an individual's ability to adapt to and manage her or his surroundings to effectively function and meet social or community expectations. Good adaptive skills promote individual's independence in different environments, whereas poorly developed adaptive skills are connected to individual's dependency and with g...

  2. Validation of a simplified field-adapted procedure for routine determinations of methyl mercury at trace levels in natural water samples using species-specific isotope dilution mass spectrometry

    Lambertsson, Lars [Umeaa Marine Sciences Centre, Hoernefors (Sweden); Umeaa University, Department of Chemistry, Analytical Chemistry, Umeaa (Sweden); Bjoern, Erik [Umeaa University, Department of Chemistry, Analytical Chemistry, Umeaa (Sweden)

    2004-12-01

    A field-adapted procedure based on species-specific isotope dilution (SSID) methodology for trace-level determinations of methyl mercury (CH{sub 3}Hg{sup +}) in mire, fresh and sea water samples was developed, validated and applied in a field study. In the field study, mire water samples were filtered, standardised volumetrically with isotopically enriched CH{sub 3}{sup 200}Hg{sup +}, and frozen on dry ice. The samples were derivatised in the laboratory without further pre-treatment using sodium tetraethyl borate (NaB(C{sub 2}H{sub 5}){sub 4}) and the ethylated methyl mercury was purge-trapped on Tenax columns. The analyte was thermo-desorbed onto a GC-ICP-MS system for analysis. Investigations preceding field application of the method showed that when using SSID, for all tested matrices, identical results were obtained between samples that were freeze-preserved or analysed unpreserved. For DOC-rich samples (mire water) additional experiments showed no difference in CH{sub 3}Hg{sup +} concentration between samples that were derivatised without pre-treatment or after liquid extraction. Extractions of samples for matrix-analyte separation prior to derivatisation are therefore not necessary. No formation of CH{sub 3}Hg{sup +} was observed during sample storage and treatment when spiking samples with {sup 198}Hg{sup 2+}. Total uncertainty budgets for the field application of the method showed that for analyte concentrations higher than 1.5 pg g{sup -1} (as Hg) the relative expanded uncertainty (REU) was approximately 5% and dominated by the uncertainty in the isotope standard concentration. Below 0.5 pg g{sup -1} (as Hg), the REU was >10% and dominated by variations in the field blank. The uncertainty of the method is sufficiently low to accurately determine CH{sub 3}Hg{sup +} concentrations at trace levels. The detection limit was determined to be 4 fg g{sup -1} (as Hg) based on replicate analyses of laboratory blanks. The described procedure is reliable, considerably

  3. Validation of a simplified field-adapted procedure for routine determinations of methyl mercury at trace levels in natural water samples using species-specific isotope dilution mass spectrometry.

    Lambertsson, Lars; Björn, Erik

    2004-12-01

    A field-adapted procedure based on species-specific isotope dilution (SSID) methodology for trace-level determinations of methyl mercury (CH(3)Hg(+)) in mire, fresh and sea water samples was developed, validated and applied in a field study. In the field study, mire water samples were filtered, standardised volumetrically with isotopically enriched CH(3) (200)Hg(+), and frozen on dry ice. The samples were derivatised in the laboratory without further pre-treatment using sodium tetraethyl borate (NaB(C(2)H(5))(4)) and the ethylated methyl mercury was purge-trapped on Tenax columns. The analyte was thermo-desorbed onto a GC-ICP-MS system for analysis. Investigations preceding field application of the method showed that when using SSID, for all tested matrices, identical results were obtained between samples that were freeze-preserved or analysed unpreserved. For DOC-rich samples (mire water) additional experiments showed no difference in CH(3)Hg(+) concentration between samples that were derivatised without pre-treatment or after liquid extraction. Extractions of samples for matrix-analyte separation prior to derivatisation are therefore not necessary. No formation of CH(3)Hg(+) was observed during sample storage and treatment when spiking samples with (198)Hg(2+). Total uncertainty budgets for the field application of the method showed that for analyte concentrations higher than 1.5 pg g(-1) (as Hg) the relative expanded uncertainty (REU) was approximately 5% and dominated by the uncertainty in the isotope standard concentration. Below 0.5 pg g(-1) (as Hg), the REU was >10% and dominated by variations in the field blank. The uncertainty of the method is sufficiently low to accurately determine CH(3)Hg(+) concentrations at trace levels. The detection limit was determined to be 4 fg g(-1) (as Hg) based on replicate analyses of laboratory blanks. The described procedure is reliable, considerably faster and simplified compared to non-SSID methods and thereby very

  4. Adaptive Rationality, Adaptive Behavior and Institutions

    Volchik Vyacheslav, V.

    2015-12-01

    Full Text Available The economic literature focused on understanding decision-making and choice processes reveals a vast collection of approaches to human rationality. Theorists’ attention has moved from absolutely rational, utility-maximizing individuals to boundedly rational and adaptive ones. A number of economists have criticized the concepts of adaptive rationality and adaptive behavior. One of the recent trends in the economic literature is to consider humans irrational. This paper offers an approach which examines adaptive behavior in the context of existing institutions and constantly changing institutional environment. It is assumed that adaptive behavior is a process of evolutionary adjustment to fundamental uncertainty. We emphasize the importance of actors’ engagement in trial and error learning, since if they are involved in this process, they obtain experience and are able to adapt to existing and new institutions. The paper aims at identifying relevant institutions, adaptive mechanisms, informal working rules and practices that influence actors’ behavior in the field of Higher Education in Russia (Rostov Region education services market has been taken as an example. The paper emphasizes the application of qualitative interpretative methods (interviews and discourse analysis in examining actors’ behavior.

  5. Adaptive trial designs.

    Lai, Tze Leung; Lavori, Philip William; Shih, Mei-Chiung

    2012-01-01

    We review adaptive designs for clinical trials, giving special attention to the control of the Type I error in late-phase confirmatory trials, when the trial planner wishes to adjust the final sample size of the study in response to an unblinded analysis of interim estimates of treatment effects. We point out that there is considerable inefficiency in using the adaptive designs that employ conditional power calculations to reestimate the sample size and that maintain the Type I error by using certain weighted test statistics. Although these adaptive designs have little advantage over familiar group-sequential designs, our review also describes recent developments in adaptive designs that are both flexible and efficient. We also discuss the use of Bayesian designs, when the context of use demands control over operating characteristics (Type I and II errors) and correction of the bias of estimated treatment effects. PMID:21838549

  6. Adaptive Lighting

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... distributed differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial...

  7. Measuring the dimensions of adaptive capacity: a psychometric approach

    Michael Lockwood

    2015-03-01

    Full Text Available Although previous studies have examined adaptive capacity using a range of self-assessment procedures, no objective self-report approaches have been used to identify the dimensions of adaptive capacity and their relative importance. We examine the content, structure, and relative importance of dimensions of adaptive capacity as perceived by rural landholders in an agricultural landscape in South-Eastern Australia. Our findings indicate that the most important dimensions influencing perceived landholder adaptive capacity are related to their management style, particularly their change orientation. Other important dimensions are individual financial capacity, labor availability, and the capacity of communities and local networks to support landholders' management practices. Trust and confidence in government with respect to native vegetation management was not found to be a significant dimension of perceived adaptive capacity. The scale items presented, particularly those with high factor loadings, provide a solid foundation for assessment of adaptive capacity in other study areas, as well as exploration of relationships between the individual dimensions of adaptive capacity and dependent variables such as perceived resilience. Further work is needed to refine the scale items and compare the findings from this case study with those from other contexts and population samples.

  8. Appraising Adaptive Management

    Kai N. Lee

    1999-12-01

    Full Text Available Adaptive management is appraised as a policy implementation approach by examining its conceptual, technical, equity, and practical strengths and limitations. Three conclusions are drawn: (1 Adaptive management has been more influential, so far, as an idea than as a practical means of gaining insight into the behavior of ecosystems utilized and inhabited by humans. (2 Adaptive management should be used only after disputing parties have agreed to an agenda of questions to be answered using the adaptive approach; this is not how the approach has been used. (3 Efficient, effective social learning, of the kind facilitated by adaptive management, is likely to be of strategic importance in governing ecosystems as humanity searches for a sustainable economy.

  9. Adaptive Lighting

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and...

  10. ADAPT Dataset

    National Aeronautics and Space Administration — Advanced Diagnostics and Prognostics Testbed (ADAPT) Project Lead: Scott Poll Subject Fault diagnosis in electrical power systems Description The Advanced...

  11. Adaptively Sharing Time-Series with Differential Privacy

    Fan, Liyue

    2012-01-01

    Sharing real-time aggregate statistics of private data has given much benefit to the public to perform data mining for understanding important phenomena, such as Influenza outbreaks and traffic congestions. We propose an adaptive approach with sampling and estimation to release aggregated time series under differential privacy, the key innovation of which is that we utilize feedback loops based on observed (perturbed) values to dynamically adjust the estimation model as well as the sampling rate. To minimize the overall privacy cost, our solution uses the PID controller to adaptively sample long time-series according to detected data dynamics. To improve the accuracy of data release per timestamp, the Kalman filter is used to predict data values at non-sampling points and to estimate true values from perturbed query answers at sampling points. Our experiments with three real data sets show that it is beneficial to incorporate feedback into both the estimation model and the sampling process. The results confir...

  12. Adaptations, exaptations, and spandrels.

    Buss, D M; Haselton, M G; Shackelford, T K; Bleske, A L; Wakefield, J C

    1998-05-01

    Adaptation and natural selection are central concepts in the emerging science of evolutionary psychology. Natural selection is the only known causal process capable of producing complex functional organic mechanisms. These adaptations, along with their incidental by-products and a residue of noise, comprise all forms of life. Recently, S. J. Gould (1991) proposed that exaptations and spandrels may be more important than adaptations for evolutionary psychology. These refer to features that did not originally arise for their current use but rather were co-opted for new purposes. He suggested that many important phenomena--such as art, language, commerce, and war--although evolutionary in origin, are incidental spandrels of the large human brain. The authors outline the conceptual and evidentiary standards that apply to adaptations, exaptations, and spandrels and discuss the relative utility of these concepts for psychological science. PMID:9612136

  13. Introduction to adaptive arrays

    Monzingo, Bob; Haupt, Randy

    2011-01-01

    This second edition is an extensive modernization of the bestselling introduction to the subject of adaptive array sensor systems. With the number of applications of adaptive array sensor systems growing each year, this look at the principles and fundamental techniques that are critical to these systems is more important than ever before. Introduction to Adaptive Arrays, 2nd Edition is organized as a tutorial, taking the reader by the hand and leading them through the maze of jargon that often surrounds this highly technical subject. It is easy to read and easy to follow as fundamental concept

  14. ADAPTATION DEBATE, AHMET VEFİK PAŞA AND ZORAKİ TABİB SAMPLE / ADAPTASYON MESELESİ, AHMET VEFİK PAŞA VE ZORAKİ TABİB ÖRNEĞİ

    Dr. Bayram YILDIZ

    2007-01-01

    Adaptation, an item of theatre in Turkish literature beginning with Tanzimat, has been opposed since it would have negative effects on development of national theatre and on Turkish society structure. On the other hand adaptation was supported as it would benefit development of modern theatre. Though the common belief was against adaptation, adaptations of Ahmet Vefik Pasa from Moliere have found acceptance. This acceptance might be due to his preference of comedy and of its best performer in...

  15. Adaptive Sampling using Support Vector Machines

    D. Mandelli; C. Smith

    2012-11-01

    Reliability/safety analysis of stochastic dynamic systems (e.g., nuclear power plants, airplanes, chemical plants) is currently performed through a combination of Event-Tress and Fault-Trees. However, these conventional methods suffer from certain drawbacks: • Timing of events is not explicitly modeled • Ordering of events is preset by the analyst • The modeling of complex accident scenarios is driven by expert-judgment For these reasons, there is currently an increasing interest into the development of dynamic PRA methodologies since they can be used to address the deficiencies of conventional methods listed above.

  16. Staff Adaptation in Selected Company

    Štolcová, Jana

    2011-01-01

    The work focuses on personnel actions of employee adaptation as, nowadays, it is very important to maintain a good and skilled staff. The main aim of this work is to analyze and evaluate the process of adaptation of new employees at the headquarters of the BILLA, Ltd., which operates more than 200 supermarkets around the Czech Republic,. Another task is to propose partial measures which would improve the process of adaptation in the society. The literature review discusses the importance of ...

  17. Transformational adaptation when incremental adaptations to climate change are insufficient.

    Kates, Robert W; Travis, William R; Wilbanks, Thomas J

    2012-05-01

    All human-environment systems adapt to climate and its natural variation. Adaptation to human-induced change in climate has largely been envisioned as increments of these adaptations intended to avoid disruptions of systems at their current locations. In some places, for some systems, however, vulnerabilities and risks may be so sizeable that they require transformational rather than incremental adaptations. Three classes of transformational adaptations are those that are adopted at a much larger scale, that are truly new to a particular region or resource system, and that transform places and shift locations. We illustrate these with examples drawn from Africa, Europe, and North America. Two conditions set the stage for transformational adaptation to climate change: large vulnerability in certain regions, populations, or resource systems; and severe climate change that overwhelms even robust human use systems. However, anticipatory transformational adaptation may be difficult to implement because of uncertainties about climate change risks and adaptation benefits, the high costs of transformational actions, and institutional and behavioral actions that tend to maintain existing resource systems and policies. Implementing transformational adaptation requires effort to initiate it and then to sustain the effort over time. In initiating transformational adaptation focusing events and multiple stresses are important, combined with local leadership. In sustaining transformational adaptation, it seems likely that supportive social contexts and the availability of acceptable options and resources for actions are key enabling factors. Early steps would include incorporating transformation adaptation into risk management and initiating research to expand the menu of innovative transformational adaptations. PMID:22509036

  18. Adaptation of a radiofrequency glow discharge optical emission spectrometer (RF-GD-OES) to the analysis of light elements (carbon, nitrogen, oxygen and hydrogen) in solids: glove box integration for the analysis of nuclear samples

    The purpose of this work is to use the radiofrequency glow discharge optical emission spectrometry in order to quantitatively determine carbon, nitrogen, oxygen and hydrogen at low concentration (in the ppm range) in nuclear materials. In this study, and before the definitive contamination of the system, works are carried out on non radioactive materials (steel, pure iron, copper and titanium). As the initial apparatus could not deliver a RF power inducing a reproducible discharge and was not adapted to the analysis of light elements: 1- The radiofrequency system had to be changed, 2- The systems controlling gaseous atmospheres had to be improved in order to obtain analytical signals stemming strictly from the sample, 3- Three discharge lamps had to be tested and compared in terms of performances, 4- The system of collection of light had to be optimized. The modifications that were brought to the initial system improved intensities and stabilities of signals which allowed lower detection limits (1000 times lower for carbon). These latter are in the ppm range for carbon and about a few tens of ppm for nitrogen and oxygen in pure irons. Calibration curves were plotted in materials presenting very different sputtering rates in order to check the existence of a 'function of analytical transfer' with the purpose of palliating the lack of reference materials certified in light elements at low concentration. Transposition of this type of function to other matrices remains to be checked. Concerning hydrogen, since no usable reference material with our technique is available, certified materials in deuterium (chosen as a surrogate for hydrogen) were studied in order to exhibit the feasibility the analysis of hydrogen. Parallel to these works, results obtained by modeling a RF discharge show that the performances of the lamp can be improved and that the optical system must be strictly adapted to the glow discharge. (author)

  19. 基于稳定竞争自适应重加权采样的光谱分析无标模型传递方法%Calibration Transfer without Standards for Spectral Analysis Based on Stability Competitive Adaptive Reweighted Sampling

    张晓羽; 李庆波; 张广军

    2014-01-01

    A novel calibration transfer method based on stability competitive adaptive reweighted sampling (SCARS) was pro-posed in the present paper .An informative criterion ,i .e .the stability index ,defined as the absolute value of regression coeffi-cient divided by its standard deviation was used .And the root mean squared error of prediction (RMSEP) after transfer was also used .The wavelength variables which were important and insensitive to influence of measurement parameters were selected . And then the differences in responses of different instruments or measurement conditions for a specific sample were eliminated or reduced to improve the calibration transfer results .Moreover ,in the proposed method ,the spectral variables were compressed , making calibration transfer more stable .The application of the proposed method to calibration transfer of NIR analysis was eval-uated by analyzing the corn with different NIR spectrometers .The results showed that this method can well correct the differ-ence between instruments and improve the analytical accuracy .The transfer results obtained by the proposed method ,orthogonal signal correction (OSC) ,Monte Carlo uninformative variable elimination (MCUVE) and competitive adaptive reweighted sam-pling (CARS) ,respectively ,for corn with different NIR spectrometers indicated that the former gave the best analytical accura-cy ,and was effective for the spectroscopic data compression which can simplify and optimize the transfer process .%提出了一种基于稳定竞争自适应重加权采样(stability competitive adaptive reweighted sampling , SCARS)的无标模型传递方法。利用有用信息标准即稳定度指数(定义为回归系数除以其标准偏差的绝对值)和传递后的预测均方根误差(root mean squared error of prediction ,RMSEP),选择重要的、受测样参数影响不敏感的波长变量,能够消除或减少不同仪器或测量条件对样本信息反应差异,提高模

  20. A Double Kriging Model Method Based on Optimization Sample Points for Importance Measure Analysis%基于优化样本点的双重Kriging模型的重要性测度求解方法

    李大伟; 吕震宙; 张磊刚

    2014-01-01

    For the engineering problems involving implicit limit state functions, a double Kriging model method based on optimization sample points for importance measure analysis is discussed in this paper. Firstly, in this method, a small amount of initial sample points are employed to build the Kriging surrogate model which relates the basic variables to the response. Then the subsequent points with relatively high uncertainty can be added to the sam-ple points with global optimization method. Finally, the Kriging surrogate model can give fairly good accuracy with a minimum number of sample points. The relationship between the basic variables and the response function and that between the basic variables and the conditional probability of failure are substituted by Kriging models;so the com-putation cost of the importance measure is reduced largely. To illustrate the engineering applicability and feasibility of the method, numerical and engineering examples are provided and discussed.%针对工程实际中极限状态函数往往是隐式的问题,提出了基于优化样本点的双重Kriging模型的重要性测度求解方法。该方法首先以少量初始样本点建立基本变量与响应值间的Kriging代理模型,通过全局优化的方法寻优找到Kriging预测值不确定性较大的点,并将其加入到初始样本点,从而在尽量少样本点的情况下建立满足精度的Kriging代理模型。该方法将基本变量与功能函数值以及基本变量与条件失效概率间的隐式关系以Kriging代理模型替代,在保证精度的情况下大大降低了矩独立的基本变量对失效概率重要性测度求解过程的计算量,数值算例和工程算例说明了该方法的工程适用性和可行性。

  1. IMPORTANT NOTIFICATION

    HR Department

    2009-01-01

    Green plates, removals and importation of personal effects Please note that, as from 1 April 2009, formalities relating to K and CD special series French vehicle plates (green plates), removals and importation of personal effects into France and Switzerland will be dealt with by GS Department (Building 73/3-014, tel. 73683/74407). Importation and purchase of tax-free vehicles in Switzerland, as well as diplomatic privileges, will continue to be dealt with by the Installation Service of HR Department (Building 33/1-011, tel. 73962). HR and GS Departments

  2. ADAPTATION DEBATE, AHMET VEFİK PAŞA AND ZORAKİ TABİB SAMPLE / ADAPTASYON MESELESİ, AHMET VEFİK PAŞA VE ZORAKİ TABİB ÖRNEĞİ

    Dr. Bayram YILDIZ

    2007-08-01

    Full Text Available Adaptation, an item of theatre in Turkish literature beginning with Tanzimat, has been opposed since it would have negative effects on development of national theatre and on Turkish society structure. On the other hand adaptation was supported as it would benefit development of modern theatre. Though the common belief was against adaptation, adaptations of Ahmet Vefik Pasa from Moliere have found acceptance. This acceptance might be due to his preference of comedy and of its best performer in world literature, Moliere, and adaptation of plays consistent with Turkish society structure and moral virtues as in the case of Zoraki Tabip.

  3. Signal sampling circuit

    Louwsma, Simon Minze; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converter via a respective output switch. The output switch of each channel opens for a tracking time period when the track-and-hold circuit is in a tracking mode for sampling the signal, and closes for a ...

  4. Reconstructing pictures from sampled data

    Corrections for two important degrading effects in gamma ray imaging systems are described. The adaptive local operator has the advantage that given the assumptions of the method the optimum correction is made for both sources of error simultaneously. The probability operator method although less soundly based in classical signal processing theory does make more use of the known statistical properties of possible inputs and consequently might make a better estimate of the true sample values. A separate correction would need to be made for blurring effects. The initial results indicate that the methods are worthy of further investigation

  5. Strategic Adaptation

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation...

  6. F-VIPGI: a new adapted version of VIPGI for FORS2 spectroscopy. Application to a sample of 16 X-ray selected galaxy clusters at 0.6 < z < 1.2

    Nastasi, Alessandro; Fassbender, Rene; Boehringer, Hans; Pierini, Daniele; Verdugo, Miguel; Garilli, Bianca; Franzetti, Paolo

    2013-01-01

    The goal of this paper is twofold. Firstly, we present F-VIPGI, a new version of the VIMOS Interactive Pipeline and Graphical Interface (VIPGI) adapted to handle FORS2 spectroscopic data. Secondly, we investigate the spectro-photometric properties of a sample of galaxies residing in distant X-ray selected galaxy clusters, the optical spectra of which were reduced with this new pipeline. We provide basic technical information about the innovations of the new software and, as a demonstration of the capabilities of the new pipeline, we show results obtained for 16 distant (0.65 < z < 1.25) X-ray luminous galaxy clusters selected within the XMM-Newton Distant Cluster Project. We performed a spectral indices analysis of the extracted optical spectra of their members, based on which we created a library of composite high signal-to-noise ratio spectra representative of passive and star-forming galaxies residing in distant galaxy clusters. The spectroscopic templates are provided to the community in electronic ...

  7. Context-aware adaptive spelling in motor imagery BCI

    Perdikis, S.; Leeb, R.; Millán, J. d. R.

    2016-06-01

    Objective. This work presents a first motor imagery-based, adaptive brain–computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject’s performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Approach. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree’s language model to improve online expectation-maximization maximum-likelihood estimation. Main results. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. Significance. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  8. Is adaptation. Truly an adaptation? Is adaptation. Truly an adaptation?

    Thais Flores Nogueira Diniz

    2008-04-01

    Full Text Available The article begins by historicizing film adaptation from the arrival of cinema, pointing out the many theoretical approaches under which the process has been seen: from the concept of “the same story told in a different medium” to a comprehensible definition such as “the process through which works can be transformed, forming an intersection of textual surfaces, quotations, conflations and inversions of other texts”. To illustrate this new concept, the article discusses Spike Jonze’s film Adaptation. according to James Naremore’s proposal which considers the study of adaptation as part of a general theory of repetition, joined with the study of recycling, remaking, and every form of retelling. The film deals with the attempt by the scriptwriter Charles Kaufman, cast by Nicholas Cage, to adapt/translate a non-fictional book to the cinema, but ends up with a kind of film which is by no means what it intended to be: a film of action in the model of Hollywood productions. During the process of creation, Charles and his twin brother, Donald, undergo a series of adventures involving some real persons from the world of film, the author and the protagonist of the book, all of them turning into fictional characters in the film. In the film, adaptation then signifies something different from itstraditional meaning. The article begins by historicizing film adaptation from the arrival of cinema, pointing out the many theoretical approaches under which the process has been seen: from the concept of “the same story told in a different medium” to a comprehensible definition such as “the process through which works can be transformed, forming an intersection of textual surfaces, quotations, conflations and inversions of other texts”. To illustrate this new concept, the article discusses Spike Jonze’s film Adaptation. according to James Naremore’s proposal which considers the study of adaptation as part of a general theory of repetition

  9. Sample rotating turntable kit for infrared spectrometers

    Eckels, Joel Del; Klunder, Gregory L.

    2008-03-04

    An infrared spectrometer sample rotating turntable kit has a rotatable sample cup containing the sample. The infrared spectrometer has an infrared spectrometer probe for analyzing the sample and the rotatable sample cup is adapted to receive the infrared spectrometer probe. A reflectance standard is located in the rotatable sample cup. A sleeve is positioned proximate the sample cup and adapted to receive the probe. A rotator rotates the rotatable sample cup. A battery is connected to the rotator.

  10. Real-time Visual Tracking of Multiple Targets Using Bootstrap Importance Sampling%自助重要性采样用于实时多目标视觉跟踪

    沈乐君; 游志胜; 李晓峰

    2012-01-01

    多目标视觉跟踪的主要困难来自于多个目标交互(部分或完全遮挡)导致的歧义性.马尔可夫随机场(Markov random field,MRF)可以消除这种歧义性且无需显式的数据关联.但是,通用概率推理算法的计算代价很高.针对上述问题,本文做出了3点贡献:1)设计了新的具有“分散-集中-分散”结构的递归贝叶斯跟踪框架—自助重要性采样粒子滤波器,它使用融入当前时刻观测的重要性密度函数解决维数灾难问题,将计算复杂度从指数增长变为线性增长;2)提出了新的蒙特卡洛策略—自助重要性采样,利用MRF的因子分解性质进行重要性采样,并使用自助法产生低成本高质量的样本、降低似然度计算次数和维持多模式分布;3)采用了新的边缘化技术—使用辅助变量采样进行边缘化,使用自助直方图对边缘后验分布进行密度估计.实验结果表明,本文提出的算法能够对大量目标进行实时跟踪,能够处理目标间复杂的交互,能够在目标消失后维持多模式分布.%Ambiguity is the major difficulty in multi-object tracking problem due to the interactions of multiple targets (partial or complete occlusion). This ambiguity can be resolved by Markov random field (MRF) without explicit data association. However, the computational cost of general probabilistic inference algorithms of MRF is expensive. This paper presents a novel approach to this problem. Firstly, a new recursive Bayesian estimation framework, bootstrap importance sampling particle filter (BIS-PF), is devised, which has a "distributed-central-distributed" structure. The core of this framework is a suboptimal importance density which uses the observation at present time. So, it does not suffer from the curse of dimensionality. Secondly, a new Monte Carlo strategy is proposed, which uses bootstrap sampling to generate low-cost and high-quality samples, maintains multi-modality and decreases the

  11. High Speed Network Sampling

    2005-01-01

    Abstract Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  12. High speed network sampling

    2010-01-01

    Abstract Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  13. 直接压印快速采样法在院内感染环境监测中的重要性分析%Direct Imprint Rapid Sampling Analysis of Importance of Environmental Monitoring of Infection in Hospital

    刘素珍; 张甜; 欧阳琳

    2013-01-01

    目的:对直接压印快速采样法在院内感染环境监测中的重要性进行分析。方法:对2011年6月-2012年5月本院空气、医护人员手部、物体表面、器械、经高压消毒的物品等240份表面细菌数量进行检测,观察组采用直接压印快速采样法,对照组采用传统生理盐水棉拭子涂擦法,对两组的检测效果进行对比分析。结果:观察组细菌数量检出率明显高于对照组,比较差异具有统计学意义(P<0.05)。结论:和传统生理盐水棉拭子涂擦法相比,直接压印快速采样法在院内感染环境的检测采样中有着明显的优势,能够减少众多中间环节所导致的检测结果误差缩短采样到达培养箱中的时间,方便快捷、易操作、经济节约,值得在同级医疗机构中推广应用。%Objective:To analyze direct imprint rapid sampling method in the environmental monitoring of nosocomial infections importance. Method:June 2011 to May 2012 in our hospital air medical hands,surfaces,equipment,items such as autoclaved 240 parts surface bacteria were detected by direct observation group imprint fast sampling method,saline control group using a cotton swab rubbed traditional method,detection results of the two groups were compared. Result:The number of bacteria detection rate of the observation group was significantly higher than that of the control group,the difference was statistically significant(P<0.05). Conclusion:The traditional method compared to saline cotton swab rubbed directly imprint rapid sampling method in the detection of nosocomial infection environmental sampling has obvious advantages,can reduce the number of intermediate links caused by shortening the sampling error of test results arrive incubator time,convenient,easy to operate,economical,worthy of medical institutions at the same level application.

  14. Staff Adaptation in Selected Company

    Haňáčková, Ivana

    2014-01-01

    This thesis deals with the management system adaptation workers in the chosen company . Staffing adaptation of workers today is very important because it helps businesses maintain a good and loyal employee. The main goal of this work is to analyze the management system to evaluate the adaptation of workers in terms of a particular company ie company Primagra, a.s. and if identified deficiencies propose appropriate measures. The theoretical part of the paper summarizes the findings of the lite...

  15. Adaptation and perceptual norms

    Webster, Michael A.; Yasuda, Maiko; Haber, Sara; Leonard, Deanne; Ballardini, Nicole

    2007-02-01

    We used adaptation to examine the relationship between perceptual norms--the stimuli observers describe as psychologically neutral, and response norms--the stimulus levels that leave visual sensitivity in a neutral or balanced state. Adapting to stimuli on opposite sides of a neutral point (e.g. redder or greener than white) biases appearance in opposite ways. Thus the adapting stimulus can be titrated to find the unique adapting level that does not bias appearance. We compared these response norms to subjectively defined neutral points both within the same observer (at different retinal eccentricities) and between observers. These comparisons were made for visual judgments of color, image focus, and human faces, stimuli that are very different and may depend on very different levels of processing, yet which share the property that for each there is a well defined and perceptually salient norm. In each case the adaptation aftereffects were consistent with an underlying sensitivity basis for the perceptual norm. Specifically, response norms were similar to and thus covaried with the perceptual norm, and under common adaptation differences between subjectively defined norms were reduced. These results are consistent with models of norm-based codes and suggest that these codes underlie an important link between visual coding and visual experience.

  16. Adaptive gain control during human perceptual choice

    Cheadle, Samuel; WYART, Valentin; Tsetsos, Konstantinos; Myers, Nicholas; de Gardelle, Vincent; Herce Castañón, Santiago; Summerfield, Christopher

    2014-01-01

    Neural systems adapt to background levels of stimulation. Adaptive gain control has been extensively studied in sensory systems, but overlooked in decision-theoretic models. Here, we describe evidence for adaptive gain control during the serial integration of decision-relevant information. Human observers judged the average information provided by a rapid stream of visual events (samples). The impact that each sample wielded over choices depended on its consistency with the previous sample, w...

  17. Adaptive test

    Kjeldsen, Lars Peter; Eriksen, Mette Rose

    2010-01-01

    Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale.......Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale....

  18. Adaptive Face Recognition via Structed Representation

    ZHANG Yu-hua; ZENG Xiao-ming

    2014-01-01

    In this paper, we propose a face recognition approach-Structed Sparse Representation-based classification when the measurement of the test sample is less than the number training samples of each subject. When this condition is not satisfied, we exploit Nearest Subspace approach to classify the test sample. In order to adapt all the cases, we combine the two approaches to an adaptive classification method-Adaptive approach. The adaptive approach yields greater recognition accuracy than the SRC approach and CRC_RLS approach with low sample rate on the Extend Yale B dataset. And it is more efficient than other two approaches.

  19. Adaptive Metric Kernel Regression

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression by...... minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  20. STUDYING COMPLEX ADAPTIVE SYSTEMS

    John H. Holland

    2006-01-01

    Complex adaptive systems (cas) - systems that involve many components that adapt or learn as they interact - are at the heart of important contemporary problems. The study of cas poses unique challenges: Some of our most powerful mathematical tools, particularly methods involving fixed points, attractors, and the like, are of limited help in understanding the development of cas. This paper suggests ways to modify research methods and tools, with an emphasis on the role of computer-based models, to increase our understanding of cas.

  1. Adaptive metric kernel regression

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the...

  2. Intestinal mucosal adaptation

    Laurie Drozdowski; Alan BR Thomson

    2006-01-01

    Intestinal failure is a condition characterized by malnutrition and/or dehydration as a result of the inadequate digestion and absorption of nutrients. The most common cause of intestinal failure is short bowel syndrome, which occurs when the functional gut mass is reduced below the level necessary for adequate nutrient and water absorption. This condition may be congenital, or may be acquired as a result of a massive resection of the small bowel. Following resection, the intestine is capable of adaptation in response to enteral nutrients as well as other trophic stimuli. Identifying factors that may enhance the process of intestinal adaptation is an exciting area of research with important potential clinical applications.

  3. Deconvolution with correct sampling

    Magain, P; Sohy, S

    1997-01-01

    A new method for improving the resolution of astronomical images is presented. It is based on the principle that sampled data cannot be fully deconvolved without violating the sampling theorem. Thus, the sampled image should not be deconvolved by the total Point Spread Function, but by a narrower function chosen so that the resolution of the deconvolved image is compatible with the adopted sampling. Our deconvolution method gives results which are markedly superior to those of other existing techniques: in particular, it does not produce ringing around point sources superimposed on a smooth background. Moreover, it allows to perform accurate astrometry and photometry of crowded fields. These improvements are a consequence of both the correct treatment of sampling and the recognition that the most probable astronomical image is not a flat one. The method is also well adapted to the optimal combination of different images of the same object, as can be obtained, e.g., via adaptive optics techniques.

  4. Adaptive Playware in Physical Games

    Lund, Henrik Hautop; Thorsteinsson, Arnar Tumi

    2011-01-01

    We describe how playware and games may adapt to the interaction of the individual user. We hypothesize that in physical games there are individual differences in user interaction capabilities and styles, and that adaptive playware may adapt to the individual user’s capabilities, so that the...... activity automatically will match the capability of the individual user. With small test groups, we investigate how different age groups and gender groups physically interact with some playware games, and find indications of differences between the groups. Despite the small test set, the results are a...... proof of existence of differences and of the need for adaptation, and therefore we investigate adaptation as an important issue for playware. With simple playware games, we show that the adaptation will speed the physical game up and down to find the appropriate level that matches the reaction speed of...

  5. Adaptation Laboratory

    Huq, Saleemul

    2011-11-15

    Efforts to help the world's poor will face crises in coming decades as climate change radically alters conditions. Action Research for Community Adapation in Bangladesh (ARCAB) is an action-research programme on responding to climate change impacts through community-based adaptation. Set in Bangladesh at 20 sites that are vulnerable to floods, droughts, cyclones and sea level rise, ARCAB will follow impacts and adaptation as they evolve over half a century or more. National and international 'research partners', collaborating with ten NGO 'action partners' with global reach, seek knowledge and solutions applicable worldwide. After a year setting up ARCAB, we share lessons on the programme's design and move into our first research cycle.

  6. Hedonic "adaptation"

    Paul Rozin

    2008-02-01

    Full Text Available People live in a world in which they are surrounded by potential disgust elicitors such as ``used'' chairs, air, silverware, and money as well as excretory activities. People function in this world by ignoring most of these, by active avoidance, reframing, or adaptation. The issue is particularly striking for professions, such as morticians, surgeons, or sanitation workers, in which there is frequent contact with major disgust elicitors. In this study, we study the ``adaptation'' process to dead bodies as disgust elicitors, by measuring specific types of disgust sensitivity in medical students before and after they have spent a few months dissecting a cadaver. Using the Disgust Scale, we find a significant reduction in disgust responses to death and body envelope violation elicitors, but no significant change in any other specific type of disgust. There is a clear reduction in discomfort at touching a cold dead body, but not in touching a human body which is still warm after death.

  7. Adaptive ethnography

    Berth, Mette

    2005-01-01

    This paper focuses on the use of an adaptive ethnography when studying such phenomena as young people's use of mobile media in a learning perspective. Mobile media such as PDAs and mobile phones have a number of affordances which make them potential tools for learning. However, before we begin to...... design and develop educational materials for mobile media platforms we must first understand everyday use and behaviour with a medium such as a mobile phone. The paper outlines the research design for a PhD project on mobile learning which focuses on mobile phones as a way to bridge the gap between...... formal and informal learning contexts. The paper also proposes several adaptive methodological techniques for studying young people's interaction with mobiles....

  8. Adaptable positioner

    This paper describes the circuits and programs in assembly language, developed to control the two DC motors that give mobility to a mechanical arm with two degrees of freedom. As a whole, the system is based in a adaptable regulator designed around a 8 bit microprocessor that, starting from a mode of regulation based in the successive approximation method, evolve to another mode through which, only one approximation is sufficient to get the right position of each motor. (Author) 22 fig. 6 ref

  9. Adaptive positioner

    This paper describes the circuits and programs in assembly language, developed to control the two DC motors that give mobility to a mechanical arm with two degrees of freedom. As a whole, the system is based in a adaptable regulator designed around a 8 bit microprocessor that, starting from a mode of regulation based in the successive approximation method, evolve to another mode through which, only one approximation is sufficient to get the right position of each motor. (Author) 6 refs

  10. Adaptive noise

    Viney, Mark; Reece, Sarah E.

    2013-01-01

    In biology, noise implies error and disorder and is therefore something which organisms may seek to minimize and mitigate against. We argue that such noise can be adaptive. Recent studies have shown that gene expression can be noisy, noise can be genetically controlled, genes and gene networks vary in how noisy they are and noise generates phenotypic differences among genetically identical cells. Such phenotypic differences can have fitness benefits, suggesting that evolution can shape noise ...

  11. The Adaptation Gap Report - a Preliminary Assessment

    Alverson, Keith; Olhoff, Anne; Noble, Ian;

    This first Adaptation Gap report provides an equally sobering assessment of the gap between adaptation needs and reality, based on preliminary thinking on how baselines, future goals or targets, and gaps between them might be defined for climate change adaptation. The report focuses on gaps in...... developing countries in three important areas: finance, technology and knowledge....

  12. Procedures for Sampling Vegetation

    US Fish and Wildlife Service, Department of the Interior — This report outlines vegetation sampling procedures used on various refuges in Region 3. The importance of sampling the response of marsh vegetation to management...

  13. On the Convergence of Adaptive Sequential Monte Carlo Methods

    Beskos, Alexandros; Jasra, Ajay; Kantas, Nikolas; Thiery, Alexandre

    2013-01-01

    In several implementations of Sequential Monte Carlo (SMC) methods it is natural, and important in terms of algorithmic efficiency, to exploit the information of the history of the samples to optimally tune their subsequent propagations. In this article we provide a carefully formulated asymptotic theory for a class of such \\emph{adaptive} SMC methods. The theoretical framework developed here will cover, under assumptions, several commonly used SMC algorithms. There are only limited results a...

  14. Conceptual Model of User Adaptive Enterprise Application

    Inese Šūpulniece

    2015-07-01

    Full Text Available The user adaptive enterprise application is a software system, which adapts its behavior to an individual user on the basis of nontrivial inferences from information about the user. The objective of this paper is to elaborate a conceptual model of the user adaptive enterprise applications. In order to conceptualize the user adaptive enterprise applications, their main characteristics are analyzed, the meta-model defining the key concepts relevant to these applications is developed, and the user adaptive enterprise application and its components are defined in terms of the meta-model. Modeling of the user adaptive enterprise application incorporates aspects of enterprise modeling, application modeling, and design of adaptive characteristics of the application. The end-user and her expectations are identified as two concepts of major importance not sufficiently explored in the existing research. Understanding these roles improves the adaptation result in the user adaptive applications.

  15. Economics of adaptation to climate change

    This report proposes a general economic framework for the issue of adaptation to climate change in order to help public and private actors to build up efficient adaptation strategies. It proposes a general definition of adaptation, identifies the major stakes for these strategies, and discusses the assessment of global costs of adaptation to climate change. It discusses the role and modalities of public action and gives some examples of possible adaptation measures in some important sectors (building and town planning, energy and transport infrastructures, water and agriculture, ecosystems, insurance). It examines the regional and national dimensions of adaptation and their relationship, and defines steps for implementing an adaptation strategy. It describes and discusses the use of economic tools in the elaboration of an adaptation strategy, i.e. how to take uncertainties into account, which scenarios to choose, how to use economic calculations to assess adaptation policies

  16. Adaptations In Buyer-Seller Relationships

    Brennan, R.; Turnbull, P W

    1995-01-01

    The concept of inter-firm adaptations has been an important component of the IMP Interaction Approach since the first IMP study. Researchers working within the IMP tradition have put forward a number of taxonomies of adaptations, but no satisfactory definition of the concept. The management of inter-firm adaptations is a critical component of relationship portfolio management. In order better to understand this concept, and to support the formulation of inter-firm adaptation strategy, progres...

  17. Adaptive manifold learning.

    Zhang, Zhenyue; Wang, Jing; Zha, Hongyuan

    2012-02-01

    Manifold learning algorithms seek to find a low-dimensional parameterization of high-dimensional data. They heavily rely on the notion of what can be considered as local, how accurately the manifold can be approximated locally, and, last but not least, how the local structures can be patched together to produce the global parameterization. In this paper, we develop algorithms that address two key issues in manifold learning: 1) the adaptive selection of the local neighborhood sizes when imposing a connectivity structure on the given set of high-dimensional data points and 2) the adaptive bias reduction in the local low-dimensional embedding by accounting for the variations in the curvature of the manifold as well as its interplay with the sampling density of the data set. We demonstrate the effectiveness of our methods for improving the performance of manifold learning algorithms using both synthetic and real-world data sets. PMID:21670485

  18. Adaptive management

    Rist, Lucy; Campbell, Bruce Morgan; Frost, Peter

    2013-01-01

    in scientific articles, policy documents and management plans, but both understanding and application of the concept is mixed. This paper reviews recent literature from conservation and natural resource management journals to assess diversity in how the term is used, highlight ambiguities and consider how......Adaptive management (AM) emerged in the literature in the mid-1970s in response both to a realization of the extent of uncertainty involved in management, and a frustration with attempts to use modelling to integrate knowledge and make predictions. The term has since become increasingly widely used...... the concept might be further assessed. AM is currently being used to describe many different management contexts, scales and locations. Few authors define the term explicitly or describe how it offers a means to improve management outcomes in their specific management context. Many do not adhere to the idea...

  19. Sampling methods

    Kılıç, Selim

    2013-01-01

    Most of the studies related to health are conducted with samples. Selection of sampling may be performed with random or nonrandom sampling methods. Sampling methods may vary with respect to some characteristics of study population, researchers’ aims and facilities. It is necessary to select samples by random sampling methods in order to provide the representativeness of the study population. The advantages and disadvantages of different sampling methods are presented in this manuscript which ...

  20. Kinetic Solvers with Adaptive Mesh in Phase Space

    Arslanbekov, Robert R; Frolova, Anna A

    2013-01-01

    An Adaptive Mesh in Phase Space (AMPS) methodology has been developed for solving multi-dimensional kinetic equations by the discrete velocity method. A Cartesian mesh for both configuration (r) and velocity (v) spaces is produced using a tree of trees data structure. The mesh in r-space is automatically generated around embedded boundaries and dynamically adapted to local solution properties. The mesh in v-space is created on-the-fly for each cell in r-space. Mappings between neighboring v-space trees implemented for the advection operator in configuration space. We have developed new algorithms for solving the full Boltzmann and linear Boltzmann equations with AMPS. Several recent innovations were used to calculate the full Boltzmann collision integral with dynamically adaptive mesh in velocity space: importance sampling, multi-point projection method, and the variance reduction method. We have developed an efficient algorithm for calculating the linear Boltzmann collision integral for elastic and inelastic...

  1. Boat sampling

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  2. Adaptive vehicle motion estimation and prediction

    Zhao, Liang; Thorpe, Chuck E.

    1999-01-01

    Accurate motion estimation and reliable maneuver prediction enable an automated car to react quickly and correctly to the rapid maneuvers of the other vehicles, and so allow safe and efficient navigation. In this paper, we present a car tracking system which provides motion estimation, maneuver prediction and detection of the tracked car. The three strategies employed - adaptive motion modeling, adaptive data sampling, and adaptive model switching probabilities - result in an adaptive interacting multiple model algorithm (AIMM). The experimental results on simulated and real data demonstrate that our tracking system is reliable, flexible, and robust. The adaptive tracking makes the system intelligent and useful in various autonomous driving tasks.

  3. Face Adaptation Without a Face

    Ghuman, Avniel Singh; McDaniel, Jonathan R.; Martin, Alex

    2010-01-01

    Prolonged viewing of a stimulus results in a subsequent perceptual bias [1], [2] and [3]. This perceptual adaptation and the resulting aftereffect reveal important characteristics regarding how perceptual systems are tuned [2], [4], [5] and [6]. These aftereffects occur not only for simple stimulus features but also for high-level stimulus properties [7], [8], [9] and [10]. Here we report a novel cross-category adaptation aftereffect demonstrating that prolonged viewing of a human body withou...

  4. Adaptive method of lines

    Saucez, Ph

    2001-01-01

    The general Method of Lines (MOL) procedure provides a flexible format for the solution of all the major classes of partial differential equations (PDEs) and is particularly well suited to evolutionary, nonlinear wave PDEs. Despite its utility, however, there are relatively few texts that explore it at a more advanced level and reflect the method''s current state of development.Written by distinguished researchers in the field, Adaptive Method of Lines reflects the diversity of techniques and applications related to the MOL. Most of its chapters focus on a particular application but also provide a discussion of underlying philosophy and technique. Particular attention is paid to the concept of both temporal and spatial adaptivity in solving time-dependent PDEs. Many important ideas and methods are introduced, including moving grids and grid refinement, static and dynamic gridding, the equidistribution principle and the concept of a monitor function, the minimization of a functional, and the moving finite elem...

  5. Viewer preferences for adaptive playout

    Deshpande, Sachin

    2013-03-01

    Adaptive media playout techniques are used to avoid buffer underflow in a dynamic streaming environment where the available bandwidth may be fluctuating. In this paper we report human perceptions from audio quality studies that we performed on speech and music samples for adaptive audio playout. Test methods based on ITU-R BS. 1534-1 recommendation were used. Studies were conducted for both slow playout and fast playout. Two scales - a coarse scale and a finer scale was used for the slow and fast audio playout factors. Results from our study can be used to determine acceptable slow and fast playout factors for speech and music content. An adaptive media playout algorithm could use knowledge of these upper and lower bounds on playback speeds to decide its adaptive playback schedule.

  6. Integrating Adaptive Functionality in a LMS

    Kees van der Sluijs

    2009-12-01

    Full Text Available Learning management systems are becoming more and more important in the learning process in both educational and corporate settings. They can nowadays even be used to server actual courses to the learner. However, one important feature is lacking in learning management systems: personalization. In this paper we look into this issue of personalization that enables courses to be adapted to the knowledge level and learning preferences of the user. We shortly review the state of the art in adaptive systems that allow creating adaptive courses. Then, exemplified in the popular LMS called CLIX we look at the authoring of an adaptive Business English course. We demonstrate how such a static course can be made adaptive by using the GALE adaptive engine. We then show that GALE can be integrated into CLIX, and in other LMSs as well, so that personalization and adaptation can become widely established technology.

  7. Adaptation-Based Programming in Haskell

    Tim Bauer

    2011-09-01

    Full Text Available We present an embedded DSL to support adaptation-based programming (ABP in Haskell. ABP is an abstract model for defining adaptive values, called adaptives, which adapt in response to some associated feedback. We show how our design choices in Haskell motivate higher-level combinators and constructs and help us derive more complicated compositional adaptives. We also show an important specialization of ABP is in support of reinforcement learning constructs, which optimize adaptive values based on a programmer-specified objective function. This permits ABP users to easily define adaptive values that express uncertainty anywhere in their programs. Over repeated executions, these adaptive values adjust to more efficient ones and enable the user's programs to self optimize. The design of our DSL depends significantly on the use of type classes. We will illustrate, along with presenting our DSL, how the use of type classes can support the gradual evolution of DSLs.

  8. Adaptation-Based Programming in Haskell

    Bauer, Tim; Fern, Alan; Pinto, Jervis; 10.4204/EPTCS.66.1

    2011-01-01

    We present an embedded DSL to support adaptation-based programming (ABP) in Haskell. ABP is an abstract model for defining adaptive values, called adaptives, which adapt in response to some associated feedback. We show how our design choices in Haskell motivate higher-level combinators and constructs and help us derive more complicated compositional adaptives. We also show an important specialization of ABP is in support of reinforcement learning constructs, which optimize adaptive values based on a programmer-specified objective function. This permits ABP users to easily define adaptive values that express uncertainty anywhere in their programs. Over repeated executions, these adaptive values adjust to more efficient ones and enable the user's programs to self optimize. The design of our DSL depends significantly on the use of type classes. We will illustrate, along with presenting our DSL, how the use of type classes can support the gradual evolution of DSLs.

  9. Biological sample collector

    Murphy, Gloria A.

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  10. COPING: IMPORTANCE OF CONTEXTUAL FACTORS AND MEASUREMENT

    Kaya, Cahit

    2014-01-01

    Coping skills cover an important area in rehabilitation counseling field. Bipolarity of coping skills as being adaptive or not adaptive has been prevalent throughout the literature. On the other hand, influence of contextual factors on coping skills has been underemphasized.  Recent researches indicate that contextual factors play major role in coping skills. This paper examines importance of contextual factors on coping skills particularly in relation to assessment issues in rehabilitation c...

  11. Adaptive gain control during human perceptual choice

    Cheadle, Samuel; Wyart, Valentin; Tsetsos, Konstantinos; Myers, Nicholas; de Gardelle, Vincent; Castañón, Santiago Herce; Summerfield, Christopher

    2015-01-01

    Neural systems adapt to background levels of stimulation. Adaptive gain control has been extensively studied in sensory systems, but overlooked in decision-theoretic models. Here, we describe evidence for adaptive gain control during the serial integration of decision-relevant information. Human observers judged the average information provided by a rapid stream of visual events (samples). The impact that each sample wielded over choices depended on its consistency with the previous sample, with more consistent or expected samples wielding the greatest influence over choice. This bias was also visible in the encoding of decision information in pupillometric signals, and in cortical responses measured with functional neuroimaging. These data can be accounted for with a new serial sampling model in which the gain of information processing adapts rapidly to reflect the average of the available evidence. PMID:24656259

  12. Optimizing heterologous expression in Dictyostelium : importance of 5 ' codon adaptation

    Vervoort, EB; van Ravestein, A; van Peij, NNME; Heikoop, JC; van Haastert, OJM; Verheijden, GF; Linskens, MHK; Heikoop, Judith C.; Haastert, Peter J.M. van; Verheijden, Gijs F.

    2000-01-01

    Expression of heterologous proteins in Dictyostelium discoideum presents unique research opportunities, such as the functional analysis of complex human glycoproteins after random mutagenesis, In one study, human chorionic gonadotropin (hCG) and human follicle stimulating hormone were expressed in D

  13. Importance of heterotrophic adaptations of corals to maintain energy reserves

    Seemann, Janina; Carballo-Bolanos, R.; Berry, K. L.; González, C. T.; Richter, Claudio; Leinfelder, R. R.

    2012-01-01

    We examined the ability of the two hard coral species Agaricia tenuifolia and Porites furcata to store lipids under natural conditions, under experimental starvation (weekly vs. daily feeding) and under heat stress. P. furcata fed more and accumulated greater lipid quantities than A. tenuifolia. Overall, lipid levels in situ showed an inverse relationship to turbidity and eutrophication with highest values at the least anthropogenically impacted site. Although zooxanthellae, chlorophyll a con...

  14. Plant sphingolipids: Their importance in cellular organization and adaption.

    Michaelson, Louise V; Napier, Johnathan A; Molino, Diana; Faure, Jean-Denis

    2016-09-01

    Sphingolipids and their phosphorylated derivatives are ubiquitous bio-active components of cells. They are structural elements in the lipid bilayer and contribute to the dynamic nature of the membrane. They have been implicated in many cellular processes in yeast and animal cells, including aspects of signaling, apoptosis, and senescence. Although sphingolipids have a better defined role in animal systems, they have been shown to be central to many essential processes in plants including but not limited to, pollen development, signal transduction and in the response to biotic and abiotic stress. A fuller understanding of the roles of sphingolipids within plants has been facilitated by classical biochemical studies and the identification of mutants of model species. Recently the development of powerful mass spectrometry techniques hailed the advent of the emerging field of lipidomics enabling more accurate sphingolipid detection and quantitation. This review will consider plant sphingolipid biosynthesis and function in the context of these new developments. This article is part of a Special Issue entitled: Plant Lipid Biology edited by Kent D. Chapman and Ivo Feussner. PMID:27086144

  15. Use Case Design for AdaptIVe

    Wolter, Stefan; Kelsch, Johann

    2014-01-01

    AdaptIVe is a large scale European project on vehicle automation and the pertaining human-machine interaction. The use case design process is a crucial part of the system design process and a part of the human-vehicle integration subproject. This paper explains the methodology for describing use cases in AdaptIVe. They are primarily based on sequence diagrams with main and alternative flows.

  16. Economics of adaptation to climate change; Economie de l'adaptation au changement climatique

    Perthuis, Ch.; Hallegatte, St.; Lecocq, F.

    2010-02-15

    This report proposes a general economic framework for the issue of adaptation to climate change in order to help public and private actors to build up efficient adaptation strategies. It proposes a general definition of adaptation, identifies the major stakes for these strategies, and discusses the assessment of global costs of adaptation to climate change. It discusses the role and modalities of public action and gives some examples of possible adaptation measures in some important sectors (building and town planning, energy and transport infrastructures, water and agriculture, ecosystems, insurance). It examines the regional and national dimensions of adaptation and their relationship, and defines steps for implementing an adaptation strategy. It describes and discusses the use of economic tools in the elaboration of an adaptation strategy, i.e. how to take uncertainties into account, which scenarios to choose, how to use economic calculations to assess adaptation policies

  17. Adaptively robust filtering with classified adaptive factors

    CUI Xianqiang; YANG Yuanxi

    2006-01-01

    The key problems in applying the adaptively robust filtering to navigation are to establish an equivalent weight matrix for the measurements and a suitable adaptive factor for balancing the contributions of the measurements and the predicted state information to the state parameter estimates. In this paper, an adaptively robust filtering with classified adaptive factors was proposed, based on the principles of the adaptively robust filtering and bi-factor robust estimation for correlated observations. According to the constant velocity model of Kalman filtering, the state parameter vector was divided into two groups, namely position and velocity. The estimator of the adaptively robust filtering with classified adaptive factors was derived, and the calculation expressions of the classified adaptive factors were presented. Test results show that the adaptively robust filtering with classified adaptive factors is not only robust in controlling the measurement outliers and the kinematic state disturbing but also reasonable in balancing the contributions of the predicted position and velocity, respectively, and its filtering accuracy is superior to the adaptively robust filter with single adaptive factor based on the discrepancy of the predicted position or the predicted velocity.

  18. Sampling with Costs

    Skufca, Joseph D; Ben-Avraham, Daniel

    2015-01-01

    We consider the problem of choosing the best of $n$ samples, out of a large random pool, when the sampling of each member is associated with a certain cost. The quality (worth) of the best sample clearly increases with $n$, but so do the sampling costs, and one important question is how many to sample for optimal gain (worth minus costs). If, in addition, the assessment of worth for each sample is associated with some "measurement error," the perceived best out of $n$ might not be the actual ...

  19. Adaptation and diversification on islands.

    Losos, Jonathan B; Ricklefs, Robert E

    2009-02-12

    Charles Darwin's travels on HMS Beagle taught him that islands are an important source of evidence for evolution. Because many islands are young and have relatively few species, evolutionary adaptation and species proliferation are obvious and easy to study. In addition, the geographical isolation of many islands has allowed evolution to take its own course, free of influence from other areas, resulting in unusual faunas and floras, often unlike those found anywhere else. For these reasons, island research provides valuable insights into speciation and adaptive radiation, and into the relative importance of contingency and determinism in evolutionary diversification. PMID:19212401

  20. Genetic structure of different cat populations in Europe and South America at a microgeographic level: importance of the choice of an adequate sampling level in the accuracy of population genetics interpretations

    Manuel Ruiz-Garcia

    1999-12-01

    Full Text Available The phenotypic markers, coat color, pattern and hair length, of natural domestic cat populations observed in four cities (Barcelona, Catalonia; Palma Majorca, Balearic Islands; Rimini, Italy and Buenos Aires, Argentina were studied at a microgeographical level. Various population genetics techniques revealed that the degree of genetic differentiation between populations of Felis catus within these cities is relatively low, when compared with that found between populations of other mammals. Two different levels of sampling were used. One was that of "natural" colonies of cat families living together in specific points within the cities, and the other referred to "artificial" subpopulations, or groups of colonies, inhabiting the same district within a city. For the two sampling levels, some of the results were identical: 1 little genic heterogeneity, 2 existence of panmixia, 3 similar levels of expected heterozygosity in all populations analyzed, 4 no spatial autocorrelation, with certain differentiation in the Buenos Aires population compared to the others, and 5 very high correlations between colonies and subpopulations with the first factors from a Q factor analysis. Nevertheless, other population genetic statistics were greatly affected by the differential choice of sampling level. This was the case for: 1 the amount of heterogeneity of the FST and GST statistics between the cities, which was greater at the subpopulation level than at colony level, 2 the existence of correlations between genic differentiation statistics and size variables at subpopulation level, but not at the colony level, and 3 the relationships between the genetic variables and the principal factors of the R factorial analysis. This suggests that care should be taken in the choice of the sampling unit, for inferences on population genetics to be valid at the microgeographical level.Os marcadores fenotípicos cor da pelagem, padrão e comprimento dos pelos de popula

  1. RESEARCH SAMPLING

    NISHA MD

    2012-01-01

    No aspect of the research plan is more critical for assuring the usefulness of a study than the sampling strategy. It will determine if the results of the study can be applied as evidence and contributes to the trustworthiness of the results. The sampling strategy is a critical part of research design. An appropriate sampling plan is vital for drawing the right conclusions from a study. Good sampling is critical for the confident application of the study findings to other people, settings, or...

  2. Solving delay differential equations in S-ADAPT by method of steps.

    Bauer, Robert J; Mo, Gary; Krzyzanski, Wojciech

    2013-09-01

    S-ADAPT is a version of the ADAPT program that contains additional simulation and optimization abilities such as parametric population analysis. S-ADAPT utilizes LSODA to solve ordinary differential equations (ODEs), an algorithm designed for large dimension non-stiff and stiff problems. However, S-ADAPT does not have a solver for delay differential equations (DDEs). Our objective was to implement in S-ADAPT a DDE solver using the methods of steps. The method of steps allows one to solve virtually any DDE system by transforming it to an ODE system. The solver was validated for scalar linear DDEs with one delay and bolus and infusion inputs for which explicit analytic solutions were derived. Solutions of nonlinear DDE problems coded in S-ADAPT were validated by comparing them with ones obtained by the MATLAB DDE solver dde23. The estimation of parameters was tested on the MATLB simulated population pharmacodynamics data. The comparison of S-ADAPT generated solutions for DDE problems with the explicit solutions as well as MATLAB produced solutions which agreed to at least 7 significant digits. The population parameter estimates from using importance sampling expectation-maximization in S-ADAPT agreed with ones used to generate the data. PMID:23810514

  3. Adaptive independent component analysis to analyze electrocardiograms

    Yim, Seong-Bin; Szu, Harold H.

    2001-03-01

    In this work, we apply adaptive version independent component analysis (ADAPTIVE ICA) to the nonlinear measurement of electro-cardio-graphic (ECG) signals for potential detection of abnormal conditions in the heart. In principle, unsupervised ADAPTIVE ICA neural networks can demix the components of measured ECG signals. However, the nonlinear pre-amplification and post measurement processing make the linear ADAPTIVE ICA model no longer valid. This is possible because of a proposed adaptive rectification pre-processing is used to linearize the preamplifier of ECG, and then linear ADAPTIVE ICA is used in iterative manner until the outputs having their own stable Kurtosis. We call such a new approach adaptive ADAPTIVE ICA. Each component may correspond to individual heart function, either normal or abnormal. Adaptive ADAPTIVE ICA neural networks have the potential to make abnormal components more apparent, even when they are masked by normal components in the original measured signals. This is particularly important for diagnosis well in advance of the actual onset of heart attack, in which abnormalities in the original measured ECG signals may be difficult to detect. This is the first known work that applies Adaptive ADAPTIVE ICA to ECG signals beyond noise extraction, to the detection of abnormal heart function.

  4. Adaptive Image Denoising by Mixture Adaptation.

    Luo, Enming; Chan, Stanley H; Nguyen, Truong Q

    2016-10-01

    We propose an adaptive learning procedure to learn patch-based image priors for image denoising. The new algorithm, called the expectation-maximization (EM) adaptation, takes a generic prior learned from a generic external database and adapts it to the noisy image to generate a specific prior. Different from existing methods that combine internal and external statistics in ad hoc ways, the proposed algorithm is rigorously derived from a Bayesian hyper-prior perspective. There are two contributions of this paper. First, we provide full derivation of the EM adaptation algorithm and demonstrate methods to improve the computational complexity. Second, in the absence of the latent clean image, we show how EM adaptation can be modified based on pre-filtering. The experimental results show that the proposed adaptation algorithm yields consistently better denoising results than the one without adaptation and is superior to several state-of-the-art algorithms. PMID:27416593

  5. Adaptive estimation of qubits by symmetry measurements

    Happ, Christof J.; Freyberger, Matthias

    2008-01-01

    We analyze quantum state estimation for finite samples based on symmetry information. The used measurement concept compares an unknown qubit to a reference state. We describe explicitly an adaptive strategy, that enhances the estimation fidelity of these measurements.

  6. Mexico: Imports or exports?

    This presentation provides an overview of Mexico's energy sector. Proven oil reserves place Mexico in ninth position in the world and fourth largest in natural gas reserves. Energy is one of the most important economic activities of the country, representing 3 per cent of Gross Domestic Product (GDP). Oil exports represent 8.4 per cent of total exports. Approximately 40 per cent of total public investment is earmarked for energy projects. The author discusses energy resources and energy sector limitations. The energy sector plan for the period 2001-2006 is discussed. Its goals are to ensure energy supply, to develop the energy sector, to stimulate participation of Mexican enterprises, to promote renewable energy sources, and to strengthen international energy cooperation. The regulatory framework is being adapted to increase private investment. Some graphs are presented, displaying the primary energy production and primary energy consumption. Energy sector reforms are reviewed, as are electricity and natural gas reforms. The energy sector demand for 2000-2010 and investment requirements are reviewed, as well as fuel consumption for power generation. The author discusses the National Pipeline System (SNG) and the bottlenecks caused by pressure efficiency in the northeast, flow restriction on several pipeline segments, variability of the Petroleos Mexicanos (PEMEX) own use, and pressure drop on central regions. The entire prospect for natural gas in the country is reviewed, along with the Strategic Gas Program (PEG) consisting of 20 projects, including 4 non-associated natural gas, 9 exploration and 7 optimization. A section dealing with multiple service contracts is included in the presentation. The authors conclude by stating that the priority is a national energy policy to address Mexico's energy security requirements, to increase natural gas production while promoting the diversification of imports, and a regulatory framework to be updated in light of current

  7. Farming System Evolution and Adaptive Capacity: Insights for Adaptation Support

    Jami L. Dixon

    2014-02-01

    Full Text Available Studies of climate impacts on agriculture and adaptation often provide current or future assessments, ignoring the historical contexts farming systems are situated within. We investigate how historical trends have influenced farming system adaptive capacity in Uganda using data from household surveys, semi-structured interviews, focus-group discussions and observations. By comparing two farming systems, we note three major findings: (1 similar trends in farming system evolution have had differential impacts on the diversity of farming systems; (2 trends have contributed to the erosion of informal social and cultural institutions and an increasing dependence on formal institutions; and (3 trade-offs between components of adaptive capacity are made at the farm-scale, thus influencing farming system adaptive capacity. To identify the actual impacts of future climate change and variability, it is important to recognize the dynamic nature of adaptation. In practice, areas identified for further adaptation support include: shift away from one-size-fits-all approach the identification and integration of appropriate modern farming method; a greater focus on building inclusive formal and informal institutions; and a more nuanced understanding regarding the roles and decision-making processes of influential, but external, actors. More research is needed to understand farm-scale trade-offs and the resulting impacts across spatial and temporal scales.

  8. Adaptation and creativity in cultural context

    Leonora M. Cohen

    2012-06-01

    Full Text Available Adaptation is the fit between the individual and the environment. The dynamic interplay between person, culture, and environment is one of the most important issues in analyzing creativity. Adaptation is defined as the fit or adjustment of the individual to external conditions, but adaptation can also mean moving from one environment to another more suitable, or even forcing the environment to adapt in response to creative efforts. Culture impacts creativity in limiting acceptable boundaries, yet providing the artifacts used in creating. Culture is impacted and changed by creative efforts. Tight conformity to confining environments or cultures can stifle. The creator must be aware of cultural values and not overstep these boundaries for work to be accepted. A developmental continuum of adaptive, creative behaviors suggests a shift from individual adaptation to the environment to adaptation by the world to the individual.

  9. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL)

    Luo Ronghua; Hong Bingrong

    2004-01-01

    An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the unce...

  10. Adaptive Modular Playware

    Lund, Henrik Hautop; Þorsteinsson, Arnar Tumi

    2011-01-01

    In this paper, we describe the concept of adaptive modular playware, where the playware adapts to the interaction of the individual user. We hypothesize that there are individual differences in user interaction capabilities and styles, and that adaptive playware may adapt to the individual user’s...

  11. Filter Bank Design for Subband Adaptive Filtering

    Haan, Jan Mark de

    2001-01-01

    Adaptive filtering is an important subject in the field of signal processing and has numerous applications in fields such as speech processing and communications. Examples in speech processing include speech enhancement, echo- and interference- cancellation, and speech coding. Subband filter banks have been introduced in the area of adaptive filtering in order to improve the performance of time domain adaptive filters. The main improvements are faster convergence speed and the reduction of co...

  12. Climate Policy Must Favour Mitigation Over Adaptation

    SCHUMACHER, Ingmar

    2016-01-01

    In climate change policy, adaptation tends to be viewed as beingas important as mitigation. In this article we present a simple yet generalargument for which mitigation must be preferred to adaptation.The argument rests on the observation that mitigation is a public goodwhile adaptation is a private one. This implies that the more one disaggregatesthe units in a social welfare function, i.e. the more one teasesout the public good nature of mitigation, the lower is average incomeand thus less ...

  13. Device-aware Adaptation of Websites

    Barsomo, Milad; Hurtig, Mats

    2014-01-01

    The use of handheld devices such as smart phones and tablets have exploded in the last few years. These mobile devices differ from regular desktops by having limited battery power, processing power, bandwidth, internal memory, and screen size. With many device types and with mobile adaptation being done in many ways, it is therefore important for websites to adapt to mobile users. This thesis characterise how websites currently are adapting to mobile devices. For our analysis and data collect...

  14. A Survey on Adaptation to Climate Change

    Dinda, Soumyananda

    2015-01-01

    In this 21st century, human civilization faces the toughest challenge to tackle the climate change for sustainable development. Civil society should adopt the climate change and reduce vulnerability for non-declining welfare. This paper reviews major papers on adaptation to climate change and provides an overview on the climate change and developing adaptive mechanism across the globe. Following major important articles this study provides clarity of the concept of adaptation, types of adapta...

  15. Sampling Development

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of the enterprise. This article discusses how to sample development in order to accurately discern the shape of developmental change. The ideal solutio...

  16. Bayesian analysis for exponential random graph models using the adaptive exchange sampler

    Jin, Ick Hoon

    2013-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the existence of intractable normalizing constants. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the issue of intractable normalizing constants encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.

  17. Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler.

    Jin, Ick Hoon; Yuan, Ying; Liang, Faming

    2013-10-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788

  18. Sample Size Calculations

    Noordzij, Marlies; Dekker, Friedo W.; Zoccali, Carmine; Jager, Kitty J.

    2011-01-01

    The sample size is the number of patients or other experimental units that need to be included in a study to answer the research question. Pre-study calculation of the sample size is important; if a sample size is too small, one will not be able to detect an effect, while a sample that is too large may be a waste of time and money. Methods to calculate the sample size are explained in statistical textbooks, but because there are many different formulas available, it can be difficult for inves...

  19. A method of language sampling

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees;

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples ...

  20. How important is importance for prospective memory?

    Stefan eWalter; Beat eMeier

    2014-01-01

    Forgetting to carry out an intention as planned can have serious consequences in everyday life. People sometimes even forget intentions that they consider as very important. Here, we review the literature on the impact of importance on prospective memory performance. We highlight different methods used to manipulate the importance of a prospective memory task such as providing rewards, importance relative to other ongoing activities, absolute importance, and providing social motives. Moreover...

  1. Innovation Model of the Concept of Professional Adaptation of Personnel

    Kurina Nataliya S.; Darchenko Nataliya D.

    2013-01-01

    The article considers the essence and types of adaptation as an important element of the modern theory of personnel management. It analyses problems of practical adaptation of personnel at domestic and Russian enterprises. It proves urgency and offers a concept of professional adaptationadaptation management. It describes main moments of the model-concept of professional adaptation of young specialists, possibilities and prospects of its introduction, risks and weaknesses. It shows innovat...

  2. The adaptation nature and content: a philosophical analysis

    Попович, О. В.

    2014-01-01

    Philosophical analysis of the phenomenon of adaptation is urgently needed, as the scientific discourse confirms the diversity within the meaning of adaptation and missing not only an integrative concept, but consistency in terminology of allied sciences. According to etymological definition of "adaptation" (from the latin words "adaptatio" (adaptation) and "adaptio" (adapt) allows to interpret it as a process and as a result of adjustment. In the development of the science of great importance...

  3. Bayesian Adaptive Exploration

    Loredo, Thomas J.

    2004-04-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.

  4. Adaptive Sticky Generalized Metropolis

    Fabrizio Leisen; Roberto Casarin; David Luengo; Luca Martino

    2013-01-01

    We introduce a new class of adaptive Metropolis algorithms called adaptive sticky algorithms for efficient general-purpose simulation from a target probability distribution. The transition of the Metropolis chain is based on a multiple-try scheme and the different proposals are generated by adaptive nonparametric distributions. Our adaptation strategy uses the interpolation of support points from the past history of the chain as in the adaptive rejection Metropolis. The algorithm efficiency i...

  5. Sampling Development

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. Language sampling

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...

  7. Bayesian Adaptive Exploration

    Loredo, T J

    2004-01-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...

  8. Environmental sampling

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  9. Motion Tracking with Fast Adaptive Background Subtraction

    Xiao De-gui; Yu Sheng-sheng; Zhou Jing-li

    2003-01-01

    To extract and track moving objects is usually one of the most important tasks of intelligent video surveillance systems. This paper presents a fast and adaptive background subtraction algorithm and the motion tracking process using this algorithm. The algorithm uses only luminance components of sampled image sequence pixels and models every pixel in a statistical model. The algorithm is characterized by its ability of real time detecting sudden lighting changes, and extracting and tracking motion objects faster. It is shown that our algorithm can be realized with lower time and space complexity and adjustable object detection error rate with comparison to other background subtraction algorithms. Making use of the algorithm, an indoor monitoring system is also worked out and the motion tracking process is presented in this paper. Experimental results testify the algorithm's good performances when used in an indoor monitoring system.

  10. Tone-Mapped Mean-Shift Based Environment Map Sampling.

    Feng, Wei; Yang, Ying; Wan, Liang; Yu, Changguo

    2016-09-01

    In this paper, we present a novel approach for environment map sampling, which is an effective and pragmatic technique to reduce the computational cost of realistic rendering and get plausible rendering images. The proposed approach exploits the advantage of adaptive mean-shift image clustering with aid of tone-mapping, yielding oversegmented strata that have uniform intensities and capture shapes of light regions. The resulted strata, however, have unbalanced importance metric values for rendering, and the strata number is not user-controlled. To handle these issues, we develop an adaptive split-and-merge scheme that refines the strata and obtains a better balanced strata distribution. Compared to the state-of-the-art methods, our approach achieves comparable and even better rendering quality in terms of SSIM, RMSE and HDRVDP2 image quality metrics. Experimental results further show that our approach is more robust to the variation of viewpoint, environment rotation, and sample number. PMID:26584494

  11. Adapting agriculture with traditional knowledge

    Swiderska, Krystyna; Reid, Hannah [IIED, London (United Kingdom); Song, Yiching; Li, Jingsong [Centre for Chinese Agriculutral Policy (China); Mutta, Doris [Kenya Forestry Research Institute (Kenya)

    2011-10-15

    Over the coming decades, climate change is likely to pose a major challenge to agriculture; temperatures are rising, rainfall is becoming more variable and extreme weather is becoming a more common event. Researchers and policymakers agree that adapting agriculture to these impacts is a priority for ensuring future food security. Strategies to achieve that in practice tend to focus on modern science. But evidence, both old and new, suggests that the traditional knowledge and crop varieties of indigenous peoples and local communities could prove even more important in adapting agriculture to climate change.

  12. Adaptation: Paradigm for the Gut and an Academic Career

    Warner, Brad W.

    2013-01-01

    Adaptation is an important compensatory response to environmental cues resulting in enhanced survival. In the gut, the abrupt loss of intestinal length is characterized by increased rates of enterocyte proliferation and apoptosis and culminates in adaptive villus and crypt growth. In the development of an academic pediatric surgical career, adaptation is also an important compensatory response to survive the ever changing research, clinical, and economic environment. The ability to adapt in b...

  13. The importance of genus Candida in human samples

    Bojić-Miličević Gordana M.

    2008-01-01

    Full Text Available Microbiology is a rapidly changing field. As new researches and experiences broaden our knowledge, changes in the approach to diagnosis and therapy have become necessary and appropriate. Recommended dosage of drugs, method and duration of administration, as well as contraindications to use, evolve over time all drugs. Over the last 2 decades, Candida species have emerged as causes of substantial morbidity and mortality in hospitalized individuals. Isolation of Candida from blood or other sterile sites, excluding the urinary tract, defines invasive candidiasis. Candida species are currently the fourth most common cause of bloodstream infections (that is, candidemia in U.S. hospitals and occur primarily in the intensive care unit (ICU, where candidemia is recognized in up to 1% of patients and where deep-seated Candida infections are recognized in an additional 1 to 2% of patients. Despite the introduction of newer anti-Candida agents, invasive candidiasis continues to have an attributable mortality rate of 40 to 49%; excess ICU and hospital stays of 12.7 days and 15.5 days, respectively, and increased care costs. Postmortem studies suggest that death rates related to invasive candidiasis might, in fact, be higher than those described because of undiagnosed and therefore untreated infection. The diagnosis of invasive candidiasis remains challenging for both clinicians and microbiologists. Reasons for missed diagnoses include nonspecific risk factors and clinical manifestations, low sensitivity of microbiological culture techniques, and unavailability of deep tissue cultures because of risks associated with the invasive procedures used to obtain them. Thus, a substantial proportion of invasive candidiasis in patients in the ICU is assumed to be undiagnosed and untreated. Yet even when invasive candidiasis is diagnosed, culture diagnosis delays treatment for 2 to 3 days, which contributes to mortality. Interventions that do not rely on a specific diagnosis and are implemented early in the course of Candida infection (that is, empirical therapy or before Candida infection occurs (that is, prophylaxis might improve patient survival and may be warranted. Selective and nonselective administration of anti-Candida prophylaxis is practiced in some ICUs. Several trials have tested this, but results were limited by low statistical power and choice of outcomes. Thus, the role of anti-Candida prophylaxis for patients in the ICU remains controversial. Initiating anti-Candida therapy for patients in the ICU who have suspected infection but have not responded to antibacterial therapy (empirical therapy is practiced in some hospitals. This practice, however, remains a subject of considerable debate. These patients are perceived to be at higher risk from invasive candidiasis and therefore are likely to benefit from empirical therapy. Nonetheless, empirical anti-Candida therapies have not been evaluated in a randomized trial and would share shortcomings that are similar to those described for prophylactic strategies. Current treatment guidelines by the Infectious Diseases Society of America (IDSA do not specify whether empirical anti-Candida therapy should be provided to immunocompetent patients. If such therapy is given, IDSA recommends that its use should be limited to patients with Candida colonization in multiple sites, patients with several other risk factors, and patients with no uncorrected causes of fever. Without data from clinical trials, determining an optimal anti-Candida strategy for patients in the ICU is challenging. Identifying such a strategy can help guide clinicians in choosing adequate therapy and may improve patient outcomes. In our study, we developed a decision analytic model to evaluate the cost-effectiveness of empirical anti-Candida therapy given to high-risk patients in the ICU, defined as those with altered temperature (fever or hypothermia or unexplained hypotension despite 3 days of antibacterial therapy in the ICU.

  14. Performance evaluation of communication systems via importance sampling

    Remondo Bueno, D.

    2000-01-01

    In the design and development of telecommunication systems, the preparation of experiments can be made more effective by predicting the system performance and its dependence on the different system parameters. This can be done by modeling the system and using performance evaluation methods. This dis

  15. Expressing Adaptation Strategies Using Adaptation Patterns

    Zemirline, N.; Bourda, Y.; Reynaud, C.

    2012-01-01

    Today, there is a real challenge to enable personalized access to information. Several systems have been proposed to address this challenge including Adaptive Hypermedia Systems (AHSs). However, the specification of adaptation strategies remains a difficult task for creators of such systems. In this paper, we consider the problem of the definition…

  16. Bayesian Optimization for Adaptive MCMC

    Mahendran, Nimalan; Wang, Ziyu; Hamze, Firas; De Freitas, Nando

    2011-01-01

    This paper proposes a new randomized strategy for adaptive MCMC using Bayesian optimization. This approach applies to non-differentiable objective functions and trades off exploration and exploitation to reduce the number of potentially costly objective function evaluations. We demonstrate the strategy in the complex setting of sampling from constrained, discrete and densely connected probabilistic graphical models where, for each variation of the problem, one needs to adjust the parameters o...

  17. Mobile, Flexible, and Adaptable

    Agergaard, Jytte; Thao, Thi Vu

    2011-01-01

    Industrialisation and urban growth are constitutive aspects of Vietnam's new economy and are important driving forces behind increasing rural-to-urban migration. Growth in informal sector employment is a significant aspect of this development, which has provided for both male and female migrants...... networking, and remittance practices. The paper is organised around why and how migrants have entered the informal labour market in Hanoi and how they make their livings there while also maintaining ties with their rural homes. In conclusion, we discuss how the migration networks and remittance practices of...... the female porters demonstrate a particular way of adapting to the migration process. Also, it is emphasised how women's flexible practices are facilitated by women's own village-based networks. It is suggested that ‘in-betweenness’, which stands for the simultaneous and overlapping presence of urban...

  18. Low dose effects. Adaptive response

    The purpose of this work was to evaluate if there are disturbancies in adaptive response when lymphocytes of people living on the polluted with radionuclides area after Chernobyl disaster and liquidators suffered from accident have been investigated. The level of lymphocytes with micronuclei have been scored in Moscow donors and people living in Bryansk region with the degree of contamination 15 - 40 Ci/km. The doses that liquidators have been obtained were not higher then 25 cGy. The mean spontaneous level of MN in control people and people from Chernobyl zones does't differ significantly but the individual variability in the mean value between two populations does not differ significantly too. And in this case it seems that persons of exposed areas. Then another important fact in lymphocytes of people living on polluted areas the chronic low dose irradiation does not induce the adaptive response. In Moscow people in most cases (≅ 59 %) the adaptive response is observed and in some cases the demonstration of adaptive response is not significant (≅1%). In Chernobyl population exposed to chronic low level, low dose rate irradiation there are fewer people here with distinct adaptive response (≅38%). And there appear beings with increased radiosensitivity after conditioned dose. Such population with enhanced radiosensitivity have not observed in Moscow. In liquidators the same types of effects have been registered. These results have been obtained on adults. Adaptive response in children 8 - 14 old population living in Moscow and in Chernobyl zone have been investigated too. In this case the spontaneous level of MN is higher in children living in polluted areas, after the 1.0 Gy irradiation the individual variability is very large. Only 5 % of children have distinct is very large. Only 5 % of children have distinct adaptive response, the enhancement of radiosensitivity after conditioned dose is observed. (authors)

  19. Localizing recent adaptive evolution in the human genome

    Williamson, Scott H; Hubisz, Melissa J; Clark, Andrew G;

    2007-01-01

    Identifying genomic locations that have experienced selective sweeps is an important first step toward understanding the molecular basis of adaptive evolution. Using statistical methods that account for the confounding effects of population demography, recombination rate variation, and single......-nucleotide polymorphism ascertainment, while also providing fine-scale estimates of the position of the selected site, we analyzed a genomic dataset of 1.2 million human single-nucleotide polymorphisms genotyped in African-American, European-American, and Chinese samples. We identify 101 regions of the human genome with......, clusters of olfactory receptors, genes involved in nervous system development and function, immune system genes, and heat shock genes. We also observe consistent evidence of selective sweeps in centromeric regions. In general, we find that recent adaptation is strikingly pervasive in the human genome, with...

  20. Sample size methodology

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  1. Uncertainty in adaptive capacity

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  2. ASIC DESIGN OF ADAPTIVE THRESHOLD DENOISE DWT CHIP

    2002-01-01

    According to the relationship of wavelet transform and perfect reconstructive FIR filter banks, this paper presents a real-time chip with adaptive Donoho's non-linear soft-threshold for denoising in different levels of multi-scale space through rearranging the input data during convolving, filtering and sub-sampling.And more important, it gives a simple iterative algorithm to calculate the variance of the noise in interregna with no signal.It works well whether the signal or noise is stationary or not.

  3. Adaptation and initial validation of the Patient Health Questionnaire - 9 (PHQ-9) and the Generalized Anxiety Disorder - 7 Questionnaire (GAD-7) in an Arabic speaking Lebanese psychiatric outpatient sample.

    Sawaya, Helen; Atoui, Mia; Hamadeh, Aya; Zeinoun, Pia; Nahas, Ziad

    2016-05-30

    The Patient Health Questionnaire - 9 (PHQ-9) and Generalized Anxiety Disorder - 7 (GAD-7) are short screening measures used in medical and community settings to assess depression and anxiety severity. The aim of this study is to translate the screening tools into Arabic and evaluate their psychometric properties in an Arabic-speaking Lebanese psychiatric outpatient sample. The patients completed the questionnaires, among others, prior to being evaluated by a clinical psychiatrist or psychologist. The scales' internal consistency and factor structure were measured and convergent and discriminant validity were established by comparing the scores with clinical diagnoses and the Psychiatric Diagnostic Screening Questionnaire - MDD subset (PDSQ - MDD). Results showed that the PHQ-9 and GAD-7 are reliable screening tools for depression and anxiety and their factor structures replicated those reported in the literature. Sensitivity and specificity analyses showed that the PHQ-9 is sensitive but not specific at capturing depressive symptoms when compared to clinician diagnoses whereas the GAD-7 was neither sensitive nor specific at capturing anxiety symptoms. The implications of these findings are discussed in reference to the scales themselves and the cultural specificity of the Lebanese population. PMID:27031595

  4. The Importance of Resilience for Well-Being in Retirement

    Cristiane Pimentel Nalin

    2015-08-01

    Full Text Available The increase in the elderly population has prompted research on retirement. This study investigated the importance of resilience, economic satisfaction, the length of retirement, and planning to well-being during retirement of 270 participants. The majority of this sample were men (64%, and the mean age was 65 years (SD = 5.7. The participants were retired members of 10 public and private organizations in Rio de Janeiro. Factor analysis and hierarchical regression were performed. The results showed that determined resilience (mastery, adaptability, confidence and perseverance and socioeconomic satisfaction were the main predictors of well-being in retirement and explained 28% of this model. The findings suggest that well-being in retirement is closely related to socioeconomic satisfaction and determined resilience. Additional research should address the importance of resilience for the well-being of retirees who are or not members of retirement associations. Resilience attitudes should be promoted in Retirement Education Programs.

  5. 5. Sampling

    The sampling is described for radionuclide X-ray fluorescence analysis. Aerosols are captured with various filter materials whose properties are summed up in the table. Fine dispersed solid and liquid particles and gaseous admixtures may be captured by bubbling air through a suitable absorption solution. The concentration of small amounts of impurities from large volumes of air is done by adsorbing impurities on surfactants, e.g., activated charcoal, silica gel, etc. Aerosols may be captured using an electrostatic precipitator and aerosol fractions may be separated with a cascade impactor. Water sampling differs by the water source, i.e., ground water, surface water, rain or waste water. Soil samples are taken by probes. (ES)

  6. Adaptive Coordinate Descent

    Loshchilov, Ilya; Schoenauer, Marc; Sebag, Michèle

    2011-01-01

    Independence from the coordinate system is one source of efficiency and robustness for the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). The recently proposed Adaptive Encoding (AE) procedure generalizes CMA-ES adaptive mechanism, and can be used together with any optimization algorithm. Adaptive Encoding gradually builds a transformation of the coordinate system such that the new coordinates are as decorrelated as possible with respect to the objective function. But any optimizat...

  7. Systems and methods for self-synchronized digital sampling

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  8. Adaptation to climate change

    J. Carmin; K. Tierney; E. Chu; L.M. Hunter; J.T. Roberts; L. Shi

    2015-01-01

    Climate change adaptation involves major global and societal challenges such as finding adequate and equitable adaptation funding and integrating adaptation and development programs. Current funding is insufficient. Debates between the Global North and South center on how best to allocate the financ

  9. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL

    Luo Ronghua

    2008-11-01

    Full Text Available An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the uncertainty of the robot's pose by using the population growth model. In addition, by using the crossover and mutation operators in evolutionary computation, intra-species evolution can drive the samples move towards the regions where the desired posterior density is large. So a small size of samples can represent the desired density well enough to make precise localization. The new algorithm is termed coevolution based adaptive Monte Carlo localization (CEAMCL. Experiments have been carried out to prove the efficiency of the new localization algorithm.

  10. Molecular evolution and thermal adaptation

    Chen, Peiqiu

    2011-12-01

    generations. Diversity plays an important role in thermal adaptation: While monoclonal strains adapt via acquisition and rapid fixation of new early mutations, wild population adapt via standing genetic variations, and they are more robust against thermal shocks due to greater diversity within the initial population.

  11. Adaptive resolution refinement for high-fidelity continuum parameterizations

    Anderson, J.W.; Khamayseh, A. [Los Alamos National Lab., NM (United States); Jean, B.A. [Mississippi State Univ., Starkville, MS (United States)

    1996-10-01

    This paper describes an algorithm the adaptively samples a parametric continuum so that a fidelity metric is satisfied. Using the divide-and-conquer strategy of adaptive sampling eliminates the guesswork of traditional uniform parameterization techniques. The space and time complexity of parameterization are increased in a controllable manner so that a desired fidelity is obtained.

  12. An analysis of adaptation negotiations in Poznan

    Hastily presented as one of the major accomplishments of the 14. United Nations Conference on Climate Change in Poznan, discussions on adaptation actually need careful analysis. Obviously, an increasing number of stakeholders (whether Parties, delegation members, civil society, businesses) see adaptation as a top concern, and this resulted in Poznan in a strong presence of the issue in plenary sessions, contact and informal groups, side events, press conferences, stands, etc. With respect to the historical treatment of adaptation, which has been quite light before COP 13 in Bali (2007), the vogue for adaptation may be good news. However, all the difficulty now lies in translating the semantic success and political momentum into operational outcomes. As the following critical synthesis shows, Poznan can hardly be considered as a major breakthrough in that regard although some significant steps forward have been made. In the past, little importance has been given to adaption in the climate change talks until the middle of this decade. In the early days of discussions (the 80's), climate change was not seen as a pressing matter, impacts were not expected to occur if action to reduce climate change was appropriately taken and there was thus no hurry to adapt. Then, in the late 90's adaptation was seen as a possible alternative to mitigation, and those defending adaptation as being resigned. Adaptation only started to gain some momentum in 2005 in Montreal, and was finally considered on an equal footage with mitigation in 2007 in Bali. Discussions on adaptation are thus still not at the level of those on mitigation, but Poznan was in a sense a major accomplishment in bringing adaptation on top of the agenda. Before Poznan, adaptation under the UNFCCC was limited to a couple of loose work programmes (see below) and three small funds financing adaptation activities in developing countries. One of these activities, arguably the most visible, is the realisation of National

  13. Continuous-time adaptive critics.

    Hanselmann, Thomas; Noakes, Lyle; Zaknich, Anthony

    2007-05-01

    A continuous-time formulation of an adaptive critic design (ACD) is investigated. Connections to the discrete case are made, where backpropagation through time (BPTT) and real-time recurrent learning (RTRL) are prevalent. Practical benefits are that this framework fits in well with plant descriptions given by differential equations and that any standard integration routine with adaptive step-size does an adaptive sampling for free. A second-order actor adaptation using Newton's method is established for fast actor convergence for a general plant and critic. Also, a fast critic update for concurrent actor-critic training is introduced to immediately apply necessary adjustments of critic parameters induced by actor updates to keep the Bellman optimality correct to first-order approximation after actor changes. Thus, critic and actor updates may be performed at the same time until some substantial error build up in the Bellman optimality or temporal difference equation, when a traditional critic training needs to be performed and then another interval of concurrent actor-critic training may resume. PMID:17526332

  14. Uses of risk importance measures

    Risk importance measures provide an understandable and practical way of presenting probabilistic safety analysis results which too often tend to remain abstract numbers without real insight into the content. The report clarifies the definitions, relationships and interpretations of the three most basic measures: Risk increase factor, risk decrease factor, and fractional contribution. The above three measures already cover the main types of risk importance measures. Many other importance measures presented in literature are close variants to some of these three measures. They are related in many cases so that, for a technical system considered, the two other measures can be derived from the one calculated first. However, the practical interpretations are different, and hence each three measures have their own uses and rights to existence. The fundamental aspect of importance measures is, that they express some specific influence of a basic event on the total risk. The basic failure or error events are the elements from which the reliability and risk models are constituted. The importance measures are relative, which is an advantage compared to absolute risk numbers, due to insensitivity with respect to quantification uncertainties. Therefore they are particularly adapted to give first hand guidance where to focus main interest from the system's risk and reliability point of view and wherefrom to continue the analysis with more sophisticated methods requiring more effort

  15. The Scope of Adaptive Digital Games for Education

    Prince, Rikki; Davis, Hugh

    2008-01-01

    In learning technologies, there is a distinct difference between the user sequencing in a system based on IMS simple sequencing and an adaptive hypermedia system. This range of possibilities is important to consider when attempting to augment educational games with adaptive elements. This poster demonstrates how truly adaptive games could be designed and discusses why this is useful in the field of education.

  16. Adaptive research supervision : Exploring expert thesis supervisors' practical knowledge

    de Kleijn, Renske A M; Meijer, Paulien C.; Brekelmans, Mieke; Pilot, Albert

    2015-01-01

    Several researchers have suggested the importance of being responsive to students' needs in research supervision. Adapting support strategies to students' needs in light of the goals of a task is referred to as adaptivity. In the present study, the practice of adaptivity is explored by interviewing

  17. Cognitive adaptation to nonmelanoma skin cancer.

    Czajkowska, Zofia; Radiotis, George; Roberts, Nicole; Körner, Annett

    2013-01-01

    Taylor's (1983) cognitive adaptation theory posits that when people go through life transitions, such as being diagnosed with a chronic disease, they adjust to their new reality. The adjustment process revolves around three themes: search for positive meaning in the experience or optimism, attempt to regain a sense of mastery in life, as well as an effort to enhance self-esteem. In the sample of 57 patients with nonmelanoma skin cancer the Cognitive Adaptation Index successfully predicted participants' distress (p accounting for 60% of the variance and lending support for the Taylor's theory of cognitive adaptation in this population. PMID:23844920

  18. Imaging deep and clear in thick inhomogeneous samples

    Andilla, Jordi; Olarte, Omar E.; Aviles-Espinosa, Rodrigo; Loza-Alvarez, Pablo

    2014-03-01

    Acquisition of images deep inside large samples is one of the most demanded improvements that current biology applications ask for. Absorption, scattering and optical aberrations are the main difficulties encountered in these types of samples. Adaptive optics has been imported form astronomy to deal with the optical aberrations induced by the sample. Nonlinear microscopy and SPIM have been proposed as interesting options to image deep into a sample. Particularly, light-sheet microscopy, due to its low photo bleaching properties, opens new opportunities to obtain information for example in long time lapses for large 3D imaging. In this work, we perform an overview of the application of adaptive optics to the fluorescence microscopy in linear and non-linear modalities. Then we will focus in the light-sheet microscopy architecture of two orthogonal optical paths which implies new requirements in terms of optical correction. We will see the different issues that appear in light-sheet microscopy particularly when imaging large and non-flat samples. Finally, we will study the problem of the isoplanetic patches.

  19. Adaptation of the Wechsler Intelligence Scale for Children-IV (WISC-IV) for Vietnam.

    Dang, Hoang-Minh; Weiss, Bahr; Pollack, Amie; Nguyen, Minh Cao

    2012-12-01

    Intelligence testing is used for many purposes including identification of children for proper educational placement (e.g., children with learning disabilities, or intellectually gifted students), and to guide education by identifying cognitive strengths and weaknesses so that teachers can adapt their instructional style to students' specific learning styles. Most of the research involving intelligence tests has been conducted in highly developed Western countries, yet the need for intelligence testing is as or even more important in developing countries. The present study, conducted through the Vietnam National University Clinical Psychology CRISP Center, focused on the cultural adaptation of the WISC-IV intelligence test for Vietnam. We report on (a) the adaptation process including the translation, cultural analysis and modifications involved in adaptation, (b) present results of two pilot studies, and (c) describe collection of the standardization sample and results of analyses with the standardization sample, with the goal of sharing our experience with other researchers who may be involved in or interested in adapting or developing IQ tests for non-Western, non-English speaking cultures. PMID:23833330

  20. Dubbing: adapting cultures in the global communication era

    Lidia Canu

    2012-01-01

    Adapting translation for dubbing is not a mere linguistic fact: it is mainly the adaptation of cultures. In fact, audiovisual translation and adaptation implicitly takes into account the importance of the historical background behind the multiplicity of languages and cultures, and by doing so, it becomes a means of cultural diffusion. That peculiarity enables what we can describe as the “socio-anthropological function” of the adaptation of translation for dubbing, which is the obj...

  1. An exploration study to find important factors influencing on expert systems

    Naser Azad

    2013-09-01

    Full Text Available Knowledge management plays an important role in modern management systems since many existing systems move towards learning organizations. Expert systems, on the other hand, are considered as the most popular techniques for adapting recent developments on knowledge management. This paper presents an empirical investigation to find important factors influencing adaptation of expert systems. The proposed study designs a questionnaire in Likert scale consists of 25 questions, distributes it among 258 people who have recently graduated from computer science and they are familiar with implementation of expert systems. Cronbach alpha is calculated as 0.730 and Kaiser-Meyer-Olkin Measure of Sampling Adequacy and Approx. Chi-Square are 0.748 and 1377.397, respectively. The study has implemented principal component analysis and the results have indicated that there were four factors influencing expert systems including systems management, intelligence systems, system analysis and specialized analysis.

  2. Sequential sampling: a novel method in farm animal welfare assessment.

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  3. Is adaptive co-management ethical?

    Fennell, David; Plummer, Ryan; Marschke, Melissa

    2008-07-01

    'Good' governance and adaptive co-management hold broad appeal due to their positive connotations and 'noble ethical claims'. This paper poses a fundamental question: is adaptive co-management ethical? In pursuing an answer to this question, the concept of adaptive co-management is succinctly summarized and three ethical perspectives (deontology, teleology and existentialism) are explored. The case of adaptive co-management in Cambodia is described and subsequently considered through the lens of ethical triangulation. The case illuminates important ethical considerations and directs attention towards the need for meditative thinking which increases the value of tradition, ecology, and culture. Giving ethics a central position makes clear the potential for adaptive co-management to be an agent for governance, which is good, right and authentic as well as an arena to embrace uncertainty. PMID:17391840

  4. Successful adaptation to climate change across scales

    Climate change impacts and responses are presently observed in physical and ecological systems. Adaptation to these impacts is increasingly being observed in both physical and ecological systems as well as in human adjustments to resource availability and risk at different spatial and societal scales. We review the nature of adaptation and the implications of different spatial scales for these processes. We outline a set of normative evaluative criteria for judging the success of adaptations at different scales. We argue that elements of effectiveness, efficiency, equity and legitimacy are important in judging success in terms of the sustainability of development pathways into an uncertain future. We further argue that each of these elements of decision-making is implicit within presently formulated scenarios of socio-economic futures of both emission trajectories and adaptation, though with different weighting. The process by which adaptations are to be judged at different scales will involve new and challenging institutional processes. (author)

  5. Inducible competitors and adaptive diversification

    Beren W. ROBINSON, David W. PFENNIG

    2013-08-01

    Full Text Available Identifying the causes of diversification is central to evolutionary biology. The ecological theory of adaptive diversification holds that the evolution of phenotypic differences between populations and species––and the formation of new species––stems from divergent natural selection, often arising from competitive interactions. Although increasing evidence suggests that phenotypic plasticity can facilitate this process, it is not generally appreciated that competitively mediated selection often also provides ideal conditions for phenotypic plasticity to evolve in the first place. Here, we discuss how competition plays at least two key roles in adaptive diversification depending on its pattern. First, heterogenous competition initially generates heterogeneity in resource use that favors adaptive plasticity in the form of “inducible competitors”. Second, once such competitively induced plasticity evolves, its capacity to rapidly generate phenotypic variation and expose phenotypes to alternate selective regimes allows populations to respond readily to selection favoring diversification, as may occur when competition generates steady diversifying selection that permanently drives the evolutionary divergence of populations that use different resources. Thus, competition plays two important roles in adaptive diversification––one well-known and the other only now emerging––mediated through its effect on the evolution of phenotypic plasticity [Current Zoology 59 (4: 537–552, 2013].

  6. Diffusion Adaptation over Networks

    Sayed, Ali H

    2012-01-01

    Adaptive networks are well-suited to perform decentralized information processing and optimization tasks and to model various types of self organized and complex behavior encountered in nature. Adaptive networks consist of a collection of agents with processing and learning abilities. The agents are linked together through a connection topology, and they cooperate with each other through local interactions to solve distributed inference problems in real-time. The continuous diffusion of information across the network enables agents to adapt their performance in relation to changing data and network conditions; it also results in improved adaptation and learning performance relative to non-cooperative networks. This article provides an overview of diffusion strategies for adaptation and learning over networks. The article is divided into several sections: 1. Motivation; 2. Mean-Square-Error Estimation; 3. Distributed Optimization via Diffusion Strategies; 4. Adaptive Diffusion Strategies; 5. Performance of Ste...

  7. Quantifying adaptive evolution in the Drosophila immune system.

    Darren J Obbard

    2009-10-01

    Full Text Available It is estimated that a large proportion of amino acid substitutions in Drosophila have been fixed by natural selection, and as organisms are faced with an ever-changing array of pathogens and parasites to which they must adapt, we have investigated the role of parasite-mediated selection as a likely cause. To quantify the effect, and to identify which genes and pathways are most likely to be involved in the host-parasite arms race, we have re-sequenced population samples of 136 immunity and 287 position-matched non-immunity genes in two species of Drosophila. Using these data, and a new extension of the McDonald-Kreitman approach, we estimate that natural selection fixes advantageous amino acid changes in immunity genes at nearly double the rate of other genes. We find the rate of adaptive evolution in immunity genes is also more variable than other genes, with a small subset of immune genes evolving under intense selection. These genes, which are likely to represent hotspots of host-parasite coevolution, tend to share similar functions or belong to the same pathways, such as the antiviral RNAi pathway and the IMD signalling pathway. These patterns appear to be general features of immune system evolution in both species, as rates of adaptive evolution are correlated between the D. melanogaster and D. simulans lineages. In summary, our data provide quantitative estimates of the elevated rate of adaptive evolution in immune system genes relative to the rest of the genome, and they suggest that adaptation to parasites is an important force driving molecular evolution.

  8. Consciousness And Adaptive Behavior

    Sieb, Richard/A.

    2005-01-01

    Consciousness has resisted scientific explanation for centuries. The main problem in explaining consciousness is its subjectivity. Subjective systems may be adaptive. Humans can produce voluntary new or novel intentional (adaptive) action and such action is always accompanied by consciousness. Action normally arises from perception. Perception must be rerepresented in order to produce new or novel adaptive action. The internal explicit states produced by a widespread nonlinear emergen...

  9. Human Adaptations: Free divers

    Tournat, Troy Z.

    2014-01-01

    Freediving has been around for thousands of years and was only way to dive until the inventionof oxygen tanks in the 19th century. Around the world, people dove for goods such as pearls, andtoday people freedive for sport. Divers stretch the limit of their body and mind’s capabilitiesthrough psychological adaptations from thermal, respiratory, and cardiovascular responses.Findings conclude that thermal adaptations are a similar process to cold adaptive response. Withthe implementation of wets...

  10. Assessing institutional capacities to adapt to climate change - integrating psychological dimensions in the Adaptive Capacity Wheel

    Grothmann, T.; Grecksch, K.; Winges, M.; Siebenhüner, B.

    2013-03-01

    Several case studies show that "soft social factors" (e.g. institutions, perceptions, social capital) strongly affect social capacities to adapt to climate change. Many soft social factors can probably be changed faster than "hard social factors" (e.g. economic and technological development) and are therefore particularly important for building social capacities. However, there are almost no methodologies for the systematic assessment of soft social factors. Gupta et al. (2010) have developed the Adaptive Capacity Wheel (ACW) for assessing the adaptive capacity of institutions. The ACW differentiates 22 criteria to assess six dimensions: variety, learning capacity, room for autonomous change, leadership, availability of resources, fair governance. To include important psychological factors we extended the ACW by two dimensions: "adaptation motivation" refers to actors' motivation to realise, support and/or promote adaptation to climate. "Adaptation belief" refers to actors' perceptions of realisability and effectiveness of adaptation measures. We applied the extended ACW to assess adaptive capacities of four sectors - water management, flood/coastal protection, civil protection and regional planning - in North Western Germany. The assessments of adaptation motivation and belief provided a clear added value. The results also revealed some methodological problems in applying the ACW (e.g. overlap of dimensions), for which we propose methodological solutions.

  11. Assessing institutional capacities to adapt to climate change: integrating psychological dimensions in the Adaptive Capacity Wheel

    Grothmann, T.; Grecksch, K.; Winges, M.; Siebenhüner, B.

    2013-12-01

    Several case studies show that social factors like institutions, perceptions and social capital strongly affect social capacities to adapt to climate change. Together with economic and technological development they are important for building social capacities. However, there are almost no methodologies for the systematic assessment of social factors. After reviewing existing methodologies we identify the Adaptive Capacity Wheel (ACW) by Gupta et al. (2010), developed for assessing the adaptive capacity of institutions, as the most comprehensive and operationalised framework to assess social factors. The ACW differentiates 22 criteria to assess 6 dimensions: variety, learning capacity, room for autonomous change, leadership, availability of resources, fair governance. To include important psychological factors we extended the ACW by two dimensions: "adaptation motivation" refers to actors' motivation to realise, support and/or promote adaptation to climate; "adaptation belief" refers to actors' perceptions of realisability and effectiveness of adaptation measures. We applied the extended ACW to assess adaptive capacities of four sectors - water management, flood/coastal protection, civil protection and regional planning - in northwestern Germany. The assessments of adaptation motivation and belief provided a clear added value. The results also revealed some methodological problems in applying the ACW (e.g. overlap of dimensions), for which we propose methodological solutions.

  12. Returning Samples from Enceladus

    Tsou, P.; Kanik, I.; Brownlee, D.; McKay, C.; Anbar, A.; Glavin, D.; Yano, H.

    2012-12-01

    From the first half century of space exploration, we have obtained samples only from the Moon, comet Wild 2, the Solar Wind and the asteroid Itokawa. The in-depth analyses of these samples in terrestrial laboratories have yielded profound knowledge that could not have been obtained without the returned samples. While obtaining samples from Solar System bodies is crucial science, it is rarely done due to cost and complexity. Cassini's discovery of geysers on Enceladus and organic materials, indicate that there is an exceptional opportunity and science rational to do a low-cost flyby sample return mission, similar to what was done by the Stardust. The earliest low cost possible flight opportunity is the next Discovery Mission [Tsou et al 2012]. Enceladus Plume Discovery - While Voyager provided evidence for young surfaces on Enceladus, the existence of Enceladus plumes was discovered by Cassini. Enceladus and comets are the only known solar system bodies that have jets enabling sample collection without landing or surface contact. Cassini in situ Findings -Cassini's made many discoveries at Saturn, including the break up of large organics in the plumes of Enceladus. Four prime criteria for habitability are liquid water, a heat source, organics and nitrogen [McKay et al. 2008, Waite et al. 2009, Postberg et al. 2011]. Out of all the NASA designated habitability targets, Enceladus is the single body that presents evidence for all four criteria. Significant advancement in the exploration of the biological potential of Enceladus can be made on returned samples in terrestrial laboratories where the full power of state-of-the-art laboratory instrumentation and procedures can be used. Without serious limits on power, mass or even cost, terrestrial laboratories provide the ultimate in analytical capability, adaptability, reproducibility and reliability. What Questions can Samples Address? - Samples collected from the Enceladus plume will enable a thorough and replicated

  13. Adaptive Multimedia Retrieval: Semantics, Context, and Adaptation

    This book constitutes the thoroughly refereed post-conference proceedings of the 10th International Conference on Adaptive Multimedia Retrieval, AMR 2012, held in Copenhagen, Denmark, in October 2012. The 17 revised full papers presented were carefully reviewed and selected from numerous submissi......This book constitutes the thoroughly refereed post-conference proceedings of the 10th International Conference on Adaptive Multimedia Retrieval, AMR 2012, held in Copenhagen, Denmark, in October 2012. The 17 revised full papers presented were carefully reviewed and selected from numerous...

  14. Decentralized adaptive control

    Oh, B. J.; Jamshidi, M.; Seraji, H.

    1988-01-01

    A decentralized adaptive control is proposed to stabilize and track the nonlinear, interconnected subsystems with unknown parameters. The adaptation of the controller gain is derived by using model reference adaptive control theory based on Lyapunov's direct method. The adaptive gains consist of sigma, proportional, and integral combination of the measured and reference values of the corresponding subsystem. The proposed control is applied to the joint control of a two-link robot manipulator, and the performance in computer simulation corresponds with what is expected in theoretical development.

  15. Adaptive Wireless Transceiver Project

    National Aeronautics and Space Administration — Wireless technologies are an increasingly attractive means for spatial data, input, manipulation, and distribution. Mobitrum is proposing an innovative Adaptive...

  16. Local adaptation in brown trout early life-history traits: implications for climate change adaptability

    Jensen, L.F.; Hansen, Michael Møller; Pertoldi, C.;

    2008-01-01

    adapt. Temperature-related adaptability in traits related to phenology and early life history are expected to be particularly important in salmonid fishes. We focused on the latter and investigated whether four populations of brown trout (Salmo trutta) are locally adapted in early life-history traits......) for two traits, indicating local adaptation. A temperature effect was observed for three traits. However, this effect varied among populations due to locally adapted reaction norms, corresponding to the temperature regimes experienced by the populations in their native environments. Additive genetic...... variance and heritable variation in phenotypic plasticity suggest that although increasing temperatures are likely to affect some populations negatively, they may have the potential to adapt to changing temperature regimes.  ...

  17. The technological influence on health professionals' care: translation and adaptation of scales1

    Almeida, Carlos Manuel Torres; Almeida, Filipe Nuno Alves dos Santos; Escola, Joaquim José Jacinto; Rodrigues, Vitor Manuel Costa Pereira

    2016-01-01

    Objectives: in this study, two research tools were validated to study the impact of technological influence on health professionals' care practice. Methods: the following methodological steps were taken: bibliographic review, selection of the scales, translation and cultural adaptation and analysis of psychometric properties. Results: the psychometric properties of the scale were assessed based on its application to a sample of 341 individuals (nurses, physicians, final-year nursing and medical students). The validity, reliability and internal consistency were tested. Two scales were found: Caring Attributes Questionnaire (adapted) with a Cronbach's Alpha coefficient of 0.647 and the Technological Influence Questionnaire (adapted) with an Alpha coefficient of 0.777. Conclusions: the scales are easy to apply and reveal reliable psychometric properties, an additional quality as they permit generalized studies on a theme as important as the impact of technological influence in health care. PMID:27143537

  18. A General Framework for Sequential and Adaptive Methods in Survival Studies

    Luo, Xiaolong; Ying, Zhiliang

    2011-01-01

    Adaptive treatment allocation schemes based on interim responses have generated a great deal of recent interest in clinical trials and other follow-up studies. An important application of such schemes is in survival studies, where the response variable of interest is time to the occurrence of a certain event. Due to possible dependency structures inherited from the enrollment and allocation schemes, existing approaches to survival models, including those that handle staggered entry, cannot be applied directly. This paper develops a new general framework with its theoretical foundation for handling such adaptive designs. The new approach is based on marked point processes and differs from existing approaches in that it considers entry and calender times rather than survival and calender times. Large sample properties, which are essential for statistical inference, are established. Special attention is given to the Cox model and related score processes. Applications to adaptive and sequential designs are discus...

  19. Ship detection for high resolution optical imagery with adaptive target filter

    Ju, Hongbin

    2015-10-01

    Ship detection is important due to both its civil and military use. In this paper, we propose a novel ship detection method, Adaptive Target Filter (ATF), for high resolution optical imagery. The proposed framework can be grouped into two stages, where in the first stage, a test image is densely divided into different detection windows and each window is transformed to a feature vector in its feature space. The Histograms of Oriented Gradients (HOG) is accumulated as a basic feature descriptor. In the second stage, the proposed ATF highlights all the ship regions and suppresses the undesired backgrounds adaptively. Each detection window is assigned a score, which represents the degree of the window belonging to a certain ship category. The ATF can be adaptively obtained by the weighted Logistic Regression (WLR) according to the distribution of backgrounds and targets of the input image. The main innovation of our method is that we only need to collect positive training samples to build the filter, while the negative training samples are adaptively generated by the input image. This is different to other classification method such as Support Vector Machine (SVM) and Logistic Regression (LR), which need to collect both positive and negative training samples. The experimental result on 1-m high resolution optical images shows the proposed method achieves a desired ship detection performance with higher quality and robustness than other methods, e.g., SVM and LR.

  20. Determination of Sample Size

    Naing, Nyi Nyi

    2003-01-01

    There is a particular importance of determining a basic minimum required ‘n’ size of the sample to recognize a particular measurement of a particular population. This article has highlighted the determination of an appropriate size to estimate population parameters.

  1. Multiple branched adaptive steered molecular dynamics

    Ozer, Gungor; Keyes, Thomas; Quirk, Stephen; Hernandez, Rigoberto

    2014-08-01

    Steered molecular dynamics, SMD, [S. Park and K. Schulten, J. Chem. Phys. 120, 5946 (2004)] combined with Jarzynski's equality has been used widely in generating free energy profiles for various biological problems, e.g., protein folding and ligand binding. However, the calculated averages are generally dominated by "rare events" from the ensemble of nonequilibrium trajectories. The recently proposed adaptive steered molecular dynamics, ASMD, introduced a new idea for selecting important events and eliminating the non-contributing trajectories, thus decreasing the overall computation needed. ASMD was shown to reduce the number of trajectories needed by a factor of 10 in a benchmarking study of decaalanine stretching. Here we propose a novel, highly efficient "multiple branching" (MB) version, MB-ASMD, which obtains a more complete enhanced sampling of the important trajectories, while still eliminating non-contributing segments. Compared to selecting a single configuration in ASMD, MB-ASMD offers to select multiple configurations at each segment along the reaction coordinate based on the distribution of work trajectories. We show that MB-ASMD has all benefits of ASMD such as faster convergence of the PMF even when pulling 1000 times faster than the reversible limit while greatly reducing the probability of getting trapped in a non-significant path. We also analyze the hydrogen bond breaking within the decaalanine peptide as we force the helix into a random coil and confirm ASMD results with less noise in the numerical averages.

  2. Twenty-five years of confirmatory adaptive designs: opportunities and pitfalls.

    Bauer, Peter; Bretz, Frank; Dragalin, Vladimir; König, Franz; Wassmer, Gernot

    2016-02-10

    'Multistage testing with adaptive designs' was the title of an article by Peter Bauer that appeared 1989 in the German journal Biometrie und Informatik in Medizin und Biologie. The journal does not exist anymore but the methodology found widespread interest in the scientific community over the past 25 years. The use of such multistage adaptive designs raised many controversial discussions from the beginning on, especially after the publication by Bauer and Köhne 1994 in Biometrics: Broad enthusiasm about potential applications of such designs faced critical positions regarding their statistical efficiency. Despite, or possibly because of, this controversy, the methodology and its areas of applications grew steadily over the years, with significant contributions from statisticians working in academia, industry and agencies around the world. In the meantime, such type of adaptive designs have become the subject of two major regulatory guidance documents in the US and Europe and the field is still evolving. Developments are particularly noteworthy in the most important applications of adaptive designs, including sample size reassessment, treatment selection procedures, and population enrichment designs. In this article, we summarize the developments over the past 25 years from different perspectives. We provide a historical overview of the early days, review the key methodological concepts and summarize regulatory and industry perspectives on such designs. Then, we illustrate the application of adaptive designs with three case studies, including unblinded sample size reassessment, adaptive treatment selection, and adaptive endpoint selection. We also discuss the availability of software for evaluating and performing such designs. We conclude with a critical review of how expectations from the beginning were fulfilled, and - if not - discuss potential reasons why this did not happen. PMID:25778935

  3. Comparison of semi-automatized assays for anti-T. gondii IgG detection in low-reactivity serum samples: importance of the results in patient counseling Comparação de ensaios semi-automatizados para pesquisa de IgG anti-T. gondii em amostras de soros de baixa reatividade: importância dos resultados no aconselhamento do paciente

    Paulo Guilherme Leser

    2003-06-01

    Full Text Available Toxoplasmosis is a disease which can cause severe congenital infection and is normally diagnosed by the detection of T. gondii specific antibodies in the serum of infected patients. Several different tests allow to distinguish recent from past infections and to quantify anti-T. gondii specific IgG, and the results can be used as markers for immunity. In the present study, we compare the performance of two different methodologies, the Elfa (bioMérieux S.A and the Meia (Abbott Laboratories in detecting T. gondii specific IgG in low-reactivity sera. Of 76 analyzed samples, three presented discrepant results, being positive in the Abbott AxSYM Toxo IgG assay, and negative in the bioMérieux Vidas Toxo IgG II assay. By using other tests, the three sera were confirmed to be negative. The results are discussed in the context of their importance for patient management, especially during pregnancy.Toxoplasmose, doença conhecida por sua severidade na infecção congênita é geralmente diagnosticada pela demonstração de anticorpos específicos contra antígenos de T. gondii, presentes no soro de indivíduos infectados. Diferentes testes são disponíveis para diferenciar infecção recente de infecção pregressa, para quantificar anticorpos IgG anti-T. gondii nos soros dos pacientes e utilizar os resultados como marcadores de imunidade. Neste trabalho apresentamos os resultados do estudo comparativo de duas tecnologias, Elfa (bioMérieux S.A. e Meia (Abbott Laboratories, para pesquisa de anticorpos IgG anti-T. gondii em amostras de soros de baixa reatividade. De 76 amostras processadas, três apresentaram resultados discrepantes, reagentes para AxSYM Toxo IgG e não-reagentes para Vidas Toxo IgG II. A confirmação dos resultados, feita por bateria de testes, mostrou que todas as três amostras eram não-reagentes. Os resultados são discutidos em sua importância e orientação clínica, principalmente para a paciente gestante.

  4. Pulse front adaptive optics in multiphoton microscopy

    Sun, B.; Salter, P. S.; Booth, M. J.

    2016-03-01

    The accurate focusing of ultrashort laser pulses is extremely important in multiphoton microscopy. Using adaptive optics to manipulate the incident ultrafast beam in either the spectral or spatial domain can introduce significant benefits when imaging. Here we introduce pulse front adaptive optics: manipulating an ultrashort pulse in both the spatial and temporal domains. A deformable mirror and a spatial light modulator are operated in concert to modify contours of constant intensity in space and time within an ultrashort pulse. Through adaptive control of the pulse front, we demonstrate an enhancement in the measured fluorescence from a two photon microscope.

  5. Pulse front control with adaptive optics

    Sun, B.; Salter, P. S.; Booth, M. J.

    2016-03-01

    The focusing of ultrashort laser pulses is extremely important for processes including microscopy, laser fabrication and fundamental science. Adaptive optic elements, such as liquid crystal spatial light modulators or membrane deformable mirrors, are routinely used for the correction of aberrations in these systems, leading to improved resolution and efficiency. Here, we demonstrate that adaptive elements used with ultrashort pulses should not be considered simply in terms of wavefront modification, but that changes to the incident pulse front can also occur. We experimentally show how adaptive elements may be used to engineer pulse fronts with spatial resolution.

  6. Adaptation through chromosomal inversions in Anopheles

    Diego eAyala

    2014-05-01

    Full Text Available Chromosomal inversions have been repeatedly involved in local adaptation in a large number of animals and plants. The ecological and behavioral plasticity of Anopheles species - human malaria vectors - is mirrored by high amounts of polymorphic inversions. The adaptive significance of chromosomal inversions has been consistently attested by strong and significant correlations between their frequencies and a number of phenotypic traits. Here, we provide an extensive literature review of the different adaptive traits associated with chromosomal inversions in the genus Anopheles. Traits having important consequences for the success of present and future vector control measures, such as insecticide resistance and behavioral changes, are discussed.

  7. Adaptive Control Algorithms, Analysis and Applications

    Landau, Ioan; Lozano, Rogelio; M'Saad, Mohammed; Karimi, Alireza

    2011-01-01

    Adaptive Control (second edition) shows how a desired level of system performance can be maintained automatically and in real time, even when process or disturbance parameters are unknown and variable. It is a coherent exposition of the many aspects of this field, setting out the problems to be addressed and moving on to solutions, their practical significance and their application. Discrete-time aspects of adaptive control are emphasized to reflect the importance of digital computers in the ...

  8. Energy-efficient adaptive wireless network design

    Havinga, Paul J. M.; Smit, Gerard J.M.; Bos, Martinus

    2000-01-01

    Energy efficiency is an important issue for mobile computers since they must rely on their batteries. We present an energy-efficient highly adaptive architecture of a network interface and novel data link layer protocol for wireless networks that provides quality of service (QoS) support for diverse traffic types. Due to the dynamic nature of wireless networks, adaptations are necessary to achieve energy efficiency and an acceptable quality of service. The paper provides a review of ideas and...

  9. Parallel Adaptive Mesh Refinement

    Diachin, L; Hornung, R; Plassmann, P; WIssink, A

    2005-03-04

    As large-scale, parallel computers have become more widely available and numerical models and algorithms have advanced, the range of physical phenomena that can be simulated has expanded dramatically. Many important science and engineering problems exhibit solutions with localized behavior where highly-detailed salient features or large gradients appear in certain regions which are separated by much larger regions where the solution is smooth. Examples include chemically-reacting flows with radiative heat transfer, high Reynolds number flows interacting with solid objects, and combustion problems where the flame front is essentially a two-dimensional sheet occupying a small part of a three-dimensional domain. Modeling such problems numerically requires approximating the governing partial differential equations on a discrete domain, or grid. Grid spacing is an important factor in determining the accuracy and cost of a computation. A fine grid may be needed to resolve key local features while a much coarser grid may suffice elsewhere. Employing a fine grid everywhere may be inefficient at best and, at worst, may make an adequately resolved simulation impractical. Moreover, the location and resolution of fine grid required for an accurate solution is a dynamic property of a problem's transient features and may not be known a priori. Adaptive mesh refinement (AMR) is a technique that can be used with both structured and unstructured meshes to adjust local grid spacing dynamically to capture solution features with an appropriate degree of resolution. Thus, computational resources can be focused where and when they are needed most to efficiently achieve an accurate solution without incurring the cost of a globally-fine grid. Figure 1.1 shows two example computations using AMR; on the left is a structured mesh calculation of a impulsively-sheared contact surface and on the right is the fuselage and volume discretization of an RAH-66 Comanche helicopter [35]. Note the

  10. Compressive adaptive computational ghost imaging

    Aßmann, Marc; 10.1038/srep01545

    2013-01-01

    Compressive sensing is considered a huge breakthrough in signal acquisition. It allows recording an image consisting of $N^2$ pixels using much fewer than $N^2$ measurements if it can be transformed to a basis where most pixels take on negligibly small values. Standard compressive sensing techniques suffer from the computational overhead needed to reconstruct an image with typical computation times between hours and days and are thus not optimal for applications in physics and spectroscopy. We demonstrate an adaptive compressive sampling technique that performs measurements directly in a sparse basis. It needs much fewer than $N^2$ measurements without any computational overhead, so the result is available instantly.

  11. Cardiovascular adaptations to exercise training

    Hellsten, Ylva; Nyberg, Michael

    2016-01-01

    Aerobic exercise training leads to cardiovascular changes that markedly increase aerobic power and lead to improved endurance performance. The functionally most important adaptation is the improvement in maximal cardiac output which is the result of an enlargement in cardiac dimension, improved...... arteries is reduced, a factor contributing to increased arterial compliance. Endurance training may also induce alterations in the vasodilator capacity, although such adaptations are more pronounced in individuals with reduced vascular function. The microvascular net increases in size within the muscle...... allowing for an improved capacity for oxygen extraction by the muscle through a greater area for diffusion, a shorter diffusion distance, and a longer mean transit time for the erythrocyte to pass through the smallest blood vessels. The present article addresses the effect of endurance training on systemic...

  12. Behavioral Adaptation and Acceptance

    Martens, M.H.; Jenssen, G.D.

    2012-01-01

    One purpose of Intelligent Vehicles is to improve road safety, throughput, and emissions. However, the predicted effects are not always as large as aimed for. Part of this is due to indirect behavioral changes of drivers, also called behavioral adaptation. Behavioral adaptation (BA) refers to uninte

  13. Behavioural adaptation and acceptance

    Martens, M.H.; Jenssen, G.D.; Eskandarian, A.

    2012-01-01

    One purpose of Intelligent Vehicles is to improve road safety, throughput, and emissions. However, the predicted effects are not always as large as aimed for. Part of this is due to indirect behavioral changes of drivers, also called behavioral adaptation. Behavioral adaptation (BA) refers to uninte

  14. Adaptive Capacity and Traps

    William A. Brock

    2008-12-01

    Full Text Available Adaptive capacity is the ability of a living system, such as a social–ecological system, to adjust responses to changing internal demands and external drivers. Although adaptive capacity is a frequent topic of study in the resilience literature, there are few formal models. This paper introduces such a model and uses it to explore adaptive capacity by contrast with the opposite condition, or traps. In a social–ecological rigidity trap, strong self-reinforcing controls prevent the flexibility needed for adaptation. In the model, too much control erodes adaptive capacity and thereby increases the risk of catastrophic breakdown. In a social–ecological poverty trap, loose connections prevent the mobilization of ideas and resources to solve problems. In the model, too little control impedes the focus needed for adaptation. Fluctuations of internal demand or external shocks generate pulses of adaptive capacity, which may gain traction and pull the system out of the poverty trap. The model suggests some general properties of traps in social–ecological systems. It is general and flexible, so it can be used as a building block in more specific and detailed models of adaptive capacity for a particular region.

  15. Importance of Corneal Thickness

    ... News About Us Donate In This Section The Importance of Corneal Thickness email Send this article to ... is important because it can mask an accurate reading of eye pressure, causing doctors to treat you ...

  16. Imported Inputs and Productivity

    László Halpern; Miklós Koren; Adam Szeidl

    2011-01-01

    We estimate a model of importers in Hungarian micro data and conduct counterfactual policy analysis to investigate the effect of imports on productivity. We find that importing all foreign varieties would increase firm productivity by 12 percent, almost two-fifths of which is due to imperfect substitution between foreign and domestic goods. The effectiveness of import use is higher for foreign firms and increases when a firm becomes foreign-owned. Our estimates imply that during 1993-2002 one...

  17. Sample size estimation and sampling techniques for selecting a representative sample

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  18. Adapt or Become Extinct!

    Goumas, Georgios; McKee, Sally A.; Själander, Magnus;

    2011-01-01

    boundaries (walls) for applications which limit software development (parallel programming wall), performance (memory wall, communication wall) and viability (power wall). The only way to survive in such a demanding environment is by adaptation. In this paper we discuss how dynamic information collected...... during the execution of an application can be utilized to adapt the execution context and may lead to performance gains beyond those provided by static information and compile-time adaptation. We consider specialization based on dynamic information like user input, architectural characteristics such as...... from static analysis (either during ahead-of-time or just-in-time) compilation. We extend the notion of information-driven adaptation and outline the architecture of an infrastructure designed to enable information ow and adaptation throughout the life-cycle of an application....

  19. Adaptive noise cancellation

    In this report we describe the concept of adaptive noise canceling, an alternative method of estimating signals corrupted by additive noise of interference. The method uses 'primary' input containing the corrupted signal and a 'reference' input containing noise correlated in some unknown way with the primary noise, the reference input is adaptively filtered and subtracted from the primary input to obtain the signal estimate. Adaptive filtering before subtraction allows the treatment of inputs that are deterministic or stochastic, stationary or time variable. When the reference input is free of signal and certain other conditions are met then noise in the primary input can be essentially eliminated without signal distortion. It is further shown that the adaptive filter also acts as notch filter. Simulated results illustrate the usefulness of the adaptive noise canceling technique. (author)

  20. Adaptive signal processor

    An experimental, general purpose adaptive signal processor system has been developed, utilizing a quantized (clipped) version of the Widrow-Hoff least-mean-square adaptive algorithm developed by Moschner. The system accommodates 64 adaptive weight channels with 8-bit resolution for each weight. Internal weight update arithmetic is performed with 16-bit resolution, and the system error signal is measured with 12-bit resolution. An adapt cycle of adjusting all 64 weight channels is accomplished in 8 μsec. Hardware of the signal processor utilizes primarily Schottky-TTL type integrated circuits. A prototype system with 24 weight channels has been constructed and tested. This report presents details of the system design and describes basic experiments performed with the prototype signal processor. Finally some system configurations and applications for this adaptive signal processor are discussed

  1. Contributions to sampling statistics

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  2. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  3. Adaptive management of natural resources-framework and issues

    Williams, B.K.

    2011-01-01

    Adaptive management, an approach for simultaneously managing and learning about natural resources, has been around for several decades. Interest in adaptive decision making has grown steadily over that time, and by now many in natural resources conservation claim that adaptive management is the approach they use in meeting their resource management responsibilities. Yet there remains considerable ambiguity about what adaptive management actually is, and how it is to be implemented by practitioners. The objective of this paper is to present a framework and conditions for adaptive decision making, and discuss some important challenges in its application. Adaptive management is described as a two-phase process of deliberative and iterative phases, which are implemented sequentially over the timeframe of an application. Key elements, processes, and issues in adaptive decision making are highlighted in terms of this framework. Special emphasis is given to the question of geographic scale, the difficulties presented by non-stationarity, and organizational challenges in implementing adaptive management. ?? 2010.

  4. Kinetic solvers with adaptive mesh in phase space.

    Arslanbekov, Robert R; Kolobov, Vladimir I; Frolova, Anna A

    2013-12-01

    An adaptive mesh in phase space (AMPS) methodology has been developed for solving multidimensional kinetic equations by the discrete velocity method. A Cartesian mesh for both configuration (r) and velocity (v) spaces is produced using a "tree of trees" (ToT) data structure. The r mesh is automatically generated around embedded boundaries, and is dynamically adapted to local solution properties. The v mesh is created on-the-fly in each r cell. Mappings between neighboring v-space trees is implemented for the advection operator in r space. We have developed algorithms for solving the full Boltzmann and linear Boltzmann equations with AMPS. Several recent innovations were used to calculate the discrete Boltzmann collision integral with dynamically adaptive v mesh: the importance sampling, multipoint projection, and variance reduction methods. We have developed an efficient algorithm for calculating the linear Boltzmann collision integral for elastic and inelastic collisions of hot light particles in a Lorentz gas. Our AMPS technique has been demonstrated for simulations of hypersonic rarefied gas flows, ion and electron kinetics in weakly ionized plasma, radiation and light-particle transport through thin films, and electron streaming in semiconductors. We have shown that AMPS allows minimizing the number of cells in phase space to reduce the computational cost and memory usage for solving challenging kinetic problems. PMID:24483578

  5. Kinetic solvers with adaptive mesh in phase space

    Arslanbekov, Robert R.; Kolobov, Vladimir I.; Frolova, Anna A.

    2013-12-01

    An adaptive mesh in phase space (AMPS) methodology has been developed for solving multidimensional kinetic equations by the discrete velocity method. A Cartesian mesh for both configuration (r) and velocity (v) spaces is produced using a “tree of trees” (ToT) data structure. The r mesh is automatically generated around embedded boundaries, and is dynamically adapted to local solution properties. The v mesh is created on-the-fly in each r cell. Mappings between neighboring v-space trees is implemented for the advection operator in r space. We have developed algorithms for solving the full Boltzmann and linear Boltzmann equations with AMPS. Several recent innovations were used to calculate the discrete Boltzmann collision integral with dynamically adaptive v mesh: the importance sampling, multipoint projection, and variance reduction methods. We have developed an efficient algorithm for calculating the linear Boltzmann collision integral for elastic and inelastic collisions of hot light particles in a Lorentz gas. Our AMPS technique has been demonstrated for simulations of hypersonic rarefied gas flows, ion and electron kinetics in weakly ionized plasma, radiation and light-particle transport through thin films, and electron streaming in semiconductors. We have shown that AMPS allows minimizing the number of cells in phase space to reduce the computational cost and memory usage for solving challenging kinetic problems.

  6. Modular microfluidic system for biological sample preparation

    Rose, Klint A.; Mariella, Jr., Raymond P.; Bailey, Christopher G.; Ness, Kevin Dean

    2015-09-29

    A reconfigurable modular microfluidic system for preparation of a biological sample including a series of reconfigurable modules for automated sample preparation adapted to selectively include a) a microfluidic acoustic focusing filter module, b) a dielectrophoresis bacteria filter module, c) a dielectrophoresis virus filter module, d) an isotachophoresis nucleic acid filter module, e) a lyses module, and f) an isotachophoresis-based nucleic acid filter.

  7. Are integral controllers adapted to the new era of ELT adaptive optics?

    Conan, J.-M.; Raynaud, H.-F.; Kulcsár, C.; Meimon, S.

    2011-09-01

    With ELTs we are now entering a new era in adaptive optics developments. Meeting unprecedented level of performance with incredibly complex systems implies reconsidering AO concepts at all levels, including controller design. Concentrating mainly on temporal aspects, one may wonder if integral controllers remain an adequate solution. This question is all the more important that, with ever larger degrees of freedom, one may be tempted to discard more sophisticated approaches because they are deemed too complex to implement. The respective performance of integrator versus LQG control should therefore be carefully evaluated in the ELT context. We recall for instance the impressive correction improvement brought by such controllers for the rejection of windshake and vibration components. LQG controller significantly outperforms the integrator because its disturbance rejection transfer function closely matches the energy concentration, respectively at low temporal frequencies for windshake, and around localized resonant peaks for vibrations. The application to turbulent modes should also be investigated, especially for very low spatial frequencies now explored on the huge ELT pupil. The questions addressed here are: 1/ How do integral and LQG controllers compare in terms of performance for a given sampling frequency and noise level?; 2/ Could we relax sampling frequency with LQG control?; 3/ Does a mode to mode adaptation of temporal rejection bring significant performance improvement?; 4/ Which modes particularly benefit from this fine tuning of the rejection transfer function? Based on a simplified ELT AO configuration, and through a simple analytical formulation, performance is evaluated for several control approaches. Various assumptions concerning the perturbation parameters (seeing and outer-scale value, windshake amplitude) are considered. Bode's integral theorem allows intuitive understanding of the results. Practical implementation and computation complexity

  8. Local adaptation in the flowering-time gene network of balsam poplar, Populus balsamifera L.

    Keller, Stephen R; Levsen, Nicholas; Olson, Matthew S; Tiffin, Peter

    2012-10-01

    Identifying the signature and targets of local adaptation is an increasingly important goal in empirical population genetics. Using data from 443 balsam poplar Populus balsamifera trees sampled from 31 populations, we tested for evidence of geographically variable selection shaping diversity at 27 homologues of the Arabidopsis flowering-time network. These genes are implicated in the control of seasonal phenology, an important determinant of fitness. Using 335 candidate and 412 reference single nucleotide polymorphisms (SNPs), we tested for evidence of local adaptation by searching for elevated population differentiation using F(ST)-based outlier analyses implemented in BayeScan or a Hierarchical Model in Arelquin and by testing for significant associations between allele frequency and environmental variables using BAYENV. A total of 46 SNPs from 14 candidate genes had signatures of local adaptation-either significantly greater population differentiation or significant covariance with one or more environmental variable relative to reference SNP distributions. Only 11 SNPs from two genes exhibited both elevated population differentiation and covariance with one or more environmental variables. Several genes including the abscisic acid gene ABI1B and the circadian clock genes ELF3 and GI5 harbored a large number of SNPs with signatures of local adaptation-with SNPs in GI5 strongly covarying with both latitude and precipitation and SNPs in ABI1B strongly covarying with temperature. In contrast to several other systems, we find little evidence that photoreceptors, including phytochromes, play an important role in local adaptation. Our results additionally show that detecting local adaptation is sensitive to the analytical approaches used and that model-based significance thresholds should be viewed with caution. PMID:22513286

  9. Statistics and sampling in transuranic studies

    The existing data on transuranics in the environment exhibit a remarkably high variability from sample to sample (coefficients of variation of 100% or greater). This chapter stresses the necessity of adequate sample size and suggests various ways to increase sampling efficiency. Objectives in sampling are regarded as being of great importance in making decisions as to sampling methodology. Four different classes of sampling methods are described: (1) descriptive sampling, (2) sampling for spatial pattern, (3) analytical sampling, and (4) sampling for modeling. A number of research needs are identified in the various sampling categories along with several problems that appear to be common to two or more such areas

  10. Salmonella Typhimurium undergoes distinct genetic adaption during chronic infections of mice

    Søndberg, Emilie; Jelsbak, Lotte

    2016-01-01

    . Typhi and serve as the reservoir for the disease. The specific mechanisms and adaptive strategies enabling S. Typhi to survive inside the host for extended periods are incompletely understood. Yet, elucidation of these processes is of major importance for improvement of therapeutic strategies. In the...... type strains of S. Typhimurium 4/74 were used to establish chronic infections of 129X1/SvJ mice. Over the course of infections, S. Typhimurium bacteria were isolated from feces and from livers and spleens upon termination of the experiment. In all samples dominant clones were identified and select...... clones were subjected to whole genome sequencing. Dominant clones isolated from either systemic organs or fecal samples exhibited distinct single nucleotide polymorphisms (SNPs). One mouse appeared to have distinct adapted clones in the spleen and liver, respectively. Three mice were colonized in the...

  11. The process of organisational adaptation through innovations, and organisational adaptability

    Tikka, Tommi

    2010-01-01

    This study is about the process of organisational adaptation and organisational adaptability. The study generates a theoretical framework about organisational adaptation behaviour and conditions that have influence on success of organisational adaptation. The research questions of the study are: How does an organisation adapt through innovations, and which conditions enhance or impede organisational adaptation through innovations? The data were gathered from five case organisations withi...

  12. Adaptive color quantization using the baker's transformation

    Montagne, Christophe; Lelandais, Sylvie; Smolarz, André; Cornu, Philippe; Larabi, Mohamed-Chaker; Fernandez-Maloigne, Christine

    2006-01-01

    International audience In this article we propose an original technique to reduce the number of colors contained in an image. This method uses the "Bakers Transformation", which obtains a statistically suitable mixture of the pixels of the image. mm this mixture, we can extract several samples, which present the same characteristics as the initial image. The concept we imagined is to consider these samples as potential pallets of colors. These pallets make it possible to do an adaptive qua...

  13. Adaptive network countermeasures.

    McClelland-Bane, Randy; Van Randwyk, Jamie A.; Carathimas, Anthony G.; Thomas, Eric D.

    2003-10-01

    This report describes the results of a two-year LDRD funded by the Differentiating Technologies investment area. The project investigated the use of countermeasures in protecting computer networks as well as how current countermeasures could be changed in order to adapt with both evolving networks and evolving attackers. The work involved collaboration between Sandia employees and students in the Sandia - California Center for Cyber Defenders (CCD) program. We include an explanation of the need for adaptive countermeasures, a description of the architecture we designed to provide adaptive countermeasures, and evaluations of the system.

  14. [Adaptive optics for ophthalmology].

    Saleh, M

    2016-04-01

    Adaptive optics is a technology enhancing the visual performance of an optical system by correcting its optical aberrations. Adaptive optics have already enabled several breakthroughs in the field of visual sciences, such as improvement of visual acuity in normal and diseased eyes beyond physiologic limits, and the correction of presbyopia. Adaptive optics technology also provides high-resolution, in vivo imaging of the retina that may eventually help to detect the onset of retinal conditions at an early stage and provide better assessment of treatment efficacy. PMID:27019970

  15. Theory of adaptive adjustment

    Weihong Huang

    2000-01-01

    Full Text Available Conventional adaptive expectation as a mechanism of stabilizing an unstable economic process is reexamined through a generalization to an adaptive adjustment framework. The generic structures of equilibria that can be stabilized through an adaptive adjustment mechanism are identified. The generalization can be applied to a broad class of discrete economic processes where the variables interested can be adjusted or controlled directly by economic agents such as in cobweb dynamics, Cournot games, Oligopoly markets, tatonnement price adjustment, tariff games, population control through immigration etc.

  16. Symmetry Adapted Basis Sets

    Avery, John Scales; Rettrup, Sten; Avery, James Emil

    In theoretical physics, theoretical chemistry and engineering, one often wishes to solve partial differential equations subject to a set of boundary conditions. This gives rise to eigenvalue problems of which some solutions may be very difficult to find. For example, the problem of finding...... such problems can be much reduced by making use of symmetry-adapted basis functions. The conventional method for generating symmetry-adapted basis sets is through the application of group theory, but this can be difficult. This book describes an easier method for generating symmetry-adapted basis sets...

  17. US Oil import dependency

    After declining oil imports in the US in the first half of the 1980's, the trend is reversed since 1986; the world oil price collapse of 1986 was obviously a major factor in the US import rise. However, it may just have expedited a trend that was already in the making rather than started a new. The growing import dependence have evident impacts on the balance of trade (but not so important), on encouraging domestic production against supply disruption for example, and on environmental policies

  18. Interrelations between psychosocial functioning and adaptive- and maladaptive-range personality traits.

    Ro, Eunyoe; Clark, Lee Anna

    2013-08-01

    Decrements in one or more domains of psychosocial functioning (e.g., poor job performance, poor interpersonal relations) are commonly observed in psychiatric patients. The purpose of this study is to increase understanding of psychosocial functioning as a broad, multifaceted construct as well as its associations with both adaptive- and maladaptive-range personality traits in both nonclinical and psychiatric outpatient samples. The study was conducted in two phases. In Study 1, a nonclinical sample (N = 429) was administered seven psychosocial functioning and adaptive-range personality trait measures. In Study 2, psychiatric outpatients (N = 181) were administered the same psychosocial functioning measures, and maladaptive- as well as adaptive-range personality trait measures. Exploratory (both studies) and confirmatory (Study 2) factor analyses indicated a common three-factor, hierarchical structure of psychosocial functioning-Well Being, Social/Interpersonal Functioning, and Basic Functioning. These psychosocial functioning domains were closely--and differentially--linked with personality traits, especially strongly so in patients. Across samples, Well Being was associated with both Neuroticism/Negative Affectivity and Extraversion/Positive Affectivity, Social/Interpersonal Functioning was associated with both Agreeableness and Conscientiousness/Disinhibition, and Basic Functioning was associated with Conscientiousness/Disinhibition, although only modestly in the nonclinical sample. These relations generally were maintained even after partialing out current general dysphoric symptoms. These findings have implications for considering psychosocial functioning as an important third domain in a tripartite model together with personality and psychopathology. PMID:24016019

  19. Sensorless adaptive optics and the effect of field of view in biological second harmonic generation microscopy

    Vandendriessche, Stefaan; Vanbel, Maarten K.; Verbiest, Thierry

    2014-05-01

    In light of the population aging in many developed countries, there is a great economical interest in improving the speed and cost-efficiency of healthcare. Clinical diagnosis tools are key to these improvements, with biophotonics providing a means to achieve them. Standard optical microscopy of in vitro biological samples has been an important diagnosis tool since the invention of the microscope, with well known resolution limits. Nonlinear optical imaging improves on the resolution limits of linear microscopy, while providing higher contrast images and a greater penetration depth due to the red-shifted incident light compared to standard optical microscopy. It also provides information on molecular orientation and chirality. Adaptive optics can improve the quality of nonlinear optical images. We analyzed the effect of sensorless adaptive optics on the quality of the nonlinear optical images of biological samples. We demonstrate that care needs to be taken when using a large field of view. Our findings provide information on how to improve the quality of nonlinear optical imaging, and can be generalized to other in vitro biological samples. The image quality improvements achieved by adaptive optics should help speed up clinical diagnostics in vitro, while increasing their accuracy and helping decrease detection limits. The same principles apply to in vivo biological samples, and in the future it may be possible to extend these findings to other nonlinear optical effects used in biological imaging.

  20. Spatial adaption for predicting random functions

    Müller-Gronbach, Thomas; Ritter, Klaus

    1998-01-01

    We study integration and reconstruction of Gaussian random functions with inhomogeneous local smoothness. A single realization may only be observed at a finite sampling design and the correct local smoothness is unknown. We construct adaptive two-stage designs that lead to asymptotically optimal methods. We show that every nonadaptive design is less efficient.

  1. Adaptation to and Recovery from Global Catastrophe

    Seth D. Baum

    2013-03-01

    Full Text Available Global catastrophes, such as nuclear war, pandemics and ecological collapse threaten the sustainability of human civilization. To date, most work on global catastrophes has focused on preventing the catastrophes, neglecting what happens to any catastrophe survivors. To address this gap in the literature, this paper discusses adaptation to and recovery from global catastrophe. The paper begins by discussing the importance of global catastrophe adaptation and recovery, noting that successful adaptation/recovery could have value on even astronomical scales. The paper then discusses how the adaptation/recovery could proceed and makes connections to several lines of research. Research on resilience theory is considered in detail and used to develop a new method for analyzing the environmental and social stressors that global catastrophe survivors would face. This method can help identify options for increasing survivor resilience and promoting successful adaptation and recovery. A key point is that survivors may exist in small isolated communities disconnected from global trade and, thus, must be able to survive and rebuild on their own. Understanding the conditions facing isolated survivors can help promote successful adaptation and recovery. That said, the processes of global catastrophe adaptation and recovery are highly complex and uncertain; further research would be of great value.

  2. Exploring the Use of Adaptively Restrained Particles for Graphics Simulations

    Pierre-Luc Manteaux; Fran\\xe7ois Faure; Stephane Redon; Marie-Paule Cani

    2013-01-01

    International audience In this paper, we explore the use of Adaptively Restrained (AR) particles for graphics simulations. Contrary to previous methods, Adaptively Restrained Particle Simulations (ARPS) do not adapt time or space sampling, but rather switch the positional degrees of freedom of particles on and off, while letting their momenta evolve. Therefore, inter-particles forces do not have to be updated at each time step, in contrast with traditional methods that spend a lot of time ...

  3. Adaptive Hamiltonian and Riemann Manifold Monte Carlo Samplers

    Wang, Ziyu; MOHAMED, SHAKIR; De Freitas, Nando

    2013-01-01

    In this paper we address the widely-experienced difficulty in tuning Hamiltonian-based Monte Carlo samplers. We develop an algorithm that allows for the adaptation of Hamiltonian and Riemann manifold Hamiltonian Monte Carlo samplers using Bayesian optimization that allows for infinite adaptation of the parameters of these samplers. We show that the resulting sampling algorithms are ergodic, and that the use of our adaptive algorithms makes it easy to obtain more efficient samplers, in some ca...

  4. Adaptive image processing a computational intelligence perspective

    Guan, Ling; Wong, Hau San

    2002-01-01

    Adaptive image processing is one of the most important techniques in visual information processing, especially in early vision such as image restoration, filtering, enhancement, and segmentation. While existing books present some important aspects of the issue, there is not a single book that treats this problem from a viewpoint that is directly linked to human perception - until now. This reference treats adaptive image processing from a computational intelligence viewpoint, systematically and successfully, from theory to applications, using the synergies of neural networks, fuzzy logic, and

  5. Partial update least-square adaptive filtering

    Xie, Bei

    2014-01-01

    Adaptive filters play an important role in the fields related to digital signal processing and communication, such as system identification, noise cancellation, channel equalization, and beamforming. In practical applications, the computational complexity of an adaptive filter is an important consideration. The Least Mean Square (LMS) algorithm is widely used because of its low computational complexity (O(N)) and simplicity in implementation. The least squares algorithms, such as Recursive Least Squares (RLS), Conjugate Gradient (CG), and Euclidean Direction Search (EDS), can converge faster a

  6. Sparse adaptive filters for echo cancellation

    Paleologu, Constantin

    2011-01-01

    Adaptive filters with a large number of coefficients are usually involved in both network and acoustic echo cancellation. Consequently, it is important to improve the convergence rate and tracking of the conventional algorithms used for these applications. This can be achieved by exploiting the sparseness character of the echo paths. Identification of sparse impulse responses was addressed mainly in the last decade with the development of the so-called ``proportionate''-type algorithms. The goal of this book is to present the most important sparse adaptive filters developed for echo cancellati

  7. Asimovian Adaptive Agents

    Gordon, D F

    2011-01-01

    The goal of this research is to develop agents that are adaptive and predictable and timely. At first blush, these three requirements seem contradictory. For example, adaptation risks introducing undesirable side effects, thereby making agents' behavior less predictable. Furthermore, although formal verification can assist in ensuring behavioral predictability, it is known to be time-consuming. Our solution to the challenge of satisfying all three requirements is the following. Agents have finite-state automaton plans, which are adapted online via evolutionary learning (perturbation) operators. To ensure that critical behavioral constraints are always satisfied, agents' plans are first formally verified. They are then reverified after every adaptation. If reverification concludes that constraints are violated, the plans are repaired. The main objective of this paper is to improve the efficiency of reverification after learning, so that agents have a sufficiently rapid response time. We present two solutions: ...

  8. Limits to adaptation

    Dow, Kirstin; Berkhout, Frans; Preston, Benjamin L.; Klein, Richard J. T.; Midgley, Guy; Shaw, M. Rebecca

    2013-04-01

    An actor-centered, risk-based approach to defining limits to social adaptation provides a useful analytic framing for identifying and anticipating these limits and informing debates over society's responses to climate change.

  9. The genomics of adaptation.

    Radwan, Jacek; Babik, Wiesław

    2012-12-22

    The amount and nature of genetic variation available to natural selection affect the rate, course and outcome of evolution. Consequently, the study of the genetic basis of adaptive evolutionary change has occupied biologists for decades, but progress has been hampered by the lack of resolution and the absence of a genome-level perspective. Technological advances in recent years should now allow us to answer many long-standing questions about the nature of adaptation. The data gathered so far are beginning to challenge some widespread views of the way in which natural selection operates at the genomic level. Papers in this Special Feature of Proceedings of the Royal Society B illustrate various aspects of the broad field of adaptation genomics. This introductory article sets up a context and, on the basis of a few selected examples, discusses how genomic data can advance our understanding of the process of adaptation. PMID:23097510

  10. Adaptive shared control system

    Sanders, David

    2009-01-01

    A control system to aid mobility is presented that is intended to assist living independently and that provides physical guidance. The system has two levels: a human machine interface and an adaptive shared controller.

  11. Adapt or Die

    Brody, Joshua Eric; Larsen, Kasper Green

    2015-01-01

    In this paper, we study the role non-adaptivity plays in maintaining dynamic data structures. Roughly speaking, a data structure is non-adaptive if the memory locations it reads and/or writes when processing a query or update depend only on the query or update and not on the contents of previously...... read cells. We study such non-adaptive data structures in the cell probe model. This model is one of the least restrictive lower bound models and in particular, cell probe lower bounds apply to data structures developed in the popular word-RAM model. Unfortunately, this generality comes at a high cost......: the highest lower bound proved for any data structure problem is only polylogarithmic. Our main result is to demonstrate that one can in fact obtain polynomial cell probe lower bounds for non-adaptive data structures. To shed more light on the seemingly inherent polylogarithmic lower bound barrier, we...

  12. Adaptive Space Structures

    Wada, B.

    1993-01-01

    The term adaptive structures refers to a structural control approach in which sensors, actuators, electronics, materials, structures, structural concepts, and system-performance-validation strategies are integrated to achieve specific objectives.

  13. Adaptive Spectral Doppler Estimation

    Gran, Fredrik; Jakobsson, Andreas; Jensen, Jørgen Arendt

    2009-01-01

    In this paper, 2 adaptive spectral estimation techniques are analyzed for spectral Doppler ultrasound. The purpose is to minimize the observation window needed to estimate the spectrogram to provide a better temporal resolution and gain more flexibility when designing the data acquisition sequence....... The methods can also provide better quality of the estimated power spectral density (PSD) of the blood signal. Adaptive spectral estimation techniques are known to pro- vide good spectral resolution and contrast even when the ob- servation window is very short. The 2 adaptive techniques are tested and...... compared with the averaged periodogram (Welch’s method). The blood power spectral capon (BPC) method is based on a standard minimum variance technique adapted to account for both averaging over slow-time and depth. The blood amplitude and phase estimation technique (BAPES) is based on finding a set of...

  14. Adaptive multiresolution methods

    Schneider Kai

    2011-12-01

    Full Text Available These lecture notes present adaptive multiresolution schemes for evolutionary PDEs in Cartesian geometries. The discretization schemes are based either on finite volume or finite difference schemes. The concept of multiresolution analyses, including Harten’s approach for point and cell averages, is described in some detail. Then the sparse point representation method is discussed. Different strategies for adaptive time-stepping, like local scale dependent time stepping and time step control, are presented. Numerous numerical examples in one, two and three space dimensions validate the adaptive schemes and illustrate the accuracy and the gain in computational efficiency in terms of CPU time and memory requirements. Another aspect, modeling of turbulent flows using multiresolution decompositions, the so-called Coherent Vortex Simulation approach is also described and examples are given for computations of three-dimensional weakly compressible mixing layers. Most of the material concerning applications to PDEs is assembled and adapted from previous publications [27, 31, 32, 34, 67, 69].

  15. Adaptive Heat Engine

    Allahverdyan, A. E.; Babajanyan, S. G.; Martirosyan, N. H.; Melkikh, A. V.

    2016-07-01

    A major limitation of many heat engines is that their functioning demands on-line control and/or an external fitting between the environmental parameters (e.g., temperatures of thermal baths) and internal parameters of the engine. We study a model for an adaptive heat engine, where—due to feedback from the functional part—the engine's structure adapts to given thermal baths. Hence, no on-line control and no external fitting are needed. The engine can employ unknown resources; it can also adapt to results of its own functioning that make the bath temperatures closer. We determine resources of adaptation and relate them to the prior information available about the environment.

  16. Adaptive Architectural Envelope

    Foged, Isak Worre; Kirkegaard, Poul Henning

    2010-01-01

    Recent years have seen an increasing variety of applications of adaptive architectural structures for improvement of structural performance by recognizing changes in their environments and loads, adapting to meet goals, and using past events to improve future performance or maintain serviceability....... The general scopes of this paper are to develop a new adaptive kinetic architectural structure, particularly a reconfigurable architectural structure which can transform body shape from planar geometries to hyper-surfaces using different control strategies, i.e. a transformation into more than one or...... two different shape alternatives. The adaptive structure is a proposal for a responsive building envelope which is an idea of a first level operational framework for present and future investigations towards performance based responsive architectures through a set of responsive typologies. A mock- up...

  17. Statistical Physics of Adaptation

    Perunov, Nikolai; England, Jeremy

    2014-01-01

    All living things exhibit adaptations that enable them to survive and reproduce in the natural environment that they inhabit. From a biological standpoint, it has long been understood that adaptation comes from natural selection, whereby maladapted individuals do not pass their traits effectively to future generations. However, we may also consider the phenomenon of adaptation from the standpoint of physics, and ask whether it is possible to delineate what the difference is in terms of physical properties between something that is well-adapted to its surrounding environment, and something that is not. In this work, we undertake to address this question from a theoretical standpoint. Building on past fundamental results in far-from-equilibrium statistical mechanics, we demonstrate a generalization of the Helmholtz free energy for the finite-time stochastic evolution of driven Newtonian matter. By analyzing this expression term by term, we are able to argue for a general tendency in driven many-particle systems...

  18. Agricultural adaptation to climate change in China

    2001-01-01

    This paper presents the study on agriculture adaptation toclimate change by adopting the assumed land use change strategy to resist the water shortage and to build the capacity to adapt the expected climate change in the northern China. The cost-benefit analysis result shows that assumed land use change from high water consuming rice cultivation to other crops is very effective. Over 7 billions m3 of water can be saved. Potential conflicts between different social interest groups, different regions, demand and supply, and present and future interests have been analyzed for to form a policy to implement the adaptation strategy. Trade, usually taken as one of adaptation strategies, was suggested as a policy option for to support land use change, which not only meets the consumption demand, but also, in terms of resources, imports waterresources.

  19. Investigation on children's social adaptive capacity

    WANG Ya-ping; WANG Bao-yan; CHEN Yun-qi; WANG Ai-rong; ZHANG Rong; NIU Xiao-lin

    2002-01-01

    Objective:To understand the present conditions of Children's social adaptive capacity. Methods:social viability and its influence factors were investigated on 628 Children in 7 kindergartens of 4 cities in China. Results: The general trend of development of Child's social adaptive capacity was fairly good. The relevance ratio on the edge level was 0.3%. Of them 16.4% and 7% were excellent and extraordinary intelligence respectively. The family environment had played a very important role in child's social adaptive capacity. Conclusion: The research revealed that in the respect of training a child's social adaptive ability, the initiative of each family should be brought into full play and we should surmount the negative influence, and solve the contradiction between releasing one's control and taking care of everything, and arouse the conscious activity of the child, and ensure the unity and balance between the child's own body and the living environment.

  20. Better economics: supporting adaptation with stakeholder analysis

    Chambwera, Muyeye; Zou, Ye; Boughlala, Mohamed

    2011-11-15

    Across the developing world, decision makers understand the need to adapt to climate change — particularly in agriculture, which supports a large proportion of low-income groups who are especially vulnerable to impacts such as increasing water scarcity or more erratic weather. But policymakers are often less clear about what adaptation action to take. Cost-benefit analyses can provide information on the financial feasibility and economic efficiency of a given policy. But such methods fail to capture the non-monetary benefits of adaptation, which can be even more important than the monetary ones. Ongoing work in Morocco shows how combining cost-benefit analysis with a more participatory stakeholder analysis can support effective decision making by identifying cross-sector benefits, highlighting areas of mutual interest among different stakeholders and more effectively assessing impacts on adaptive capacity.