WorldWideScience

Sample records for adaptive importance sampling

  1. Adaptive Importance Sampling for Control and Inference

    Science.gov (United States)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  2. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  3. AIS-BN: An Adaptive Importance Sampling Algorithm for Evidential Reasoning in Large Bayesian Networks

    CERN Document Server

    Cheng, J; 10.1613/jair.764

    2011-01-01

    Stochastic sampling algorithms, while an attractive alternative to exact algorithms in very large Bayesian network models, have been observed to perform poorly in evidential reasoning with extremely unlikely evidence. To address this problem, we propose an adaptive importance sampling algorithm, AIS-BN, that shows promising convergence rates even under extreme conditions and seems to outperform the existing sampling algorithms consistently. Three sources of this performance improvement are (1) two heuristics for initialization of the importance function that are based on the theoretical properties of importance sampling in finite-dimensional integrals and the structural advantages of Bayesian networks, (2) a smooth learning method for the importance function, and (3) a dynamic weighting function for combining samples from different stages of the algorithm. We tested the performance of the AIS-BN algorithm along with two state of the art general purpose sampling algorithms, likelihood weighting (Fung and Chang...

  4. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    NARCIS (Netherlands)

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van Wim C.; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models t

  5. Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    OpenAIRE

    Remondo, David; Srinivasan, Rajan; Nicola, Victor F.; Etten, van, WC Wim; Tattje, Henk E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models that are analytically tractable are employed to demonstrate the validity of the techniques. As an application to situations that are analytically intractable and numerically intensive, the influence...

  6. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  7. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Bin, E-mail: bins@ieee.org [School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 (China)

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  8. Importance sampling for characterizing STAP detectors

    NARCIS (Netherlands)

    Srinivasan, R.; Rangaswamy, M.

    2007-01-01

    This paper describes the development of adaptive importance sampling techniques for estimating false alarm probabilities of detectors that use space-time adaptive processing (STAP) algorithms. Fast simulation using importance sampling methods has been notably successful in the study of conventional

  9. Adaptive sampling for noisy problems

    Energy Technology Data Exchange (ETDEWEB)

    Cantu-Paz, E

    2004-03-26

    The usual approach to deal with noise present in many real-world optimization problems is to take an arbitrary number of samples of the objective function and use the sample average as an estimate of the true objective value. The number of samples is typically chosen arbitrarily and remains constant for the entire optimization process. This paper studies an adaptive sampling technique that varies the number of samples based on the uncertainty of deciding between two individuals. Experiments demonstrate the effect of adaptive sampling on the final solution quality reached by a genetic algorithm and the computational cost required to find the solution. The results suggest that the adaptive technique can effectively eliminate the need to set the sample size a priori, but in many cases it requires high computational costs.

  10. Quantization based recursive Importance Sampling

    CERN Document Server

    Frikha, Noufel

    2011-01-01

    We investigate in this paper an alternative method to simulation based recursive importance sampling procedure to estimate the optimal change of measure for Monte Carlo simulations. We propose an algorithm which combines (vector and functional) optimal quantization with Newton-Raphson zero search procedure. Our approach can be seen as a robust and automatic deterministic counterpart of recursive importance sampling by means of stochastic approximation algorithm which, in practice, may require tuning and a good knowledge of the payoff function in practice. Moreover, unlike recursive importance sampling procedures, the proposed methodology does not rely on simulations so it is quite generic and can come along on the top of Monte Carlo simulations. We first emphasize on the consistency of quantization for designing an importance sampling algorithm for both multi-dimensional distributions and diffusion processes. We show that the induced error on the optimal change of measure is controlled by the mean quantizatio...

  11. Adaptive Sampling in Hierarchical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  12. A new design for sampling with adaptive sample plots

    OpenAIRE

    Yang, Haijun; Kleinn, Christoph; Fehrmann, Lutz; Tang, Shouzheng; Magnussen, Steen

    2009-01-01

    Adaptive cluster sampling (ACS) is a sampling technique for sampling rare and geographically clustered populations. Aiming to enhance the practicability of ACS while maintaining some of its major characteristics, an adaptive sample plot design is introduced in this study which facilitates field work compared to “standard” ACS. The plot design is based on a conditional plot expansion: a larger plot (by a pre-defined plot size factor) is installed at a sample point instead of the smaller initia...

  13. Adaptive sampling algorithm for detection of superpoints

    Institute of Scientific and Technical Information of China (English)

    CHENG Guang; GONG Jian; DING Wei; WU Hua; QIANG ShiQiang

    2008-01-01

    The superpoints are the sources (or the destinations) that connect with a great deal of destinations (or sources) during a measurement time interval, so detecting the superpoints in real time is very important to network security and management. Previous algorithms are not able to control the usage of the memory and to deliver the desired accuracy, so it is hard to detect the superpoints on a high speed link in real time. In this paper, we propose an adaptive sampling algorithm to detect the superpoints in real time, which uses a flow sample and hold module to reduce the detection of the non-superpoints and to improve the measurement accuracy of the superpoints. We also design a data stream structure to maintain the flow records, which compensates for the flow Hash collisions statistically. An adaptive process based on different sampling probabilities is used to maintain the recorded IP ad dresses in the limited memory. This algorithm is compared with the other algo rithms by analyzing the real network trace data. Experiment results and mathematic analysis show that this algorithm has the advantages of both the limited memory requirement and high measurement accuracy.

  14. Importance Sampling for the Infinite Sites Model*

    OpenAIRE

    Hobolth, Asger; Uyenoyama, Marcy K; Wiuf, Carsten

    2008-01-01

    Importance sampling or Markov Chain Monte Carlo sampling is required for state-of-the-art statistical analysis of population genetics data. The applicability of these sampling-based inference techniques depends crucially on the proposal distribution. In this paper, we discuss importance sampling for the infinite sites model. The infinite sites assumption is attractive because it constraints the number of possible genealogies, thereby allowing for the analysis of larger data sets. We recall th...

  15. An Adaptive Importance Sampling Theory Based on The Generalized Genetic Algorithm%基于广义遗传算法的自适应重要抽样理论

    Institute of Scientific and Technical Information of China (English)

    董聪; 郭晓华

    2000-01-01

    In the present paper,using the generalized genetic algorithm,the problem of finding out all design points in the case of generalized multiple design point is solved,establishing recursion-type bound-and-classification algorithm,the problem of reducing and synthesizing generaliged multiple design points is also solved.The present paper shows that the adaptive importance sampling theory based on the generalized genetic algorithm is an more efficient tool for the reliability simulation of nonlinear sys-tems.

  16. Importance sampling for NMF class of STAP detectors

    NARCIS (Netherlands)

    Anitori, L.; Srinivasan, R.; Rangaswamy, M.

    2006-01-01

    Importance sampling (IS) techniques are applied to space-time adaptive processing (STAP) radar detection algorithms for performance characterization via fast estimation of false alarm probabilities (FAP’s). The work here builds on and extends the initial thrust in this area provided in a recent pape

  17. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio;

    2011-01-01

    Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot reliably...... erase their secrets prior to corruption. Previous feasibility results for adaptively secure MPC in this setting applied either to deterministic functionalities or to randomized functionalities which satisfy a certain technical requirement. The question whether adaptive security is possible for all...... functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...

  18. 电力系统可靠性评估的自适应分层重要抽样法%A Self-adapting Stratified and Importance Sampling Method for Power System Reliability Evaluation

    Institute of Scientific and Technical Information of China (English)

    王晓滨; 郭瑞鹏; 曹一家; 余秀月; 杨桂钟

    2011-01-01

    提出了电力系统可靠性评估的自适应分层重要抽样算法,将系统状态空间分割成无故障状态子空间和各重故障状态子空间,避免对无故障状态子空间抽样,对各重故障状态子空间的抽样次数进行最优分配,并不断修正最优重要抽样概率密度函数,可以显著提高计算效率并解决了以往蒙特卡洛方法在高可靠性系统中效率低下的问题.对IEEE-RTS系统的发输电部分进行可靠性评估,结果表明该方法合理、高效,且不会出现退化现象.%A new method for power system reliability evaluation called self-adapting stratified and importance sampling (SASIS) is presented. With the SASIS, the system state space is partitioned into one contingency-free state subspace and various contingency order state subspaces. As contingency-free state subspace sampling is completely avoided, the SASIS converges fast in the system with high reliability. The number of sampling is optimally allocated among the contingency order state subspaces and the probability density function is steadily rectified. This method will markedly increase the calculating efficiency while eradicating the problem of low efficiency with the Monte Carlo method in high efficiency systems as reported in the past. Compared with other Monte Carlo methods, the results of the IEEE-RTS test system show that the method proposed is rational and highly effective and free from degradation.This work is supported by Important Zhejiang Science & Technology Specific Projects (No. 2007C11098).

  19. Important ingredients for health adaptive information systems.

    Science.gov (United States)

    Senathirajah, Yalini; Bakken, Suzanne

    2011-01-01

    Healthcare information systems frequently do not truly meet clinician needs, due to the complexity, variability, and rapid change in medical contexts. Recently the internet world has been transformed by approaches commonly termed 'Web 2.0'. This paper proposes a Web 2.0 model for a healthcare adaptive architecture. The vision includes creating modular, user-composable systems which aim to make all necessary information from multiple internal and external sources available via a platform, for the user to use, arrange, recombine, author, and share at will, using rich interfaces where advisable. Clinicians can create a set of 'widgets' and 'views' which can transform data, reflect their domain knowledge and cater to their needs, using simple drag and drop interfaces without the intervention of programmers. We have built an example system, MedWISE, embodying the user-facing parts of the model. This approach to HIS is expected to have several advantages, including greater suitability to user needs (reflecting clinician rather than programmer concepts and priorities), incorporation of multiple information sources, agile reconfiguration to meet emerging situations and new treatment deployment, capture of user domain expertise and tacit knowledge, efficiencies due to workflow and human-computer interaction improvements, and greater user acceptance.

  20. A software sampling frequency adaptive algorithm for reducing spectral leakage

    Institute of Scientific and Technical Information of China (English)

    PAN Li-dong; WANG Fei

    2006-01-01

    Spectral leakage caused by synchronous error in a nonsynchronous sampling system is an important cause that reduces the accuracy of spectral analysis and harmonic measurement.This paper presents a software sampling frequency adaptive algorithm that can obtain the actual signal frequency more accurately,and then adjusts sampling interval base on the frequency calculated by software algorithm and modifies sampling frequency adaptively.It can reduce synchronous error and impact of spectral leakage;thereby improving the accuracy of spectral analysis and harmonic measurement for power system signal where frequency changes slowly.This algorithm has high precision just like the simulations show,and it can be a practical method in power system harmonic analysis since it can be implemented easily.

  1. Adaptive sampling for mesh spectrum editing

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiang-jun; ZHANG Hong-xin; BAO Hu-jun

    2006-01-01

    A mesh editing framework is presented in this paper, which integrates Free-Form Deformation (FFD) and geometry signal processing. By using simplified model from original mesh, the editing task can be accomplished with a few operations. We take the deformation of the proxy and the position coordinates of the mesh models as geometry signal. Wavelet analysis is employed to separate local detail information gracefully. The crucial innovation of this paper is a new adaptive regular sampling approach for our signal analysis based editing framework. In our approach, an original mesh is resampled and then refined iteratively which reflects optimization of our proposed spectrum preserving energy. As an extension of our spectrum editing scheme,the editing principle is applied to geometry details transferring, which brings satisfying results.

  2. An adaptive sampling scheme for deep-penetration calculation

    International Nuclear Information System (INIS)

    As we know, the deep-penetration problem has been one of the important and difficult problems in shielding calculation with Monte Carlo Method for several decades. In this paper, an adaptive Monte Carlo method under the emission point as a sampling station for shielding calculation is investigated. The numerical results show that the adaptive method may improve the efficiency of the calculation of shielding and might overcome the under-estimation problem easy to happen in deep-penetration calculation in some degree

  3. Importance sampling of severe wind gusts

    NARCIS (Netherlands)

    Bos, R.; Bierbooms, W.A.A.M.; Van Bussel, G.J.W.

    2015-01-01

    An important problem that arises during the design of wind turbines is estimating extreme loads with sufficient accuracy. This is especially difficult during iterative design phases when computational resources are scarce. Over the years, many methods have been proposed to extrapolate extreme load d

  4. Sparse signals estimation for adaptive sampling

    Directory of Open Access Journals (Sweden)

    Andrey Ordin

    2014-08-01

    Full Text Available This paper presents an estimation procedure for sparse signals in adaptive setting. We show that when the pure signal is strong enough, the value of loss function is asymptotically the same as for an optimal estimator up to a constant multiplier.

  5. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  6. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  7. Adaptive Sampling for Large Scale Boosting

    OpenAIRE

    Dubout, Charles; Fleuret, Francois

    2014-01-01

    Classical Boosting algorithms, such as AdaBoost, build a strong classifier without concern for the computational cost. Some applications, in particular in computer vision, may involve millions of training examples and very large feature spaces. In such contexts, the training time of off-the-shelf Boosting algorithms may become prohibitive. Several methods exist to accelerate training, typically either by sampling the features or the examples used to train the weak learners. Even if some of th...

  8. Domain Adaptation: Overfitting and Small Sample Statistics

    CERN Document Server

    Foster, Dean; Salakhutdinov, Ruslan

    2011-01-01

    We study the prevalent problem when a test distribution differs from the training distribution. We consider a setting where our training set consists of a small number of sample domains, but where we have many samples in each domain. Our goal is to generalize to a new domain. For example, we may want to learn a similarity function using only certain classes of objects, but we desire that this similarity function be applicable to object classes not present in our training sample (e.g. we might seek to learn that "dogs are similar to dogs" even though images of dogs were absent from our training set). Our theoretical analysis shows that we can select many more features than domains while avoiding overfitting by utilizing data-dependent variance properties. We present a greedy feature selection algorithm based on using T-statistics. Our experiments validate this theory showing that our T-statistic based greedy feature selection is more robust at avoiding overfitting than the classical greedy procedure.

  9. 19 CFR 151.67 - Sampling by importer.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling by importer. 151.67 Section 151.67 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE... importer. The importer may be permitted after entry to draw samples under Customs supervision in...

  10. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue – it is known that in this setting ‘traditional’ stateindependent importance-sampling distributions perfo

  11. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue – it is known that in this setting ‘traditional’ state-independent importance-sampling distributions perf

  12. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, Denis; Scheinhardt, Werner; Mandjes, Michel

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting “traditional” state-independent importance-sampling distributions per

  13. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    D. Miretskiy; W. Scheinhardt; M. Mandjes

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting "traditional" state-independent importance-sampling distributions per

  14. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  15. Adaptive sampling program support for expedited site characterization

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.

    1993-10-01

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ``real-time`` data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection.

  16. Multi-Level Monte Carlo Simulations with Importance Sampling

    OpenAIRE

    Przemyslaw S. Stilger and Ser-Huang Poon

    2013-01-01

    We present an application of importance sampling to multi-asset options under the Heston and the Bates models as well as to the Heston-Hull-White and the Heston-Cox-Ingersoll-Ross models. Moreover, we provide an efficient importance sampling scheme in a Multi-Level Monte Carlo simulation. In all cases, we explain how the Greeks can be computed in the different simulation schemes using the Likelihood Ratio Method, and how combining it with importance sampling leads to a significant variance re...

  17. Adaptive Monte Carlo on multivariate binary sampling spaces

    CERN Document Server

    Schäfer, Christian

    2010-01-01

    A Monte Carlo algorithm is said to be adaptive if it can adjust automatically its current proposal distribution, using past simulations. The choice of the parametric family that defines the set of proposal distributions is critical for a good performance. We treat the problem of constructing such parametric families for adaptive sampling on multivariate binary spaces. A practical motivation for this problem is variable selection in a linear regression context, where we need to either find the best model, with respect to some criterion, or to sample from a Bayesian posterior distribution on the model space. In terms of adaptive algorithms, we focus on the Cross-Entropy (CE) method for optimisation, and the Sequential Monte Carlo (SMC) methods for sampling. Raw versions of both SMC and CE algorithms are easily implemented using binary vectors with independent components. However, for high-dimensional model choice problems, these straightforward proposals do not yields satisfactory results. The key to advanced a...

  18. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  19. Two-phase importance sampling for inference about transmission trees

    OpenAIRE

    Numminen, E.; Chewapreecha, C.; Siren, J.; Turner, C.; Turner, P; Bentley, S.D.; Corander, J.

    2014-01-01

    There has been growing interest in the statistics community to develop methods for inferring transmission pathways of infectious pathogens from molecular sequence data. For many datasets, the computational challenge lies in the huge dimension of the missing data. Here, we introduce an importance sampling scheme in which the transmission trees and phylogenies of pathogens are both sampled from reasonable importance distributions, alleviating the inference. Using this approach, arbitrary models...

  20. Computing Greeks with Multilevel Monte Carlo Methods using Importance Sampling

    OpenAIRE

    Euget, Thomas

    2012-01-01

    This paper presents a new efficient way to reduce the variance of an estimator of popular payoffs and greeks encounter in financial mathematics. The idea is to apply Importance Sampling with the Multilevel Monte Carlo recently introduced by M.B. Giles. So far, Importance Sampling was proved successful in combination with standard Monte Carlo method. We will show efficiency of our approach on the estimation of financial derivatives prices and then on the estimation of Greeks (i.e. sensitivitie...

  1. Iterative importance sampling algorithms for parameter estimation problems

    OpenAIRE

    Morzfeld, Matthias; Day, Marcus S.; Grout, Ray W.; Pau, George Shu Heng; Finsterle, Stefan A.; Bell, John B.

    2016-01-01

    In parameter estimation problems one approximates a posterior distribution over uncertain param- eters defined jointly by a prior distribution, a numerical model, and noisy data. Typically, Markov Chain Monte Carlo (MCMC) is used for the numerical solution of such problems. An alternative to MCMC is importance sampling, where one draws samples from a proposal distribution, and attaches weights to each sample to account for the fact that the proposal distribution is not the posterior distribut...

  2. Joint importance sampling of low-order volumetric scattering

    DEFF Research Database (Denmark)

    Georgiev, Iliyan; Křivánek, Jaroslav; Hachisuka, Toshiya;

    2013-01-01

    that such approaches are an unnecessary legacy inherited from traditional surface-based rendering algorithms. We devise joint importance sampling of path vertices in participating media to construct paths that explicitly account for the product of all scattering and geometry terms along a sequence of vertices instead...... of just locally at a single vertex. This leads to a number of practical importance sampling routines to explicitly construct single-and double-scattering subpaths in anisotropically-scattering media. We demonstrate the benefit of our new sampling techniques, integrating them into several path...

  3. Geometrical importance sampling in Geant4 from design to verification

    CERN Document Server

    Dressel, M

    2003-01-01

    The addition of flexible, general implementations of geometrical splitting and Russian Roulette, in combination called geometrical importance sampling, for variance reduction and of a scoring system, for controlling the sampling, are described. The efficiency of the variance reduction implementation is measured in a simulation of a typical benchmark experiment for neutron shielding. Using geometrical importance sampling a reduction of the computing time of a factor 89 compared to the analog calculation, for obtaining a neutron flux with a certain precision, was achieved for the benchmark application.

  4. Adaptation of the methodology of sample surveys for marketing researches

    Directory of Open Access Journals (Sweden)

    Kataev Andrey

    2015-08-01

    Full Text Available The article presents the results of the theory of adaptation of sample survey for the purposes of marketing, that allows to answer the fundamental question of any marketing research – how many objects should be studied for drawing adequate conclusions.

  5. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  6. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Institute of Scientific and Technical Information of China (English)

    Huang Jiangtao; Gao Zhenghong; Zhou Zhu; Zhao Ke

    2015-01-01

    Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is pro-posed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF) network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sam-pling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE) feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowd-ness enhance (RCE) adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic opti-mization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  7. Adaptive video compressed sampling in the wavelet domain

    Science.gov (United States)

    Dai, Hui-dong; Gu, Guo-hua; He, Wei-ji; Chen, Qian; Mao, Tian-yi

    2016-07-01

    In this work, we propose a multiscale video acquisition framework called adaptive video compressed sampling (AVCS) that involves sparse sampling and motion estimation in the wavelet domain. Implementing a combination of a binary DMD and a single-pixel detector, AVCS acquires successively finer resolution sparse wavelet representations in moving regions directly based on extended wavelet trees, and alternately uses these representations to estimate the motion in the wavelet domain. Then, we can remove the spatial and temporal redundancies and provide a method to reconstruct video sequences from compressed measurements in real time. In addition, the proposed method allows adaptive control over the reconstructed video quality. The numerical simulation and experimental results indicate that AVCS performs better than the conventional CS-based methods at the same sampling rate even under the influence of noise, and the reconstruction time and measurements required can be significantly reduced.

  8. Adaptation of a Digitally Predistorted RF Amplifier Using Selective Sampling

    Institute of Scientific and Technical Information of China (English)

    R. Neil Braithwaite

    2011-01-01

    In this paper, a reduced-cost method of measuring residual nonlinearities in an adaptive digitally predistorted amplifier is proposed. Measurements obtained by selective sampling of the amplifier output are integrated over the input envelope range to adapt a fourth-order polynomial predistorter with memory correction. Results for a WCDMA input with a 101 carrier configuration show that a transmitter using the proposed method can meet the adjacent channel leakage ratio (ACLR) specification. Inverse modeling of the nonlinearity is proposed as a future extension that will reduce the cost of the system further.

  9. An Importance Sampling Simulation Method for Bayesian Decision Feedback Equalizers

    OpenAIRE

    Chen, S.; Hanzo, L.

    2000-01-01

    An importance sampling (IS) simulation technique is presented for evaluating the lower-bound bit error rate (BER) of the Bayesian decision feedback equalizer (DFE) under the assumption of correct decisions being fed back. A design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency of the IS simulation.

  10. Variance Analysis and Adaptive Sampling for Indirect Light Path Reuse

    Institute of Scientific and Technical Information of China (English)

    Hao Qin; Xin Sun; Jun Yan; Qi-Ming Hou; Zhong Ren; Kun Zhou

    2016-01-01

    In this paper, we study the estimation variance of a set of global illumination algorithms based on indirect light path reuse. These algorithms usually contain two passes — in the first pass, a small number of indirect light samples are generated and evaluated, and they are then reused by a large number of reconstruction samples in the second pass. Our analysis shows that the covariance of the reconstruction samples dominates the estimation variance under high reconstruction rates and increasing the reconstruction rate cannot effectively reduce the covariance. We also find that the covariance represents to what degree the indirect light samples are reused during reconstruction. This analysis motivates us to design a heuristic approximating the covariance as well as an adaptive sampling scheme based on this heuristic to reduce the rendering variance. We validate our analysis and adaptive sampling scheme in the indirect light field reconstruction algorithm and the axis-aligned filtering algorithm for indirect lighting. Experiments are in accordance with our analysis and show that rendering artifacts can be greatly reduced at a similar computational cost.

  11. Stochastic seismic inversion using greedy annealed importance sampling

    Science.gov (United States)

    Xue, Yang; Sen, Mrinal K.

    2016-10-01

    A global optimization method called very fast simulated annealing (VFSA) inversion has been applied to seismic inversion. Here we address some of the limitations of VFSA by developing a new stochastic inference method, named greedy annealed importance sampling (GAIS). GAIS combines VFSA and greedy importance sampling (GIS), which uses a greedy search in the important regions located by VFSA, in order to attain fast convergence and provide unbiased estimation. We demonstrate the performance of GAIS with application to seismic inversion of field post- and pre-stack datasets. The results indicate that GAIS can improve lateral continuity of the inverted impedance profiles and provide better estimation of uncertainties than using VFSA alone. Thus this new hybrid method combining global and local optimization methods can be applied in seismic reservoir characterization and reservoir monitoring for accurate estimation of reservoir models and their uncertainties.

  12. Importance Sampling for Failure Probabilities in Computing and Data Transmission

    DEFF Research Database (Denmark)

    Asmussen, Søren

    We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description...... these asymptotical descriptions have bounded relative error as χ → ∞ when combined with the ideas used for a fixed t. Nevertheless, the paper gives examples that algorithms carefully designed to enjoy bounded relative error may provide little or no asymptotic improvement of crude Monte Carlo simulation when...

  13. Importance sampling for failure probabilities in computing and data transmission

    DEFF Research Database (Denmark)

    Asmussen, Søren

    2009-01-01

    In this paper we study efficient simulation algorithms for estimating P(X›x), where X is the total time of a job with ideal time $T$ that needs to be restarted after a failure. The main tool is importance sampling, where a good importance distribution is identified via an asymptotic description...... these asymptotic descriptions have bounded relative error as x→∞ when combined with the ideas used for a fixed t. Nevertheless, we give examples of algorithms carefully designed to enjoy bounded relative error that may provide little or no asymptotic improvement over crude Monte Carlo simulation when...

  14. Adaptive Sampling of Time Series During Remote Exploration

    Science.gov (United States)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  15. Distributed Database Kriging for Adaptive Sampling (D2 KAS)

    Science.gov (United States)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; Rouet-Leduc, Bertrand; McPherson, Allen L.; Germann, Timothy C.; Junghans, Christoph

    2015-07-01

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5-25, while retaining high accuracy for various choices of the algorithm parameters.

  16. Learning Adaptive Forecasting Models from Irregularly Sampled Multivariate Clinical Data

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2016-01-01

    Building accurate predictive models of clinical multivariate time series is crucial for understanding of the patient condition, the dynamics of a disease, and clinical decision making. A challenging aspect of this process is that the model should be flexible and adaptive to reflect well patient-specific temporal behaviors and this also in the case when the available patient-specific data are sparse and short span. To address this problem we propose and develop an adaptive two-stage forecasting approach for modeling multivariate, irregularly sampled clinical time series of varying lengths. The proposed model (1) learns the population trend from a collection of time series for past patients; (2) captures individual-specific short-term multivariate variability; and (3) adapts by automatically adjusting its predictions based on new observations. The proposed forecasting model is evaluated on a real-world clinical time series dataset. The results demonstrate the benefits of our approach on the prediction tasks for multivariate, irregularly sampled clinical time series, and show that it can outperform both the population based and patient-specific time series prediction models in terms of prediction accuracy.

  17. The Importance of Formalizing Computational Models of Face Adaptation Aftereffects

    Science.gov (United States)

    Ross, David A.; Palmeri, Thomas J.

    2016-01-01

    Face adaptation is widely used as a means to probe the neural representations that support face recognition. While the theories that relate face adaptation to behavioral aftereffects may seem conceptually simple, our work has shown that testing computational instantiations of these theories can lead to unexpected results. Instantiating a model of face adaptation not only requires specifying how faces are represented and how adaptation shapes those representations but also specifying how decisions are made, translating hidden representational states into observed responses. Considering the high-dimensionality of face representations, the parallel activation of multiple representations, and the non-linearity of activation functions and decision mechanisms, intuitions alone are unlikely to succeed. If the goal is to understand mechanism, not simply to examine the boundaries of a behavioral phenomenon or correlate behavior with brain activity, then formal computational modeling must be a component of theory testing. To illustrate, we highlight our recent computational modeling of face adaptation aftereffects and discuss how models can be used to understand the mechanisms by which faces are recognized. PMID:27378960

  18. Gap processing for adaptive maximal Poisson-disk sampling

    KAUST Repository

    Yan, Dongming

    2013-09-01

    In this article, we study the generation of maximal Poisson-disk sets with varying radii. First, we present a geometric analysis of gaps in such disk sets. This analysis is the basis for maximal and adaptive sampling in Euclidean space and on manifolds. Second, we propose efficient algorithms and data structures to detect gaps and update gaps when disks are inserted, deleted, moved, or when their radii are changed.We build on the concepts of regular triangulations and the power diagram. Third, we show how our analysis contributes to the state-of-the-art in surface remeshing. © 2013 ACM.

  19. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    Science.gov (United States)

    Grace, J. M.; Verseux, C.; Gentry, D.; Moffet, A.; Thayabaran, R.; Wong, N.; Rothschild, L.

    2013-12-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates[Wielgoss et al., 2013]. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques[Wassmann et al., 2010]. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols[Alcántara-Díaz et al., 2004; Goldman and Travisano, 2011]. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of

  20. Semigroups and sequential importance sampling for multiway tables

    CERN Document Server

    Yoshida, Ruriko; Wei, Shaoceng; Zhou, Feng; Haws, David

    2011-01-01

    When an interval of integers between the lower bound $l_i$ and the upper bound $u_i$ is the support of the marginal distribution $n_i|(n_{i-1}, ...,n_1)$, Chen et al, 2005 noticed that sampling from the interval at each step, for $n_i$ during a sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection Chen et al 2006. In this paper we consider the uniform distribution as the target distribution. First we show that if we fix the number of rows and columns of the design matrix of the model for contingency tables then there exists a polynomial time algorithm in terms of the input size to sample a table from the set of all tables satisfying all marginals defined by the given model via the SIS procedure without rejection. We then show experimentall...

  1. Adaptive Sampling-Based Information Collection for Wireless Body Area Networks.

    Science.gov (United States)

    Xu, Xiaobin; Zhao, Fang; Wang, Wendong; Tian, Hui

    2016-01-01

    To collect important health information, WBAN applications typically sense data at a high frequency. However, limited by the quality of wireless link, the uploading of sensed data has an upper frequency. To reduce upload frequency, most of the existing WBAN data collection approaches collect data with a tolerable error. These approaches can guarantee precision of the collected data, but they are not able to ensure that the upload frequency is within the upper frequency. Some traditional sampling based approaches can control upload frequency directly, however, they usually have a high loss of information. Since the core task of WBAN applications is to collect health information, this paper aims to collect optimized information under the limitation of upload frequency. The importance of sensed data is defined according to information theory for the first time. Information-aware adaptive sampling is proposed to collect uniformly distributed data. Then we propose Adaptive Sampling-based Information Collection (ASIC) which consists of two algorithms. An adaptive sampling probability algorithm is proposed to compute sampling probabilities of different sensed values. A multiple uniform sampling algorithm provides uniform samplings for values in different intervals. Experiments based on a real dataset show that the proposed approach has higher performance in terms of data coverage and information quantity. The parameter analysis shows the optimized parameter settings and the discussion shows the underlying reason of high performance in the proposed approach. PMID:27589758

  2. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    KAUST Repository

    Cheon, Sooyoung

    2013-02-16

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  3. Local Importance Sampling: A Novel Technique to Enhance Particle Filtering

    Directory of Open Access Journals (Sweden)

    Péter Torma

    2006-04-01

    Full Text Available In the low observation noise limit particle filters become inefficient. In this paper a simple-to- implement particle filter is suggested as a solution to this well-known problem. The proposed Local Importance Sampling based particle filters draw the particles’ positions in a two-step process that makes use of both the dynamics of the system and the most recent observation. Experiments with the standard bearings-only tracking problem indicate that the proposed new particle filter method is indeed very successful when observations are reliable. Experiments with a high-dimensional variant of this problem further show that the advantage of the new filter grows with the increasing dimensionality of the system.

  4. Semigroups and sequential importance sampling for multiway tables and beyond

    CERN Document Server

    Xi, Jing; Zhou, Feng; Yoshida, Ruriko; Haws, David

    2011-01-01

    When an interval of integers between the lower bound l_i and the upper bounds u_i is the support of the marginal distribution n_i|(n_{i-1}, ...,n_1), Chen et al. 2005 noticed that sampling from the interval at each step, for n_i during the sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection [Chen et al. 2006]. Rejecting tables is computationally expensive and incorrect proposal distributions result in biased estimators for the number of tables given its marginal sums. This paper has two focuses; (1) we propose a correction coefficient which corrects an interval of integers between the lower bound l_i and the upper bounds u_i to the support of the marginal distribution asymptotically even with rejections and with the same time complexity ...

  5. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  6. Multi-Scaling Sampling: An Adaptive Sampling Method for Discovering Approximate Association Rules

    Institute of Scientific and Technical Information of China (English)

    Cai-Yan Jia; Xie-Ping Gao

    2005-01-01

    One of the obstacles of the efficient association rule mining is the explosive expansion of data sets since it is costly or impossible to scan large databases, esp., for multiple times. A popular solution to improve the speed and scalability of the association rule mining is to do the algorithm on a random sample instead of the entire database. But how to effectively define and efficiently estimate the degree of error with respect to the outcome of the algorithm, and how to determine the sample size needed are entangling researches until now. In this paper, an effective and efficient algorithm is given based on the PAC (Probably Approximate Correct) learning theory to measure and estimate sample error. Then, a new adaptive, on-line, fast sampling strategy - multi-scaling sampling - is presented inspired by MRA (Multi-Resolution Analysis) and Shannon sampling theorem, for quickly obtaining acceptably approximate association rules at appropriate sample size. Both theoretical analysis and empirical study have showed that the sampling strategy can achieve a very good speed-accuracy trade-off.

  7. The importance of cooling of urine samples for doping analysis

    NARCIS (Netherlands)

    Kuenen, J.G.; Konings, W.N.

    2009-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  8. The importance of cooling of urine samples for doping analysis

    NARCIS (Netherlands)

    Kuenen, J. Gijs; Konings, Wil N.

    2010-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  9. Innovation and adaptation in a Turkish sample: a preliminary study.

    Science.gov (United States)

    Oner, B

    2000-11-01

    The aim of this study was to examine the representations of adaptation and innovation among adults in Turkey. Semi-structured interviews were carried out with a sample of 20 Turkish adults (10 men, 10 women) from various occupations. The participants' ages ranged from 21 to 58 years. Results of content analysis showed that the representation of innovation varied with the type of context. Innovation was not preferred within the family and interpersonal relationship contexts, whereas it was relatively more readily welcomed within the contexts of work, science, and technology. This finding may indicate that the concept of innovation that is assimilated in traditional Turkish culture has limits. Contents of the interviews were also analyzed with respect to M. J. Kirton's (1976) subscales of originality, efficiency, and rule-group conformity. The participants favored efficient innovators, whereas they thought that the risk of failure was high in cases of inefficient innovation. The reasons for and indications of the representations of innovativeness among Turkish people are discussed in relation to their social structure and cultural expectations. PMID:11092420

  10. Large Deviations and Importance Sampling for Systems of Slow-Fast Motion

    Energy Technology Data Exchange (ETDEWEB)

    Spiliopoulos, Konstantinos, E-mail: kspiliop@dam.brown.edu [Brown University, Division of Applied Mathematics (United States)

    2013-02-15

    In this paper we develop the large deviations principle and a rigorous mathematical framework for asymptotically efficient importance sampling schemes for general, fully dependent systems of stochastic differential equations of slow and fast motion with small noise in the slow component. We assume periodicity with respect to the fast component. Depending on the interaction of the fast scale with the smallness of the noise, we get different behavior. We examine how one range of interaction differs from the other one both for the large deviations and for the importance sampling. We use the large deviations results to identify asymptotically optimal importance sampling schemes in each case. Standard Monte Carlo schemes perform poorly in the small noise limit. In the presence of multiscale aspects one faces additional difficulties and straightforward adaptation of importance sampling schemes for standard small noise diffusions will not produce efficient schemes. It turns out that one has to consider the so called cell problem from the homogenization theory for Hamilton-Jacobi-Bellman equations in order to guarantee asymptotic optimality. We use stochastic control arguments.

  11. Turkish adaptation of the Fear of Spiders Questionnaire: Reliability and validity in non-clinical samples

    Directory of Open Access Journals (Sweden)

    Robert W. Booth

    2016-12-01

    Full Text Available The rapid, objective measurement of spider fear is important for clinicians, and for researchers studying fear. To facilitate this, we adapted the Fear of Spiders Questionnaire (FSQ into Turkish. The FSQ is quick to complete and easy to understand. Compared to the commonly used Spider Phobia Questionnaire, it has shown superior test-retest reliability and better discrimination of lower levels of spider fear, facilitating fear research in non-clinical samples. In two studies, with 137 and 105 undergraduates and unselected volunteers, our adapted FSQ showed excellent internal consistency (Cronbach’s α = .95 and .96 and test-retest reliability (r = .90, and good discriminant validity against the State–Trait Anxiety Inventory—Trait (r = .23 and Beck Anxiety Inventory—Trait (r = .07. Most importantly, our adapted FSQ significantly predicted 26 students’ self-reported discomfort upon approaching a caged tarantula; however, a measure of behavioural avoidance of the tarantula yielded little variability, so a more sensitive task will be required for future behavioural testing. Based on this initial testing, we recommend our adapted FSQ for research use. Further research is required to verify that our adapted FSQ discriminates individuals with and without phobia effectively. A Turkish-language report of the studies is included as supplementary material.

  12. Importance of eccentric actions in performance adaptations to resistance training

    Science.gov (United States)

    Dudley, Gary A.; Miller, Bruce J.; Buchanan, Paul; Tesch, Per A.

    1991-01-01

    The importance of eccentric (ecc) muscle actions in resistance training for the maintenance of muscle strength and mass in hypogravity was investigated in experiments in which human subjects, divided into three groups, were asked to perform four-five sets of 6 to 12 repetitions (rep) per set of three leg press and leg extension exercises, 2 days each weeks for 19 weeks. One group, labeled 'con', performed each rep with only concentric (con) actions, while group con/ecc with performed each rep with only ecc actions; the third group, con/con, performed twice as many sets with only con actions. Control subjects did not train. It was found that resistance training wih both con and ecc actions induced greater increases in muscle strength than did training with only con actions.

  13. Parallel importance sampling in conditional linear Gaussian networks

    DEFF Research Database (Denmark)

    Salmerón, Antonio; Ramos-López, Darío; Borchani, Hanen;

    2015-01-01

    In this paper we analyse the problem of probabilistic inference in CLG networks when evidence comes in streams. In such situations, fast and scalable algorithms, able to provide accurate responses in a short time are required. We consider the instantiation of variational inference and importance ...

  14. Job performance ratings : The relative importance of mental ability, conscientiousness, and career adaptability

    NARCIS (Netherlands)

    Ohme, Melanie; Zacher, Hannes

    2015-01-01

    According to career construction theory, continuous adaptation to the work environment is crucial to achieve work and career success. In this study, we examined the relative importance of career adaptability for job performance ratings using an experimental policy-capturing design. Employees (N = 13

  15. Importance Nested Sampling and the MultiNest Algorithm

    CERN Document Server

    Feroz, F; Cameron, E; Pettitt, A N

    2013-01-01

    Bayesian inference involves two main computational challenges. First, in estimating the parameters of some model for the data, the posterior distribution may well be highly multi-modal: a regime in which the convergence to stationarity of traditional Markov Chain Monte Carlo (MCMC) techniques becomes incredibly slow. Second, in selecting between a set of competing models the necessary estimation of the Bayesian evidence for each is, by definition, a (possibly high-dimensional) integration over the entire parameter space; again this can be a daunting computational task, although new Monte Carlo (MC) integration algorithms offer solutions of ever increasing efficiency. Nested sampling (NS) is one such contemporary MC strategy targeted at calculation of the Bayesian evidence, but which also enables posterior inference as a by-product, thereby allowing simultaneous parameter estimation and model selection. The widely-used MultiNest algorithm presents a particularly efficient implementation of the NS technique for...

  16. Stress avoidance in a common annual: reproductive timing is important for local adaptation and geographic distribution.

    Science.gov (United States)

    Griffith, T M; Watson, M A

    2005-11-01

    Adaptation to local environments may be an important determinant of species' geographic range. However, little is known about which traits contribute to adaptation or whether their further evolution would facilitate range expansion. In this study, we assessed the adaptive value of stress avoidance traits in the common annual Cocklebur (Xanthium strumarium) by performing a reciprocal transplant across a broad latitudinal gradient extending to the species' northern border. Populations were locally adapted and stress avoidance traits accounted for most fitness differences between populations. At the northern border where growing seasons are cooler and shorter, native populations had evolved to reproduce earlier than native populations in the lower latitude gardens. This clinal pattern in reproductive timing corresponded to a shift in selection from favouring later to earlier reproduction. Thus, earlier reproduction is an important adaptation to northern latitudes and constraint on the further evolution of this trait in marginal populations could potentially limit distribution. PMID:16313471

  17. Multi-species attributes as the condition for adaptive sampling of rare species using two-stage sequential sampling with an auxiliary variable

    Science.gov (United States)

    Panahbehagh, B.; Smith, D.R.; Salehi, M.M.; Hornbach, D.J.; Brown, D.J.; Chan, F.; Marinova, D.; Anderssen, R.S.

    2011-01-01

    Assessing populations of rare species is challenging because of the large effort required to locate patches of occupied habitat and achieve precise estimates of density and abundance. The presence of a rare species has been shown to be correlated with presence or abundance of more common species. Thus, ecological community richness or abundance can be used to inform sampling of rare species. Adaptive sampling designs have been developed specifically for rare and clustered populations and have been applied to a wide range of rare species. However, adaptive sampling can be logistically challenging, in part, because variation in final sample size introduces uncertainty in survey planning. Two-stage sequential sampling (TSS), a recently developed design, allows for adaptive sampling, but avoids edge units and has an upper bound on final sample size. In this paper we present an extension of two-stage sequential sampling that incorporates an auxiliary variable (TSSAV), such as community attributes, as the condition for adaptive sampling. We develop a set of simulations to approximate sampling of endangered freshwater mussels to evaluate the performance of the TSSAV design. The performance measures that we are interested in are efficiency and probability of sampling a unit occupied by the rare species. Efficiency measures the precision of population estimate from the TSSAV design relative to a standard design, such as simple random sampling (SRS). The simulations indicate that the density and distribution of the auxiliary population is the most important determinant of the performance of the TSSAV design. Of the design factors, such as sample size, the fraction of the primary units sampled was most important. For the best scenarios, the odds of sampling the rare species was approximately 1.5 times higher for TSSAV compared to SRS and efficiency was as high as 2 (i.e., variance from TSSAV was half that of SRS). We have found that design performance, especially for adaptive

  18. State-independent importance sampling for random walks with regularly varying increments

    Directory of Open Access Journals (Sweden)

    Karthyek R. A. Murthy

    2015-03-01

    Full Text Available We develop importance sampling based efficient simulation techniques for three commonly encountered rare event probabilities associated with random walks having i.i.d. regularly varying increments; namely, 1 the large deviation probabilities, 2 the level crossing probabilities, and 3 the level crossing probabilities within a regenerative cycle. Exponential twisting based state-independent methods, which are effective in efficiently estimating these probabilities for light-tailed increments are not applicable when the increments are heavy-tailed. To address the latter case, more complex and elegant state-dependent efficient simulation algorithms have been developed in the literature over the last few years. We propose that by suitably decomposing these rare event probabilities into a dominant and further residual components, simpler state-independent importance sampling algorithms can be devised for each component resulting in composite unbiased estimators with desirable efficiency properties. When the increments have infinite variance, there is an added complexity in estimating the level crossing probabilities as even the well known zero-variance measures have an infinite expected termination time. We adapt our algorithms so that this expectation is finite while the estimators remain strongly efficient. Numerically, the proposed estimators perform at least as well, and sometimes substantially better than the existing state-dependent estimators in the literature.

  19. An Adaptive Sampling System for Sensor Nodes in Body Area Networks.

    Science.gov (United States)

    Rieger, R; Taylor, J

    2014-04-23

    The importance of body sensor networks to monitor patients over a prolonged period of time has increased with an advance in home healthcare applications. Sensor nodes need to operate with very low-power consumption and under the constraint of limited memory capacity. Therefore, it is wasteful to digitize the sensor signal at a constant sample rate, given that the frequency contents of the signals vary with time. Adaptive sampling is established as a practical method to reduce the sample data volume. In this paper a low-power analog system is proposed, which adjusts the converter clock rate to perform a peak-picking algorithm on the second derivative of the input signal. The presented implementation does not require an analog-to-digital converter or a digital processor in the sample selection process. The criteria for selecting a suitable detection threshold are discussed, so that the maximum sampling error can be limited. A circuit level implementation is presented. Measured results exhibit a significant reduction in the average sample frequency and data rate of over 50% and 38% respectively. PMID:24760918

  20. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    OpenAIRE

    Huibin Lu; Zhengping Hu; Hongxiao Gao

    2015-01-01

    In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the...

  1. Climate variables explain neutral and adaptive variation within salmonid metapopulations: the importance of replication in landscape genetics.

    Science.gov (United States)

    Hand, Brian K; Muhlfeld, Clint C; Wade, Alisa A; Kovach, Ryan P; Whited, Diane C; Narum, Shawn R; Matala, Andrew P; Ackerman, Michael W; Garner, Brittany A; Kimball, John S; Stanford, Jack A; Luikart, Gordon

    2016-02-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST ) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  2. Climate variables explain neutral and adaptive variation within salmonid metapopulations: The importance of replication in landscape genetics

    Science.gov (United States)

    Hand, Brian K; Muhlfeld, Clint C.; Wade, Alisa A.; Kovach, Ryan; Whited, Diane C.; Narum, Shawn R; Matala, Andrew P; Ackerman, Michael W.; Garner, B. A.; Kimball, John S; Stanford, Jack A.; Luikart, Gordon

    2016-01-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  3. Long-term dynamics of adaptive evolution in a globally important phytoplankton species to ocean acidification

    Science.gov (United States)

    Schlüter, Lothar; Lohbeck, Kai T.; Gröger, Joachim P.; Riebesell, Ulf; Reusch, Thorsten B. H.

    2016-01-01

    Marine phytoplankton may adapt to ocean change, such as acidification or warming, because of their large population sizes and short generation times. Long-term adaptation to novel environments is a dynamic process, and phenotypic change can take place thousands of generations after exposure to novel conditions. We conducted a long-term evolution experiment (4 years = 2100 generations), starting with a single clone of the abundant and widespread coccolithophore Emiliania huxleyi exposed to three different CO2 levels simulating ocean acidification (OA). Growth rates as a proxy for Darwinian fitness increased only moderately under both levels of OA [+3.4% and +4.8%, respectively, at 1100 and 2200 μatm partial pressure of CO2 (Pco2)] relative to control treatments (ambient CO2, 400 μatm). Long-term adaptation to OA was complex, and initial phenotypic responses of ecologically important traits were later reverted. The biogeochemically important trait of calcification, in particular, that had initially been restored within the first year of evolution was later reduced to levels lower than the performance of nonadapted populations under OA. Calcification was not constitutively lost but returned to control treatment levels when high CO2–adapted isolates were transferred back to present-day control CO2 conditions. Selection under elevated CO2 exacerbated a general decrease of cell sizes under long-term laboratory evolution. Our results show that phytoplankton may evolve complex phenotypic plasticity that can affect biogeochemically important traits, such as calcification. Adaptive evolution may play out over longer time scales (>1 year) in an unforeseen way under future ocean conditions that cannot be predicted from initial adaptation responses. PMID:27419227

  4. Long-term dynamics of adaptive evolution in a globally important phytoplankton species to ocean acidification.

    Science.gov (United States)

    Schlüter, Lothar; Lohbeck, Kai T; Gröger, Joachim P; Riebesell, Ulf; Reusch, Thorsten B H

    2016-07-01

    Marine phytoplankton may adapt to ocean change, such as acidification or warming, because of their large population sizes and short generation times. Long-term adaptation to novel environments is a dynamic process, and phenotypic change can take place thousands of generations after exposure to novel conditions. We conducted a long-term evolution experiment (4 years = 2100 generations), starting with a single clone of the abundant and widespread coccolithophore Emiliania huxleyi exposed to three different CO2 levels simulating ocean acidification (OA). Growth rates as a proxy for Darwinian fitness increased only moderately under both levels of OA [+3.4% and +4.8%, respectively, at 1100 and 2200 μatm partial pressure of CO2 (Pco2)] relative to control treatments (ambient CO2, 400 μatm). Long-term adaptation to OA was complex, and initial phenotypic responses of ecologically important traits were later reverted. The biogeochemically important trait of calcification, in particular, that had initially been restored within the first year of evolution was later reduced to levels lower than the performance of nonadapted populations under OA. Calcification was not constitutively lost but returned to control treatment levels when high CO2-adapted isolates were transferred back to present-day control CO2 conditions. Selection under elevated CO2 exacerbated a general decrease of cell sizes under long-term laboratory evolution. Our results show that phytoplankton may evolve complex phenotypic plasticity that can affect biogeochemically important traits, such as calcification. Adaptive evolution may play out over longer time scales (>1 year) in an unforeseen way under future ocean conditions that cannot be predicted from initial adaptation responses. PMID:27419227

  5. Estimating the Importance of Private Adaptation to Climate Change in Agriculture: A Review of Empirical Methods

    Science.gov (United States)

    Moore, F.; Burke, M.

    2015-12-01

    A wide range of studies using a variety of methods strongly suggest that climate change will have a negative impact on agricultural production in many areas. Farmers though should be able to learn about a changing climate and to adjust what they grow and how they grow it in order to reduce these negative impacts. However, it remains unclear how effective these private (autonomous) adaptations will be, or how quickly they will be adopted. Constraining the uncertainty on this adaptation is important for understanding the impacts of climate change on agriculture. Here we review a number of empirical methods that have been proposed for understanding the rate and effectiveness of private adaptation to climate change. We compare these methods using data on agricultural yields in the United States and western Europe.

  6. Dangerous climate change and the importance of adaptation for the Arctic's Inuit population

    Science.gov (United States)

    Ford, James D.

    2009-04-01

    The Arctic's climate is changing rapidly, to the extent that 'dangerous' climate change as defined by the United Nations Framework on Climate Change might already be occurring. These changes are having implications for the Arctic's Inuit population and are being exacerbated by the dependence of Inuit on biophysical resources for livelihoods and the low socio-economic-health status of many northern communities. Given the nature of current climate change and projections of a rapidly warming Arctic, climate policy assumes a particular importance for Inuit regions. This paper argues that efforts to stabilize and reduce greenhouse gas emissions are urgent if we are to avoid runaway climate change in the Arctic, but unlikely to prevent changes which will be dangerous for Inuit. In this context, a new policy discourse on climate change is required for Arctic regions—one that focuses on adaptation. The paper demonstrates that states with Inuit populations and the international community in general has obligations to assist Inuit to adapt to climate change through international human rights and climate change treaties. However, the adaptation deficit, in terms of what we know and what we need to know to facilitate successful adaptation, is particularly large in an Arctic context and limiting the ability to develop response options. Moreover, adaptation as an option of response to climate change is still marginal in policy negotiations and Inuit political actors have been slow to argue the need for adaptation assistance. A new focus on adaptation in both policy negotiations and scientific research is needed to enhance Inuit resilience and reduce vulnerability in a rapidly changing climate.

  7. Contractile activity-induced adaptations in the mitochondrial protein import system.

    Science.gov (United States)

    Takahashi, M; Chesley, A; Freyssenet, D; Hood, D A

    1998-05-01

    We previously demonstrated that subsarcolemmal (SS) and intermyofibrillar (IMF) mitochondrial subfractions import proteins at different rates. This study was undertaken to investigate 1) whether protein import is altered by chronic contractile activity, which induces mitochondrial biogenesis, and 2) whether these two subfractions adapt similarly. Using electrical stimulation (10 Hz, 3 h/day for 7 and 14 days) to induce contractile activity, we observed that malate dehydrogenase import into the matrix of the SS and IMF mitochondia isolated from stimulated muscle was significantly increased by 1.4-to 1.7-fold, although the pattern of increase differed for each subfraction. This acceleration of import may be mitochondrial compartment specific, since the import of Bcl-2 into the outer membrane was not affected. Contractile activity also modified the mitochondrial content of proteins comprising the import machinery, as evident from increases in the levels of the intramitochondrial chaperone mtHSP70 as well as the outer membrane import receptor Tom20 in SS and IMF mitochondria. Addition of cytosol isolated from stimulated or control muscles to the import reaction resulted in similar twofold increases in the ability of mitochondria to import malate dehydrogenase, despite elevations in the concentration of mitochondrial import-stimulating factor within the cytosol of chronically stimulated muscle. These results suggest that chronic contractile activity modifies the extra- and intramitochondrial environments in a fashion that favors the acceleration of precursor protein import into the matrix of the organelle. This increase in protein import is likely an important adaptation in the overall process of mitochondrial biogenesis. PMID:9612226

  8. Simulation of a Jackson tandem network using state-dependent importance sampling

    NARCIS (Netherlands)

    D.I. Miretskiy; W.R.W. Scheinhardt; M.R.H. Mandjes

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jackson two-node tandem queue. It is known that in this setting `traditional' state-independent importance-sampling distributions perform

  9. Simulation of a Jackson tandem network using state-dependent importance sampling

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rareevent simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jackson twonode tandem queue. It is known that in this setting ‘traditional’ state-independent importance-sampling distributions perform po

  10. Parks, people, and change: the importance of multistakeholder engagement in adaptation planning for conserved areas

    Directory of Open Access Journals (Sweden)

    Corrine N. Knapp

    2014-12-01

    Full Text Available Climate change challenges the traditional goals and conservation strategies of protected areas, necessitating adaptation to changing conditions. Denali National Park and Preserve (Denali in south central Alaska, USA, is a vast landscape that is responding to climate change in ways that will impact both ecological resources and local communities. Local observations help to inform understanding of climate change and adaptation planning, but whose knowledge is most important to consider? For this project we interviewed long-term Denali staff, scientists, subsistence community members, bus drivers, and business owners to assess what types of observations each can contribute, how climate change is impacting each, and what they think the National Park Service should do to adapt. The project shows that each type of long-term observer has different types of observations, but that those who depend more directly on natural resources for their livelihoods have more and different observations than those who do not. These findings suggest that engaging multiple groups of stakeholders who interact with the park in distinct ways adds substantially to the information provided by Denali staff and scientists and offers a broader foundation for adaptation planning. It also suggests that traditional protected area paradigms that fail to learn from and foster appropriate engagement of people may be maladaptive in the context of climate change.

  11. Estimating the abundance of clustered animal population by using adaptive cluster sampling and negative binomial distribution

    Science.gov (United States)

    Bo, Yizhou; Shifa, Naima

    2013-09-01

    An estimator for finding the abundance of a rare, clustered and mobile population has been introduced. This model is based on adaptive cluster sampling (ACS) to identify the location of the population and negative binomial distribution to estimate the total in each site. To identify the location of the population we consider both sampling with replacement (WR) and sampling without replacement (WOR). Some mathematical properties of the model are also developed.

  12. Iterative Monte Carlo with bead-adapted sampling for complex-time correlation functions

    Science.gov (United States)

    Jadhao, Vikram; Makri, Nancy

    2010-03-01

    In a recent communication [V. Jadhao and N. Makri, J. Chem. Phys. 129, 161102 (2008)], we introduced an iterative Monte Carlo (IMC) path integral methodology for calculating complex-time correlation functions. This method constitutes a stepwise evaluation of the path integral on a grid selected by a Monte Carlo procedure, circumventing the exponential growth of statistical error with increasing propagation time, while realizing the advantageous scaling of importance sampling in the grid selection and integral evaluation. In the present paper, we present an improved formulation of IMC, which is based on a bead-adapted sampling procedure; thus leading to grid point distributions that closely resemble the absolute value of the integrand at each iteration. We show that the statistical error of IMC does not grow upon repeated iteration, in sharp contrast to the performance of the conventional path integral approach which leads to exponential increase in statistical uncertainty. Numerical results on systems with up to 13 degrees of freedom and propagation up to 30 times the "thermal" time ℏβ /2 illustrate these features.

  13. Hybrid algorithm of ensemble transform and importance sampling for assimilation of non-Gaussian observations

    Directory of Open Access Journals (Sweden)

    Shin'ya Nakano

    2014-05-01

    Full Text Available A hybrid algorithm that combines the ensemble transform Kalman filter (ETKF and the importance sampling approach is proposed. Since the ETKF assumes a linear Gaussian observation model, the estimate obtained by the ETKF can be biased in cases with nonlinear or non-Gaussian observations. The particle filter (PF is based on the importance sampling technique, and is applicable to problems with nonlinear or non-Gaussian observations. However, the PF usually requires an unrealistically large sample size in order to achieve a good estimation, and thus it is computationally prohibitive. In the proposed hybrid algorithm, we obtain a proposal distribution similar to the posterior distribution by using the ETKF. A large number of samples are then drawn from the proposal distribution, and these samples are weighted to approximate the posterior distribution according to the importance sampling principle. Since the importance sampling provides an estimate of the probability density function (PDF without assuming linearity or Gaussianity, we can resolve the bias due to the nonlinear or non-Gaussian observations. Finally, in the next forecast step, we reduce the sample size to achieve computational efficiency based on the Gaussian assumption, while we use a relatively large number of samples in the importance sampling in order to consider the non-Gaussian features of the posterior PDF. The use of the ETKF is also beneficial in terms of the computational simplicity of generating a number of random samples from the proposal distribution and in weighting each of the samples. The proposed algorithm is not necessarily effective in case that the ensemble is located distant from the true state. However, monitoring the effective sample size and tuning the factor for covariance inflation could resolve this problem. In this paper, the proposed hybrid algorithm is introduced and its performance is evaluated through experiments with non-Gaussian observations.

  14. Assessing employability capacities and career adaptability in a sample of human resource professionals

    OpenAIRE

    Melinde Coetzee; Nadia Ferreira; Ingrid L. Potgieter

    2015-01-01

    Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1) to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2) to identify the variables that contributed the most to this relationship.Motivation fo...

  15. Efficient calculation of risk measures by importance sampling -- the heavy tailed case

    CERN Document Server

    Hult, Henrik

    2009-01-01

    Computation of extreme quantiles and tail-based risk measures using standard Monte Carlo simulation can be inefficient. A method to speed up computations is provided by importance sampling. We show that importance sampling algorithms, designed for efficient tail probability estimation, can significantly improve Monte Carlo estimators of tail-based risk measures. In the heavy-tailed setting, when the random variable of interest has a regularly varying distribution, we provide sufficient conditions for the asymptotic relative error of importance sampling estimators of risk measures, such as Value-at-Risk and expected shortfall, to be small. The results are illustrated by some numerical examples.

  16. Low Bit-Rate Image Compression using Adaptive Down-Sampling technique

    OpenAIRE

    V.Swathi; Prof. K ASHOK BABU

    2011-01-01

    In this paper, we are going to use a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass pre-filtering. The resulting down-sampled pre-filtered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then up-converts it to the original resolut...

  17. Data reduction in the ITMS system through a data acquisition model with self-adaptive sampling rate

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, M. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain)], E-mail: mariano.ruiz@upm.es; Lopez, JM.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Barrera, E. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Melendez, R. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2008-04-15

    Long pulse or steady state operation of fusion experiments require data acquisition and processing systems that reduce the volume of data involved. The availability of self-adaptive sampling rate systems and the use of real-time lossless data compression techniques can help solve these problems. The former is important for continuous adaptation of sampling frequency for experimental requirements. The latter allows the maintenance of continuous digitization under limited memory conditions. This can be achieved by permanent transmission of compressed data to other systems. The compacted transfer ensures the use of minimum bandwidth. This paper presents an implementation based on intelligent test and measurement system (ITMS), a data acquisition system architecture with multiprocessing capabilities that permits it to adapt the system's sampling frequency throughout the experiment. The sampling rate can be controlled depending on the experiment's specific requirements by using an external dc voltage signal or by defining user events through software. The system takes advantage of the high processing capabilities of the ITMS platform to implement a data reduction mechanism based in lossless data compression algorithms which are themselves based in periodic deltas.

  18. How to apply importance-sampling techniques to simulations of optical systems

    OpenAIRE

    McKinstrie, C. J.; Winzer, P. J.

    2003-01-01

    This report contains a tutorial introduction to the method of importance sampling. The use of this method is illustrated for simulations of the noise-induced energy jitter of return-to-zero pulses in optical communication systems.

  19. Reward and Punishment based Cooperative Adaptive Sampling in Wireless Sensor Networks

    NARCIS (Netherlands)

    Masoum, Alireza; Meratnia, Nirvana; Taghikhaki, Zahra; Havinga, Paul J.M.

    2010-01-01

    Energy conservation is one of the main concerns in wireless sensor networks. One of the mechanisms to better manage energy in wireless sensor networks is adaptive sampling, by which instead of using a fixed frequency interval for sensing and data transmission, the wireless sensor network employs a d

  20. Career Adapt-Abilities Scale in a French-Speaking Swiss Sample: Psychometric Properties and Relationships to Personality and Work Engagement

    Science.gov (United States)

    Rossier, Jerome; Zecca, Gregory; Stauffer, Sarah D.; Maggiori, Christian; Dauwalder, Jean-Pierre

    2012-01-01

    The aim of this study was to analyze the psychometric properties of the Career Adapt-Abilities Scale (CAAS) in a French-speaking Swiss sample and its relationship with personality dimensions and work engagement. The heterogeneous sample of 391 participants (M[subscript age] = 39.59, SD = 12.30) completed the CAAS-International and a short version…

  1. The importance of the EU green paper on climate adaptation for the Netherlands

    International Nuclear Information System (INIS)

    An analysis of the EU Green Paper Climate Adaptation shows that this is not inconsistent with the National Adaptation, but does differ. The Green Paper also highlights the European dimension of climate adaptation and approaches climate adaptation in a broader context. Furthermore, the challenge is to remain alert and to bring Dutch ideas into the EU policy process, when concrete measures will be formulated

  2. The use of importance sampling in a trial assessment to obtain converged estimates of radiological risk

    International Nuclear Information System (INIS)

    In developing a methodology for assessing potential sites for the disposal of radioactive wastes, the Department of the Environment has conducted a series of trial assessment exercises. In order to produce converged estimates of radiological risk using the SYVAC A/C simulation system an efficient sampling procedure is required. Previous work has demonstrated that importance sampling can substantially increase sampling efficiency. This study used importance sampling to produce converged estimates of risk for the first DoE trial assessment. Four major nuclide chains were analysed. In each case importance sampling produced converged risk estimates with between 10 and 170 times fewer runs of the SYVAC A/C model. This increase in sampling efficiency can reduce the total elapsed time required to obtain a converged estimate of risk from one nuclide chain by a factor of 20. The results of this study suggests that the use of importance sampling could reduce the elapsed time required to perform a risk assessment of a potential site by a factor of ten. (author)

  3. Multiview Sample Classification Algorithm Based on L1-Graph Domain Adaptation Learning

    Directory of Open Access Journals (Sweden)

    Huibin Lu

    2015-01-01

    Full Text Available In the case of multiview sample classification with different distribution, training and testing samples are from different domains. In order to improve the classification performance, a multiview sample classification algorithm based on L1-Graph domain adaptation learning is presented. First of all, a framework of nonnegative matrix trifactorization based on domain adaptation learning is formed, in which the unchanged information is regarded as the bridge of knowledge transformation from the source domain to the target domain; the second step is to construct L1-Graph on the basis of sparse representation, so as to search for the nearest neighbor data with self-adaptation and preserve the samples and the geometric structure; lastly, we integrate two complementary objective functions into the unified optimization issue and use the iterative algorithm to cope with it, and then the estimation of the testing sample classification is completed. Comparative experiments are conducted in USPS-Binary digital database, Three-Domain Object Benchmark database, and ALOI database; the experimental results verify the effectiveness of the proposed algorithm, which improves the recognition accuracy and ensures the robustness of algorithm.

  4. Sample based 3D face reconstruction from a single frontal image by adaptive locally linear embedding

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; ZHUANG Yue-ting

    2007-01-01

    In this paper, we propose a highly automatic approach for 3D photorealistic face reconstruction from a single frontal image. The key point of our work is the implementation of adaptive manifold learning approach. Beforehand, an active appearance model (AAM) is trained for automatic feature extraction and adaptive locally linear embedding (ALLE) algorithm is utilized to reduce the dimensionality of the 3D database. Then, given an input frontal face image, the corresponding weights between 3D samples and the image are synthesized adaptively according to the AAM selected facial features. Finally, geometry reconstruction is achieved by linear weighted combination of adaptively selected samples. Radial basis function (RBF) is adopted to map facial texture from the frontal image to the reconstructed face geometry. The texture of invisible regions between the face and the ears is interpolated by sampling from the frontal image. This approach has several advantages: (1) Only a single frontal face image is needed for highly automatic face reconstruction; (2) Compared with former works, our reconstruction approach provides higher accuracy; (3) Constraint based RBF texture mapping provides natural appearance for reconstructed face.

  5. Local Adaptation in European Firs Assessed through Extensive Sampling across Altitudinal Gradients in Southern Europe

    Science.gov (United States)

    Postolache, Dragos; Lascoux, Martin; Drouzas, Andreas D.; Källman, Thomas; Leonarduzzi, Cristina; Liepelt, Sascha; Piotti, Andrea; Popescu, Flaviu; Roschanski, Anna M.; Zhelev, Peter; Fady, Bruno; Vendramin, Giovanni Giuseppe

    2016-01-01

    Background Local adaptation is a key driver of phenotypic and genetic divergence at loci responsible for adaptive traits variations in forest tree populations. Its experimental assessment requires rigorous sampling strategies such as those involving population pairs replicated across broad spatial scales. Methods A hierarchical Bayesian model of selection (HBM) that explicitly considers both the replication of the environmental contrast and the hierarchical genetic structure among replicated study sites is introduced. Its power was assessed through simulations and compared to classical ‘within-site’ approaches (FDIST, BAYESCAN) and a simplified, within-site, version of the model introduced here (SBM). Results HBM demonstrates that hierarchical approaches are very powerful to detect replicated patterns of adaptive divergence with low false-discovery (FDR) and false-non-discovery (FNR) rates compared to the analysis of different sites separately through within-site approaches. The hypothesis of local adaptation to altitude was further addressed by analyzing replicated Abies alba population pairs (low and high elevations) across the species’ southern distribution range, where the effects of climatic selection are expected to be the strongest. For comparison, a single population pair from the closely related species A. cephalonica was also analyzed. The hierarchical model did not detect any pattern of adaptive divergence to altitude replicated in the different study sites. Instead, idiosyncratic patterns of local adaptation among sites were detected by within-site approaches. Conclusion Hierarchical approaches may miss idiosyncratic patterns of adaptation among sites, and we strongly recommend the use of both hierarchical (multi-site) and classical (within-site) approaches when addressing the question of adaptation across broad spatial scales. PMID:27392065

  6. Coalescent: an open-science framework for importance sampling in coalescent theory.

    Science.gov (United States)

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  7. An Adaptive Defect Weighted Sampling Algorithm to Design Pseudoknotted RNA Secondary Structures.

    Science.gov (United States)

    Zandi, Kasra; Butler, Gregory; Kharma, Nawwaf

    2016-01-01

    Computational design of RNA sequences that fold into targeted secondary structures has many applications in biomedicine, nanotechnology and synthetic biology. An RNA molecule is made of different types of secondary structure elements and an important RNA element named pseudoknot plays a key role in stabilizing the functional form of the molecule. However, due to the computational complexities associated with characterizing pseudoknotted RNA structures, most of the existing RNA sequence designer algorithms generally ignore this important structural element and therefore limit their applications. In this paper we present a new algorithm to design RNA sequences for pseudoknotted secondary structures. We use NUPACK as the folding algorithm to compute the equilibrium characteristics of the pseudoknotted RNAs, and describe a new adaptive defect weighted sampling algorithm named Enzymer to design low ensemble defect RNA sequences for targeted secondary structures including pseudoknots. We used a biological data set of 201 pseudoknotted structures from the Pseudobase library to benchmark the performance of our algorithm. We compared the quality characteristics of the RNA sequences we designed by Enzymer with the results obtained from the state of the art MODENA and antaRNA. Our results show our method succeeds more frequently than MODENA and antaRNA do, and generates sequences that have lower ensemble defect, lower probability defect and higher thermostability. Finally by using Enzymer and by constraining the design to a naturally occurring and highly conserved Hammerhead motif, we designed 8 sequences for a pseudoknotted cis-acting Hammerhead ribozyme. Enzymer is available for download at https://bitbucket.org/casraz/enzymer. PMID:27499762

  8. FloodNet: Coupling Adaptive Sampling with Energy Aware Routing in a Flood Warning System

    Institute of Scientific and Technical Information of China (English)

    Jing Zhou; David De Roure

    2007-01-01

    We describe the design of FloodNet, a flood warning system, which uses a grid-based flood predictor model developed by environmental experts to make flood predictions based on readings of water level collected by a set of sensor nodes.To optimize battery consumption, the reporting frequency of sensor nodes is required to be adaptive to local conditions as well as the flood predictor model.We therefore propose an energy aware routing protocol which allows sensor nodes to consume energy according to this need.This system is notable both for the adaptive sampling regime and the methodology adopted in the design of the adaptive behavior, which involved development of simulation tools and very close collaboration with environmental experts.

  9. Intraspecific shape variation in horseshoe crabs: the importance of sexual and natural selection for local adaptation

    DEFF Research Database (Denmark)

    Faurby, Søren; Nielsen, Kasper Sauer Kollerup; Bussarawit, Somchai;

    2011-01-01

    A morphometric analysis of the body shape of three species of horseshoe crabs was undertaken in order to infer the importance of natural and sexual selection. It was expected that natural selection would be most intense, leading to highest regional differentiation, in the American species Limulus...... polyphemus, which has the largest climatic differences between different populations. Local adaptation driven by sexual selection was expected in males but not females because horseshoe crab mating behaviour leads to competition between males, but not between females. Three hundred fifty-nine horseshoe crabs...... of geographically-based intraspecific variation. An admixture analysis showed regional intraspecific differentiation for males and females of L. polyphemus and males of the Asian horseshoe crab Carcinoscorpius rotundicauda, but not for females of C. rotundicauda and another Asian horseshoe crab, Tachypleus gigas...

  10. Assessing employability capacities and career adaptability in a sample of human resource professionals

    Directory of Open Access Journals (Sweden)

    Melinde Coetzee

    2015-03-01

    Full Text Available Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1 to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2 to identify the variables that contributed the most to this relationship.Motivation for the study: Global competitive markets and technological advances are increasingly driving the demand for graduate knowledge and skills in a wide variety of jobs. Contemporary career theory further emphasises career adaptability across the lifespan as a critical skill for career management agency. Despite the apparent importance attached to employees’ employability and career adaptability, there seems to be a general lack of research investigating the association between these constructs.Research approach, design and method: A cross-sectional, quantitative research design approach was followed. Descriptive statistics, Pearson product-moment correlations and canonical correlation analysis were performed to achieve the objective of the study. The participants (N = 196 were employed in professional positions in the human resource field and were predominantly early career black people and women.Main findings: The results indicated positive multivariate relationships between the variables and showed that lifelong learning capacities and problem solving, decision-making and interactive skills contributed the most to explaining the participants’ career confidence, career curiosity and career control.Practical/managerial implications: The study suggests that developing professional graduates’ employability capacities may strengthen their career adaptability. These capacities were shown to explain graduates’ active engagement in career management strategies

  11. Adaptation to climate change and climate variability:The importance of understanding agriculture as performance

    NARCIS (Netherlands)

    Crane, T.A.; Roncoli, C.; Hoogenboom, G.

    2011-01-01

    Most climate change studies that address potential impacts and potential adaptation strategies are largely based on modelling technologies. While models are useful for visualizing potential future outcomes and evaluating options for potential adaptation, they do not adequately represent and integrat

  12. Adaptation to climate change and climate variability in European agriculture: The importance of farm level responses

    NARCIS (Netherlands)

    Reidsma, P.; Ewert, F.; Oude Lansink, A.G.J.M.; Leemans, R.

    2010-01-01

    Climatic conditions and hence climate change influence agriculture. Most studies that addressed the vulnerability of agriculture to climate change have focused on potential impacts without considering adaptation. When adaptation strategies are considered, socio-economic conditions and farm managemen

  13. Low Bit-Rate Image Compression using Adaptive Down-Sampling technique

    Directory of Open Access Journals (Sweden)

    V.Swathi

    2011-09-01

    Full Text Available In this paper, we are going to use a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass pre-filtering. The resulting down-sampled pre-filtered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then up-converts it to the original resolution in a constrained least squares restoration process, using a 2-D piecewise autoregressive model and the knowledge of directional low-pass pre-filtering. The proposed compression approach of collaborative adaptive down-sampling and up-conversion (CADU outperforms JPEG 2000 in PSNR measure at low to medium bit rates and achieves superior visual quality, as well. The superior low bit-rate performance of the CADU approach seems to suggest that over-sampling not only wastes hardware resources and energy, and it could be counterproductive to image quality given a tight bit budget.

  14. Using continuous in-situ measurements to adaptively trigger urban storm water samples

    Science.gov (United States)

    Wong, B. P.; Kerkez, B.

    2015-12-01

    Until cost-effective in-situ sensors are available for biological parameters, nutrients and metals, automated samplers will continue to be the primary source of reliable water quality measurements. Given limited samples bottles, however, autosamplers often obscure insights on nutrient sources and biogeochemical processes which would otherwise be captured using a continuous sampling approach. To that end, we evaluate the efficacy a novel method to measure first-flush nutrient dynamics in flashy, urban watersheds. Our approach reduces the number of samples required to capture water quality dynamics by leveraging an internet-connected sensor node, which is equipped with a suite of continuous in-situ sensors and an automated sampler. To capture both the initial baseflow as well as storm concentrations, a cloud-hosted adaptive algorithm analyzes the high-resolution sensor data along with local weather forecasts to optimize a sampling schedule. The method was tested in a highly developed urban catchment in Ann Arbor, Michigan and collected samples of nitrate, phosphorus, and suspended solids throughout several storm events. Results indicate that the watershed does not exhibit first flush dynamics, a behavior that would have been obscured when using a non-adaptive sampling approach.

  15. Enhanced modeling via network theory: Adaptive sampling of Markov state models

    OpenAIRE

    Bowman, Gregory R; Ensign, Daniel L.; Pande, Vijay S.

    2010-01-01

    Computer simulations can complement experiments by providing insight into molecular kinetics with atomic resolution. Unfortunately, even the most powerful supercomputers can only simulate small systems for short timescales, leaving modeling of most biologically relevant systems and timescales intractable. In this work, however, we show that molecular simulations driven by adaptive sampling of networks called Markov State Models (MSMs) can yield tremendous time and resource savings, allowing p...

  16. Efficient adaptive designs with mid-course sample size adjustment in clinical trials

    CERN Document Server

    Bartroff, Jay

    2011-01-01

    Adaptive designs have been proposed for clinical trials in which the nuisance parameters or alternative of interest are unknown or likely to be misspecified before the trial. Whereas most previous works on adaptive designs and mid-course sample size re-estimation have focused on two-stage or group sequential designs in the normal case, we consider here a new approach that involves at most three stages and is developed in the general framework of multiparameter exponential families. Not only does this approach maintain the prescribed type I error probability, but it also provides a simple but asymptotically efficient sequential test whose finite-sample performance, measured in terms of the expected sample size and power functions, is shown to be comparable to the optimal sequential design, determined by dynamic programming, in the simplified normal mean case with known variance and prespecified alternative, and superior to the existing two-stage designs and also to adaptive group sequential designs when the al...

  17. Adaptation and Validation of the Sexual Assertiveness Scale (SAS) in a Sample of Male Drug Users.

    Science.gov (United States)

    Vallejo-Medina, Pablo; Sierra, Juan Carlos

    2015-01-01

    The aim of the present study was to adapt and validate the Sexual Assertiveness Scale (SAS) in a sample of male drug users. A sample of 326 male drug users and 322 non-clinical males was selected by cluster sampling and convenience sampling, respectively. Results showed that the scale had good psychometric properties and adequate internal consistency reliability (Initiation = .66, Refusal = .74 and STD-P = .79). An evaluation of the invariance showed strong factor equivalence between both samples. A high and moderate effect of Differential Item Functioning was only found in items 1 and 14 (∆R 2 Nagelkerke = .076 and .037, respectively). We strongly recommend not using item 1 if the goal is to compare the scores of both groups, otherwise the comparison will be biased. Correlations obtained between the CSFQ-14 and the safe sex ratio and the SAS subscales were significant (CI = 95%) and indicated good concurrent validity. Scores of male drug users were similar to those of non-clinical males. Therefore, the adaptation of the SAS to drug users provides enough guarantees for reliable and valid use in both clinical practice and research, although care should be taken with item 1. PMID:25896498

  18. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  19. Importance sampling for Lambda-coalescents in the infinitely many sites model

    CERN Document Server

    Birkner, Matthias; Steinruecken, Matthias; 10.1016/j.tpb.2011.01.005

    2011-01-01

    We present and discuss new importance sampling schemes for the approximate computation of the sample probability of observed genetic types in the infinitely many sites model from population genetics. More specifically, we extend the 'classical framework', where genealogies are assumed to be governed by Kingman's coalescent, to the more general class of Lambda-coalescents and develop further Hobolth et. al.'s (2008) idea of deriving importance sampling schemes based on 'compressed genetrees'. The resulting schemes extend earlier work by Griffiths and Tavar\\'e (1994), Stephens and Donnelly (2000), Birkner and Blath (2008) and Hobolth et. al. (2008). We conclude with a performance comparison of classical and new schemes for Beta- and Kingman coalescents.

  20. Adapting chain referral methods to sample new migrants: Possibilities and limitations

    Directory of Open Access Journals (Sweden)

    Lucinda Platt

    2015-09-01

    Full Text Available Background: Demographic research on migration requires representative samples of migrant populations. Yet recent immigrants, who are particularly informative about current migrant flows, are difficult to capture even in specialist surveys. Respondent-driven sampling (RDS, a chain referral sampling and analysis technique, potentially offers the opportunity to achieve population-level inference of recently arrived migrant populations. Objective: We evaluate the attempt to use RDS to sample two groups of migrants, from Pakistan and Poland, who had arrived in the UK within the previous 18 months, and we present an alternative approach adapted to recent migrants. Methods: We discuss how connectedness, privacy, clustering, and motivation are expected to differ among recently arrived migrants, compared to typical applications of RDS. We develop a researcher-led chain referral approach, and compare success in recruitment and indicators of representativeness to standard RDS recruitment. Results: Our researcher-led approach led to higher rates of chain-referral, and enabled us to reach population members with smaller network sizes. The researcher-led approach resulted in similar recruiter-recruit transition probabilities to traditional RDS across many demographic and social characteristics. However, we did not succeed in building up long referral chains, largely due to the lack of connectedness of our target populations and some reluctance to refer. There were some differences between the two migrant groups, with less mobile and less hidden Pakistani men producing longer referral chains. Conclusions: Chain referral is difficult to implement for sampling newly arrived migrants. However, our researcher-led adaptation shows promise for less hidden and more stable recent immigrant populations. Contribution: The paper offers an evaluation of RDS for surveying recent immigrants and an adaptation that may be effective under certain conditions.

  1. Social Daydreaming and Adjustment: An Experience-Sampling Study of Socio-Emotional Adaptation During a Life Transition.

    Science.gov (United States)

    Poerio, Giulia L; Totterdell, Peter; Emerson, Lisa-Marie; Miles, Eleanor

    2016-01-01

    Estimates suggest that up to half of waking life is spent daydreaming; that is, engaged in thought that is independent of, and unrelated to, one's current task. Emerging research indicates that daydreams are predominately social suggesting that daydreams may serve socio-emotional functions. Here we explore the functional role of social daydreaming for socio-emotional adjustment during an important and stressful life transition (the transition to university) using experience-sampling with 103 participants over 28 days. Over time, social daydreams increased in their positive characteristics and positive emotional outcomes; specifically, participants reported that their daydreams made them feel more socially connected and less lonely, and that the content of their daydreams became less fanciful and involved higher quality relationships. These characteristics then predicted less loneliness at the end of the study, which, in turn was associated with greater social adaptation to university. Feelings of connection resulting from social daydreams were also associated with less emotional inertia in participants who reported being less socially adapted to university. Findings indicate that social daydreaming is functional for promoting socio-emotional adjustment to an important life event. We highlight the need to consider the social content of stimulus-independent cognitions, their characteristics, and patterns of change, to specify how social thoughts enable socio-emotional adaptation.

  2. Improved Algorithms and Coupled Neutron-Photon Transport for Auto-Importance Sampling Method

    CERN Document Server

    Wang, Xin; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Li, Jun-Li

    2016-01-01

    Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed by Tsinghua University for deep penetration problem, which can improve computational efficiency significantly without pre-calculations for importance distribution. However AIS method is only validated with several basic deep penetration problems of simple geometries and cannot be used for coupled neutron-photon transport. This paper firstly presented the latest algorithm improvements for AIS method including particle transport, fictitious particles creation and adjustment, fictitious surface geometry, random number allocation and calculation of estimated relative error, which made AIS method applicable to complicated deep penetration problem. Then, a coupled Neutron-Photon Auto-Importance Sampling (NP-AIS) method was proposed to apply AIS method with the improved algorithms in coupled neutron-photon Monte Carlo transport. Finally, the NUREG/CR-6115 PWR benchmark model was calculated with the method of geometry splitti...

  3. Adaptive sampling strategy support for the unlined chromic acid pit, chemical waste landfill, Sandia National Laboratories, Albuquerque, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.L.

    1993-11-01

    Adaptive sampling programs offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the real-time data generated by an adaptive sampling program. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system (SitePlanner{trademark} ) for data fusion, management, and display and combined Bayesian/geostatistical methods (PLUME) for contamination-extent estimation and sample location selection. This approach is applied in a retrospective study of a subsurface chromium plume at Sandia National Laboratories` chemical waste landfill. Retrospective analyses suggest the potential for characterization cost savings on the order of 60% through a reduction in the number of sampling programs, total number of soil boreholes, and number of samples analyzed from each borehole.

  4. Adaptive Sampling approach to environmental site characterization at Joliet Army Ammunition Plant: Phase 2 demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Bujewski, G.E. [Sandia National Labs., Albuquerque, NM (United States). Environmental Restoration Technologies Dept.; Johnson, R.L. [Argonne National Lab., IL (United States)

    1996-04-01

    Adaptive sampling programs provide real opportunities to save considerable time and money when characterizing hazardous waste sites. This Strategic Environmental Research and Development Program (SERDP) project demonstrated two decision-support technologies, SitePlanner{trademark} and Plume{trademark}, that can facilitate the design and deployment of an adaptive sampling program. A demonstration took place at Joliet Army Ammunition Plant (JAAP), and was unique in that it was tightly coupled with ongoing Army characterization work at the facility, with close scrutiny by both state and federal regulators. The demonstration was conducted in partnership with the Army Environmental Center`s (AEC) Installation Restoration Program and AEC`s Technology Development Program. AEC supported researchers from Tufts University who demonstrated innovative field analytical techniques for the analysis of TNT and DNT. SitePlanner{trademark} is an object-oriented database specifically designed for site characterization that provides an effective way to compile, integrate, manage and display site characterization data as it is being generated. Plume{trademark} uses a combination of Bayesian analysis and geostatistics to provide technical staff with the ability to quantitatively merge soft and hard information for an estimate of the extent of contamination. Plume{trademark} provides an estimate of contamination extent, measures the uncertainty associated with the estimate, determines the value of additional sampling, and locates additional samples so that their value is maximized.

  5. Performance evaluation of an importance sampling technique in a Jackson network

    Science.gov (United States)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  6. The importance of trabecular hypertrophy in right ventricular adaptation to chronic pressure overload.

    Science.gov (United States)

    van de Veerdonk, Mariëlle C; Dusoswa, Sophie A; Marcus, J Tim; Bogaard, Harm-Jan; Spruijt, Onno; Kind, Taco; Westerhof, Nico; Vonk-Noordegraaf, Anton

    2014-02-01

    To assess the contribution of right ventricular (RV) trabeculae and papillary muscles (TPM) to RV mass and volumes in controls and patients with pulmonary arterial hypertension (PAH). Furthermore, to evaluate whether TPM shows a similar response as the RV free wall (RVFW) to changes in pulmonary artery pressure (PAP) during follow-up. 50 patients underwent cardiac magnetic resonance (CMR) and right heart catheterization at baseline and after one-year follow-up. Furthermore 20 controls underwent CMR. RV masses were assessed with and without TPM. TPM constituted a larger proportion of total RV mass and RV end-diastolic volume (RVEDV) in PAH than in controls (Mass: 35 ± 7 vs. 25 ± 5 %; p TPM mass was related to the RVFW mass in patients (baseline: R = 0.65; p TPM from the assessment resulted in altered RV mass, volumes and function than when included (all p TPM mass (β = 0.44; p = 0.004) but not the changes in RVFW mass (p = 0.095) were independently related to changes in PAP during follow-up. RV TPM showed a larger contribution to total RV mass in PAH (~35 %) compared to controls (~25 %). Inclusion of TPM in the analyses significantly influenced the magnitude of the RV volumes and mass. Furthermore, TPM mass was stronger related to changes in PAP than RVFW mass. Our results implicate that TPM are important contributors to RV adaptation during pressure overload and cannot be neglected from the RV assessment.

  7. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  8. Performance evaluation of Bayesian decision feedback equalizer with M-PAM symbols using importance sampling simulation

    OpenAIRE

    Chen, S.

    2002-01-01

    An importance sampling (IS) simulation method is presented for evaluating the lower-bound symbol error rate (SER) of the Bayesian decision feedback equalizer (DFE) with $M$-PAM symbols, under the assumption of correct decision feedback. By exploiting an asymptotic property of the Bayesian DFE, a design procedure is developed, which chooses appropriate bias vectors for the simulation density to ensure asymptotic efficiency (AE) of the IS simulation.

  9. A laser microdissection-based workflow for FFPE tissue microproteomics: Important considerations for small sample processing.

    Science.gov (United States)

    Longuespée, Rémi; Alberts, Deborah; Pottier, Charles; Smargiasso, Nicolas; Mazzucchelli, Gabriel; Baiwir, Dominique; Kriegsmann, Mark; Herfs, Michael; Kriegsmann, Jörg; Delvenne, Philippe; De Pauw, Edwin

    2016-07-15

    Proteomic methods are today widely applied to formalin-fixed paraffin-embedded (FFPE) tissue samples for several applications in research, especially in molecular pathology. To date, there is an unmet need for the analysis of small tissue samples, such as for early cancerous lesions. Indeed, no method has yet been proposed for the reproducible processing of small FFPE tissue samples to allow biomarker discovery. In this work, we tested several procedures to process laser microdissected tissue pieces bearing less than 3000 cells. Combined with appropriate settings for liquid chromatography mass spectrometry-mass spectrometry (LC-MS/MS) analysis, a citric acid antigen retrieval (CAAR)-based procedure was established, allowing to identify more than 1400 proteins from a single microdissected breast cancer tissue biopsy. This work demonstrates important considerations concerning the handling and processing of laser microdissected tissue samples of extremely limited size, in the process opening new perspectives in molecular pathology. A proof of the proposed method for biomarker discovery, with respect to these specific handling considerations, is illustrated using the differential proteomic analysis of invasive breast carcinoma of no special type and invasive lobular triple-negative breast cancer tissues. This work will be of utmost importance for early biomarker discovery or in support of matrix-assisted laser desorption/ionization (MALDI) imaging for microproteomics from small regions of interest. PMID:26690073

  10. High-resolution in-depth imaging of optically cleared thick samples using an adaptive SPIM

    Science.gov (United States)

    Masson, Aurore; Escande, Paul; Frongia, Céline; Clouvel, Grégory; Ducommun, Bernard; Lorenzo, Corinne

    2015-11-01

    Today, Light Sheet Fluorescence Microscopy (LSFM) makes it possible to image fluorescent samples through depths of several hundreds of microns. However, LSFM also suffers from scattering, absorption and optical aberrations. Spatial variations in the refractive index inside the samples cause major changes to the light path resulting in loss of signal and contrast in the deepest regions, thus impairing in-depth imaging capability. These effects are particularly marked when inhomogeneous, complex biological samples are under study. Recently, chemical treatments have been developed to render a sample transparent by homogenizing its refractive index (RI), consequently enabling a reduction of scattering phenomena and a simplification of optical aberration patterns. One drawback of these methods is that the resulting RI of cleared samples does not match the working RI medium generally used for LSFM lenses. This RI mismatch leads to the presence of low-order aberrations and therefore to a significant degradation of image quality. In this paper, we introduce an original optical-chemical combined method based on an adaptive SPIM and a water-based clearing protocol enabling compensation for aberrations arising from RI mismatches induced by optical clearing methods and acquisition of high-resolution in-depth images of optically cleared complex thick samples such as Multi-Cellular Tumour Spheroids.

  11. Adaptive multi-sample-based photoacoustic tomography with imaging quality optimization

    Institute of Scientific and Technical Information of China (English)

    Yuxin Wang; Jie Yuan; Sidan Du; Xiaojun Liu; Guan Xu; Xueding Wang

    2015-01-01

    The energy of light exposed on human skin is compulsively limited for safety reasons which affects the power of photoacoustic (PA) signal and its signal-to-noise ratio (SNR) level.Thus,the final reconstructed PA image quality is degraded.This Letter proposes an adaptive multi-sample-based approach to enhance the SNR of PA signals and in addition,detailed information in rebuilt PA images that used to be buried in the noise can be distinguished.Both ex vivo and in vivo experiments are conducted to validate the effectiveness of our proposed method which provides its potential value in clinical trials.

  12. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  13. Adaptation to lactose in lactose malabsorbers - importance of the intestinal microflora

    OpenAIRE

    Fondén, Rangne

    2001-01-01

    At high intakes most lactose in lactose malabsorbers will be fermented by the intestinal microflora to hydrogen and other fermentation products, as are all other low-molecular, non-absorbable, fermentable carbohydrates. By adaptation higher intakes of lactose could be tolerated partly due to a lower net formation of hydrogen. This shift in fermentation is at least partly caused by a change in the activities of the intestinal microflora. Keywords: Adaptation, hydrogen production, lactose malab...

  14. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Directory of Open Access Journals (Sweden)

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  15. Future tendencies of climate indicators important for adaptation and mitigation strategies in forestry

    Science.gov (United States)

    Galos, Borbala; Hänsler, Andreas; Gulyas, Krisztina; Bidlo, Andras; Czimber, Kornel

    2014-05-01

    impact analyses and build an important basis of the future adaptation strategies in forestry, agriculture and water management. Funding: The research is supported by the TÁMOP-4.2.2.A-11/1/KONV-2012-0013 and TÁMOP-4.1.1.C-12/1/KONV-2012-0012 (ZENFE) joint EU-national research projects. Keywords: climate indices, climate change impacts, forestry, regional climate modelling

  16. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    Science.gov (United States)

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-03-28

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  17. Motion-adapted pulse sequences for oriented sample (OS) solid-state NMR of biopolymers.

    Science.gov (United States)

    Lu, George J; Opella, Stanley J

    2013-08-28

    One of the main applications of solid-state NMR is to study the structure and dynamics of biopolymers, such as membrane proteins, under physiological conditions where the polypeptides undergo global motions as they do in biological membranes. The effects of NMR radiofrequency irradiations on nuclear spins are strongly influenced by these motions. For example, we previously showed that the MSHOT-Pi4 pulse sequence yields spectra with resonance line widths about half of those observed using the conventional pulse sequence when applied to membrane proteins undergoing rapid uniaxial rotational diffusion in phospholipid bilayers. In contrast, the line widths were not changed in microcrystalline samples where the molecules did not undergo global motions. Here, we demonstrate experimentally and describe analytically how some Hamiltonian terms are susceptible to sample motions, and it is their removal through the critical π/2 Z-rotational symmetry that confers the "motion adapted" property to the MSHOT-Pi4 pulse sequence. This leads to the design of separated local field pulse sequence "Motion-adapted SAMPI4" and is generalized to an approach for the design of decoupling sequences whose performance is superior in the presence of molecular motions. It works by cancelling the spin interaction by explicitly averaging the reduced Wigner matrix to zero, rather than utilizing the 2π nutation to average spin interactions. This approach is applicable to both stationary and magic angle spinning solid-state NMR experiments.

  18. The importance of training strategy adaptation: a learner-oriented approach for improving older adults' memory and transfer.

    Science.gov (United States)

    Bottiroli, Sara; Cavallini, Elena; Dunlosky, John; Vecchi, Tomaso; Hertzog, Christopher

    2013-09-01

    We investigated the benefits of strategy-adaptation training for promoting transfer effects. This learner-oriented approach--which directly encourages the learner to generalize strategic behavior to new tasks--helps older adults appraise new tasks and adapt trained strategies to them. In Experiment 1, older adults in a strategy-adaptation training group used 2 strategies (imagery and sentence generation) while practicing 2 tasks (list and associative learning); they were then instructed on how to do a simple task analysis to help them adapt the trained strategies for 2 different unpracticed tasks (place learning and text learning) that were discussed during training. Two additional criterion tasks (name-face associative learning and grocery-list learning) were never mentioned during training. Two other groups were included: A strategy training group (who received strategy training and transfer instructions but not strategy-adaptation training) and a waiting-list control group. Both training procedures enhanced older adults' performance on the trained tasks and those tasks that were discussed during training, but transfer was greatest after strategy-adaptation training. Experiment 2 found that strategy-adaptation training conducted via a manual that older adults used at home also promoted transfer. These findings demonstrate the importance of adopting a learner-oriented approach to promote transfer of strategy training.

  19. Role of importance of X-ray fluorescence analysis of forensic samples

    International Nuclear Information System (INIS)

    Full text: In the field of forensic science, it is very important to investigate the evidential samples obtained at various crime scenes. X-ray fluorescence (XRF) is used widely in forensic science [1]. Its main strength is its non-destructive nature, thus preserving evidence [2, 3]. In this paper, we report the application of XRF to examine the evidences like purity gold and silver jewelry (Indian Ornaments), remnants of glass pieces and paint chips recovered from crime scenes. The experimental measurements on these samples have been made using X-ray fluorescence spectrometer (LAB Center XRF-1800) procured from Shimazdu Scientific Inst., USA. The results are explained in terms of quantitative/ qualitative analysis of trace elements. (author)

  20. Local adaptation of a bacterium is as important as its presence in structuring a natural microbial community.

    Science.gov (United States)

    Gómez, Pedro; Paterson, Steve; De Meester, Luc; Liu, Xuan; Lenzi, Luca; Sharma, M D; McElroy, Kerensa; Buckling, Angus

    2016-01-01

    Local adaptation of a species can affect community composition, yet the importance of local adaptation compared with species presence per se is unknown. Here we determine how a compost bacterial community exposed to elevated temperature changes over 2 months as a result of the presence of a focal bacterium, Pseudomonas fluorescens SBW25, that had been pre-adapted or not to the compost for 48 days. The effect of local adaptation on community composition is as great as the effect of species presence per se, with these results robust to the presence of an additional strong selection pressure: an SBW25-specific virus. These findings suggest that evolution occurring over ecological time scales can be a key driver of the structure of natural microbial communities, particularly in situations where some species have an evolutionary head start following large perturbations, such as exposure to antibiotics or crop planting and harvesting. PMID:27501868

  1. On importance sampling with mixtures for random walks with heavy tails

    CERN Document Server

    Hult, Henrik

    2009-01-01

    Importance sampling algorithms for heavy-tailed random walks are considered. Using a specification with algorithms based on mixtures of the original distribution with some other distribution, sufficient conditions for obtaining bounded relative error are presented. It is proved that mixture algorithms of this kind can achieve asymptotically optimal relative error. Some examples of mixture algorithms are presented, including mixture algorithms using a scaling of the original distribution, and the bounds of the relative errors are calculated. The algorithms are evaluated numerically in a simple setting.

  2. Evaluation of endoscopically obtained duodenal biopsy samples from cats and dogs in an adapter-modified Ussing chamber

    OpenAIRE

    Ruhnke, Isabelle; DeBiasio, John V.; Suchodolski, Jan S.; Newman, Shelley; Musch, Mark W.; Steiner, Jörg M.

    2014-01-01

    This study was conducted to evaluate an adapter-modified Ussing chamber for assessment of transport physiology in endoscopically obtained duodenal biopsies from healthy cats and dogs, as well as dogs with chronic enteropathies. 17 duodenal biopsies from five cats and 51 duodenal biopsies from 13 dogs were obtained. Samples were transferred into an adapter-modified Ussing chamber and sequentially exposed to various absorbagogues and secretagogues. Overall, 78.6% of duodenal samples obtained fr...

  3. Cold adaptation in geographical populations of Drosophila melanogaster : phenotypic plasticity is more important than genetic variability

    NARCIS (Netherlands)

    Ayrinhac, A; Debat, [No Value; Gibert, P; Kister, AG; Legout, H; Moreteau, B; Vergilino, R; David, [No Value

    2004-01-01

    1. According to their geographical distribution, most Drosophila species may be classified as either temperate or tropical, and this pattern is assumed to reflect differences in their thermal adaptation, especially in their cold tolerance. We investigated cold tolerance in a global collection of D.

  4. Do women's voices provide cues of the likelihood of ovulation? The importance of sampling regime.

    Directory of Open Access Journals (Sweden)

    Julia Fischer

    Full Text Available The human voice provides a rich source of information about individual attributes such as body size, developmental stability and emotional state. Moreover, there is evidence that female voice characteristics change across the menstrual cycle. A previous study reported that women speak with higher fundamental frequency (F0 in the high-fertility compared to the low-fertility phase. To gain further insights into the mechanisms underlying this variation in perceived attractiveness and the relationship between vocal quality and the timing of ovulation, we combined hormone measurements and acoustic analyses, to characterize voice changes on a day-to-day basis throughout the menstrual cycle. Voice characteristics were measured from free speech as well as sustained vowels. In addition, we asked men to rate vocal attractiveness from selected samples. The free speech samples revealed marginally significant variation in F0 with an increase prior to and a distinct drop during ovulation. Overall variation throughout the cycle, however, precluded unequivocal identification of the period with the highest conception risk. The analysis of vowel samples revealed a significant increase in degree of unvoiceness and noise-to-harmonic ratio during menstruation, possibly related to an increase in tissue water content. Neither estrogen nor progestogen levels predicted the observed changes in acoustic characteristics. The perceptual experiments revealed a preference by males for voice samples recorded during the pre-ovulatory period compared to other periods in the cycle. While overall we confirm earlier findings in that women speak with a higher and more variable fundamental frequency just prior to ovulation, the present study highlights the importance of taking the full range of variation into account before drawing conclusions about the value of these cues for the detection of ovulation.

  5. Component-adaptive up-sampling for inter layer interpolation in scalable video coding

    Institute of Scientific and Technical Information of China (English)

    WANG Zhang; ZHANG JiXian; LI HaiTao

    2009-01-01

    Scalable video coding (SVC) is a newly emerging standard to be finalized as an extension of H.264/AVC. The most attractive characters in SVC are the inter layer prediction techniques, such as Intra_BL mode. But in current SVC scheme, a uniform up-sampling filter (UUSF) is employed to magnify all components of an image, which will be very inefficient and result in a lot of redundant computational complexity. To overcome this, we propose an efficient component-adaptive up-sampling filter (CAUSF) for inter layer interpolation. In CAUSF, one character of human vision system is considered, and different up-sampling filters are assigned to different components. In particular, the six-tap FIR filter used in UUSF is kept and assigned for luminance component. But for chrominance components, a new four-tap FIR filter is used. Experimental results show that CAUSF maintains the performances of coded bit-rate and PSNR-Y without any noticeable loss, and provides significant reduction in computational complexity.

  6. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  7. Accelerating the convergence of replica exchange simulations using Gibbs sampling and adaptive temperature sets

    CERN Document Server

    Vogel, Thomas

    2015-01-01

    We recently introduced a novel replica-exchange scheme in which an individual replica can sample from states encountered by other replicas at any previous time by way of a global configuration database, enabling the fast propagation of relevant states through the whole ensemble of replicas. This mechanism depends on the knowledge of global thermodynamic functions which are measured during the simulation and not coupled to the heat bath temperatures driving the individual simulations. Therefore, this setup also allows for a continuous adaptation of the temperature set. In this paper, we will review the new scheme and demonstrate its capability. The method is particularly useful for the fast and reliable estimation of the microcanonical temperature T(U) or, equivalently, of the density of states g(U) over a wide range of energies.

  8. PARALLEL ADAPTIVE MULTILEVEL SAMPLING ALGORITHMS FOR THE BAYESIAN ANALYSIS OF MATHEMATICAL MODELS

    KAUST Repository

    Prudencio, Ernesto

    2012-01-01

    In recent years, Bayesian model updating techniques based on measured data have been applied to many engineering and applied science problems. At the same time, parallel computational platforms are becoming increasingly more powerful and are being used more frequently by the engineering and scientific communities. Bayesian techniques usually require the evaluation of multi-dimensional integrals related to the posterior probability density function (PDF) of uncertain model parameters. The fact that such integrals cannot be computed analytically motivates the research of stochastic simulation methods for sampling posterior PDFs. One such algorithm is the adaptive multilevel stochastic simulation algorithm (AMSSA). In this paper we discuss the parallelization of AMSSA, formulating the necessary load balancing step as a binary integer programming problem. We present a variety of results showing the effectiveness of load balancing on the overall performance of AMSSA in a parallel computational environment.

  9. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Science.gov (United States)

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  10. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Directory of Open Access Journals (Sweden)

    Bruno Srbinovski

    2016-03-01

    Full Text Available Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind. Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources and power hungry sensors (ultrasonic wind sensor and gas sensors. The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  11. Real-time nutrient monitoring in rivers: adaptive sampling strategies, technological challenges and future directions

    Science.gov (United States)

    Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Bradley, Chris

    2016-04-01

    Excessive nutrient concentrations in river waters threaten aquatic ecosystem functioning and can pose substantial risks to human health. Robust monitoring strategies are therefore required to generate reliable estimates of river nutrient loads and to improve understanding of the catchment processes that drive spatiotemporal patterns in nutrient fluxes. Furthermore, these data are vital for prediction of future trends under changing environmental conditions and thus the development of appropriate mitigation measures. In recent years, technological developments have led to an increase in the use of continuous in-situ nutrient analysers, which enable measurements at far higher temporal resolutions than can be achieved with discrete sampling and subsequent laboratory analysis. However, such instruments can be costly to run and difficult to maintain (e.g. due to high power consumption and memory requirements), leading to trade-offs between temporal and spatial monitoring resolutions. Here, we highlight how adaptive monitoring strategies, comprising a mixture of temporal sample frequencies controlled by one or more 'trigger variables' (e.g. river stage, turbidity, or nutrient concentration), can advance our understanding of catchment nutrient dynamics while simultaneously overcoming many of the practical and economic challenges encountered in typical in-situ river nutrient monitoring applications. We present examples of short-term variability in river nutrient dynamics, driven by complex catchment behaviour, which support our case for the development of monitoring systems that can adapt in real-time to rapid environmental changes. In addition, we discuss the advantages and disadvantages of current nutrient monitoring techniques, and suggest new research directions based on emerging technologies and highlight how these might improve: 1) monitoring strategies, and 2) understanding of linkages between catchment processes and river nutrient fluxes.

  12. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    Science.gov (United States)

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  13. Model reduction algorithms for optimal control and importance sampling of diffusions

    Science.gov (United States)

    Hartmann, Carsten; Schütte, Christof; Zhang, Wei

    2016-08-01

    We propose numerical algorithms for solving optimal control and importance sampling problems based on simplified models. The algorithms combine model reduction techniques for multiscale diffusions and stochastic optimization tools, with the aim of reducing the original, possibly high-dimensional problem to a lower dimensional representation of the dynamics, in which only a few relevant degrees of freedom are controlled or biased. Specifically, we study situations in which either a reaction coordinate onto which the dynamics can be projected is known, or situations in which the dynamics shows strongly localized behavior in the small noise regime. No explicit assumptions about small parameters or scale separation have to be made. We illustrate the approach with simple, but paradigmatic numerical examples.

  14. Estimation variance bounds of importance sampling simulations in digital communication systems

    Science.gov (United States)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  15. Importance Sampling Variance Reduction for the Fokker-Planck Rarefied Gas Particle Method

    CERN Document Server

    Collyer, Benjamin; Lockerby, Duncan

    2015-01-01

    Models and methods that are able to accurately and efficiently predict the flows of low-speed rarefied gases are in high demand, due to the increasing ability to manufacture devices at micro and nano scales. One such model and method is a Fokker-Planck approximation to the Boltzmann equation, which can be solved numerically by a stochastic particle method. The stochastic nature of this method leads to noisy estimates of the thermodynamic quantities one wishes to sample when the signal is small in comparison to the thermal velocity of the gas. Recently, Gorji et al have proposed a method which is able to greatly reduce the variance of the estimators, by creating a correlated stochastic process which acts as a control variate for the noisy estimates. However, there are potential difficulties involved when the geometry of the problem is complex, as the method requires the density to be solved for independently. Importance sampling is a variance reduction technique that has already been shown to successfully redu...

  16. Efficient estimation of abundance for patchily distributed populations via two-phase, adaptive sampling.

    Science.gov (United States)

    Conroy, M.J.; Runge, J.P.; Barker, R.J.; Schofield, M.R.; Fonnesbeck, C.J.

    2008-01-01

    Many organisms are patchily distributed, with some patches occupied at high density, others at lower densities, and others not occupied. Estimation of overall abundance can be difficult and is inefficient via intensive approaches such as capture-mark-recapture (CMR) or distance sampling. We propose a two-phase sampling scheme and model in a Bayesian framework to estimate abundance for patchily distributed populations. In the first phase, occupancy is estimated by binomial detection samples taken on all selected sites, where selection may be of all sites available, or a random sample of sites. Detection can be by visual surveys, detection of sign, physical captures, or other approach. At the second phase, if a detection threshold is achieved, CMR or other intensive sampling is conducted via standard procedures (grids or webs) to estimate abundance. Detection and CMR data are then used in a joint likelihood to model probability of detection in the occupancy sample via an abundance-detection model. CMR modeling is used to estimate abundance for the abundance-detection relationship, which in turn is used to predict abundance at the remaining sites, where only detection data are collected. We present a full Bayesian modeling treatment of this problem, in which posterior inference on abundance and other parameters (detection, capture probability) is obtained under a variety of assumptions about spatial and individual sources of heterogeneity. We apply the approach to abundance estimation for two species of voles (Microtus spp.) in Montana, USA. We also use a simulation study to evaluate the frequentist properties of our procedure given known patterns in abundance and detection among sites as well as design criteria. For most population characteristics and designs considered, bias and mean-square error (MSE) were low, and coverage of true parameter values by Bayesian credibility intervals was near nominal. Our two-phase, adaptive approach allows efficient estimation of

  17. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity over Generalized Fading Channels

    KAUST Repository

    Ben Rached, Nadhir

    2015-11-13

    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closedform expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive Monte Carlo (MC) simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations.

  18. Motion-adapted pulse sequences for oriented sample (OS) solid-state NMR of biopolymers

    Science.gov (United States)

    Lu, George J.; Opella, Stanley J.

    2013-01-01

    One of the main applications of solid-state NMR is to study the structure and dynamics of biopolymers, such as membrane proteins, under physiological conditions where the polypeptides undergo global motions as they do in biological membranes. The effects of NMR radiofrequency irradiations on nuclear spins are strongly influenced by these motions. For example, we previously showed that the MSHOT-Pi4 pulse sequence yields spectra with resonance line widths about half of those observed using the conventional pulse sequence when applied to membrane proteins undergoing rapid uniaxial rotational diffusion in phospholipid bilayers. In contrast, the line widths were not changed in microcrystalline samples where the molecules did not undergo global motions. Here, we demonstrate experimentally and describe analytically how some Hamiltonian terms are susceptible to sample motions, and it is their removal through the critical π/2 Z-rotational symmetry that confers the “motion adapted” property to the MSHOT-Pi4 pulse sequence. This leads to the design of separated local field pulse sequence “Motion-adapted SAMPI4” and is generalized to an approach for the design of decoupling sequences whose performance is superior in the presence of molecular motions. It works by cancelling the spin interaction by explicitly averaging the reduced Wigner matrix to zero, rather than utilizing the 2π nutation to average spin interactions. This approach is applicable to both stationary and magic angle spinning solid-state NMR experiments. PMID:24006989

  19. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Science.gov (United States)

    2010-07-01

    ... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... rail car for import to the U.S., the importer must obtain a copy of the terminal test result that... diesel fuel samples and perform audits. These inspections or audits may be either announced...

  20. Estimation of failure probabilities of linear dynamic systems by importance sampling

    Indian Academy of Sciences (India)

    Anna Ivanova Olsen; Arvid Naess

    2006-08-01

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the first iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the first maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure.

  1. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  2. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Directory of Open Access Journals (Sweden)

    Oscar Vedder

    2013-07-01

    Full Text Available Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations.

  3. Cortisol Secretion and Functional Disabilities in Old Age: Importance of Using Adaptive Control Strategies

    Science.gov (United States)

    Wrosch, Carsten; Miller, Gregory E.; Schulz, Richard

    2009-01-01

    Objectives To examine whether the use of health-related control strategies moderates the association between elevated diurnal cortisol secretion and increases in older adults’ functional disabilities. Methods Functional disabilities of 164 older adults were assessed over 4 years by measuring participants’ problems with performing activities of daily living. The main predictors included baseline levels of diurnal cortisol secretion and control strategies used to manage physical health threats. Results A large increase in functional disabilities was observed among participants who secreted elevated baseline levels of cortisol and did not use health-related control strategies. By contrast, high cortisol level was not associated with increases in functional disabilities among participants who reported using these control strategies. Among participants with low cortisol level, there was a relatively smaller increase in functional disabilities over time, and the use of control strategies was not significantly associated with changes in functional disabilities. Conclusions The findings suggest that high cortisol level is associated with an increase in older adults’ functional disabilities, but only if older adults do not engage in adaptive control strategies. PMID:19875635

  4. Blood Volume: Importance and Adaptations to Exercise Training, Environmental Stresses and Trauma/Sickness

    Science.gov (United States)

    Sawka, Michael N.; Convertino, Victor A.; Eichner, E. Randy; Schnieder, Suzanne M.; Young, Andrew J.

    2000-01-01

    This paper reviews the influence of several perturbations (physical exercise, heat stress, terrestrial altitude, microgravity, and trauma/sickness) on adaptations of blood volume (BV), erythrocyte volume (EV), and plasma volume (PV). Exercise training can induced BV expansion; PV expansion usually occurs immediately, but EV expansion takes weeks. EV and PV expansion contribute to aerobic power improvements associated with exercise training. Repeated heat exposure induces PV expansion but does not alter EV. PV expansion does not improve thermoregulation, but EV expansion improves thermoregulation during exercise in the heat. Dehydration decreases PV (and increases plasma tonicity) which elevates heat strain and reduces exercise performance. High altitude exposure causes rapid (hours) plasma loss. During initial weeks at altitude, EV is unaffected, but a gradual expansion occurs with extended acclimatization. BV adjustments contribute, but are not key, to altitude acclimatization. Microgravity decreases PV and EV which contribute to orthostatic intolerance and decreased exercise capacity in astronauts. PV decreases may result from lower set points for total body water and central venous pressure, which EV decrease bay result form increased erythrocyte destruction. Trauma, renal disease, and chronic diseases cause anemia from hemorrhage and immune activation, which suppressions erythropoiesis. The re-establishment of EV is associated with healing, improved life quality, and exercise capabilities for these injured/sick persons.

  5. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  6. Important aspects of residue sampling in drilling dikes; Aspectos importantes para a amostragem de residuos em diques de perfuracao

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Gilvan Ferreira da [PETROBRAS, Rio de Janeiro (Brazil). Centro de Pesquisas. Div. de Explotacao

    1989-12-31

    This paper describes the importance of sampling in the evaluation of physical and chemical properties of residues found in drilling dikes, considering the later selection of treatment methods or discard of these residues. We present the fundamental concepts of applied statistics, which are essential to the elaboration of sampling plans, with views of obtaining exact and precise results. Other types of samples are also presented, as well as sampling equipment and methods for storage and preservation of the samples. As a conclusion, we the example of the implementation of a sampling plan. (author) 3 refs., 9 figs., 3 tabs.

  7. 76 FR 56491 - Culturally Significant Objects Imported for Exhibition Determinations: “Adapting the Eye: An...

    Science.gov (United States)

    2011-09-13

    ... British in India, 1770-1830'' Summary: Notice is hereby given of the following determinations: Pursuant to...: An Archive of the British in India, 1770-1830,'' imported from abroad for temporary exhibition within... objects at the Yale Center for British Art, New Haven, Connecticut, from on or about October 11,...

  8. Massively parallel sampling of lattice proteins reveals foundations of thermal adaptation

    Science.gov (United States)

    Venev, Sergey V.; Zeldovich, Konstantin B.

    2015-08-01

    Evolution of proteins in bacteria and archaea living in different conditions leads to significant correlations between amino acid usage and environmental temperature. The origins of these correlations are poorly understood, and an important question of protein theory, physics-based prediction of types of amino acids overrepresented in highly thermostable proteins, remains largely unsolved. Here, we extend the random energy model of protein folding by weighting the interaction energies of amino acids by their frequencies in protein sequences and predict the energy gap of proteins designed to fold well at elevated temperatures. To test the model, we present a novel scalable algorithm for simultaneous energy calculation for many sequences in many structures, targeting massively parallel computing architectures such as graphics processing unit. The energy calculation is performed by multiplying two matrices, one representing the complete set of sequences, and the other describing the contact maps of all structural templates. An implementation of the algorithm for the CUDA platform is available at http://www.github.com/kzeldovich/galeprot and calculates protein folding energies over 250 times faster than a single central processing unit. Analysis of amino acid usage in 64-mer cubic lattice proteins designed to fold well at different temperatures demonstrates an excellent agreement between theoretical and simulated values of energy gap. The theoretical predictions of temperature trends of amino acid frequencies are significantly correlated with bioinformatics data on 191 bacteria and archaea, and highlight protein folding constraints as a fundamental selection pressure during thermal adaptation in biological evolution.

  9. Bandpass Sampling--An Opportunity to Stress the Importance of In-Depth Understanding

    Science.gov (United States)

    Stern, Harold P. E.

    2010-01-01

    Many bandpass signals can be sampled at rates lower than the Nyquist rate, allowing significant practical advantages. Illustrating this phenomenon after discussing (and proving) Shannon's sampling theorem provides a valuable opportunity for an instructor to reinforce the principle that innovation is possible when students strive to have a complete…

  10. The Importance of Pressure Sampling Frequency in Models for Determination of Critical Wave Loadingson Monolithic Structures

    DEFF Research Database (Denmark)

    Burcharth, Hans F.; Andersen, Thomas Lykke; Meinert, Palle

    2008-01-01

    This paper discusses the influence of wave load sampling frequency on calculated sliding distance in an overall stability analysis of a monolithic caisson. It is demonstrated by a specific example of caisson design that for this kind of analyses the sampling frequency in a small scale model could...

  11. Physical Activity: An Important Adaptative Mechanism for Body-Weight Control

    OpenAIRE

    Finelli, Carmine; Gioia, Saverio; La Sala, Nicolina

    2012-01-01

    We review the current concepts about energy expenditure and evaluate the physical activity (PhA) in the context of this knowledge and the available literature. Regular PhA is correlated with low body weight and low body fat mass. The negative fat balance is probably secondary to this negative energy balance. Nonexercise activity thermogenesis (NEAT) and physical activity, that is crucial for weight control, may be important in the physiology of weight change. An intriguing doubt that remains ...

  12. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    OpenAIRE

    HASAN CANI; ARSEN PROKO; VATH TABAKU

    2014-01-01

    Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended t...

  13. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    Science.gov (United States)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  14. Physical activity: an important adaptative mechanism for body-weight control.

    Science.gov (United States)

    Finelli, Carmine; Gioia, Saverio; La Sala, Nicolina

    2012-01-01

    We review the current concepts about energy expenditure and evaluate the physical activity (PhA) in the context of this knowledge and the available literature. Regular PhA is correlated with low body weight and low body fat mass. The negative fat balance is probably secondary to this negative energy balance. Nonexercise activity thermogenesis (NEAT) and physical activity, that is crucial for weight control, may be important in the physiology of weight change. An intriguing doubt that remains unresolved is whether changes in nutrient intake or body composition secondarily affect the spontaneous physical activity. PMID:24533208

  15. The soft palate is an important site of adaptation for transmissible influenza viruses.

    Science.gov (United States)

    Lakdawala, Seema S; Jayaraman, Akila; Halpin, Rebecca A; Lamirande, Elaine W; Shih, Angela R; Stockwell, Timothy B; Lin, Xudong; Simenauer, Ari; Hanson, Christopher T; Vogel, Leatrice; Paskel, Myeisha; Minai, Mahnaz; Moore, Ian; Orandle, Marlene; Das, Suman R; Wentworth, David E; Sasisekharan, Ram; Subbarao, Kanta

    2015-10-01

    Influenza A viruses pose a major public health threat by causing seasonal epidemics and sporadic pandemics. Their epidemiological success relies on airborne transmission from person to person; however, the viral properties governing airborne transmission of influenza A viruses are complex. Influenza A virus infection is mediated via binding of the viral haemagglutinin (HA) to terminally attached α2,3 or α2,6 sialic acids on cell surface glycoproteins. Human influenza A viruses preferentially bind α2,6-linked sialic acids whereas avian influenza A viruses bind α2,3-linked sialic acids on complex glycans on airway epithelial cells. Historically, influenza A viruses with preferential association with α2,3-linked sialic acids have not been transmitted efficiently by the airborne route in ferrets. Here we observe efficient airborne transmission of a 2009 pandemic H1N1 (H1N1pdm) virus (A/California/07/2009) engineered to preferentially bind α2,3-linked sialic acids. Airborne transmission was associated with rapid selection of virus with a change at a single HA site that conferred binding to long-chain α2,6-linked sialic acids, without loss of α2,3-linked sialic acid binding. The transmissible virus emerged in experimentally infected ferrets within 24 hours after infection and was remarkably enriched in the soft palate, where long-chain α2,6-linked sialic acids predominate on the nasopharyngeal surface. Notably, presence of long-chain α2,6-linked sialic acids is conserved in ferret, pig and human soft palate. Using a loss-of-function approach with this one virus, we demonstrate that the ferret soft palate, a tissue not normally sampled in animal models of influenza, rapidly selects for transmissible influenza A viruses with human receptor (α2,6-linked sialic acids) preference.

  16. Cas9 function and host genome sampling in Type II-A CRISPR–Cas adaptation

    OpenAIRE

    Wei, Yunzhou; Terns, Rebecca M.; Terns, Michael P.

    2015-01-01

    Wei et al. found that Cas9, previously identified as the nuclease responsible for ultimate invader destruction, is also essential for adaptation in Streptococcus thermophilus. Cas9 nuclease activity is dispensable for adaptation. Wei et al. also revealed extensive, unbiased acquisition of the self-targeting host genome sequence by the CRISPR–Cas system that is masked in the presence of active target destruction.

  17. Importance of covariance components of waveform data with high sampling rate in seismic source inversion

    Science.gov (United States)

    Yagi, Y.; Fukahata, Y.

    2007-12-01

    As computer technology advanced, it has become possible to observe seismic wave with a higher sampling rate and perform inversion for a larger data set. In general, to obtain a finer image of seismic source processes, waveform data with a higher sampling rate are needed. Then we encounter a problem whether there is no limitation of sampling rate in waveform inversion. In traditional seismic source inversion, covariance components of sampled waveform data have commonly been neglected. In fact, however, observed waveform data are not completely independent of each other at least in time domain, because they are always affected by un-elastic attenuation in the propagation of seismic waves through the Earth. In this study, we have developed a method of seismic source inversion to take the data covariance into account, and applied it to teleseismic P-wave data of the 2003 Boumerdes-Zemmouri, Algeria earthquake. From a comparison of final slip distributions inverted by the new formulation and the traditional formulation, we found that the effect of covariance components is crucial for a data set of higher sampling rates (≥ 5 Hz). For higher sampling rates, the slip distributions by the new formulation look stable, whereas the slip distributions by the traditional formulation tend to concentrate into small patches due to overestimation of the information from observed data. Our result indicates that the un-elastic effect of the Earth gives a limitation to the resolution of inverted seismic source models. It has been pointed out that seismic source models obtained from waveform data analyses are quite different from one another. One possible reason for the discrepancy is the neglect of covariance components. The new formulation must be useful to obtain a standard seismic source model.

  18. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    Science.gov (United States)

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function. PMID:26801023

  19. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    Science.gov (United States)

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  20. Sampling procedure in a willow plantation for chemical elements important for biomass combustion quality

    DEFF Research Database (Denmark)

    Liu, Na; Nielsen, Henrik Kofoed; Jørgensen, Uffe;

    2015-01-01

    Willow (Salix spp.) is expected to contribute significantly to the woody bioenergy system in the future, so more information on how to sample the quality of the willow biomass is needed. The objectives of this study were to investigate the spatial variation of elements within shoots of a willow c...... the mean concentration of the whole stem (from 86% to 108%, except for Mg, Na, Al and Fe). For practical reasons it is recommended to sample 10 cm sections at the breast height (125–135 cm) to minimise labour costs....

  1. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    Directory of Open Access Journals (Sweden)

    HASAN CANI

    2014-06-01

    Full Text Available Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended to produce biological knowledge that can be used to better manage forest resources for sustainable production of economic and non-economic values and aims to improve the understanding of past and current dynamics of Mediterranean and temperate forests. The overarching goal is to quantify the influence of genetics, climate, environmental stresses, and forest management inputs on forest productivity and carbon sequestration, and to understand the physiological mechanisms underlying these responses.Process-based models open the way to useful predictions of the future growth rate of forests and provide a means of assessing the probable effects of variations in climate and management on forest productivity. As such they have the potential to overcome the limitations of conventional forest growth and yield models. This paper discusses the basic physiological processes that determine the growth of plants, the way they are affected by environmental factors and how we can improve processes that are well-understood such as growth from leaf to stand level and productivity. The study trays to show a clear relationship between temperature and water relations and other factors affecting forest plant germination and growth that are often looked at separately. This integrated approach will provide the most comprehensive source for process-based modelling, which is valuable to ecologists, plant physiologists, forest planners and environmental scientists [10]. Actually the

  2. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Science.gov (United States)

    2010-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline...

  3. Sample preparation and biomass determination of SRF model mixture using cryogenic milling and the adapted balance method

    Energy Technology Data Exchange (ETDEWEB)

    Schnöller, Johannes, E-mail: johannes.schnoeller@chello.at; Aschenbrenner, Philipp; Hahn, Manuel; Fellner, Johann; Rechberger, Helmut

    2014-11-15

    Highlights: • An alternative sample comminution procedure for SRF is tested. • Proof of principle is shown on a SRF model mixture. • The biogenic content of the SRF is analyzed with the adapted balance method. • The novel method combines combustion analysis and a data reconciliation algorithm. • Factors for the variance of the analysis results are statistically quantified. - Abstract: The biogenic fraction of a simple solid recovered fuel (SRF) mixture (80 wt% printer paper/20 wt% high density polyethylene) is analyzed with the in-house developed adapted balance method (aBM). This fairly new approach is a combination of combustion elemental analysis (CHNS) and a data reconciliation algorithm based on successive linearisation for evaluation of the analysis results. This method shows a great potential as an alternative way to determine the biomass content in SRF. However, the employed analytical technique (CHNS elemental analysis) restricts the probed sample mass to low amounts in the range of a few hundred milligrams. This requires sample comminution to small grain sizes (<200 μm) to generate representative SRF specimen. This is not easily accomplished for certain material mixtures (e.g. SRF with rubber content) by conventional means of sample size reduction. This paper presents a proof of principle investigation of the sample preparation and analysis of an SRF model mixture with the use of cryogenic impact milling (final sample comminution) and the adapted balance method (determination of biomass content). The so derived sample preparation methodology (cutting mills and cryogenic impact milling) shows a better performance in accuracy and precision for the determination of the biomass content than one solely based on cutting mills. The results for the determination of the biogenic fraction are within 1–5% of the data obtained by the reference methods, selective dissolution method (SDM) and {sup 14}C-method ({sup 14}C-M)

  4. Optimal importance sampling for tracking in image sequences: application to point tracking

    OpenAIRE

    Arnaud, Elise; Memin, Etienne

    2004-01-01

    International audience In this paper, we propose a particle filtering technique for tracking applications in image sequences. The system we propose combines a measurement equation and a dynamic equation which both depend on the image sequence. Taking into account several possible observations, the peculiar measure model we consider is a linear combination of Gaussian laws. Such a model allows us to infer an analytic expression of the optimal importance function used in the diffusion proces...

  5. Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models

    OpenAIRE

    Koopman, S.J.; Lucas, A.; Scharth Figueiredo Pinto, M.

    2011-01-01

    This paper led to a publication in the 'Journal of Business & Economic Statistics' , 2015, 33 (1), 114-127. We introduce a new efficient importance sampler for nonlinear non-Gaussian state space models. We propose a general and efficient likelihood evaluation method for this class of models via the combination of numerical and Monte Carlo integration methods. Our methodology explores the idea that only a small part of the likelihood evaluation problem requires simulation. We refer to our new ...

  6. A Monte Carlo Simulation of the Flow Network Reliability using Importance and Stratified Sampling

    OpenAIRE

    Bulteau, Stéphane; El Khadiri, Mohamed

    1997-01-01

    We consider the evaluation of the flow network reliability parameter. Because the exact evaluation of this parameter has exponential time complexity- , simulation methods are used to derive an estimate. In this paper, we use the state space decomposition methodology of Doulliez and Jamoulle for constructing a new simulation method which combines the importance and the stratified Monte Carlo principles. We show that the related estimator belongs to the variance-reduction family. By numerical c...

  7. How are farmers adapting to climate change in Vietnam?: Endogeneity and sample selection in a rice yield model

    OpenAIRE

    Yu, Bingxin; Zhu, Tingju; Breisinger, Clemens; Manh Hai, Nguyen

    2013-01-01

    This paper examines how a changing climate may affect rice production and how Vietnamese farmers are likely to adapt to various climatic conditions using an innovative yield function approach, taking into account sample selection bias and endogeneity of inputs. Model results suggest that although climate change can potentially reduce rice production, farmers will respond mainly by adjusting the production portfolio and levels of input use. However, investments in rural infrastructure and huma...

  8. Research on non-uniform sampling problem when adapting wavenumber algorithm to multiple-receiver synthetic aperture sonar

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The azimuth sampling of multiple-receiver SAS systems is non-uniform,which causes standard wavenumber algorithm(ω—κ) can't be applied to multiple-receiver SAS image reconstruction.To solve the problem,two methods are presented,which can adapt the standardω—κalgorithm to multiple-receiver SAS system.One method named Non-uniform Separate Fourier Transform(NSFFT) converts the Fourier Transform(FT) of the non-uniform samples in azimuth direction into several uniform FTs on the assumption that the sonar array...

  9. Importance of Sample Size for the Estimation of Repeater F Waves in Amyotrophic Lateral Sclerosis

    Directory of Open Access Journals (Sweden)

    Jia Fang

    2015-01-01

    Full Text Available Background: In amyotrophic lateral sclerosis (ALS, repeater F waves are increased. Accurate assessment of repeater F waves requires an adequate sample size. Methods: We studied the F waves of left ulnar nerves in ALS patients. Based on the presence or absence of pyramidal signs in the left upper limb, the ALS patients were divided into two groups: One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group. The Index repeating neurons (RN and Index repeater F waves (Freps were compared among the P, NP and control groups following 20 and 100 stimuli respectively. For each group, the Index RN and Index Freps obtained from 20 and 100 stimuli were compared. Results: In the P group, the Index RN (P = 0.004 and Index Freps (P = 0.001 obtained from 100 stimuli were significantly higher than from 20 stimuli. For F waves obtained from 20 stimuli, no significant differences were identified between the P and NP groups for Index RN (P = 0.052 and Index Freps (P = 0.079; The Index RN (P < 0.001 and Index Freps (P < 0.001 of the P group were significantly higher than the control group; The Index RN (P = 0.002 of the NP group was significantly higher than the control group. For F waves obtained from 100 stimuli, the Index RN (P < 0.001 and Index Freps (P < 0.001 of the P group were significantly higher than the NP group; The Index RN (P < 0.001 and Index Freps (P < 0.001 of the P and NP groups were significantly higher than the control group. Conclusions: Increased repeater F waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS. For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy, 100 stimuli would be required.

  10. Importance of Sample Size for the Estimation of Repeater F Waves in Amyotrophic Lateral Sclerosis

    Institute of Scientific and Technical Information of China (English)

    Jia Fang; Ming-Sheng Liu; Yu-Zhou Guan; Bo Cui; Li-Ying Cui

    2015-01-01

    Background:In amyotrophic lateral sclerosis (ALS),repeater F waves are increased.Accurate assessment of repeater F waves requires an adequate sample size.Methods:We studied the F waves of left ulnar nerves in ALS patients.Based on the presence or absence of pyramidal signs in the left upper limb,the ALS patients were divided into two groups:One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group.The Index repeating neurons (RN) and Index repeater F waves (Freps) were compared among the P,NP and control groups following 20 and 100 stimuli respectively.For each group,the Index RN and Index Freps obtained from 20 and 100 stimuli were compared.Results:In the P group,the Index RN (P =0.004) and Index Freps (P =0.001) obtained from 100 stimuli were significantly higher than from 20 stimuli.For F waves obtained from 20 stimuli,no significant differences were identified between the P and NP groups for Index RN (P =0.052) and Index Freps (P =0.079); The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the control group; The Index RN (P =0.002) of the NP group was significantly higher than the control group.For F waves obtained from 100 stimuli,the Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the NP group; The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P and NP groups were significantly higher than the control group.Conclusions:Increased repeater F waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS.For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy,100 stimuli would be required.

  11. Quantitative Assessment of the Importance of Phenotypic Plasticity in Adaptation to Climate Change in Wild Bird Populations

    OpenAIRE

    Vedder, Oscar; Bouwhuis, Sandra; SHELDON, BEN C.

    2013-01-01

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major) breeding in Wytham Woods, Oxford, UK t...

  12. Reliability and Validity of the Spanish Adaptation of EOSS, Comparing Normal and Clinical Samples

    Science.gov (United States)

    Valero-Aguayo, Luis; Ferro-Garcia, Rafael; Lopez-Bermudez, Miguel Angel; de Huralde, Ma. Angeles Selva-Lopez

    2012-01-01

    The Experiencing of Self Scale (EOSS) was created for the evaluation of Functional Analytic Psychotherapy (Kohlenberg & Tsai, 1991, 2001, 2008) in relation to the concept of the experience of personal self as socially and verbally constructed. This paper presents a reliability and validity study of the EOSS with a Spanish sample (582 participants,…

  13. Efficient Bayes-Adaptive Reinforcement Learning using Sample-Based Search

    CERN Document Server

    Guez, Arthur; Dayan, Peter

    2012-01-01

    Bayesian model-based reinforcement learning is a formally elegant approach to learning optimal behaviour under model uncertainty. In this setting, a Bayes-optimal policy captures the ideal trade-off between exploration and exploitation. Unfortunately, finding Bayes-optimal policies is notoriously taxing due to the enormous search space in the augmented belief-state MDP. In this paper we exploit recent advances in sample-based planning, based on Monte-Carlo tree search, to introduce a tractable method for approximate Bayes-optimal planning. Unlike prior work in this area, we avoid expensive applications of Bayes rule within the search tree, by lazily sampling models from the current beliefs. Our approach outperformed prior Bayesian model-based RL algorithms by a significant margin on several well-known benchmark problems.

  14. SAMPLING ADAPTIVE STRATEGY AND SPATIAL ORGANISATION ESTIMATION OF SOIL ANIMAL COMMUNITIES AT VARIOUS HIERARCHICAL LEVELS OF URBANISED TERRITORIES

    Directory of Open Access Journals (Sweden)

    Baljuk J.A.

    2014-12-01

    Full Text Available In work the algorithm of adaptive strategy of optimum spatial sampling for studying of the spatial organisation of communities of soil animals in the conditions of an urbanization have been presented. As operating variables the principal components obtained as a result of the analysis of the field data on soil penetration resistance, soils electrical conductivity and density of a forest stand, collected on a quasiregular grid have been used. The locations of experimental polygons have been stated by means of program ESAP. The sampling has been made on a regular grid within experimental polygons. The biogeocoenological estimation of experimental polygons have been made on a basis of A.L.Belgard's ecomorphic analysis. The spatial configuration of biogeocoenosis types has been established on the basis of the data of earth remote sensing and the analysis of digital elevation model. The algorithm was suggested which allows to reveal the spatial organisation of soil animal communities at investigated point, biogeocoenosis, and landscape.

  15. Adaptive foveated single-pixel imaging with dynamic super-sampling

    CERN Document Server

    Phillips, David B; Taylor, Jonathan M; Edgar, Matthew P; Barnett, Stephen M; Gibson, Graham G; Padgett, Miles J

    2016-01-01

    As an alternative to conventional multi-pixel cameras, single-pixel cameras enable images to be recorded using a single detector that measures the correlations between the scene and a set of patterns. However, to fully sample a scene in this way requires at least the same number of correlation measurements as there are pixels in the reconstructed image. Therefore single-pixel imaging systems typically exhibit low frame-rates. To mitigate this, a range of compressive sensing techniques have been developed which rely on a priori knowledge of the scene to reconstruct images from an under-sampled set of measurements. In this work we take a different approach and adopt a strategy inspired by the foveated vision systems found in the animal kingdom - a framework that exploits the spatio-temporal redundancy present in many dynamic scenes. In our single-pixel imaging system a high-resolution foveal region follows motion within the scene, but unlike a simple zoom, every frame delivers new spatial information from acros...

  16. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    Science.gov (United States)

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  17. Adaptation of the Thinking Styles Inventory (TSI within a Romanian student sample

    Directory of Open Access Journals (Sweden)

    Maricutoiu, L.P.

    2014-07-01

    Full Text Available The present paper presents the psychometric properties of the Thinking Styles Inventory (TSI in a sample of 543 Romanian undergraduate students. The TSI is a self-report questionnaire developed for the assessment of 13 types of preferences for problem solving (or thinking styles. The internal reliability analyses indicated that TSI scales have poor reliability (Cronbach's alphas between .26 and .72, with a median value of .62, and these values were slightly improved after we removed of 10 items from the original questionnaire. Confirmatory factor analyses failed to identify an appropriate solution for describing the relationships between the TSI items, indicating poor structural validity of the questionnaire. Further analyses indicated that the sex of the respondent has small effects on TSI scales. Also, results indicated that TSI scales can be used effectively to predict the academic specialization of the respondent.

  18. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping [Academia Sinica, Institute of Astronomy and Astrophysics, Taipei, Taiwan (China); Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Girart, Josep M. [Institut de Ciències de l' Espai, CSIC-IEEC, Campus UAB, Facultat de Ciències, C5p 2, 08193 Bellaterra, Catalonia (Spain); Frau, Pau [Observatorio Astronómico Nacional, Alfonso XII, 3 E-28014 Madrid (Spain); Li, Hua-Bai [Department of Physics, The Chinese University of Hong Kong (Hong Kong); Li, Zhi-Yun [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904 (United States); Padovani, Marco [Laboratoire Univers et Particules de Montpellier, UMR 5299 du CNRS, Université de Montpellier II, place E. Bataillon, cc072, F-34095 Montpellier (France); Qiu, Keping [School of Astronomy and Space Science, Nanjing University, 22 Hankou Road, Nanjiing 210093 (China); Rao, Ramprasad, E-mail: pmkoch@asiaa.sinica.edu.tw [Academia Sinica, Institute of Astronomy and Astrophysics, 645 N. Aohoku Place, Hilo, HI 96720 (United States)

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  19. The Importance of Sample Return in Establishing Chemical Evidence for Life on Mars or Other Solar System Bodies

    Science.gov (United States)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA s robotic exploration program over the next decade. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including complex organic compounds important in life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right sample (i.e. one with biosignatures or having a high probability of biosignatures) to Earth would allow for more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct Martian life. Here we will discuss the current analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) using the Sample Analysis at Mars (SAM) instrument suite and how sample return missions from Mars and other targets of astrobiological interest will help advance our understanding of chemical biosignatures in the solar system.

  20. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    Science.gov (United States)

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history.

  1. HOW TO ESTIMATE THE AMOUNT OF IMPORTANT CHARACTERISTICS MISSING IN A CONSUMERS SAMPLE BY USING BAYESIAN ESTIMATORS

    Directory of Open Access Journals (Sweden)

    Sueli A. Mingoti

    2001-06-01

    Full Text Available Consumers surveys are conducted very often by many companies with the main objective of obtaining information about the opinions the consumers have about a specific prototype, product or service. In many situations the goal is to identify the characteristics that are considered important by the consumers when taking the decision of buying or using the products or services. When the survey is performed some characteristics that are present in the consumers population might not be reported by those consumers in the observed sample. Therefore, some important characteristics of the product according to the consumers opinions could be missing in the observed sample. The main objective of this paper is to show how the amount of characteristics missing in the observed sample could be easily estimated by using some Bayesian estimators proposed by Mingoti & Meeden (1992 and Mingoti (1999. An example of application related to an automobile survey is presented.Pesquisas de mercado são conduzidas freqüentemente com o propósito de obter informações sobre a opinião dos consumidores em relação a produtos já existentes no mercado, protótipos, ou determinados tipos de serviços prestados pela empresa. Em muitas situações deseja-se identificar as características que são consideradas importantes pelos consumidores no que se refere à tomada de decisão de compra do produto ou de opção pelo serviço prestado pela empresa. Como as pesquisas são feitas com amostras de consumidores do mercado potencial, algumas características consideradas importantes pela população podem não estar representadas nas amostras. O objetivo deste artigo é mostrar como a quantidade de características presentes na população e que não estão representadas nas amostras, pode ser facilmente estimada através de estimadores Bayesianos propostos por Mingoti & Meeden (1992 e Mingoti (1999. Como ilustração apresentamos um exemplo de uma pesquisa de mercado sobre um

  2. Predicting the impacts of climate change on animal distributions: the importance of local adaptation and species' traits

    Energy Technology Data Exchange (ETDEWEB)

    HELLMANN, J. J.; LOBO, N. F.

    2011-12-20

    The geographic range limits of many species are strongly affected by climate and are expected to change under global warming. For species that are able to track changing climate over broad geographic areas, we expect to see shifts in species distributions toward the poles and away from the equator. A number of ecological and evolutionary factors, however, could restrict this shifting or redistribution under climate change. These factors include restricted habitat availability, restricted capacity for or barriers to movement, or reduced abundance of colonists due the perturbation effect of climate change. This research project examined the last of these constraints - that climate change could perturb local conditions to which populations are adapted, reducing the likelihood that a species will shift its distribution by diminishing the number of potential colonists. In the most extreme cases, species ranges could collapse over a broad geographic area with no poleward migration and an increased risk of species extinction. Changes in individual species ranges are the processes that drive larger phenomena such as changes in land cover, ecosystem type, and even changes in carbon cycling. For example, consider the poleward range shift and population outbreaks of the mountain pine beetle that has decimated millions of acres of Douglas fir trees in the western US and Canada. Standing dead trees cause forest fires and release vast quantities of carbon to the atmosphere. The beetle likely shifted its range because it is not locally adapted across its range, and it appears to be limited by winter low temperatures that have steadily increased in the last decades. To understand range and abundance changes like the pine beetle, we must reveal the extent of adaptive variation across species ranges - and the physiological basis of that adaptation - to know if other species will change as readily as the pine beetle. Ecologists tend to assume that range shifts are the dominant

  3. A new calculation method adapted to the experimental conditions for determining samples γ-activities induced by 14 MeV neutrons

    Science.gov (United States)

    Rzama, A.; Erramli, H.; Misdaq, M. A.

    1994-09-01

    Induced gamma-activities of different disk shaped irradiated samples and standards with 14 MeV neutrons have been determined by using a Monte Carlo calculation method adapted to the experimental conditions. The self-absorption of the multienergetic emitted gamma rays has been taken into account in the final samples activities. The influence of the different activation parameters has been studied. Na, K, Cl and P contents in biological (red beet) samples have been determined.

  4. Non-essential amino acids play an important role in adaptation of the rat exocrine pancreas to high nitrogen feeding.

    OpenAIRE

    Hara, Hiroshi; Akatsuka, Naoki; Aoyama, Yoritaka

    2001-01-01

    We have previously demonstrated that feeding a diet with a high amino acid (60% AA diet) content, as a mixture simulating casein, induced pancreatic growth and pancreatic protease production in rats. In the present study, we examined the effects of an increasing dietary content of essential amino acids (EAA, x1 - x3 in exp. 1 and x1 - x3.3 in exp. 2) and non-essential amino acids (NEAA, x1 - x3 in exp. 1 and x1 - x5.2 in exp. 2) on pancreatic growth, amylase and protease adaptation using case...

  5. Comparative chemical composition and antimicrobial activity study of essential oils from two imported lemon fruits samples against pathogenic bacteria

    Directory of Open Access Journals (Sweden)

    Najwa Nasser AL-Jabri

    2014-12-01

    Full Text Available The aim of this work to isolate and identify two essential oils by hydro distillation method from two imported lemon fruits samples collected from local supermarket and evaluate their antimicrobial activity against pathogenic bacteria through disc diffusion method. The essential oil was obtained from Turkish and Indian lemon fruits samples by hydro distillation method using Clevenger type apparatus. Both isolated essential oils were identified by GC–MS and determine their in vitro antimicrobial activity against pathogenic bacteria through agar gel method. Twenty two bioactive ingredients with different percentage were identified based on GC retention time from Turkish and Indian lemon collected from local supermarket. The predominant bioactive ingredients with high percentage in Turkish essential oil were dl-limonene (78.92%, α-pinene (5.08%, l-α-terpineol (4.61%, β-myrcene (1.75%, β-pinene (1.47% and β-linalool (0.95% and in Indian essential oil were dl-limonene (53.57%, l-α-terpineol (15.15%, β-pinene (7.44%, α-terpinolene (4.33%, terpinen-4-ol (3.55%, cymene (2.88% and E-citral (2.38% respectively. Both isolated essential oils by hydro distillation were used for the study of antimicrobial activity against four pathogenic bacterial strains such as Staphylococcus aureus (S. aureus, Escherichia coli (E. coli, Pseudomonas aeruginosa (P. aeruginosa and Proteus vulgaris (Pseudomonas vulgaris. Almost all bacterial strains did not give any activity against the employed essential oils at different concentrations. Therefore, the obtained results show that both essential oils could be needed further extensive biological study and their mechanism of action.

  6. Glove box adaptation of a high resolution ICP emission spectrometer and its operating experience for analysis of radioactive samples

    International Nuclear Information System (INIS)

    ICP-AES units are commercially available in market from many well established companies. These units are all compact in design and are not suitable for its glove box adaptation. As per our divisional requirement of ICP-AES to be incorporated in glove box for the purpose of analyzing radioactive material, it was decided to have all electronic and optical components to keep outside radioactive containment and the entire assembly of ICP-torch, r.f. coil, nebulizer, spray chamber, peristaltic pump and drainage system to be placed inside the glove-box. Simultaneously it was essential to maintain the analytical performance of the spectrometer. From its ore to nuclear fuel to reprocessing and disposal, uranium undergoes several different transformations within the nuclear fuel cycle, including concentration, purification, isotope enrichment, metallurgical processing and obtaining precious element i.e. plutonium (Pu). The determination of impurities in uranium/plutonium at various stages of nuclear fuel cycle plays an important role in quality control and achievement of chemical and metallurgical requirements

  7. Methodological Adaptations for Investigating the Perceptions of Language-Impaired Adolescents Regarding the Relative Importance of Selected Communication Skills

    Science.gov (United States)

    Reed, Vicki A.; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…

  8. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally-Lensed Star-Forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    CERN Document Server

    Leethochawalit, Nicha; Ellis, Richard S; Stark, Daniel P; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2015-01-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of $z\\simeq2$ based on Keck laser-assisted adaptive optics observations undertaken with the recently-improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L$^{\\ast}$ galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2-D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offers a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how serious errors of interpretation can only be revealed through better sampling. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary t...

  9. Perceptions of Australian marine protected area managers regarding the role, importance, and achievability of adaptation for managing the risks of climate change

    Directory of Open Access Journals (Sweden)

    Christopher Cvitanovic

    2014-12-01

    Full Text Available The rapid development of adaptation as a mainstream strategy for managing the risks of climate change has led to the emergence of a broad range of adaptation policies and management strategies globally. However, the success of such policies or management interventions depends on the effective integration of new scientific research into the decision-making process. Ineffective communication between scientists and environmental decision makers represents one of the key barriers limiting the integration of science into the decision-making process in many areas of natural resource management. This can be overcome by understanding the perceptions of end users, so as to identify knowledge gaps and develop improved and targeted strategies for communication and engagement. We assessed what one group of environmental decision makers, Australian marine protected area (MPA managers, viewed as the major risks associated with climate change, and their perceptions regarding the role, importance, and achievability of adaptation for managing these risks. We also assessed what these managers perceived as the role of science in managing the risks from climate change, and identified the factors that increased their trust in scientific information. We do so by quantitatively surveying 30 MPA managers across 3 Australian management agencies. We found that although MPA managers have a very strong awareness of the range and severity of risks posed by climate change, their understanding of adaptation as an option for managing these risks is less comprehensive. We also found that although MPA managers view science as a critical source of information for informing the decision-making process, it should be considered in context with other knowledge types such as community and cultural knowledge, and be impartial, evidence based, and pragmatic in outlining policy and management recommendations that are realistically achievable.

  10. Bottom–up protein identifications from microliter quantities of individual human tear samples. Important steps towards clinical relevance.

    Directory of Open Access Journals (Sweden)

    Peter Raus

    2015-12-01

    With 375 confidently identified proteins in the healthy adult tear, the obtained results are comprehensive and in large agreement with previously published observations on pooled samples of multiple patients. We conclude that, to a limited extent, bottom–up tear protein identifications from individual patients may have clinical relevance.

  11. Testing Set-Point Theory in a Swiss National Sample: Reaction and Adaptation to Major Life Events.

    Science.gov (United States)

    Anusic, Ivana; Yap, Stevie C Y; Lucas, Richard E

    2014-12-01

    Set-point theory posits that individuals react to the experience of major life events, but quickly adapt back to pre-event baseline levels of subjective well-being in the years following the event. A large, nationally representative panel study of Swiss households was used to examine set-point theory by investigating the extent of adaptation following the experience of marriage, childbirth, widowhood, unemployment, and disability. Our results demonstrate that major life events are associated with marked change in life satisfaction and, for some events (e.g., marriage, disability), these changes are relatively long lasting even when accounting for normative, age related change.

  12. A quasi-exclusive European ancestry in the Senepol tropical cattle breed highlights the importance of the slick locus in tropical adaptation.

    Directory of Open Access Journals (Sweden)

    Laurence Flori

    Full Text Available BACKGROUND: The Senepol cattle breed (SEN was created in the early XX(th century from a presumed cross between a European (EUT breed (Red Poll and a West African taurine (AFT breed (N'Dama. Well adapted to tropical conditions, it is also believed trypanotolerant according to its putative AFT ancestry. However, such origins needed to be verified to define relevant husbandry practices and the genetic background underlying such adaptation needed to be characterized. METHODOLOGY/PRINCIPAL FINDINGS: We genotyped 153 SEN individuals on 47,365 SNPs and combined the resulting data with those available on 18 other populations representative of EUT, AFT and Zebu (ZEB cattle. We found on average 89% EUT, 10.4% ZEB and 0.6% AFT ancestries in the SEN genome. We further looked for footprints of recent selection using standard tests based on the extent of haplotype homozygosity. We underlined i three footprints on chromosome (BTA 01, two of which are within or close to the polled locus underlying the absence of horns and ii one footprint on BTA20 within the slick hair coat locus, involved in thermotolerance. Annotation of these regions allowed us to propose three candidate genes to explain the observed signals (TIAM1, GRIK1 and RAI14. CONCLUSIONS/SIGNIFICANCE: Our results do not support the accepted concept about the AFT origin of SEN breed. Initial AFT ancestry (if any might have been counter-selected in early generations due to breeding objectives oriented in particular toward meat production and hornless phenotype. Therefore, SEN animals are likely susceptible to African trypanosomes which questions the importation of SEN within the West African tsetse belt, as promoted by some breeding societies. Besides, our results revealed that SEN breed is predominantly a EUT breed well adapted to tropical conditions and confirmed the importance in thermotolerance of the slick locus.

  13. Cold shock genes cspA and cspB from Caulobacter crescentus are posttranscriptionally regulated and important for cold adaptation.

    Science.gov (United States)

    Mazzon, Ricardo R; Lang, Elza A S; Silva, Carolina A P T; Marques, Marilis V

    2012-12-01

    Cold shock proteins (CSPs) are nucleic acid binding chaperones, first described as being induced to solve the problem of mRNA stabilization after temperature downshift. Caulobacter crescentus has four CSPs: CspA and CspB, which are cold induced, and CspC and CspD, which are induced only in stationary phase. In this work we have determined that the synthesis of both CspA and CspB reaches the maximum levels early in the acclimation phase. The deletion of cspA causes a decrease in growth at low temperature, whereas the strain with a deletion of cspB has a very subtle and transient cold-related growth phenotype. The cspA cspB double mutant has a slightly more severe phenotype than that of the cspA mutant, suggesting that although CspA may be more important to cold adaptation than CspB, both proteins have a role in this process. Gene expression analyses were carried out using cspA and cspB regulatory fusions to the lacZ reporter gene and showed that both genes are regulated at the transcriptional and posttranscriptional levels. Deletion mapping of the long 5'-untranslated region (5'-UTR) of each gene identified a common region important for cold induction, probably via translation enhancement. In contrast to what was reported for other bacteria, these cold shock genes have no regulatory regions downstream from ATG that are important for cold induction. This work shows that the importance of CspA and CspB to C. crescentus cold adaptation, mechanisms of regulation, and pattern of expression during the acclimation phase apparently differs in many aspects from what has been described so far for other bacteria.

  14. Climate impacts on European agriculture and water management in the context of adaptation and mitigation--the importance of an integrated approach.

    Science.gov (United States)

    Falloon, Pete; Betts, Richard

    2010-11-01

    We review and qualitatively assess the importance of interactions and feedbacks in assessing climate change impacts on water and agriculture in Europe. We focus particularly on the impact of future hydrological changes on agricultural greenhouse gas (GHG) mitigation and adaptation options. Future projected trends in European agriculture include northward movement of crop suitability zones and increasing crop productivity in Northern Europe, but declining productivity and suitability in Southern Europe. This may be accompanied by a widening of water resource differences between the North and South, and an increase in extreme rainfall events and droughts. Changes in future hydrology and water management practices will influence agricultural adaptation measures and alter the effectiveness of agricultural mitigation strategies. These interactions are often highly complex and influenced by a number of factors which are themselves influenced by climate. Mainly positive impacts may be anticipated for Northern Europe, where agricultural adaptation may be shaped by reduced vulnerability of production, increased water supply and reduced water demand. However, increasing flood hazards may present challenges for agriculture, and summer irrigation shortages may result from earlier spring runoff peaks in some regions. Conversely, the need for effective adaptation will be greatest in Southern Europe as a result of increased production vulnerability, reduced water supply and increased demands for irrigation. Increasing flood and drought risks will further contribute to the need for robust management practices. The impacts of future hydrological changes on agricultural mitigation in Europe will depend on the balance between changes in productivity and rates of decomposition and GHG emission, both of which depend on climatic, land and management factors. Small increases in European soil organic carbon (SOC) stocks per unit land area are anticipated considering changes in climate

  15. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    Directory of Open Access Journals (Sweden)

    A Townsend Peterson

    Full Text Available Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk.

  16. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    Science.gov (United States)

    Peterson, A Townsend; Moses, Lina M; Bausch, Daniel G

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk. PMID:25105746

  17. Mapping Transmission Risk of Lassa Fever in West Africa: The Importance of Quality Control, Sampling Bias, and Error Weighting

    Science.gov (United States)

    Peterson, A. Townsend; Moses, Lina M.; Bausch, Daniel G.

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk. PMID:25105746

  18. Behavioral Regulation, Visual Spatial Maturity in Kindergarten, and the Relationship of School Adaptation in the First Grade for a Sample of Turkish Children.

    Science.gov (United States)

    Özer, Serap

    2016-04-01

    Behavioral regulation has recently become an important variable in research looking at kindergarten and first-grade achievement of children in private and public schools. The purpose of this study was to examine a measure of behavioral regulation, the Head Toes Knees Shoulders Task, and to evaluate its relationship with visual spatial maturity at the end of kindergarten. Later, in first grade, teachers were asked to rate the children (N = 82) in terms of academic and behavioral adaptation. Behavioral regulation and visual spatial maturity were significantly different between the two school types, but ratings by the teachers in the first grade were affected by children's visual spatial maturity rather than by behavioral regulation. Socioeducational opportunities provided by the two types of schools may be more important to school adaptation than behavioral regulation.

  19. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally Lensed Star-forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    Science.gov (United States)

    Leethochawalit, Nicha; Jones, Tucker A.; Ellis, Richard S.; Stark, Daniel P.; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2016-04-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of z≃ 2 based on Keck laser-assisted adaptive optics observations undertaken with the recently improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L{}* galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offer a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how kinematic complexities essential to understanding the maturity of an early star-forming galaxy can often only be revealed with better sampled data. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary to earlier suggestions, our data indicate a more diverse range of kinematic and metal gradient behavior inconsistent with a simple picture of well-ordered rotation developing concurrently with established steep metal gradients in all but merging systems. Comparing our observations with the predictions of hydrodynamical simulations suggests that gas and metals have been mixed by outflows or other strong feedback processes, flattening the metal gradients in early star-forming galaxies.

  20. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  1. The two-sample problem for Poisson processes: adaptive tests with a non-asymptotic wild bootstrap approach

    CERN Document Server

    Reynaud-Bouret, Patricia; Laurent, Béatrice

    2012-01-01

    Considering two independent Poisson processes, we address the question of testing equality of their respective intensities. We construct multiple testing procedures from the aggregation of single tests whose testing statistics come from model selection, thresholding and/or kernel estimation methods. The corresponding critical values are computed through a non-asymptotic wild bootstrap approach. The obtained tests are proved to be exactly of level $\\alpha$, and to satisfy non-asymptotic oracle type inequalities. From these oracle type inequalities, we deduce that our tests are adaptive in the minimax sense over a large variety of classes of alternatives based on classical and weak Besov bodies in the univariate case, but also Sobolev and anisotropic Nikol'skii-Besov balls in the multivariate case. A simulation study furthermore shows that they strongly perform in practice.

  2. Mobile membrane introduction tandem mass spectrometry for on-the-fly measurements and adaptive sampling of VOCs around oil and gas projects in Alberta, Canada

    Science.gov (United States)

    Krogh, E.; Gill, C.; Bell, R.; Davey, N.; Martinsen, M.; Thompson, A.; Simpson, I. J.; Blake, D. R.

    2012-12-01

    The release of hydrocarbons into the environment can have significant environmental and economic consequences. The evolution of smaller, more portable mass spectrometers to the field can provide spatially and temporally resolved information for rapid detection, adaptive sampling and decision support. We have deployed a mobile platform membrane introduction mass spectrometer (MIMS) for the in-field simultaneous measurement of volatile and semi-volatile organic compounds. In this work, we report instrument and data handling advances that produce geographically referenced data in real-time and preliminary data where these improvements have been combined with high precision ultra-trace VOCs analysis to adaptively sample air plumes near oil and gas operations in Alberta, Canada. We have modified a commercially available ion-trap mass spectrometer (Griffin ICX 400) with an in-house temperature controlled capillary hollow fibre polydimethylsiloxane (PDMS) polymer membrane interface and in-line permeation tube flow cell for a continuously infused internal standard. The system is powered by 24 VDC for remote operations in a moving vehicle. Software modifications include the ability to run continuous, interlaced tandem mass spectrometry (MS/MS) experiments for multiple contaminants/internal standards. All data are time and location stamped with on-board GPS and meteorological data to facilitate spatial and temporal data mapping. Tandem MS/MS scans were employed to simultaneously monitor ten volatile and semi-volatile analytes, including benzene, toluene, ethylbenzene and xylene (BTEX), reduced sulfur compounds, halogenated organics and naphthalene. Quantification was achieved by calibrating against a continuously infused deuterated internal standard (toluene-d8). Time referenced MS/MS data were correlated with positional data and processed using Labview and Matlab to produce calibrated, geographical Google Earth data-visualizations that enable adaptive sampling protocols

  3. Field-adapted sampling of whole blood to determine the levels of amodiaquine and its metabolite in children with uncomplicated malaria treated with amodiaquine plus artesunate combination

    Directory of Open Access Journals (Sweden)

    Gustafsson Lars L

    2009-03-01

    Full Text Available Abstract Background Artemisinin combination therapy (ACT has been widely adopted as first-line treatment for uncomplicated falciparum malaria. In Uganda, amodiaquine plus artesunate (AQ+AS, is the alternative first-line regimen to Coartem® (artemether + lumefantrine for the treatment of uncomplicated falciparum malaria. Currently, there are few field-adapted analytical techniques for monitoring amodiaquine utilization in patients. This study evaluates the field applicability of a new method to determine amodiaquine and its metabolite concentrations in whole blood dried on filter paper. Methods Twelve patients aged between 1.5 to 8 years with uncomplicated malaria received three standard oral doses of AQ+AS. Filter paper blood samples were collected before drug intake and at six different time points over 28 days period. A new field-adapted sampling procedure and liquid chromatographic method was used for quantitative determination of amodiaquine and its metabolite in whole blood. Results The sampling procedure was successively applied in the field. Amodiaquine could be quantified for at least three days and the metabolite up to 28 days. All parasites in all the 12 patients cleared within the first three days of treatment and no adverse drug effects were observed. Conclusion The methodology is suitable for field studies. The possibility to determine the concentration of the active metabolite of amodiaquine up to 28 days suggested that the method is sensitive enough to monitor amodiaquine utilization in patients. Amodiaquine plus artesunate seems effective for treatment of falciparum malaria.

  4. Adaptation and Validation of the Brief Sexual Opinion Survey (SOS) in a Colombian Sample and Factorial Equivalence with the Spanish Version

    Science.gov (United States)

    Sierra, Juan Carlos; Soler, Franklin

    2016-01-01

    Attitudes toward sexuality are a key variable for sexual health. It is really important for psychology and education to have adapted and validated questionnaires to evaluate these attitudes. Therefore, the objective of this research was to adapt, validate and calculate the equivalence of the Colombia Sexual Opinion Survey as compared to the same survey from Spain. To this end, a total of eight experts were consulted and 1,167 subjects from Colombia and Spain answered the Sexual Opinion Survey, the Sexual Assertiveness Scale, the Massachusetts General Hospital-Sexual Functioning Questionnaire, and the Sexuality Scale. The evaluation was conducted by online and the results show adequate qualitative and quantitative properties of the items, with adequate reliability and external validity and compliance with the strong invariance between the two countries. Consequently, the Colombia Sexual Opinion Survey is a valid and reliable scale and its scores can be compared with the ones from the Spain survey, with minimum bias. PMID:27627114

  5. Adaptation and Validation of the Brief Sexual Opinion Survey (SOS) in a Colombian Sample and Factorial Equivalence with the Spanish Version.

    Science.gov (United States)

    Vallejo-Medina, Pablo; Marchal-Bertrand, Laurent; Gómez-Lugo, Mayra; Espada, José Pedro; Sierra, Juan Carlos; Soler, Franklin; Morales, Alexandra

    2016-01-01

    Attitudes toward sexuality are a key variable for sexual health. It is really important for psychology and education to have adapted and validated questionnaires to evaluate these attitudes. Therefore, the objective of this research was to adapt, validate and calculate the equivalence of the Colombia Sexual Opinion Survey as compared to the same survey from Spain. To this end, a total of eight experts were consulted and 1,167 subjects from Colombia and Spain answered the Sexual Opinion Survey, the Sexual Assertiveness Scale, the Massachusetts General Hospital-Sexual Functioning Questionnaire, and the Sexuality Scale. The evaluation was conducted by online and the results show adequate qualitative and quantitative properties of the items, with adequate reliability and external validity and compliance with the strong invariance between the two countries. Consequently, the Colombia Sexual Opinion Survey is a valid and reliable scale and its scores can be compared with the ones from the Spain survey, with minimum bias. PMID:27627114

  6. Adaptation of the Participant Role Scale (PRS) in a Spanish youth sample: measurement invariance across gender and relationship with sociometric status.

    Science.gov (United States)

    Lucas-Molina, Beatriz; Williamson, Ariel A; Pulido, Rosa; Calderón, Sonsoles

    2014-11-01

    In recent years, bullying research has transitioned from investigating the characteristics of the bully-victim dyad to examining bullying as a group-level process, in which the majority of children play some kind of role. This study used a shortened adaptation of the Participant Role Scale (PRS) to identify these roles in a representative sample of 2,050 Spanish children aged 8 to 13 years. Confirmatory factor analysis revealed three different roles, indicating that the adapted scale remains a reliable way to distinguish the Bully, Defender, and Outsider roles. In addition, measurement invariance of the adapted scale was examined to analyze possible gender differences among the roles. Peer status was assessed separately by gender through two sociometric procedures: the nominations-based method and the ratings-based method. Across genders, children in the Bully role were more often rated as rejected, whereas Defenders were more popular. Results suggest that although the PRS can reveal several different peer roles in the bullying process, a more clear distinction between bullying roles (i.e., Bully, Assistant, and Reinforcer) could better inform strategies for bullying interventions. PMID:24707035

  7. Adaptation of triple axis neutron spectrometer for SANS measurements using alumina samples at TRIGA reactor of Bangladesh

    Science.gov (United States)

    Ahmed, F. U.; Kamal, I.; Yunus, S. M.; Datta, T. K.; Azad, A. K.; Zakaria, A. K. M.; Goyal, P. S.

    2005-09-01

    Double crystal method known as Bonse and Hart's technique has been employed to develop small angle neutron scattering (SANS) facility on a triple axis neutron spectrometer at TRIGA Mark II (3 MW) research reactor of Atomic Energy Research Establishment (AERE), Savar, Dhaka, Bangladesh. Two Si(1 1 1) crystals with very small mosaic spread ∼1 min have been used for this purpose. At an incident neutron wavelength of 1.24 Å, this device is useful for SANS in the Q range between 1.6×10 -3 and 10 -1 Å -1. This Q range allows investigating particle sizes and interparticle correlations on a length scale of ∼200 Å. Results of SANS experiments on three alumina (Al 2O 3) samples as performed using above setup are presented. It is seen that Al 2O 3 particles, indeed, scatter neutrons in regions of small angles. It is also seen that scattering is different for different samples showing that it changes with a change in particle size.

  8. A novel β hyper-plane based importance sampling method%基于β面的截断重要抽样法

    Institute of Scientific and Technical Information of China (English)

    张峰; 吕震宙; 赵新攀

    2011-01-01

    A novel β hyper-plane based importance sampling method is presented to estimate failure probability of structure. Based on β section, a virtual hyper-plane tangent to the failure surface, the variable space is separated into the importance region R and the unimportance region S, on which the truncated importance sampling functions hR(x) and hS(x) sue established respectively. The sampling numbers generated from hR(x) and hS(x) are dependent on the contributions of R and S to the failure probability, and they are determined by the iterative simulations. The formulae of the failure probability estimation, the variance and the coefficient of variation are derived for the presented β hyper-plane importance sampling method. The presented method is suitable for the failure probability estimation of both a single failure mode and multiple failure modes in parallel. The examples show that the presented method is more efficient than the traditional importance sampling method.%提出一种基于β面的截断重要抽样法求解结构单失效模式的失效概率。该方法通过在设计点处作失效面的虚拟切面-β面,将变量空间分割成重要抽样区域R和非重要抽样区域S。在R和S区域分别建立相应的截断重要抽样密度函数hR(x)和hs(x),从hR(x)和hs(x)中抽取的样本量按照R和S区域对失效概率的贡献来分配,由迭代模拟计算得到。推导了基于β面截断重要抽样法的失效概率估计值的方差和变异系数计算公式,该方法还可推广应用到并联系统中。算例结果表明,在相同的计算精度下,基于β面的截断重要抽样法所需的样本数比传统重要抽样法计算量少。

  9. Enhancing the Frequency Adaptability of Periodic Current Controllers with a Fixed Sampling Rate for Grid-Connected Power Converters

    DEFF Research Database (Denmark)

    Yang, Yongheng; Zhou, Keliang; Blaabjerg, Frede

    2016-01-01

    the instantaneous grid information (e.g., frequency and phase of the grid voltage) for the current control, which is commonly performed by a Phase-Locked-Loop (PLL) system. Hence, harmonics and deviations in the estimated frequency by the PLL could lead to current tracking performance degradation, especially......Grid-connected power converters should employ advanced current controllers, e.g., Proportional Resonant (PR) and Repetitive Controllers (RC), in order to produce high-quality feed-in currents that are required to be synchronized with the grid. The synchronization is actually to detect...... of the resonant controllers and by approximating the fractional delay using a Lagrange interpolating polynomial for the RC, respectively, the frequency-variation-immunity of these periodic current controllers with a fixed sampling rate is improved. Experiments on a single-phase grid-connected system are presented...

  10. The relative importance of Staphylococcus saprophyticus as a urinary tract pathogen: distribution of bacteria among urinary samples analysed during 1 year at a major Swedish laboratory.

    Science.gov (United States)

    Eriksson, Andreas; Giske, Christian G; Ternhag, Anders

    2013-01-01

    To determine the distribution of urinary tract pathogens with focus on Staphylococcus saprophyticus and analyse the seasonality, antibiotic susceptibility, and gender and age distributions in a large Swedish cohort. S. saprophyticus is considered an important causative agent of urinary tract infection (UTI) in young women, and some earlier studies have reported up to approximately 40% of UTIs in this patient group being caused by S. saprophyticus. We hypothesized that this may be true only in very specific outpatient settings. During the year 2010, 113,720 urine samples were sent for culture to the Karolinska University Hospital, from both clinics in the hospital and from primary care units. Patient age, gender and month of sampling were analysed for S. saprophyticus, Escherichia coli, Klebsiella pneumoniae and Proteus mirabilis. Species data were obtained for 42,633 (37%) of the urine samples. The most common pathogens were E. coli (57.0%), Enterococcus faecalis (6.5%), K. pneumoniae (5.9%), group B streptococci (5.7%), P. mirabilis (3.0%) and S. saprophyticus (1.8%). The majority of subjects with S. saprophyticus were women 15-29 years of age (63.8%). In this age group, S. saprophyticus constituted 12.5% of all urinary tract pathogens. S. saprophyticus is a common urinary tract pathogen in young women, but its relative importance is low compared with E. coli even in this patient group. For women in other ages and for men, growth of S. saprophyticus is a quite uncommon finding.

  11. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  12. Importance of the market portfolio description in the assessment of a sample of Spanish investment funds through the Jensen’s Alpha

    Directory of Open Access Journals (Sweden)

    BELÉN VALLEJO ALONSO

    2003-06-01

    Full Text Available The right assessment of the investment funds performance and of the manager’s ability to add value with their management is an important aspect to which has been paid special attention. Among the traditional performance measures, one of the most used is the Jensen’s alpha. However, one of the main problems of the evaluation methods using the beta as a risk measure and, hence of the Jensen’s alpha, is their sensibility to the market portfolio. In this work we aim to study the importance of the market portfolio description in the assessment of a sample of Spanish investment funds through the Jensen’s alpha. We analyze the market portfolio influence, on the one hand, in the alpha outcomes and, on the other, in the ranking of the funds that they provide.

  13. Single sample face recognition using adaptively weighted extended local binary pattern%采用自适应加权扩展LBP的单样本人脸识别

    Institute of Scientific and Technical Information of China (English)

    高涛; 马祥; 白璘

    2012-01-01

    提出了一种采用自适应加权扩展LBP(AWELBP,adaptively weighted extended local binarypattern)的单样本人脸描述方法,首先对单样本的人脸图像进行多尺度分块,对子块的图像进行扩展均匀LBP算子运算,同时同步生成图像局部熵图谱(LEM,local entropy map),计算每一子块对整体人脸图像纹理描述的贡献度图谱,根据贡献度图谱对每个子块的LBP直方图进行自适应加权,最后将各子块的LBP直方图进行连接形成人脸特征。本算法在ORL、Yale、Yale B人脸库上对部分遮挡、表情变化、光照变化等环境进行测试,并与传统算法以及与多种LBP改进算法进行比较,结果表明该算法对部分遮挡、表情变化和光照等环境下单样本人脸描述具有较好的效果。%This paper presents a novel method for face description with single sample by adaptively weighted extended local binary pattern(AWELBP).The proposed algorithm utilizes an extended local binary pattern(ELBP) to represent faces partitioned into sub-patterns.Especially,in order to perform matching in the sense of the richness of identity information and to handle the partial occlusion problem,the proposed algorithm employs an adaptively weighting map to weight the sub-ELBP(SELBP) extracted from local areas based on the contribution of each sub-pattern to the final similarity measurement.Simulated experiments and comparisons on a subset of Yale face databases,a subset of Yale B face databases and ORL face databases under ideal condition,different illumination condition,different facial expression and partial occlusion show that the proposed algorithm is an outstanding method for single sample face recognition.

  14. Adaptive sampling by information maximization

    CERN Document Server

    Machens, C K

    2002-01-01

    The investigation of input-output systems often requires a sophisticated choice of test inputs to make best use of limited experimental time. Here we present an iterative algorithm that continuously adjusts an ensemble of test inputs online, subject to the data already acquired about the system under study. The algorithm focuses the input ensemble by maximizing the mutual information between input and output. We apply the algorithm to simulated neurophysiological experiments and show that it serves to extract the ensemble of stimuli that a given neural system ``expects'' as a result of its natural history.

  15. Ex vivo transcriptional profiling reveals a common set of genes important for the adaptation of Pseudomonas aeruginosa to chronically infected host sites

    NARCIS (Netherlands)

    Bielecki, P.; Komor, U.; Bielecka, A.; Müsken, M.; Puchalka, J.; Pletz, M.W.; Ballmann, M.; Martins Dos Santos, V.A.P.; Weiss, S.; Häussler, S.

    2013-01-01

    The opportunistic bacterium Pseudomonas aeruginosa is a major nosocomial pathogen causing both devastating acute and chronic persistent infections. During the course of an infection, P.¿ aeruginosa rapidly adapts to the specific conditions within the host. In the present study, we aimed at the ident

  16. Face recognition with single training sample per person based on adaptive weighted LBP%自适应加权LBP的单样本人脸识别方法

    Institute of Scientific and Technical Information of China (English)

    赵汝哲; 房斌; 文静

    2012-01-01

    在面对单训练样本的人脸识别问题时,传统人脸识别方法识别率会下降很多,有的方法甚至不能使用.针对单样本人脸识别问题,提出了一种自适应加权LBP方法.方法既提取了纹理信息又包含了分块拓扑信息,更重要的是可以把这些特征用合适的权重融合起来.划分图像并用LBP提取纹理信息;利用方差来完成对特征的自适应加权融合;用最近邻分类器识别结果.在ORL人脸数据库上的实验结果表明,该方法可以有效地提高识别率.%Facing with the problem of face recognition with single training sample per person, the conventional methods will suffer serious performance drop or even fail to work. To solve this problem, this paper proposes an adaptive weighted Local Binary Pattern (LBP) method. It combines both texture feature and topological information with a novel weighted way involving the variance of sub-images. The paper partitions facial images and uses LBP to extract texture feature. It makes use of variance to implement the adaptive weighted fusion for features. The nearest neighbor classifier is adopted for further recognition. Experimental results show a better performance on ORL facial database.

  17. Estimation of number and density, and random distribution testing of important plant species in Ban Pong Forest, Sansai District, Chiang Mai Province, Thailand using T-Square sampling

    Directory of Open Access Journals (Sweden)

    Phahol Sakkatat

    Full Text Available A study by T-square sampling method was conducted to investigate importantplant species in Ban Pong Forest, Sansai district, Chiang Mai province by estimation of theirnumber and density, and testing of their random distribution. The result showed that, therewere 14 kinds of important plant species, viz. Dipterocarpus tuberculatus Roxb., Shoreaobtuse Wall. exBlume, Bridelia retusa (L. A. Juss, Derris scandens Benth., Thysostachyssiamensis, Parinari anamense Hance, Vitex pinnata L.f., Canarium subulatum Guill.,Litsea glutinosa C.B.Roxb., Alphonsea glabrifolia Craib., Pueraria mirifica, Vaticastapfiana van Slooten, Walsura robusta Rox. and Dipterocarpus alatus Roxb. By far,Dipterocarpus tuberculatus Roxb was greatest in number and density, and all of the specieshad random distribution, except Walsura robusta Roxb and Dipterocarpus alatus Roxb

  18. The Importance of In Situ Measurements and Sample Return in the Search for Chemical Biosignatures on Mars or other Solar System Bodies (Invited)

    Science.gov (United States)

    Glavin, D. P.; Brinckerhoff, W. B.; Conrad, P. G.; Dworkin, J. P.; Eigenbrode, J. L.; Getty, S.; Mahaffy, P. R.

    2013-12-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA's robotic exploration program for decades to come. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including organic compounds important in life on Earth and their geological forms. These compounds include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right samples to Earth (i.e. samples containing chemical biosignatures or having a high probability of biosignature preservation) would enable more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct life on Mars or elsewhere. In this presentation we will review the current in situ analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) rover using the Sample Analysis at Mars (SAM) instrument suite [3] and discuss how both future advanced in situ instrumentation [4] and laboratory measurements of samples returned from Mars and other targets of astrobiological interest including the icy moons of Jupiter and Saturn will help advance our understanding of chemical biosignatures in the Solar System. References: [1] Cronin, J. R and Chang S. (1993

  19. Sexual Excitation/Sexual Inhibition Inventory (SESII-W/M): Adaptation and Validation Within a Portuguese Sample of Men and Women.

    Science.gov (United States)

    Neves, Cide Filipe; Milhausen, Robin R; Carvalheira, Ana

    2016-08-17

    The SESII-W/M is a self-report measure assessing factors that inhibit and enhance sexual arousal in men and women. The goal of this study was to adapt and validate it in a sample of Portuguese men and women. A total of 1,723 heterosexual men and women participated through a web survey, with ages ranging from 18 to 72 years old (M  = 36.05, SD =  11.93). The levels of internal consistency were considered satisfactory in the first four factors, but not in Setting and Dyadic Elements of the Sexual Interaction. Confirmatory factor analysis partially supported the six-factor, 30-item model, as factor loadings and squared multiple correlations pointed to problems with items mainly loading on those two factors. General fit indices were lower than the ones estimated by Milhausen, Graham, Sanders, Yarber, and Maitland (2010). Psychometric sensitivity and construct validity were adequate and gender differences were consistent with the original study. The six-factor, 30-item model was retained, but changes to the factors Setting and Dyadic Elements of the Sexual Interaction, and their corresponding items, were recommended in order to strengthen the measure. PMID:26548421

  20. Neural Adaptive Sequential Monte Carlo

    OpenAIRE

    Gu, Shixiang; Ghahramani, Zoubin; Turner, Richard E

    2015-01-01

    Sequential Monte Carlo (SMC), or particle filtering, is a popular class of methods for sampling from an intractable target distribution using a sequence of simpler intermediate distributions. Like other importance sampling-based methods, performance is critically dependent on the proposal distribution: a bad proposal can lead to arbitrarily inaccurate estimates of the target distribution. This paper presents a new method for automatically adapting the proposal using an approximation of the Ku...

  1. The R Package MitISEM: Mixture of Student-t Distributions using Importance Sampling Weighted Expectation Maximization for Efficient and Robust Simulation

    NARCIS (Netherlands)

    N. Basturk (Nalan); L.F. Hoogerheide (Lennart); A. Opschoor (Anne); H.K. van Dijk (Herman)

    2012-01-01

    textabstractThis paper presents the R package MitISEM, which provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate de

  2. Thriving While Engaging in Risk? Examining Trajectories of Adaptive Functioning, Delinquency, and Substance Use in a Nationally Representative Sample of U.S. Adolescents

    Science.gov (United States)

    Warren, Michael T.; Wray-Lake, Laura; Rote, Wendy M.; Shubert, Jennifer

    2016-01-01

    Recent advances in positive youth development theory and research explicate complex associations between adaptive functioning and risk behavior, acknowledging that high levels of both co-occur in the lives of some adolescents. However, evidence on nuanced overlapping developmental trajectories of adaptive functioning and risk has been limited to 1…

  3. Hemoglobina y testosterona: importancia en la aclimatación y adaptación a la altura Hemoglobin and testosterone: importance on high altitude acclimatization and adaptation

    Directory of Open Access Journals (Sweden)

    Gustavo F. Gonzales

    2011-03-01

    Full Text Available Los diferentes tipos de mecanismos que emplea el organismo cuando se enfrenta a una situación de hipoxia incluyen la acomodación, la aclimatación y la adaptación. La acomodación es la respuesta inicial a la exposición aguda a la hipoxia de altura y se caracteriza por aumento de la ventilación y de la frecuencia cardiaca. La aclimatación se presenta en los individuos que están temporalmente expuestos a la altura y, que en cierto grado, les permite tolerar la altura. En esta fase hay un incremento en la eritropoyesis, se incrementa la concentración de hemoglobina y mejora la capacidad de transporte de oxígeno. La adaptación es el proceso de aclimatación natural donde entra en juego las variaciones genéticas y la aclimatación que les permiten a los individuos vivir sin dificultad en la altura. La testosterona es una hormona que regula la eritropoyesis y la ventilación, podría estar asociada con los procesos de aclimatación y adaptación a la altura. La eritrocitosis excesiva que conduce al mal de montaña crónico es causada por una baja saturación arterial de oxígeno, una ineficiencia ventilatoria y reducida respuesta ventilatoria a la hipoxia. La testosterona se incrementa en la exposición aguda en la altura y en los nativos de altura con eritrocitosis excesiva. Los resultados de las investigaciones actuales permitirían concluir que el incremento de la testosterona y de la hemoglobina son buenas para la aclimatación adquirida pues mejoran el transporte de oxígeno pero no para la adaptación a la altura, dado que valores altos de testosterona en suero se asocian con eritrocitosis excesiva.The different types of response mechanisms that the organism uses when exposed to hypoxia include accommodation, acclimatization and adaptation. Accommodation is the initial response to acute exposure to high altitude hypoxia and is characterized by an increase in ventilation and heart rate. Acclimatization is observed in individuals

  4. Evolution of the MIDTAL microarray: the adaption and testing of oligonucleotide 18S and 28S rDNA probes and evaluation of subsequent microarray generations with Prymnesium spp. cultures and field samples.

    Science.gov (United States)

    McCoy, Gary R; Touzet, Nicolas; Fleming, Gerard T A; Raine, Robin

    2015-07-01

    The toxic microalgal species Prymnesium parvum and Prymnesium polylepis are responsible for numerous fish kills causing economic stress on the aquaculture industry and, through the consumption of contaminated shellfish, can potentially impact on human health. Monitoring of toxic phytoplankton is traditionally carried out by light microscopy. However, molecular methods of identification and quantification are becoming more common place. This study documents the optimisation of the novel Microarrays for the Detection of Toxic Algae (MIDTAL) microarray from its initial stages to the final commercial version now available from Microbia Environnement (France). Existing oligonucleotide probes used in whole-cell fluorescent in situ hybridisation (FISH) for Prymnesium species from higher group probes to species-level probes were adapted and tested on the first-generation microarray. The combination and interaction of numerous other probes specific for a whole range of phytoplankton taxa also spotted on the chip surface caused high cross reactivity, resulting in false-positive results on the microarray. The probe sequences were extended for the subsequent second-generation microarray, and further adaptations of the hybridisation protocol and incubation temperatures significantly reduced false-positive readings from the first to the second-generation chip, thereby increasing the specificity of the MIDTAL microarray. Additional refinement of the subsequent third-generation microarray protocols with the addition of a poly-T amino linker to the 5' end of each probe further enhanced the microarray performance but also highlighted the importance of optimising RNA labelling efficiency when testing with natural seawater samples from Killary Harbour, Ireland. PMID:25631743

  5. Low incidence of clonality in cold water corals revealed through the novel use of standardized protocol adapted to deep sea sampling

    Science.gov (United States)

    Becheler, Ronan; Cassone, Anne-Laure; Noel, Philippe; Mouchel, Olivier; Morrison, Cheryl; Arnaud-Haond, Sophie

    2016-01-01

    Sampling in the deep sea is a technical challenge, which has hindered the acquisition of robust datasets that are necessary to determine the fine-grained biological patterns and processes that may shape genetic diversity. Estimates of the extent of clonality in deep-sea species, despite the importance of clonality in shaping the local dynamics and evolutionary trajectories, have been largely obscured by such limitations. Cold-water coral reefs along European margins are formed mainly by two reef-building species, Lophelia pertusa and Madrepora oculata. Here we present a fine-grained analysis of the genotypic and genetic composition of reefs occurring in the Bay of Biscay, based on an innovative deep-sea sampling protocol. This strategy was designed to be standardized, random, and allowed the georeferencing of all sampled colonies. Clonal lineages discriminated through their Multi-Locus Genotypes (MLG) at 6–7 microsatellite markers could thus be mapped to assess the level of clonality and the spatial spread of clonal lineages. High values of clonal richness were observed for both species across all sites suggesting a limited occurrence of clonality, which likely originated through fragmentation. Additionally, spatial autocorrelation analysis underlined the possible occurrence of fine-grained genetic structure in several populations of both L. pertusa and M. oculata. The two cold-water coral species examined had contrasting patterns of connectivity among canyons, with among-canyon genetic structuring detected in M. oculata, whereas L. pertusa was panmictic at the canyon scale. This study exemplifies that a standardized, random and georeferenced sampling strategy, while challenging, can be applied in the deep sea, and associated benefits outlined here include improved estimates of fine grained patterns of clonality and dispersal that are comparable across sites and among species.

  6. 珍稀植物细枝岩黄耆自适应群团抽样的比较研究%Comparison of Adaptive Cluster Samplings for Inventory of Rare Plant Hedysarum scoparium

    Institute of Scientific and Technical Information of China (English)

    朱光玉; 吕勇

    2011-01-01

    对于稀少、群团状总体的调查,自适应群团抽样(adaptive cluster sampling,简称ACS)被认为是一种有效的抽样方法.针对中国西部森林植被的集聚、稀少的分布特征,以乌兰布和沙漠边缘地区细枝岩黄耆株数密度为研究对象,进行6种抽样方法(简单放回随机抽样、简单不放回随机抽样、最初样本放回基于修正Hansen-Hurwitz估计量的ACS、最初样本不放回基于修正Hansen-Hurwitz估计量的ACS、最初样本放回基于修正Horvitz-Thompson估计量的ACS和最初样本不放回基于修正Horvitz-Thompson估计量的ACS)的重复抽样模拟试验,并对模拟试验的结果进行了比较和分析,指出最初样本不放回基于Horvitz-Thompson估计量的自适应群团抽样的效果最佳,其均值估计相对误差为0.037%,均值方差估计为0.03571.研究结果有助于提高稀少、群团状森林资源的清查的精度和效率.%Adaptive cluster sampling (ACS) appears to be an effective method for sampling rare and clustering population. The forest vegetation is most rare and clustering in west China. Based on the density of Hedysarum scoparium in Ulanbuh desert edge, four kinds of adaptive cluster sampling methods and two simple rand sampling methods had been carried out, there were simple random sampling with primary units selected with replacement, simple random sampling with primary units selected without replacement, adaptive cluster sampling based on Hansen-Hurwitz estimator with primary units selected with replacement, adaptive cluster sampling based on Hansen-Hurwitz estimator without primary units selected with replacement, adaptive cluster sampling based on Horvitz-Thotnpson estimator with primary units selected with replacement, adaptive cluster sampling based on Horvitz-Thompson estimator with primary units selected without replacement, and simulation resampling of six methods had also been conducted, the result of which had been compared. The result

  7. "Como Se Dice HIV?" Adapting Human Immunodeficiency Virus Prevention Messages to Reach Homosexual and Bisexual Hispanic Men: The Importance of Hispanic Cultural and Health Beliefs.

    Science.gov (United States)

    Bowdy, Matthew A.

    HIV/AIDS prevention messages catered to Anglo homosexual/bisexual men are not effective in teaching preventative behaviors to Hispanic homosexual/bisexual men. Hispanic sociocultural traits associated with homosexuality and bisexuality prevent the effectiveness of these messages. The Hispanic family is also extremely important in influencing…

  8. The Importance of Sampling Strategies on AMS Determination of Dykes II. Further Examples from the Kapaa Quarry, Koolau Volcano, Oahu, Hawaii

    Science.gov (United States)

    Mendoza-Borunda, R.; Herrero-Bervera, E.; Canon-Tapia, E.

    2012-12-01

    Recent work has suggested the convenience of dyke sampling along several profiles parallel and perpendicular to its walls to increase the probability of determining a geologically significant magma flow direction using anisotropy of magnetic susceptibility (AMS) measurements. For this work, we have resampled in great detail some dykes from the Kapaa Quarry, Koolau Volcano in Oahu Hawaii, comparing the results of a more detailed sampling scheme with those obtained previously with a traditional sampling scheme. In addition to the AMS results we will show magnetic properties, including magnetic grain sizes, Curie points and AMS measured at two different frequencies on a new MFK1-FA Spinner Kappabridge. Our results thus far provide further empirical evidence supporting the occurrence of a definite cyclic fabric acquisition during the emplacement of at least some of the dykes. This cyclic behavior can be captured using the new sampling scheme, but might be easily overlooked if the simple, more traditional sampling scheme is used. Consequently, previous claims concerning the advantages of adopting a more complex sampling scheme are justified since this approach can serve to reduce the uncertainty in the interpretation of AMS results.

  9. THE IMPORTANCE OF THE STANDARD SAMPLE FOR ACCURATE ESTIMATION OF THE CONCENTRATION OF NET ENERGY FOR LACTATION IN FEEDS ON THE BASIS OF GAS PRODUCED DURING THE INCUBATION OF SAMPLES WITH RUMEN LIQUOR

    Directory of Open Access Journals (Sweden)

    T ŽNIDARŠIČ

    2003-10-01

    Full Text Available The aim of this work was to examine the necessity of using the standard sample at the Hohenheim gas test. During a three year period, 24 runs of forage samples were incubated with rumen liquor in vitro. Beside the forage samples also the standard hay sample provided by the Hohenheim University (HFT-99 was included in the experiment. Half of the runs were incubated with rumen liquor of cattle and half with the rumen liquor of sheep. Gas produced during the 24 h incubation of standard sample was measured and compared to a declared value of sample HFT-99. Beside HFT-99, 25 test samples with known digestibility coefficients determined in vivo were included in the experiment. Based on the gas production of HFT-99, it was found that donor animal (cattle or sheep did not significantly affect the activity of rumen liquor (41.4 vs. 42.2 ml of gas per 200 mg dry matter, P>0.1. Neither differences between years (41.9, 41.2 and 42.3 ml of gas per 200 mg dry matter, P>0.1 were significant. However, a variability of about 10% (from 38.9 to 43.7 ml of gas per 200 mg dry matter was observed between runs. In the present experiment, the gas production in HFT-99 was about 6% lower than the value obtained by the Hohenheim University (41.8 vs. 44.43 ml per 200 mg dry matter. This indicates a systematic error between the laboratories. In the case of twenty-five test samples, correction on the basis of the standard sample reduced the average difference of the in vitro estimates of net energy for lactation (NEL from the in vivo determined values. It was concluded that, due to variation between runs and systematical differences in rumen liquor activity between two laboratories, the results of Hohenheim gas test have to be corrected on the basis of standard sample.

  10. Importance of covariance components in inversion analyses of densely sampled observed data: an application to waveform data inversion for seismic source processes

    Science.gov (United States)

    Yagi, Yuji; Fukahata, Yukitoshi

    2008-10-01

    Nominally continuous data in space and/or time is obtained in various observations in geophysics. Due to an enhanced technology of computers, we can now invert such observed data with a very high sampling rate. Densely sampled observed data are usually not completely independent of each other, and so we must take this effect into account. As for seismic waveform data, they have at least temporal correlation due to the effect of inelastic attenuation of the Earth. Taking the data covariance into account, we have developed a method of seismic source inversion and applied it to teleseismic P-wave data of the 2003 Boumerdes-Zemmouri, Algeria earthquake. From the comparison of the final slip distributions inverted with and without the covariance components, we found that the effect of covariance components is crucial for a data set of higher sampling rates (>=5 Hz). If we neglect the covariance components, the inverted results become unstable due to overestimation of the information from observed data. So far, it has been widely believed that we can obtain a finer image of seismic source processes, by inverting waveform data with a higher sampling rate. However, the covariance components of observed data originated from inelastic effect of the Earth give a limitation on the resolution of inverted seismic source models.

  11. 一种适于时差法超声流量计的自适应采样方法%A Self-adaptive Sampling Method for Time Difference Method Ultrasonic Flow-meter

    Institute of Scientific and Technical Information of China (English)

    罗永; 王让定; 姚灵

    2012-01-01

    For the requirements of high precision and low power in Ultrasonic Flow-meter, an self-adaptive sampling method is proposed in order to overcome the deficiency of periodic sampling. The sampling rate of change in the adjacent time difference is considered as the parameter to control sampling period, which can be adjusted automatically based on the fluid flow. Through the comparative analysis of experimental data between self-adaptive sampling and periodic sampling, adaptive sampling method can not only significantly improve the measurement accuracy in a fluid fluctuation environment, but also can reduce system power consumption while the fluid is stable.%针对超声流量计高精度、低功耗的要求,提出一种自适应采样方法,克服了周期性采样的不足.该方法以相邻二次采样的时差变化率作为动态控制采样周期的主要指标,可根据流体流动情况自动调整采样周期.通过对自适应和周期性采样算法的实验数据进行比较分析,自适应采样方法不仅能在流体波动环境中显著提高计量精度,而且还可以在流体处于稳定时降低系统功耗.

  12. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, Simon Minze; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  13. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, Simon Minze; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  14. Importance of the market portfolio description in the assessment of a sample of Spanish investment funds through the Jensen’s Alpha

    OpenAIRE

    BELÉN VALLEJO ALONSO

    2003-01-01

    The right assessment of the investment funds performance and of the manager’s ability to add value with their management is an important aspect to which has been paid special attention. Among the traditional performance measures, one of the most used is the Jensen’s alpha. However, one of the main problems of the evaluation methods using the beta as a risk measure and, hence of the Jensen’s alpha, is their sensibility to the market portfolio. In this work we aim to study the importance of the...

  15. Global dust attenuation in disc galaxies: strong variation with specific star formation and stellar mass, and the importance of sample selection

    Science.gov (United States)

    Devour, Brian M.; Bell, Eric F.

    2016-06-01

    We study the relative dust attenuation-inclination relation in 78 721 nearby galaxies using the axis ratio dependence of optical-near-IR colour, as measured by the Sloan Digital Sky Survey, the Two Micron All Sky Survey, and the Wide-field Infrared Survey Explorer. In order to avoid to the greatest extent possible attenuation-driven biases, we carefully select galaxies using dust attenuation-independent near- and mid-IR luminosities and colours. Relative u-band attenuation between face-on and edge-on disc galaxies along the star-forming main sequence varies from ˜0.55 mag up to ˜1.55 mag. The strength of the relative attenuation varies strongly with both specific star formation rate and galaxy luminosity (or stellar mass). The dependence of relative attenuation on luminosity is not monotonic, but rather peaks at M3.4 μm ≈ -21.5, corresponding to M* ≈ 3 × 1010 M⊙. This behaviour stands seemingly in contrast to some older studies; we show that older works failed to reliably probe to higher luminosities, and were insensitive to the decrease in attenuation with increasing luminosity for the brightest star-forming discs. Back-of-the-envelope scaling relations predict the strong variation of dust optical depth with specific star formation rate and stellar mass. More in-depth comparisons using the scaling relations to model the relative attenuation require the inclusion of star-dust geometry to reproduce the details of these variations (especially at high luminosities), highlighting the importance of these geometrical effects.

  16. Adaptation and Psychometric Properties of the Self-Efficacy/Social Support for Activity for Persons with Intellectual Disability Scale (SE/SS-AID) in a Spanish Sample

    Science.gov (United States)

    Cuesta-Vargas, Antonio Ignacio; Paz-Lourido, Berta; Lee, Miyoung; Peterson-Besse, Jana J.

    2013-01-01

    Background: In this study we aimed to develop a Spanish version of the Self-Efficacy/Social Support Scales for Activity for persons with Intellectual Disability (SE/SS-AID). Method: A cross-sectional study was carried out in a sample of 117 individuals with intellectual disability (ID). The SE/SS-AID scales were translated into Spanish and their…

  17. Technology transfer for adaptation

    Science.gov (United States)

    Biagini, Bonizella; Kuhl, Laura; Gallagher, Kelly Sims; Ortiz, Claudia

    2014-09-01

    Technology alone will not be able to solve adaptation challenges, but it is likely to play an important role. As a result of the role of technology in adaptation and the importance of international collaboration for climate change, technology transfer for adaptation is a critical but understudied issue. Through an analysis of Global Environment Facility-managed adaptation projects, we find there is significantly more technology transfer occurring in adaptation projects than might be expected given the pessimistic rhetoric surrounding technology transfer for adaptation. Most projects focused on demonstration and early deployment/niche formation for existing technologies rather than earlier stages of innovation, which is understandable considering the pilot nature of the projects. Key challenges for the transfer process, including technology selection and appropriateness under climate change, markets and access to technology, and diffusion strategies are discussed in more detail.

  18. Comparison Between Adaptive Cluster Sampling and Simple Random Sampling: a Case Study of Coastal Bruguiera gymnorrhiza%适应性群团抽样技术与传统简单随机抽样法的比较——以沿海红树植物木榄为例

    Institute of Scientific and Technical Information of China (English)

    史京京; 雷渊才; 赵天忠

    2011-01-01

    A study was conducted to compare a new inequality probability sampling—adaptive cluster sampling (ACS) with conventional sampling methods such as simple random sampling ( SRS ) using the field data of a coastal mangrove species (Bruguiera gymnorrhiza) as a study object. Results of Hansen-Hurwitz (HH) and Horvitz-Thompson (HT) estimators in adaptive cluster sampling were analyzed in terms of various areas of initial sampling units, initial sample sizes and criterion values. Results show that, compared with SRS, the sampling variance for ACS is usually lower. The estimated variances for HT and HH increase with the increase of criterion value under the same conditions of area of sampling unit and initial sample size. The estimated variances for HT and HH are inversely proportional to the area of sampling unit under the same conditions of initial sampling size and criterion value. The estimated variance for HT is lower than that for HH when the area of sampling unit is small, while they are approximate when the area of sampling unit increases.%以沿海红树植物木榄(B. gymnorrhiza)的实地调查数据为对象,将适应性群团抽样技术--一种新的不等概抽样技术与传统简单随机抽样法进行了比较,并对适应性群团抽样的两种估计方法(HT、HH估计)在不同样本单元面积、初始抽样比和标准值下的结果进行了分析研究.结果表明:与传统简单随机抽样方法相比,适应性群团抽样方差一般更小.对于HT和HH估计,在相同样本单元面积、初始抽样比一定条件下,估计值方差一般随标准值的增加而增加.在相同初始抽样比和标准值条件下,HT、HH估计方差一般与样本单元面积成反比,样本单元面积较小时,与HH估计相比,HT估计方差较小;当样本单元面积增大时,二者方差趋于接近.

  19. Adaptive Lighting

    OpenAIRE

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive LightingAdaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled i...

  20. 自适应采样间隔的无线传感器网络多目标跟踪算法%Multi-target tracking algorithm based on adaptive sampling interval in wireless sensor networks

    Institute of Scientific and Technical Information of China (English)

    王建平; 赵高丽; 胡孟杰; 陈伟

    2014-01-01

    Multi-target tracking is a hot topic of current research on wireless sensor networks (WSN ). Based on adaptive sampling interval,we propose a multi-target tracking algorithm in order to save energy consumption and prevent tracking lost for WSN.We contrast the targets moving model by using the position metadata,and predicte the targets moving status based on extended Kalman filter (EKF).we adopt the probability density function (PDF )of the estimated targets to establish the tracking cluster.By defining the tracking center,we use Markov distance to quantify the election process of the main node (MN).We comput targets impact strength through the targets importance and the distance to MN node, and then use it to build tracking algorithm.We do the simulation experiment based on MATLAB,and the experiment results show that the proposed algorithm can accurate predict the trajectory of the targets,and adjust the sampling interval while the targets were moving.By analyzing the experiments data,we know that the proposed algorithm can improve the tracking precision and save the energy consumption of WSN obviously.%多目标跟踪是无线传感器网络当前研究的热点问题。针对多目标跟踪存在耗能较大,跟踪丢失等问题,提出了一种自适应采样间隔的多目标跟踪算法。采用跟踪目标的定位元数据来对目标的运动模式进行建模。基于扩展的卡尔曼滤波器来预测跟踪目标状态,采用预测目标定位的概率密度函数构建跟踪簇。通过定义跟踪目标中心,基于马氏距离来量化主节点 MN 的选举过程。通过跟踪目标重要性和其与MN之间的距离来量化目标的影响强度,并以此构建自适应采样间隔的多目标跟踪算法。基于MATLAB进行了仿真实验,实验结果显示,本文设计的跟踪算法能准确预测目标的运动轨迹,能随着运动目标的状态实时采用自适应的采样间隔。通过数据分析得知,本

  1. Validation of a simplified field-adapted procedure for routine determinations of methyl mercury at trace levels in natural water samples using species-specific isotope dilution mass spectrometry.

    Science.gov (United States)

    Lambertsson, Lars; Björn, Erik

    2004-12-01

    A field-adapted procedure based on species-specific isotope dilution (SSID) methodology for trace-level determinations of methyl mercury (CH(3)Hg(+)) in mire, fresh and sea water samples was developed, validated and applied in a field study. In the field study, mire water samples were filtered, standardised volumetrically with isotopically enriched CH(3) (200)Hg(+), and frozen on dry ice. The samples were derivatised in the laboratory without further pre-treatment using sodium tetraethyl borate (NaB(C(2)H(5))(4)) and the ethylated methyl mercury was purge-trapped on Tenax columns. The analyte was thermo-desorbed onto a GC-ICP-MS system for analysis. Investigations preceding field application of the method showed that when using SSID, for all tested matrices, identical results were obtained between samples that were freeze-preserved or analysed unpreserved. For DOC-rich samples (mire water) additional experiments showed no difference in CH(3)Hg(+) concentration between samples that were derivatised without pre-treatment or after liquid extraction. Extractions of samples for matrix-analyte separation prior to derivatisation are therefore not necessary. No formation of CH(3)Hg(+) was observed during sample storage and treatment when spiking samples with (198)Hg(2+). Total uncertainty budgets for the field application of the method showed that for analyte concentrations higher than 1.5 pg g(-1) (as Hg) the relative expanded uncertainty (REU) was approximately 5% and dominated by the uncertainty in the isotope standard concentration. Below 0.5 pg g(-1) (as Hg), the REU was >10% and dominated by variations in the field blank. The uncertainty of the method is sufficiently low to accurately determine CH(3)Hg(+) concentrations at trace levels. The detection limit was determined to be 4 fg g(-1) (as Hg) based on replicate analyses of laboratory blanks. The described procedure is reliable, considerably faster and simplified compared to non-SSID methods and thereby very

  2. Robust Adaptive Photon Tracing using Photon Path Visibility

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Jensen, Henrik Wann

    2011-01-01

    algorithm is the use of visibility of photon path as the importance function which ensures that our sampling algorithm focuses on paths that are visible from the given viewpoint. Our sampling algorithm builds on two recent developments in Markov chain Monte Carlo methods: adaptive Markov chain sampling...... and replica exchange. Using these techniques, each photon path is adaptively mutated and it explores the sampling space efficiently without being stuck at a local peak of the importance function. We have implemented this sampling approach in the progressive photon mapping algorithm which provides visibility...... information in a natural way when a photon path contributes to a measurement point. We demonstrate that the final algorithm is strikingly simple, yet effective at sampling photons under lighting conditions that would be difficult for existing Monte Carlo ray tracing-based methods....

  3. Origins of adaptive immunity.

    Science.gov (United States)

    Liongue, Clifford; John, Liza B; Ward, Alister

    2011-01-01

    Adaptive immunity, involving distinctive antibody- and cell-mediated responses to specific antigens based on "memory" of previous exposure, is a hallmark of higher vertebrates. It has been argued that adaptive immunity arose rapidly, as articulated in the "big bang theory" surrounding its origins, which stresses the importance of coincident whole-genome duplications. Through a close examination of the key molecules and molecular processes underpinning adaptive immunity, this review suggests a less-extreme model, in which adaptive immunity emerged as part of longer evolutionary journey. Clearly, whole-genome duplications provided additional raw genetic materials that were vital to the emergence of adaptive immunity, but a variety of other genetic events were also required to generate some of the key molecules, whereas others were preexisting and simply co-opted into adaptive immunity.

  4. Adaptive Rationality, Adaptive Behavior and Institutions

    Directory of Open Access Journals (Sweden)

    Volchik Vyacheslav, V.

    2015-12-01

    Full Text Available The economic literature focused on understanding decision-making and choice processes reveals a vast collection of approaches to human rationality. Theorists’ attention has moved from absolutely rational, utility-maximizing individuals to boundedly rational and adaptive ones. A number of economists have criticized the concepts of adaptive rationality and adaptive behavior. One of the recent trends in the economic literature is to consider humans irrational. This paper offers an approach which examines adaptive behavior in the context of existing institutions and constantly changing institutional environment. It is assumed that adaptive behavior is a process of evolutionary adjustment to fundamental uncertainty. We emphasize the importance of actors’ engagement in trial and error learning, since if they are involved in this process, they obtain experience and are able to adapt to existing and new institutions. The paper aims at identifying relevant institutions, adaptive mechanisms, informal working rules and practices that influence actors’ behavior in the field of Higher Education in Russia (Rostov Region education services market has been taken as an example. The paper emphasizes the application of qualitative interpretative methods (interviews and discourse analysis in examining actors’ behavior.

  5. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... distributed differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial...

  6. From equivalence to adaptation

    Directory of Open Access Journals (Sweden)

    Paulina Borowczyk

    2009-01-01

    Full Text Available The aim of this paper is to illustrate in which cases the translators use the adaptation when they are confronted with a term related to sociocultural aspects. We will discuss the notions of equivalence and adaptation and their limits in the translation. Some samples from Arte TV news and from the American film Shrek translated into Polish, German and French will be provided as a support for this article.

  7. Measuring the dimensions of adaptive capacity: a psychometric approach

    Directory of Open Access Journals (Sweden)

    Michael Lockwood

    2015-03-01

    Full Text Available Although previous studies have examined adaptive capacity using a range of self-assessment procedures, no objective self-report approaches have been used to identify the dimensions of adaptive capacity and their relative importance. We examine the content, structure, and relative importance of dimensions of adaptive capacity as perceived by rural landholders in an agricultural landscape in South-Eastern Australia. Our findings indicate that the most important dimensions influencing perceived landholder adaptive capacity are related to their management style, particularly their change orientation. Other important dimensions are individual financial capacity, labor availability, and the capacity of communities and local networks to support landholders' management practices. Trust and confidence in government with respect to native vegetation management was not found to be a significant dimension of perceived adaptive capacity. The scale items presented, particularly those with high factor loadings, provide a solid foundation for assessment of adaptive capacity in other study areas, as well as exploration of relationships between the individual dimensions of adaptive capacity and dependent variables such as perceived resilience. Further work is needed to refine the scale items and compare the findings from this case study with those from other contexts and population samples.

  8. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and...

  9. Adaptation: A Partially Automated Approach

    OpenAIRE

    Manjing, Tham; Bukhsh, F.A.; Weigand, H.

    2014-01-01

    This paper showcases the possibility of creating an adaptive auditing system. Adaptation in an audit environment need human intervention at some point. Based on a case study this paper focuses on automation of adaptation process. It is divided into solution design and validation parts. The artifact design is developed around import procedures of M-company. An overview of the artefact is discussed in detail to fully describes the adaptation mechanism with automatic adjustment for compliance re...

  10. Appraising Adaptive Management

    Directory of Open Access Journals (Sweden)

    Kai N. Lee

    1999-12-01

    Full Text Available Adaptive management is appraised as a policy implementation approach by examining its conceptual, technical, equity, and practical strengths and limitations. Three conclusions are drawn: (1 Adaptive management has been more influential, so far, as an idea than as a practical means of gaining insight into the behavior of ecosystems utilized and inhabited by humans. (2 Adaptive management should be used only after disputing parties have agreed to an agenda of questions to be answered using the adaptive approach; this is not how the approach has been used. (3 Efficient, effective social learning, of the kind facilitated by adaptive management, is likely to be of strategic importance in governing ecosystems as humanity searches for a sustainable economy.

  11. Adaptive regularization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.;

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient descent...... in the estimated generalization error with respect to the regularization parameters. The scheme is implemented in the authors' Designer Net framework for network training and pruning, i.e., is based on the diagonal Hessian approximation. The scheme does not require essential computational overhead in addition...

  12. Adaptive Computing.

    Science.gov (United States)

    Harrell, William

    1999-01-01

    Provides information on various adaptive technology resources available to people with disabilities. (Contains 19 references, an annotated list of 129 websites, and 12 additional print resources.) (JOW)

  13. ADAPT Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Diagnostics and Prognostics Testbed (ADAPT) Project Lead: Scott Poll Subject Fault diagnosis in electrical power systems Description The Advanced...

  14. Signal sampling circuit

    OpenAIRE

    Louwsma, Simon Minze; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converter via a respective output switch. The output switch of each channel opens for a tracking time period when the track-and-hold circuit is in a tracking mode for sampling the signal, and closes for a ...

  15. A Double Kriging Model Method Based on Optimization Sample Points for Importance Measure Analysis%基于优化样本点的双重Kriging模型的重要性测度求解方法

    Institute of Scientific and Technical Information of China (English)

    李大伟; 吕震宙; 张磊刚

    2014-01-01

    For the engineering problems involving implicit limit state functions, a double Kriging model method based on optimization sample points for importance measure analysis is discussed in this paper. Firstly, in this method, a small amount of initial sample points are employed to build the Kriging surrogate model which relates the basic variables to the response. Then the subsequent points with relatively high uncertainty can be added to the sam-ple points with global optimization method. Finally, the Kriging surrogate model can give fairly good accuracy with a minimum number of sample points. The relationship between the basic variables and the response function and that between the basic variables and the conditional probability of failure are substituted by Kriging models;so the com-putation cost of the importance measure is reduced largely. To illustrate the engineering applicability and feasibility of the method, numerical and engineering examples are provided and discussed.%针对工程实际中极限状态函数往往是隐式的问题,提出了基于优化样本点的双重Kriging模型的重要性测度求解方法。该方法首先以少量初始样本点建立基本变量与响应值间的Kriging代理模型,通过全局优化的方法寻优找到Kriging预测值不确定性较大的点,并将其加入到初始样本点,从而在尽量少样本点的情况下建立满足精度的Kriging代理模型。该方法将基本变量与功能函数值以及基本变量与条件失效概率间的隐式关系以Kriging代理模型替代,在保证精度的情况下大大降低了矩独立的基本变量对失效概率重要性测度求解过程的计算量,数值算例和工程算例说明了该方法的工程适用性和可行性。

  16. 非线性结构动力学系统的首次穿越%An important sampling procedure for estimating failure probabilities of non-linear dynamic systems

    Institute of Scientific and Technical Information of China (English)

    任丽梅; 徐伟; 李战国

    2013-01-01

    The failure probability is one of the most important reliability measures in structural reliability assessment of dynamical systems. Here, a procedure for estimating failure probabilities of non-linear systems based on the important sampling technique was presented. Firstly ,by using Rice formula,the equivalent linear version of the non-linear systems was derived. Using the equivalent linear equation, the design point of the equivalent linear systems was used to construct control function. Secondly, an important sampling technique was used to estimate the first excursion probabilities for the non-linear system. Finally, a Duffing oscillator was taken for example. The simulation results showed that the proposed method is correct and effective; the number of samples and the computational time are reduced significantly compared with those of direct Monte Carlo simulations.%在结构动力学系统的可靠性分析中,动力学系统的首次穿越失效一直是研究重点问题之一.基于重要抽样法基础上研究了非线性结构动力学系统的首次穿越.首先根据Rice公式,得到与非线性系统方程具有相同平均上穿率的等效线性化系统方程,利用此等效方程得到设计点的解析表达式,并用此解析式来构造控制函数,最后将此控制函数运用到非线性系统中,利用重要抽样法估计非线性系统的首穿失效概率.以Duffing振子为例,模拟结果显示了方法的正确性与有效性,与原始蒙特卡罗模拟方法相比较,样本数量、计算所需时间都有明显减小.

  17. Adaptively Sharing Time-Series with Differential Privacy

    CERN Document Server

    Fan, Liyue

    2012-01-01

    Sharing real-time aggregate statistics of private data has given much benefit to the public to perform data mining for understanding important phenomena, such as Influenza outbreaks and traffic congestions. We propose an adaptive approach with sampling and estimation to release aggregated time series under differential privacy, the key innovation of which is that we utilize feedback loops based on observed (perturbed) values to dynamically adjust the estimation model as well as the sampling rate. To minimize the overall privacy cost, our solution uses the PID controller to adaptively sample long time-series according to detected data dynamics. To improve the accuracy of data release per timestamp, the Kalman filter is used to predict data values at non-sampling points and to estimate true values from perturbed query answers at sampling points. Our experiments with three real data sets show that it is beneficial to incorporate feedback into both the estimation model and the sampling process. The results confir...

  18. Adaptations, exaptations, and spandrels.

    Science.gov (United States)

    Buss, D M; Haselton, M G; Shackelford, T K; Bleske, A L; Wakefield, J C

    1998-05-01

    Adaptation and natural selection are central concepts in the emerging science of evolutionary psychology. Natural selection is the only known causal process capable of producing complex functional organic mechanisms. These adaptations, along with their incidental by-products and a residue of noise, comprise all forms of life. Recently, S. J. Gould (1991) proposed that exaptations and spandrels may be more important than adaptations for evolutionary psychology. These refer to features that did not originally arise for their current use but rather were co-opted for new purposes. He suggested that many important phenomena--such as art, language, commerce, and war--although evolutionary in origin, are incidental spandrels of the large human brain. The authors outline the conceptual and evidentiary standards that apply to adaptations, exaptations, and spandrels and discuss the relative utility of these concepts for psychological science. PMID:9612136

  19. ADAPTATION DEBATE, AHMET VEFİK PAŞA AND ZORAKİ TABİB SAMPLE / ADAPTASYON MESELESİ, AHMET VEFİK PAŞA VE ZORAKİ TABİB ÖRNEĞİ

    Directory of Open Access Journals (Sweden)

    Dr. Bayram YILDIZ

    2007-08-01

    Full Text Available Adaptation, an item of theatre in Turkish literature beginning with Tanzimat, has been opposed since it would have negative effects on development of national theatre and on Turkish society structure. On the other hand adaptation was supported as it would benefit development of modern theatre. Though the common belief was against adaptation, adaptations of Ahmet Vefik Pasa from Moliere have found acceptance. This acceptance might be due to his preference of comedy and of its best performer in world literature, Moliere, and adaptation of plays consistent with Turkish society structure and moral virtues as in the case of Zoraki Tabip.

  20. 基于稳定竞争自适应重加权采样的光谱分析无标模型传递方法%Calibration Transfer without Standards for Spectral Analysis Based on Stability Competitive Adaptive Reweighted Sampling

    Institute of Scientific and Technical Information of China (English)

    张晓羽; 李庆波; 张广军

    2014-01-01

    A novel calibration transfer method based on stability competitive adaptive reweighted sampling (SCARS) was pro-posed in the present paper .An informative criterion ,i .e .the stability index ,defined as the absolute value of regression coeffi-cient divided by its standard deviation was used .And the root mean squared error of prediction (RMSEP) after transfer was also used .The wavelength variables which were important and insensitive to influence of measurement parameters were selected . And then the differences in responses of different instruments or measurement conditions for a specific sample were eliminated or reduced to improve the calibration transfer results .Moreover ,in the proposed method ,the spectral variables were compressed , making calibration transfer more stable .The application of the proposed method to calibration transfer of NIR analysis was eval-uated by analyzing the corn with different NIR spectrometers .The results showed that this method can well correct the differ-ence between instruments and improve the analytical accuracy .The transfer results obtained by the proposed method ,orthogonal signal correction (OSC) ,Monte Carlo uninformative variable elimination (MCUVE) and competitive adaptive reweighted sam-pling (CARS) ,respectively ,for corn with different NIR spectrometers indicated that the former gave the best analytical accura-cy ,and was effective for the spectroscopic data compression which can simplify and optimize the transfer process .%提出了一种基于稳定竞争自适应重加权采样(stability competitive adaptive reweighted sampling , SCARS)的无标模型传递方法。利用有用信息标准即稳定度指数(定义为回归系数除以其标准偏差的绝对值)和传递后的预测均方根误差(root mean squared error of prediction ,RMSEP),选择重要的、受测样参数影响不敏感的波长变量,能够消除或减少不同仪器或测量条件对样本信息反应差异,提高模

  1. Introduction to adaptive arrays

    CERN Document Server

    Monzingo, Bob; Haupt, Randy

    2011-01-01

    This second edition is an extensive modernization of the bestselling introduction to the subject of adaptive array sensor systems. With the number of applications of adaptive array sensor systems growing each year, this look at the principles and fundamental techniques that are critical to these systems is more important than ever before. Introduction to Adaptive Arrays, 2nd Edition is organized as a tutorial, taking the reader by the hand and leading them through the maze of jargon that often surrounds this highly technical subject. It is easy to read and easy to follow as fundamental concept

  2. Reconstructing pictures from sampled data

    International Nuclear Information System (INIS)

    Corrections for two important degrading effects in gamma ray imaging systems are described. The adaptive local operator has the advantage that given the assumptions of the method the optimum correction is made for both sources of error simultaneously. The probability operator method although less soundly based in classical signal processing theory does make more use of the known statistical properties of possible inputs and consequently might make a better estimate of the true sample values. A separate correction would need to be made for blurring effects. The initial results indicate that the methods are worthy of further investigation

  3. Staff Adaptation in Selected Company

    OpenAIRE

    Štolcová, Jana

    2011-01-01

    The work focuses on personnel actions of employee adaptation as, nowadays, it is very important to maintain a good and skilled staff. The main aim of this work is to analyze and evaluate the process of adaptation of new employees at the headquarters of the BILLA, Ltd., which operates more than 200 supermarkets around the Czech Republic,. Another task is to propose partial measures which would improve the process of adaptation in the society. The literature review discusses the importance of ...

  4. IMPORTANT NOTIFICATION

    CERN Multimedia

    HR Department

    2009-01-01

    Green plates, removals and importation of personal effects Please note that, as from 1 April 2009, formalities relating to K and CD special series French vehicle plates (green plates), removals and importation of personal effects into France and Switzerland will be dealt with by GS Department (Building 73/3-014, tel. 73683/74407). Importation and purchase of tax-free vehicles in Switzerland, as well as diplomatic privileges, will continue to be dealt with by the Installation Service of HR Department (Building 33/1-011, tel. 73962). HR and GS Departments

  5. F-VIPGI: a new adapted version of VIPGI for FORS2 spectroscopy. Application to a sample of 16 X-ray selected galaxy clusters at 0.6 < z < 1.2

    CERN Document Server

    Nastasi, Alessandro; Fassbender, Rene; Boehringer, Hans; Pierini, Daniele; Verdugo, Miguel; Garilli, Bianca; Franzetti, Paolo

    2013-01-01

    The goal of this paper is twofold. Firstly, we present F-VIPGI, a new version of the VIMOS Interactive Pipeline and Graphical Interface (VIPGI) adapted to handle FORS2 spectroscopic data. Secondly, we investigate the spectro-photometric properties of a sample of galaxies residing in distant X-ray selected galaxy clusters, the optical spectra of which were reduced with this new pipeline. We provide basic technical information about the innovations of the new software and, as a demonstration of the capabilities of the new pipeline, we show results obtained for 16 distant (0.65 < z < 1.25) X-ray luminous galaxy clusters selected within the XMM-Newton Distant Cluster Project. We performed a spectral indices analysis of the extracted optical spectra of their members, based on which we created a library of composite high signal-to-noise ratio spectra representative of passive and star-forming galaxies residing in distant galaxy clusters. The spectroscopic templates are provided to the community in electronic ...

  6. Sample rotating turntable kit for infrared spectrometers

    Science.gov (United States)

    Eckels, Joel Del; Klunder, Gregory L.

    2008-03-04

    An infrared spectrometer sample rotating turntable kit has a rotatable sample cup containing the sample. The infrared spectrometer has an infrared spectrometer probe for analyzing the sample and the rotatable sample cup is adapted to receive the infrared spectrometer probe. A reflectance standard is located in the rotatable sample cup. A sleeve is positioned proximate the sample cup and adapted to receive the probe. A rotator rotates the rotatable sample cup. A battery is connected to the rotator.

  7. Real-time Visual Tracking of Multiple Targets Using Bootstrap Importance Sampling%自助重要性采样用于实时多目标视觉跟踪

    Institute of Scientific and Technical Information of China (English)

    沈乐君; 游志胜; 李晓峰

    2012-01-01

    多目标视觉跟踪的主要困难来自于多个目标交互(部分或完全遮挡)导致的歧义性.马尔可夫随机场(Markov random field,MRF)可以消除这种歧义性且无需显式的数据关联.但是,通用概率推理算法的计算代价很高.针对上述问题,本文做出了3点贡献:1)设计了新的具有“分散-集中-分散”结构的递归贝叶斯跟踪框架—自助重要性采样粒子滤波器,它使用融入当前时刻观测的重要性密度函数解决维数灾难问题,将计算复杂度从指数增长变为线性增长;2)提出了新的蒙特卡洛策略—自助重要性采样,利用MRF的因子分解性质进行重要性采样,并使用自助法产生低成本高质量的样本、降低似然度计算次数和维持多模式分布;3)采用了新的边缘化技术—使用辅助变量采样进行边缘化,使用自助直方图对边缘后验分布进行密度估计.实验结果表明,本文提出的算法能够对大量目标进行实时跟踪,能够处理目标间复杂的交互,能够在目标消失后维持多模式分布.%Ambiguity is the major difficulty in multi-object tracking problem due to the interactions of multiple targets (partial or complete occlusion). This ambiguity can be resolved by Markov random field (MRF) without explicit data association. However, the computational cost of general probabilistic inference algorithms of MRF is expensive. This paper presents a novel approach to this problem. Firstly, a new recursive Bayesian estimation framework, bootstrap importance sampling particle filter (BIS-PF), is devised, which has a "distributed-central-distributed" structure. The core of this framework is a suboptimal importance density which uses the observation at present time. So, it does not suffer from the curse of dimensionality. Secondly, a new Monte Carlo strategy is proposed, which uses bootstrap sampling to generate low-cost and high-quality samples, maintains multi-modality and decreases the

  8. Adaptive digital filters

    CERN Document Server

    Kovačević, Branko; Milosavljević, Milan

    2013-01-01

    Adaptive Digital Filters” presents an important discipline applied to the domain of speech processing. The book first makes the reader acquainted with the basic terms of filtering and adaptive filtering, before introducing the field of advanced modern algorithms, some of which are contributed by the authors themselves. Working in the field of adaptive signal processing requires the use of complex mathematical tools. The book offers a detailed presentation of the mathematical models that is clear and consistent, an approach that allows everyone with a college level of mathematics knowledge to successfully follow the mathematical derivations and descriptions of algorithms.   The algorithms are presented in flow charts, which facilitates their practical implementation. The book presents many experimental results and treats the aspects of practical application of adaptive filtering in real systems, making it a valuable resource for both undergraduate and graduate students, and for all others interested in m...

  9. Principal component importance sampling for bank credit portfolio risk management%银行信用组合风险多成分重要性抽样算法研究

    Institute of Scientific and Technical Information of China (English)

    龚朴; 邓洋; 胡祖辉

    2012-01-01

    银行信用组合违约风险的度量和计算对银行监管有着重要的意义.使蒙特卡洛研究信用组合违约概率时,为提高模拟效率,越来越多的学者采用了重要性抽样技术来实现.它主要通过条件独立性和“均值移动”两个步骤实现.本文基于前人研究结果的基础之上,提出了一种基于违约相关性矩阵的多因子变方差的重要性抽样算法.该算法通过主成分分析选择违约结构中的占优成分并扩大其方差来实现.数值算例证明了该方法在信用组合遭遇极值事件时,能够提高模拟效率及计算精度,具有一定的计算优势.%The bank credit portfolio risk measurement has great significance to bank supervision. One of the most popular methods to estimate the default probability of credit asset is Monte Carlo simulation. In order to improve the simulation efficiency, more and more studies have adopted the important sampling technique to deal with it. In this paper, we propose an importance sampling procedure which does not need the conditional independence which previous studies had to base on. The procedure we provide uses principal component a-nalysis to choose dominant factors. Numerical experiments are provided and the results show that our approach when a credit portfolio confronts extreme events, offers substantial variance reduction and outperforms plain Monte Carlo algorithm and Morokoff IS algorithm.

  10. 直接压印快速采样法在院内感染环境监测中的重要性分析%Direct Imprint Rapid Sampling Analysis of Importance of Environmental Monitoring of Infection in Hospital

    Institute of Scientific and Technical Information of China (English)

    刘素珍; 张甜; 欧阳琳

    2013-01-01

    目的:对直接压印快速采样法在院内感染环境监测中的重要性进行分析。方法:对2011年6月-2012年5月本院空气、医护人员手部、物体表面、器械、经高压消毒的物品等240份表面细菌数量进行检测,观察组采用直接压印快速采样法,对照组采用传统生理盐水棉拭子涂擦法,对两组的检测效果进行对比分析。结果:观察组细菌数量检出率明显高于对照组,比较差异具有统计学意义(P<0.05)。结论:和传统生理盐水棉拭子涂擦法相比,直接压印快速采样法在院内感染环境的检测采样中有着明显的优势,能够减少众多中间环节所导致的检测结果误差缩短采样到达培养箱中的时间,方便快捷、易操作、经济节约,值得在同级医疗机构中推广应用。%Objective:To analyze direct imprint rapid sampling method in the environmental monitoring of nosocomial infections importance. Method:June 2011 to May 2012 in our hospital air medical hands,surfaces,equipment,items such as autoclaved 240 parts surface bacteria were detected by direct observation group imprint fast sampling method,saline control group using a cotton swab rubbed traditional method,detection results of the two groups were compared. Result:The number of bacteria detection rate of the observation group was significantly higher than that of the control group,the difference was statistically significant(P<0.05). Conclusion:The traditional method compared to saline cotton swab rubbed directly imprint rapid sampling method in the detection of nosocomial infection environmental sampling has obvious advantages,can reduce the number of intermediate links caused by shortening the sampling error of test results arrive incubator time,convenient,easy to operate,economical,worthy of medical institutions at the same level application.

  11. Is adaptation. Truly an adaptation? Is adaptation. Truly an adaptation?

    Directory of Open Access Journals (Sweden)

    Thais Flores Nogueira Diniz

    2008-04-01

    Full Text Available The article begins by historicizing film adaptation from the arrival of cinema, pointing out the many theoretical approaches under which the process has been seen: from the concept of “the same story told in a different medium” to a comprehensible definition such as “the process through which works can be transformed, forming an intersection of textual surfaces, quotations, conflations and inversions of other texts”. To illustrate this new concept, the article discusses Spike Jonze’s film Adaptation. according to James Naremore’s proposal which considers the study of adaptation as part of a general theory of repetition, joined with the study of recycling, remaking, and every form of retelling. The film deals with the attempt by the scriptwriter Charles Kaufman, cast by Nicholas Cage, to adapt/translate a non-fictional book to the cinema, but ends up with a kind of film which is by no means what it intended to be: a film of action in the model of Hollywood productions. During the process of creation, Charles and his twin brother, Donald, undergo a series of adventures involving some real persons from the world of film, the author and the protagonist of the book, all of them turning into fictional characters in the film. In the film, adaptation then signifies something different from itstraditional meaning. The article begins by historicizing film adaptation from the arrival of cinema, pointing out the many theoretical approaches under which the process has been seen: from the concept of “the same story told in a different medium” to a comprehensible definition such as “the process through which works can be transformed, forming an intersection of textual surfaces, quotations, conflations and inversions of other texts”. To illustrate this new concept, the article discusses Spike Jonze’s film Adaptation. according to James Naremore’s proposal which considers the study of adaptation as part of a general theory of repetition

  12. Context-aware adaptive spelling in motor imagery BCI

    Science.gov (United States)

    Perdikis, S.; Leeb, R.; Millán, J. d. R.

    2016-06-01

    Objective. This work presents a first motor imagery-based, adaptive brain-computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject’s performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Approach. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree’s language model to improve online expectation-maximization maximum-likelihood estimation. Main results. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. Significance. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  13. Context-aware adaptive spelling in motor imagery BCI

    Science.gov (United States)

    Perdikis, S.; Leeb, R.; Millán, J. d. R.

    2016-06-01

    Objective. This work presents a first motor imagery-based, adaptive brain–computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject’s performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Approach. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree’s language model to improve online expectation-maximization maximum-likelihood estimation. Main results. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. Significance. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  14. Deconvolution with correct sampling

    CERN Document Server

    Magain, P; Sohy, S

    1997-01-01

    A new method for improving the resolution of astronomical images is presented. It is based on the principle that sampled data cannot be fully deconvolved without violating the sampling theorem. Thus, the sampled image should not be deconvolved by the total Point Spread Function, but by a narrower function chosen so that the resolution of the deconvolved image is compatible with the adopted sampling. Our deconvolution method gives results which are markedly superior to those of other existing techniques: in particular, it does not produce ringing around point sources superimposed on a smooth background. Moreover, it allows to perform accurate astrometry and photometry of crowded fields. These improvements are a consequence of both the correct treatment of sampling and the recognition that the most probable astronomical image is not a flat one. The method is also well adapted to the optimal combination of different images of the same object, as can be obtained, e.g., via adaptive optics techniques.

  15. 基于重要样本法的结构动力学系统的首次穿越%FIRST EXCURSION PROBABILITIES OF DYNAMICAL SYSTEMS BY IMPORTANCE SAMPLING

    Institute of Scientific and Technical Information of China (English)

    任丽梅; 徐伟; 肖玉柱; 王文杰

    2012-01-01

    基于Gisranov定理,提出一种估计稳态高斯白噪声激励的结构动力学系统首穿失效概率的重要样本法.文章重点是构造控制函数,控制函数促使随机响应尽量集中在样本空间中最易导致首次穿越发生的部分.利用设计点构造控制函数,在线性系统场合,结合时不变系统的结构可靠性理论,通过解有约束的优化问题得到设计点;在非线性系统场合,利用Heonsang Koo提出的设计点激励,通过镜像法得到设计点.最后给出例子,将所提方法与原始蒙特卡罗法相比较,模拟结果显示方法的正确性与有效性.%Based on the Girsanov transformation, this paper develops a method for estimating the first excursion probability of dynamical systems with stationary gauss white noise. The focus is to construct control function that concentrates on the samples paths in the "most important part" of the sample space, to achieve the purpose of variance reduction. The paper uses design point to construct control function. For linear systems, the present approach combines with the time-invariant structure reliability theory to get design points by solving the problem of the optimization. For non-linear systems, the paper uses mirror-images method to get design points. Finally the paper gives two examples. The results show the method of this paper to be correct and effective by comparing with the primitive Monte Carlo method.

  16. Genomics of local adaptation with gene flow.

    Science.gov (United States)

    Tigano, Anna; Friesen, Vicki L

    2016-05-01

    Gene flow is a fundamental evolutionary force in adaptation that is especially important to understand as humans are rapidly changing both the natural environment and natural levels of gene flow. Theory proposes a multifaceted role for gene flow in adaptation, but it focuses mainly on the disruptive effect that gene flow has on adaptation when selection is not strong enough to prevent the loss of locally adapted alleles. The role of gene flow in adaptation is now better understood due to the recent development of both genomic models of adaptive evolution and genomic techniques, which both point to the importance of genetic architecture in the origin and maintenance of adaptation with gene flow. In this review, we discuss three main topics on the genomics of adaptation with gene flow. First, we investigate selection on migration and gene flow. Second, we discuss the three potential sources of adaptive variation in relation to the role of gene flow in the origin of adaptation. Third, we explain how local adaptation is maintained despite gene flow: we provide a synthesis of recent genomic models of adaptation, discuss the genomic mechanisms and review empirical studies on the genomics of adaptation with gene flow. Despite predictions on the disruptive effect of gene flow in adaptation, an increasing number of studies show that gene flow can promote adaptation, that local adaptations can be maintained despite high gene flow, and that genetic architecture plays a fundamental role in the origin and maintenance of local adaptation with gene flow.

  17. Adaptive test

    DEFF Research Database (Denmark)

    Kjeldsen, Lars Peter; Eriksen, Mette Rose

    2010-01-01

    Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale.......Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale....

  18. Strategic Adaptation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation....... This model incorporates elements of central strategizing, autonomous entrepreneurial behavior, interactive information processing, and open communication systems that enhance the organization's ability to observe exogenous changes and respond effectively to them....

  19. Adaptive management

    Science.gov (United States)

    Allen, Craig R.; Garmestani, Ahjond S.

    2015-01-01

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.

  20. Adaptation and perceptual norms

    Science.gov (United States)

    Webster, Michael A.; Yasuda, Maiko; Haber, Sara; Leonard, Deanne; Ballardini, Nicole

    2007-02-01

    We used adaptation to examine the relationship between perceptual norms--the stimuli observers describe as psychologically neutral, and response norms--the stimulus levels that leave visual sensitivity in a neutral or balanced state. Adapting to stimuli on opposite sides of a neutral point (e.g. redder or greener than white) biases appearance in opposite ways. Thus the adapting stimulus can be titrated to find the unique adapting level that does not bias appearance. We compared these response norms to subjectively defined neutral points both within the same observer (at different retinal eccentricities) and between observers. These comparisons were made for visual judgments of color, image focus, and human faces, stimuli that are very different and may depend on very different levels of processing, yet which share the property that for each there is a well defined and perceptually salient norm. In each case the adaptation aftereffects were consistent with an underlying sensitivity basis for the perceptual norm. Specifically, response norms were similar to and thus covaried with the perceptual norm, and under common adaptation differences between subjectively defined norms were reduced. These results are consistent with models of norm-based codes and suggest that these codes underlie an important link between visual coding and visual experience.

  1. Adapting Bulls to Florida

    Science.gov (United States)

    The adaptation of bulls used for natural breeding purposes to the Gulf Coast region of the United States including all of Florida is an important topic. Nearly 40% of the U.S. cow/calf population resides in the Gulf Coast and Southeast. Thus, as A.I. is relatively rare, the number of bulls used for ...

  2. Adaptive Face Recognition via Structed Representation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yu-hua; ZENG Xiao-ming

    2014-01-01

    In this paper, we propose a face recognition approach-Structed Sparse Representation-based classification when the measurement of the test sample is less than the number training samples of each subject. When this condition is not satisfied, we exploit Nearest Subspace approach to classify the test sample. In order to adapt all the cases, we combine the two approaches to an adaptive classification method-Adaptive approach. The adaptive approach yields greater recognition accuracy than the SRC approach and CRC_RLS approach with low sample rate on the Extend Yale B dataset. And it is more efficient than other two approaches.

  3. Applying IRT_ΔB Procedure and Adapted LR Procedure to Detect DIF in Tests with Matrix Sampling%IRT_Δb法和修正LR法对矩阵取样DIF检验的有效性

    Institute of Scientific and Technical Information of China (English)

    张勋; 李凌艳; 刘红云; 孙研

    2013-01-01

    Matrix sampling is a useful technique widely used in large-scale educational assessments. In an assessment with matrix sampling design, each examinee takes one of the multiple booklets with partial items. A critical problem of detecting differential item functioning (DIF) in such scenario has gained a lot of attention in recent years, which is, it is not appropriate to take the observed total score obtained from individual booklet as the matching variable in detecting the DIF. Therefore, the traditional detecting methods, such as Mantel-Haenszel (MH), SIBTEST, as well as Logistic Regression (LR) are not suitable. IRT_Δb might be an alternative due to its abilities to provide valid matching variable. However, the DIF classification criterion of IRT_Δb was not well established yet. Thus, the purpose of this study were: 1) to investigate the efficiency and robustness of using ability parameters obtained from Item Response Theory (IRT) model as the matching variable, comparing with the way using traditional observed raw total scores;2) to further identify what factors will influence the abilities in detecting DIF of two methods;3) to propose a DIF classification criteria for IRT_Δb. Simulated and empirical data were both employed in this study to explore the robustness and the efficiency of the two prevailing DIF detecting methods, which were the IRT_Δb method and the adapted LR method with the estimation of group-level ability based on IRT model as the matching variable. In the Monte Carlo study, a matrix sampling test was generated, and various experimental conditions were simulated as follows:1) different proportions of DIF items;2) different actual examinee ability distributions;3) different sample sizes;4) different size of DIF. Two DIF detection methods were then applied and results were compared. In addition, power functions were established in order to derive DIF classification rule for IRT_Δb based on current rules for LR. In the empirical study, through

  4. STUDYING COMPLEX ADAPTIVE SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    John H. Holland

    2006-01-01

    Complex adaptive systems (cas) - systems that involve many components that adapt or learn as they interact - are at the heart of important contemporary problems. The study of cas poses unique challenges: Some of our most powerful mathematical tools, particularly methods involving fixed points, attractors, and the like, are of limited help in understanding the development of cas. This paper suggests ways to modify research methods and tools, with an emphasis on the role of computer-based models, to increase our understanding of cas.

  5. Intestinal mucosal adaptation

    Institute of Scientific and Technical Information of China (English)

    Laurie Drozdowski; Alan BR Thomson

    2006-01-01

    Intestinal failure is a condition characterized by malnutrition and/or dehydration as a result of the inadequate digestion and absorption of nutrients. The most common cause of intestinal failure is short bowel syndrome, which occurs when the functional gut mass is reduced below the level necessary for adequate nutrient and water absorption. This condition may be congenital, or may be acquired as a result of a massive resection of the small bowel. Following resection, the intestine is capable of adaptation in response to enteral nutrients as well as other trophic stimuli. Identifying factors that may enhance the process of intestinal adaptation is an exciting area of research with important potential clinical applications.

  6. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression by...... minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  7. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the...

  8. Adaptation Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Huq, Saleemul

    2011-11-15

    Efforts to help the world's poor will face crises in coming decades as climate change radically alters conditions. Action Research for Community Adapation in Bangladesh (ARCAB) is an action-research programme on responding to climate change impacts through community-based adaptation. Set in Bangladesh at 20 sites that are vulnerable to floods, droughts, cyclones and sea level rise, ARCAB will follow impacts and adaptation as they evolve over half a century or more. National and international 'research partners', collaborating with ten NGO 'action partners' with global reach, seek knowledge and solutions applicable worldwide. After a year setting up ARCAB, we share lessons on the programme's design and move into our first research cycle.

  9. Adaptive ethnography

    DEFF Research Database (Denmark)

    Berth, Mette

    2005-01-01

    This paper focuses on the use of an adaptive ethnography when studying such phenomena as young people's use of mobile media in a learning perspective. Mobile media such as PDAs and mobile phones have a number of affordances which make them potential tools for learning. However, before we begin...... formal and informal learning contexts. The paper also proposes several adaptive methodological techniques for studying young people's interaction with mobiles....... to design and develop educational materials for mobile media platforms we must first understand everyday use and behaviour with a medium such as a mobile phone. The paper outlines the research design for a PhD project on mobile learning which focuses on mobile phones as a way to bridge the gap between...

  10. Hedonic "adaptation"

    Directory of Open Access Journals (Sweden)

    Paul Rozin

    2008-02-01

    Full Text Available People live in a world in which they are surrounded by potential disgust elicitors such as ``used'' chairs, air, silverware, and money as well as excretory activities. People function in this world by ignoring most of these, by active avoidance, reframing, or adaptation. The issue is particularly striking for professions, such as morticians, surgeons, or sanitation workers, in which there is frequent contact with major disgust elicitors. In this study, we study the ``adaptation'' process to dead bodies as disgust elicitors, by measuring specific types of disgust sensitivity in medical students before and after they have spent a few months dissecting a cadaver. Using the Disgust Scale, we find a significant reduction in disgust responses to death and body envelope violation elicitors, but no significant change in any other specific type of disgust. There is a clear reduction in discomfort at touching a cold dead body, but not in touching a human body which is still warm after death.

  11. Adaptive noise

    OpenAIRE

    Viney, Mark; Reece, Sarah E.

    2013-01-01

    In biology, noise implies error and disorder and is therefore something which organisms may seek to minimize and mitigate against. We argue that such noise can be adaptive. Recent studies have shown that gene expression can be noisy, noise can be genetically controlled, genes and gene networks vary in how noisy they are and noise generates phenotypic differences among genetically identical cells. Such phenotypic differences can have fitness benefits, suggesting that evolution can shape noise ...

  12. Adaptable positioner

    International Nuclear Information System (INIS)

    This paper describes the circuits and programs in assembly language, developed to control the two DC motors that give mobility to a mechanical arm with two degrees of freedom. As a whole, the system is based in a adaptable regulator designed around a 8 bit microprocessor that, starting from a mode of regulation based in the successive approximation method, evolve to another mode through which, only one approximation is sufficient to get the right position of each motor. (Author) 22 fig. 6 ref

  13. Image Sampling with Quasicrystals

    CERN Document Server

    Grundland, Mark; Masakova, Zuzana; Dodgson, Neil A; 10.3842/SIGMA.2009.075

    2009-01-01

    We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  14. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  15. Adaptive Playware in Physical Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Thorsteinsson, Arnar Tumi

    2011-01-01

    that the activity automatically will match the capability of the individual user. With small test groups, we investigate how different age groups and gender groups physically interact with some playware games, and find indications of differences between the groups. Despite the small test set, the results......We describe how playware and games may adapt to the interaction of the individual user. We hypothesize that in physical games there are individual differences in user interaction capabilities and styles, and that adaptive playware may adapt to the individual user’s capabilities, so...... are a proof of existence of differences and of the need for adaptation, and therefore we investigate adaptation as an important issue for playware. With simple playware games, we show that the adaptation will speed the physical game up and down to find the appropriate level that matches the reaction speed...

  16. Procedures for Sampling Vegetation

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report outlines vegetation sampling procedures used on various refuges in Region 3. The importance of sampling the response of marsh vegetation to management...

  17. Evaluating sampling designs by computer simulation: A case study with the Missouri bladderpod

    Science.gov (United States)

    Morrison, L.W.; Smith, D.R.; Young, C.C.; Nichols, D.W.

    2008-01-01

    To effectively manage rare populations, accurate monitoring data are critical. Yet many monitoring programs are initiated without careful consideration of whether chosen sampling designs will provide accurate estimates of population parameters. Obtaining accurate estimates is especially difficult when natural variability is high, or limited budgets determine that only a small fraction of the population can be sampled. The Missouri bladderpod, Lesquerella filiformis Rollins, is a federally threatened winter annual that has an aggregated distribution pattern and exhibits dramatic interannual population fluctuations. Using the simulation program SAMPLE, we evaluated five candidate sampling designs appropriate for rare populations, based on 4 years of field data: (1) simple random sampling, (2) adaptive simple random sampling, (3) grid-based systematic sampling, (4) adaptive grid-based systematic sampling, and (5) GIS-based adaptive sampling. We compared the designs based on the precision of density estimates for fixed sample size, cost, and distance traveled. Sampling fraction and cost were the most important factors determining precision of density estimates, and relative design performance changed across the range of sampling fractions. Adaptive designs did not provide uniformly more precise estimates than conventional designs, in part because the spatial distribution of L. filiformis was relatively widespread within the study site. Adaptive designs tended to perform better as sampling fraction increased and when sampling costs, particularly distance traveled, were taken into account. The rate that units occupied by L. filiformis were encountered was higher for adaptive than for conventional designs. Overall, grid-based systematic designs were more efficient and practically implemented than the others. ?? 2008 The Society of Population Ecology and Springer.

  18. Boat sampling

    International Nuclear Information System (INIS)

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  19. [Cellular adaptation and cancerogenesis].

    Science.gov (United States)

    La Torre, F; Silpigni, A; Tomasello, R; Picone, G S; La Torre, I; Aragona, M

    1998-06-01

    , managing to escape the immune system using various adaptive mechanisms which induce immune tolerance/anergy. From this point of view, cancer may be regarded as an incidental factor in the host's cell adaptation processes; the latter are much more important from a biological point of view and their absence is incompatible with life: cancer might therefore be regarded as a cell adaptation pathology. PMID:9739355

  20. Sample Design.

    Science.gov (United States)

    Ross, Kenneth N.

    1987-01-01

    This article considers various kinds of probability and non-probability samples in both experimental and survey studies. Throughout, how a sample is chosen is stressed. Size alone is not the determining consideration in sample selection. Good samples do not occur by accident; they are the result of a careful design. (Author/JAZ)

  1. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  2. Multi-Directional Motion Adaptation

    Directory of Open Access Journals (Sweden)

    David Patrick McGovern

    2012-05-01

    Full Text Available The direction aftereffect (DAE is a phenomenon whereby prolonged exposure to a moving stimulus biases the perceived direction of subsequent stimuli. It is believed to arise through a selective suppression of directionally tuned neurons in the visual cortex, causing shifts in the population response away from the adapted direction. Whereas most studies consider only unidirectional adaptation, here we examine how concurrent adaptation to multiple directions affects the DAE. Observers were required to judge whether a random dot kinematogram (RDK moved clockwise or counter-clockwise relative to upwards. In different conditions, observers adapted to a stimulus comprised of directions drawn from a distribution or to bidirectional motion. Increasing the variance of normally distributed directions reduced the magnitude of the peak DAE and broadened its tuning profile. Asymmetric sampling of Gaussian and uniform distributions resulted in shifts of DAE tuning profiles consistent with changes in the perceived global direction of the adapting stimulus. Discrimination thresholds were elevated by an amount that related to the magnitude of the bias. For bidirectional adaptors, adding dots in directions away from the adapting motion led to a pronounced reduction in the DAE. This reduction was observed when dots were added in opposite or orthogonal directions to the adaptor suggesting that it may arise via inhibition from a broadly tuned normalisation pool. Preliminary simulations with a population coding model, where the gain of a direction-selective neuron is inversely proportional to its response to the adapting stimulus, suggest that it provides a parsimonious account of these adaptation effects.

  3. Economics of adaptation to climate change

    International Nuclear Information System (INIS)

    This report proposes a general economic framework for the issue of adaptation to climate change in order to help public and private actors to build up efficient adaptation strategies. It proposes a general definition of adaptation, identifies the major stakes for these strategies, and discusses the assessment of global costs of adaptation to climate change. It discusses the role and modalities of public action and gives some examples of possible adaptation measures in some important sectors (building and town planning, energy and transport infrastructures, water and agriculture, ecosystems, insurance). It examines the regional and national dimensions of adaptation and their relationship, and defines steps for implementing an adaptation strategy. It describes and discusses the use of economic tools in the elaboration of an adaptation strategy, i.e. how to take uncertainties into account, which scenarios to choose, how to use economic calculations to assess adaptation policies

  4. Adaptive management

    DEFF Research Database (Denmark)

    Rist, Lucy; Campbell, Bruce Morgan; Frost, Peter

    2013-01-01

    in scientific articles, policy documents and management plans, but both understanding and application of the concept is mixed. This paper reviews recent literature from conservation and natural resource management journals to assess diversity in how the term is used, highlight ambiguities and consider how...... a management framework, as well as of identified challenges and pathologies, are needed. Further discussion and systematic assessment of the approach is required, together with greater attention to its definition and description, enabling the assessment of new approaches to managing uncertainty, and AM itself.......Adaptive management (AM) emerged in the literature in the mid-1970s in response both to a realization of the extent of uncertainty involved in management, and a frustration with attempts to use modelling to integrate knowledge and make predictions. The term has since become increasingly widely used...

  5. Sequencing of 50 human exomes reveals adaptation to high altitude

    DEFF Research Database (Denmark)

    Yi, Xin; Liang, Yu; Huerta-Sanchez, Emilia;

    2010-01-01

    Residents of the Tibetan Plateau show heritable adaptations to extreme altitude. We sequenced 50 exomes of ethnic Tibetans, encompassing coding sequences of 92% of human genes, with an average coverage of 18x per individual. Genes showing population-specific allele frequency changes, which...... represent strong candidates for altitude adaptation, were identified. The strongest signal of natural selection came from endothelial Per-Arnt-Sim (PAS) domain protein 1 (EPAS1), a transcription factor involved in response to hypoxia. One single-nucleotide polymorphism (SNP) at EPAS1 shows a 78% frequency...... difference between Tibetan and Han samples, representing the fastest allele frequency change observed at any human gene to date. This SNP's association with erythrocyte abundance supports the role of EPAS1 in adaptation to hypoxia. Thus, a population genomic survey has revealed a functionally important locus...

  6. Adaptive vehicle motion estimation and prediction

    Science.gov (United States)

    Zhao, Liang; Thorpe, Chuck E.

    1999-01-01

    Accurate motion estimation and reliable maneuver prediction enable an automated car to react quickly and correctly to the rapid maneuvers of the other vehicles, and so allow safe and efficient navigation. In this paper, we present a car tracking system which provides motion estimation, maneuver prediction and detection of the tracked car. The three strategies employed - adaptive motion modeling, adaptive data sampling, and adaptive model switching probabilities - result in an adaptive interacting multiple model algorithm (AIMM). The experimental results on simulated and real data demonstrate that our tracking system is reliable, flexible, and robust. The adaptive tracking makes the system intelligent and useful in various autonomous driving tasks.

  7. Face Adaptation Without a Face

    OpenAIRE

    Ghuman, Avniel Singh; McDaniel, Jonathan R.; Martin, Alex

    2010-01-01

    Prolonged viewing of a stimulus results in a subsequent perceptual bias [1], [2] and [3]. This perceptual adaptation and the resulting aftereffect reveal important characteristics regarding how perceptual systems are tuned [2], [4], [5] and [6]. These aftereffects occur not only for simple stimulus features but also for high-level stimulus properties [7], [8], [9] and [10]. Here we report a novel cross-category adaptation aftereffect demonstrating that prolonged viewing of a human body withou...

  8. Sampling with Costs

    OpenAIRE

    Skufca, Joseph D; Ben-Avraham, Daniel

    2015-01-01

    We consider the problem of choosing the best of $n$ samples, out of a large random pool, when the sampling of each member is associated with a certain cost. The quality (worth) of the best sample clearly increases with $n$, but so do the sampling costs, and one important question is how many to sample for optimal gain (worth minus costs). If, in addition, the assessment of worth for each sample is associated with some "measurement error," the perceived best out of $n$ might not be the actual ...

  9. Adaptive method of lines

    CERN Document Server

    Saucez, Ph

    2001-01-01

    The general Method of Lines (MOL) procedure provides a flexible format for the solution of all the major classes of partial differential equations (PDEs) and is particularly well suited to evolutionary, nonlinear wave PDEs. Despite its utility, however, there are relatively few texts that explore it at a more advanced level and reflect the method''s current state of development.Written by distinguished researchers in the field, Adaptive Method of Lines reflects the diversity of techniques and applications related to the MOL. Most of its chapters focus on a particular application but also provide a discussion of underlying philosophy and technique. Particular attention is paid to the concept of both temporal and spatial adaptivity in solving time-dependent PDEs. Many important ideas and methods are introduced, including moving grids and grid refinement, static and dynamic gridding, the equidistribution principle and the concept of a monitor function, the minimization of a functional, and the moving finite elem...

  10. Viewer preferences for adaptive playout

    Science.gov (United States)

    Deshpande, Sachin

    2013-03-01

    Adaptive media playout techniques are used to avoid buffer underflow in a dynamic streaming environment where the available bandwidth may be fluctuating. In this paper we report human perceptions from audio quality studies that we performed on speech and music samples for adaptive audio playout. Test methods based on ITU-R BS. 1534-1 recommendation were used. Studies were conducted for both slow playout and fast playout. Two scales - a coarse scale and a finer scale was used for the slow and fast audio playout factors. Results from our study can be used to determine acceptable slow and fast playout factors for speech and music content. An adaptive media playout algorithm could use knowledge of these upper and lower bounds on playback speeds to decide its adaptive playback schedule.

  11. New competitive dendrimer-based and highly selective immunosensor for determination of atrazine in environmental, feed and food samples: the importance of antibody selectivity for discrimination among related triazinic metabolites.

    Science.gov (United States)

    Giannetto, Marco; Umiltà, Eleonora; Careri, Maria

    2014-01-01

    A new voltammetric competitive immunosensor selective for atrazine, based on the immobilization of a conjugate atrazine-bovine serum albumine on a nanostructured gold substrate previously functionalized with poliamidoaminic dendrimers, was realized, characterized, and validated in different real samples of environmental and food concern. Response of the sensor was reliable, highly selective and suitable for the detection and quantification of atrazine at trace levels in complex matrices such as territorial waters, corn-cultivated soils, corn-containing poultry and bovine feeds and corn flakes for human use. Selectivity studies were focused on desethylatrazine, the principal metabolite generated by long-term microbiological degradation of atrazine, terbutylazine-2-hydroxy and simazine as potential interferents. The response of the developed immunosensor for atrazine was explored over the 10(-2)-10(3) ng mL(-1) range. Good sensitivity was proved, as limit of detection and limit of quantitation of 1.2 and 5 ng mL(-1), respectively, were estimated for atrazine. RSD values <5% over the entire explored range attested a good precision of the device.

  12. Genetic structure of different cat populations in Europe and South America at a microgeographic level: importance of the choice of an adequate sampling level in the accuracy of population genetics interpretations

    Directory of Open Access Journals (Sweden)

    Manuel Ruiz-Garcia

    1999-12-01

    Full Text Available The phenotypic markers, coat color, pattern and hair length, of natural domestic cat populations observed in four cities (Barcelona, Catalonia; Palma Majorca, Balearic Islands; Rimini, Italy and Buenos Aires, Argentina were studied at a microgeographical level. Various population genetics techniques revealed that the degree of genetic differentiation between populations of Felis catus within these cities is relatively low, when compared with that found between populations of other mammals. Two different levels of sampling were used. One was that of "natural" colonies of cat families living together in specific points within the cities, and the other referred to "artificial" subpopulations, or groups of colonies, inhabiting the same district within a city. For the two sampling levels, some of the results were identical: 1 little genic heterogeneity, 2 existence of panmixia, 3 similar levels of expected heterozygosity in all populations analyzed, 4 no spatial autocorrelation, with certain differentiation in the Buenos Aires population compared to the others, and 5 very high correlations between colonies and subpopulations with the first factors from a Q factor analysis. Nevertheless, other population genetic statistics were greatly affected by the differential choice of sampling level. This was the case for: 1 the amount of heterogeneity of the FST and GST statistics between the cities, which was greater at the subpopulation level than at colony level, 2 the existence of correlations between genic differentiation statistics and size variables at subpopulation level, but not at the colony level, and 3 the relationships between the genetic variables and the principal factors of the R factorial analysis. This suggests that care should be taken in the choice of the sampling unit, for inferences on population genetics to be valid at the microgeographical level.Os marcadores fenotípicos cor da pelagem, padrão e comprimento dos pelos de popula

  13. Integrating Adaptive Functionality in a LMS

    Directory of Open Access Journals (Sweden)

    Kees van der Sluijs

    2009-12-01

    Full Text Available Learning management systems are becoming more and more important in the learning process in both educational and corporate settings. They can nowadays even be used to server actual courses to the learner. However, one important feature is lacking in learning management systems: personalization. In this paper we look into this issue of personalization that enables courses to be adapted to the knowledge level and learning preferences of the user. We shortly review the state of the art in adaptive systems that allow creating adaptive courses. Then, exemplified in the popular LMS called CLIX we look at the authoring of an adaptive Business English course. We demonstrate how such a static course can be made adaptive by using the GALE adaptive engine. We then show that GALE can be integrated into CLIX, and in other LMSs as well, so that personalization and adaptation can become widely established technology.

  14. Adaptation-Based Programming in Haskell

    Directory of Open Access Journals (Sweden)

    Tim Bauer

    2011-09-01

    Full Text Available We present an embedded DSL to support adaptation-based programming (ABP in Haskell. ABP is an abstract model for defining adaptive values, called adaptives, which adapt in response to some associated feedback. We show how our design choices in Haskell motivate higher-level combinators and constructs and help us derive more complicated compositional adaptives. We also show an important specialization of ABP is in support of reinforcement learning constructs, which optimize adaptive values based on a programmer-specified objective function. This permits ABP users to easily define adaptive values that express uncertainty anywhere in their programs. Over repeated executions, these adaptive values adjust to more efficient ones and enable the user's programs to self optimize. The design of our DSL depends significantly on the use of type classes. We will illustrate, along with presenting our DSL, how the use of type classes can support the gradual evolution of DSLs.

  15. Adaptation-Based Programming in Haskell

    CERN Document Server

    Bauer, Tim; Fern, Alan; Pinto, Jervis; 10.4204/EPTCS.66.1

    2011-01-01

    We present an embedded DSL to support adaptation-based programming (ABP) in Haskell. ABP is an abstract model for defining adaptive values, called adaptives, which adapt in response to some associated feedback. We show how our design choices in Haskell motivate higher-level combinators and constructs and help us derive more complicated compositional adaptives. We also show an important specialization of ABP is in support of reinforcement learning constructs, which optimize adaptive values based on a programmer-specified objective function. This permits ABP users to easily define adaptive values that express uncertainty anywhere in their programs. Over repeated executions, these adaptive values adjust to more efficient ones and enable the user's programs to self optimize. The design of our DSL depends significantly on the use of type classes. We will illustrate, along with presenting our DSL, how the use of type classes can support the gradual evolution of DSLs.

  16. El Cuestionario de Necesidades de los Familiares de Pacientes de Cuidados Intensivos (CCFNI versión breve: adaptación y validación en población española The short version of Critical Care Family Needs Inventory (CCFNI: adaptation and validation for a Spanish sample

    Directory of Open Access Journals (Sweden)

    S. Gómez Martínez

    2011-12-01

    Full Text Available Los familiares son una parte muy importante en el proceso de la enfermedad y el cuidado de los pacientes ingresados en Unidades de Cuidados Intensivos (UCI. Por ello es fundamental conocer sus necesidades, para tratar de mejorar la adaptación a una situación tan difícil como es el ingreso en UCI. El objetivo del presente estudio es adaptar y validar la versión breve del Cuestionario de Necesidades de los Familiares de Pacientes de Cuidados Intensivos (CCFNI en una muestra española. Para ello se aplicó la adaptación del cuestionario, realizada conforme a las directrices internacionales, a 55 familiares de pacientes ingresados en la UCI del Hospital General Universitario de Castellón. Tras la eliminación de tres ítems por diversos motivos, se realizó un análisis factorial exploratorio con los 11 ítems restantes para obtener la estructura factorial del mismo. Se realizó un análisis descriptivo de los ítems y se calcularon la consistencia interna mediante α de Cronbach y la validez de constructo mediante el coeficiente de correlación de Pearson. El CCFNI obtuvo una estructura de cuatro factores que corresponden a: atención médica al paciente, atención personal al familiar, información y comunicación médico-paciente y posibles mejoras percibidas. Esta versión del CCFNI mostró una buena consistencia interna tanto para la escala total como para los factores. La versión del CCFNI validada en el presente estudio constituye una medida adecuada para la evaluación de las distintas necesidades que presentan los familiares de los pacientes ingresados en una UCI, mostrando una adecuada bondad psicométrica.Relatives play an important role in the disease process of patients admitted to Intensive Care Units (ICU. It is therefore important to know the needs of people close to the patient in order to try to improve their adaption to a situation as difficult as an ICU admission. The aim of this study was the adaptation and validation of

  17. Optimizing heterologous expression in Dictyostelium : importance of 5 ' codon adaptation

    NARCIS (Netherlands)

    Vervoort, EB; van Ravestein, A; van Peij, NNME; Heikoop, JC; van Haastert, OJM; Verheijden, GF; Linskens, MHK; Heikoop, Judith C.; Haastert, Peter J.M. van; Verheijden, Gijs F.

    2000-01-01

    Expression of heterologous proteins in Dictyostelium discoideum presents unique research opportunities, such as the functional analysis of complex human glycoproteins after random mutagenesis, In one study, human chorionic gonadotropin (hCG) and human follicle stimulating hormone were expressed in D

  18. Plant sphingolipids: Their importance in cellular organization and adaption.

    Science.gov (United States)

    Michaelson, Louise V; Napier, Johnathan A; Molino, Diana; Faure, Jean-Denis

    2016-09-01

    Sphingolipids and their phosphorylated derivatives are ubiquitous bio-active components of cells. They are structural elements in the lipid bilayer and contribute to the dynamic nature of the membrane. They have been implicated in many cellular processes in yeast and animal cells, including aspects of signaling, apoptosis, and senescence. Although sphingolipids have a better defined role in animal systems, they have been shown to be central to many essential processes in plants including but not limited to, pollen development, signal transduction and in the response to biotic and abiotic stress. A fuller understanding of the roles of sphingolipids within plants has been facilitated by classical biochemical studies and the identification of mutants of model species. Recently the development of powerful mass spectrometry techniques hailed the advent of the emerging field of lipidomics enabling more accurate sphingolipid detection and quantitation. This review will consider plant sphingolipid biosynthesis and function in the context of these new developments. This article is part of a Special Issue entitled: Plant Lipid Biology edited by Kent D. Chapman and Ivo Feussner. PMID:27086144

  19. COPING: IMPORTANCE OF CONTEXTUAL FACTORS AND MEASUREMENT

    OpenAIRE

    Kaya, Cahit

    2014-01-01

    Coping skills cover an important area in rehabilitation counseling field. Bipolarity of coping skills as being adaptive or not adaptive has been prevalent throughout the literature. On the other hand, influence of contextual factors on coping skills has been underemphasized.  Recent researches indicate that contextual factors play major role in coping skills. This paper examines importance of contextual factors on coping skills particularly in relation to assessment issues in rehabilitation c...

  20. Adaptively robust filtering with classified adaptive factors

    Institute of Scientific and Technical Information of China (English)

    CUI Xianqiang; YANG Yuanxi

    2006-01-01

    The key problems in applying the adaptively robust filtering to navigation are to establish an equivalent weight matrix for the measurements and a suitable adaptive factor for balancing the contributions of the measurements and the predicted state information to the state parameter estimates. In this paper, an adaptively robust filtering with classified adaptive factors was proposed, based on the principles of the adaptively robust filtering and bi-factor robust estimation for correlated observations. According to the constant velocity model of Kalman filtering, the state parameter vector was divided into two groups, namely position and velocity. The estimator of the adaptively robust filtering with classified adaptive factors was derived, and the calculation expressions of the classified adaptive factors were presented. Test results show that the adaptively robust filtering with classified adaptive factors is not only robust in controlling the measurement outliers and the kinematic state disturbing but also reasonable in balancing the contributions of the predicted position and velocity, respectively, and its filtering accuracy is superior to the adaptively robust filter with single adaptive factor based on the discrepancy of the predicted position or the predicted velocity.

  1. GLOBALIZATION AND IMPORT RISKS

    Directory of Open Access Journals (Sweden)

    Popa Ioan

    2014-07-01

    Full Text Available Delocalization of production and diversification of the sources of offer in the global market place the issue of protection of consumer rights in major consumption centres, namely the European Union in a new light. A review of policies for the protection of consumer rights in the EU, USA and China, reveals major differences regarding the protection of consumer rights and the existence of gaps, and in particular the implementation of effective legislation in this regard. As such, the risks associated with imports have become a major concern in the European Union. The consumer has – one can say – a central role in the globalization process, which justifies the measures aimed at its protection. Although worldwide there are major differences in the degree of market regulation in matters of protection of consumer rights, the trend is the continuous adaptation of the offer to the requirements of global demand. However, one can still find significant gaps which translate into risks specific to the consumers in developed countries, namely in the EU. An important issue arises from this radical change of the localization of production centres in relation to the main consumption centres. While in the developed world, consumer rights protection has reached high levels both by creating an appropriate legislative framework and through consumer awareness and activism regarding their rights, in areas where much of the offer comes from the Western market (China, India, etc. modern mentality on the protection of consumer rights is just emerging. A major requirement is therefore the provision of a status of the consumer compatible with the benefits and risks of globalization, a status defined by safety and protection of imports. This paper confirms the thesis that, ultimately, the main factor counteracting the risks in matters of protection of consumer rights is the consumer, its awareness of its rights.

  2. Convergence Performance of Adaptive Algorithms of L-Filters

    Directory of Open Access Journals (Sweden)

    Robert Hudec

    2003-01-01

    Full Text Available This paper deals with convergence parameters determination of adaptive algorithms, which are used in adaptive L-filters design. Firstly the stability of adaptation process, convergence rate or adaptation time, and behaviour of convergence curve belong among basic properties of adaptive algorithms. L-filters with variety of adaptive algorithms were used to their determination. Convergence performances finding of adaptive filters is important mainly for their hardware applications, where filtration in real time or adaptation of coefficient filter with low capacity of input data are required.

  3. Economics of adaptation to climate change; Economie de l'adaptation au changement climatique

    Energy Technology Data Exchange (ETDEWEB)

    Perthuis, Ch.; Hallegatte, St.; Lecocq, F.

    2010-02-15

    This report proposes a general economic framework for the issue of adaptation to climate change in order to help public and private actors to build up efficient adaptation strategies. It proposes a general definition of adaptation, identifies the major stakes for these strategies, and discusses the assessment of global costs of adaptation to climate change. It discusses the role and modalities of public action and gives some examples of possible adaptation measures in some important sectors (building and town planning, energy and transport infrastructures, water and agriculture, ecosystems, insurance). It examines the regional and national dimensions of adaptation and their relationship, and defines steps for implementing an adaptation strategy. It describes and discusses the use of economic tools in the elaboration of an adaptation strategy, i.e. how to take uncertainties into account, which scenarios to choose, how to use economic calculations to assess adaptation policies

  4. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...

  5. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  6. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees;

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples ...

  7. Adaptive Image Denoising by Mixture Adaptation.

    Science.gov (United States)

    Luo, Enming; Chan, Stanley H; Nguyen, Truong Q

    2016-10-01

    We propose an adaptive learning procedure to learn patch-based image priors for image denoising. The new algorithm, called the expectation-maximization (EM) adaptation, takes a generic prior learned from a generic external database and adapts it to the noisy image to generate a specific prior. Different from existing methods that combine internal and external statistics in ad hoc ways, the proposed algorithm is rigorously derived from a Bayesian hyper-prior perspective. There are two contributions of this paper. First, we provide full derivation of the EM adaptation algorithm and demonstrate methods to improve the computational complexity. Second, in the absence of the latent clean image, we show how EM adaptation can be modified based on pre-filtering. The experimental results show that the proposed adaptation algorithm yields consistently better denoising results than the one without adaptation and is superior to several state-of-the-art algorithms. PMID:27416593

  8. Adaptive independent component analysis to analyze electrocardiograms

    Science.gov (United States)

    Yim, Seong-Bin; Szu, Harold H.

    2001-03-01

    In this work, we apply adaptive version independent component analysis (ADAPTIVE ICA) to the nonlinear measurement of electro-cardio-graphic (ECG) signals for potential detection of abnormal conditions in the heart. In principle, unsupervised ADAPTIVE ICA neural networks can demix the components of measured ECG signals. However, the nonlinear pre-amplification and post measurement processing make the linear ADAPTIVE ICA model no longer valid. This is possible because of a proposed adaptive rectification pre-processing is used to linearize the preamplifier of ECG, and then linear ADAPTIVE ICA is used in iterative manner until the outputs having their own stable Kurtosis. We call such a new approach adaptive ADAPTIVE ICA. Each component may correspond to individual heart function, either normal or abnormal. Adaptive ADAPTIVE ICA neural networks have the potential to make abnormal components more apparent, even when they are masked by normal components in the original measured signals. This is particularly important for diagnosis well in advance of the actual onset of heart attack, in which abnormalities in the original measured ECG signals may be difficult to detect. This is the first known work that applies Adaptive ADAPTIVE ICA to ECG signals beyond noise extraction, to the detection of abnormal heart function.

  9. Mexico: Imports or exports?

    International Nuclear Information System (INIS)

    This presentation provides an overview of Mexico's energy sector. Proven oil reserves place Mexico in ninth position in the world and fourth largest in natural gas reserves. Energy is one of the most important economic activities of the country, representing 3 per cent of Gross Domestic Product (GDP). Oil exports represent 8.4 per cent of total exports. Approximately 40 per cent of total public investment is earmarked for energy projects. The author discusses energy resources and energy sector limitations. The energy sector plan for the period 2001-2006 is discussed. Its goals are to ensure energy supply, to develop the energy sector, to stimulate participation of Mexican enterprises, to promote renewable energy sources, and to strengthen international energy cooperation. The regulatory framework is being adapted to increase private investment. Some graphs are presented, displaying the primary energy production and primary energy consumption. Energy sector reforms are reviewed, as are electricity and natural gas reforms. The energy sector demand for 2000-2010 and investment requirements are reviewed, as well as fuel consumption for power generation. The author discusses the National Pipeline System (SNG) and the bottlenecks caused by pressure efficiency in the northeast, flow restriction on several pipeline segments, variability of the Petroleos Mexicanos (PEMEX) own use, and pressure drop on central regions. The entire prospect for natural gas in the country is reviewed, along with the Strategic Gas Program (PEG) consisting of 20 projects, including 4 non-associated natural gas, 9 exploration and 7 optimization. A section dealing with multiple service contracts is included in the presentation. The authors conclude by stating that the priority is a national energy policy to address Mexico's energy security requirements, to increase natural gas production while promoting the diversification of imports, and a regulatory framework to be updated in light of current

  10. Adaptation and creativity in cultural context

    Directory of Open Access Journals (Sweden)

    Leonora M. Cohen

    2012-06-01

    Full Text Available Adaptation is the fit between the individual and the environment. The dynamic interplay between person, culture, and environment is one of the most important issues in analyzing creativity. Adaptation is defined as the fit or adjustment of the individual to external conditions, but adaptation can also mean moving from one environment to another more suitable, or even forcing the environment to adapt in response to creative efforts. Culture impacts creativity in limiting acceptable boundaries, yet providing the artifacts used in creating. Culture is impacted and changed by creative efforts. Tight conformity to confining environments or cultures can stifle. The creator must be aware of cultural values and not overstep these boundaries for work to be accepted. A developmental continuum of adaptive, creative behaviors suggests a shift from individual adaptation to the environment to adaptation by the world to the individual.

  11. Adaptive Modular Playware

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Þorsteinsson, Arnar Tumi

    2011-01-01

    In this paper, we describe the concept of adaptive modular playware, where the playware adapts to the interaction of the individual user. We hypothesize that there are individual differences in user interaction capabilities and styles, and that adaptive playware may adapt to the individual user’s...

  12. Performance evaluation of communication systems via importance sampling

    NARCIS (Netherlands)

    Remondo Bueno, D.

    2000-01-01

    In the design and development of telecommunication systems, the preparation of experiments can be made more effective by predicting the system performance and its dependence on the different system parameters. This can be done by modeling the system and using performance evaluation methods. This dis

  13. The importance of genus Candida in human samples

    Directory of Open Access Journals (Sweden)

    Bojić-Miličević Gordana M.

    2008-01-01

    Full Text Available Microbiology is a rapidly changing field. As new researches and experiences broaden our knowledge, changes in the approach to diagnosis and therapy have become necessary and appropriate. Recommended dosage of drugs, method and duration of administration, as well as contraindications to use, evolve over time all drugs. Over the last 2 decades, Candida species have emerged as causes of substantial morbidity and mortality in hospitalized individuals. Isolation of Candida from blood or other sterile sites, excluding the urinary tract, defines invasive candidiasis. Candida species are currently the fourth most common cause of bloodstream infections (that is, candidemia in U.S. hospitals and occur primarily in the intensive care unit (ICU, where candidemia is recognized in up to 1% of patients and where deep-seated Candida infections are recognized in an additional 1 to 2% of patients. Despite the introduction of newer anti-Candida agents, invasive candidiasis continues to have an attributable mortality rate of 40 to 49%; excess ICU and hospital stays of 12.7 days and 15.5 days, respectively, and increased care costs. Postmortem studies suggest that death rates related to invasive candidiasis might, in fact, be higher than those described because of undiagnosed and therefore untreated infection. The diagnosis of invasive candidiasis remains challenging for both clinicians and microbiologists. Reasons for missed diagnoses include nonspecific risk factors and clinical manifestations, low sensitivity of microbiological culture techniques, and unavailability of deep tissue cultures because of risks associated with the invasive procedures used to obtain them. Thus, a substantial proportion of invasive candidiasis in patients in the ICU is assumed to be undiagnosed and untreated. Yet even when invasive candidiasis is diagnosed, culture diagnosis delays treatment for 2 to 3 days, which contributes to mortality. Interventions that do not rely on a specific diagnosis and are implemented early in the course of Candida infection (that is, empirical therapy or before Candida infection occurs (that is, prophylaxis might improve patient survival and may be warranted. Selective and nonselective administration of anti-Candida prophylaxis is practiced in some ICUs. Several trials have tested this, but results were limited by low statistical power and choice of outcomes. Thus, the role of anti-Candida prophylaxis for patients in the ICU remains controversial. Initiating anti-Candida therapy for patients in the ICU who have suspected infection but have not responded to antibacterial therapy (empirical therapy is practiced in some hospitals. This practice, however, remains a subject of considerable debate. These patients are perceived to be at higher risk from invasive candidiasis and therefore are likely to benefit from empirical therapy. Nonetheless, empirical anti-Candida therapies have not been evaluated in a randomized trial and would share shortcomings that are similar to those described for prophylactic strategies. Current treatment guidelines by the Infectious Diseases Society of America (IDSA do not specify whether empirical anti-Candida therapy should be provided to immunocompetent patients. If such therapy is given, IDSA recommends that its use should be limited to patients with Candida colonization in multiple sites, patients with several other risk factors, and patients with no uncorrected causes of fever. Without data from clinical trials, determining an optimal anti-Candida strategy for patients in the ICU is challenging. Identifying such a strategy can help guide clinicians in choosing adequate therapy and may improve patient outcomes. In our study, we developed a decision analytic model to evaluate the cost-effectiveness of empirical anti-Candida therapy given to high-risk patients in the ICU, defined as those with altered temperature (fever or hypothermia or unexplained hypotension despite 3 days of antibacterial therapy in the ICU.

  14. How important is importance for prospective memory?

    OpenAIRE

    Stefan eWalter; Beat eMeier

    2014-01-01

    Forgetting to carry out an intention as planned can have serious consequences in everyday life. People sometimes even forget intentions that they consider as very important. Here, we review the literature on the impact of importance on prospective memory performance. We highlight different methods used to manipulate the importance of a prospective memory task such as providing rewards, importance relative to other ongoing activities, absolute importance, and providing social motives. Moreover...

  15. Optimization under uncertainty: Adaptive variance reduction, adaptive metamodeling, and investigation of robustness measures

    Science.gov (United States)

    Medina, Juan Camilo

    This dissertation offers computational and theoretical advances for optimization under uncertainty problems that utilize a probabilistic framework for addressing such uncertainties, and adopt a probabilistic performance as objective function. Emphasis is placed on applications that involve potentially complex numerical and probability models. A generalized approach is adopted, treating the system model as a "black-box" and relying on stochastic simulation for evaluating the probabilistic performance. This approach can impose, though, an elevated computational cost, and two of the advances offered in this dissertation aim at decreasing the computational burden associated with stochastic simulation when integrated with optimization applications. The first one develops an adaptive implementation of importance sampling (a popular variance reduction technique) by sharing information across the iterations of the numerical optimization algorithm. The system model evaluations from the current iteration are utilized to formulate importance sampling densities for subsequent iterations with only a small additional computational effort. The characteristics of these densities as well as the specific model parameters these densities span are explicitly optimized. The second advancement focuses on adaptive tuning of a kriging metamodel to replace the computationally intensive system model. A novel implementation is considered, establishing a metamodel with respect to both the uncertain model parameters as well as the design variables, offering significant computational savings. Additionally, the adaptive selection of certain characteristics of the metamodel, such as support points or order of basis functions, is considered by utilizing readily available information from the previous iteration of the optimization algorithm. The third advancement extends to a different application and considers the assessment of the appropriateness of different candidate robust designs. A novel

  16. Adaptive Robust Variable Selection

    CERN Document Server

    Fan, Jianqing; Barut, Emre

    2012-01-01

    Heavy-tailed high-dimensional data are commonly encountered in various scientific fields and pose great challenges to modern statistical analysis. A natural procedure to address this problem is to use penalized least absolute deviation (LAD) method with weighted $L_1$-penalty, called weighted robust Lasso (WR-Lasso), in which weights are introduced to ameliorate the bias problem induced by the $L_1$-penalty. In the ultra-high dimensional setting, where the dimensionality can grow exponentially with the sample size, we investigate the model selection oracle property and establish the asymptotic normality of the WR-Lasso. We show that only mild conditions on the model error distribution are needed. Our theoretical results also reveal that adaptive choice of the weight vector is essential for the WR-Lasso to enjoy these nice asymptotic properties. To make the WR-Lasso practically feasible, we propose a two-step procedure, called adaptive robust Lasso (AR-Lasso), in which the weight vector in the second step is c...

  17. Bayesian Adaptive Exploration

    Science.gov (United States)

    Loredo, Thomas J.

    2004-04-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.

  18. Device-aware Adaptation of Websites

    OpenAIRE

    Barsomo, Milad; Hurtig, Mats

    2014-01-01

    The use of handheld devices such as smart phones and tablets have exploded in the last few years. These mobile devices differ from regular desktops by having limited battery power, processing power, bandwidth, internal memory, and screen size. With many device types and with mobile adaptation being done in many ways, it is therefore important for websites to adapt to mobile users. This thesis characterise how websites currently are adapting to mobile devices. For our analysis and data collect...

  19. Climate Policy Must Favour Mitigation Over Adaptation

    OpenAIRE

    SCHUMACHER, Ingmar

    2016-01-01

    In climate change policy, adaptation tends to be viewed as beingas important as mitigation. In this article we present a simple yet generalargument for which mitigation must be preferred to adaptation.The argument rests on the observation that mitigation is a public goodwhile adaptation is a private one. This implies that the more one disaggregatesthe units in a social welfare function, i.e. the more one teasesout the public good nature of mitigation, the lower is average incomeand thus less ...

  20. Tone-Mapped Mean-Shift Based Environment Map Sampling.

    Science.gov (United States)

    Feng, Wei; Yang, Ying; Wan, Liang; Yu, Changguo

    2016-09-01

    In this paper, we present a novel approach for environment map sampling, which is an effective and pragmatic technique to reduce the computational cost of realistic rendering and get plausible rendering images. The proposed approach exploits the advantage of adaptive mean-shift image clustering with aid of tone-mapping, yielding oversegmented strata that have uniform intensities and capture shapes of light regions. The resulted strata, however, have unbalanced importance metric values for rendering, and the strata number is not user-controlled. To handle these issues, we develop an adaptive split-and-merge scheme that refines the strata and obtains a better balanced strata distribution. Compared to the state-of-the-art methods, our approach achieves comparable and even better rendering quality in terms of SSIM, RMSE and HDRVDP2 image quality metrics. Experimental results further show that our approach is more robust to the variation of viewpoint, environment rotation, and sample number. PMID:26584494

  1. 5. Sampling

    International Nuclear Information System (INIS)

    The sampling is described for radionuclide X-ray fluorescence analysis. Aerosols are captured with various filter materials whose properties are summed up in the table. Fine dispersed solid and liquid particles and gaseous admixtures may be captured by bubbling air through a suitable absorption solution. The concentration of small amounts of impurities from large volumes of air is done by adsorbing impurities on surfactants, e.g., activated charcoal, silica gel, etc. Aerosols may be captured using an electrostatic precipitator and aerosol fractions may be separated with a cascade impactor. Water sampling differs by the water source, i.e., ground water, surface water, rain or waste water. Soil samples are taken by probes. (ES)

  2. Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler.

    Science.gov (United States)

    Jin, Ick Hoon; Yuan, Ying; Liang, Faming

    2013-10-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788

  3. Bayesian analysis for exponential random graph models using the adaptive exchange sampler

    KAUST Repository

    Jin, Ick Hoon

    2013-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the existence of intractable normalizing constants. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the issue of intractable normalizing constants encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.

  4. Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler*

    Science.gov (United States)

    Jin, Ick Hoon; Yuan, Ying; Liang, Faming

    2014-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788

  5. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  6. Bayesian Adaptive Exploration

    CERN Document Server

    Loredo, T J

    2004-01-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...

  7. Adaptive compressive sensing camera

    Science.gov (United States)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  8. Motion Tracking with Fast Adaptive Background Subtraction

    Institute of Scientific and Technical Information of China (English)

    Xiao De-gui; Yu Sheng-sheng; Zhou Jing-li

    2003-01-01

    To extract and track moving objects is usually one of the most important tasks of intelligent video surveillance systems. This paper presents a fast and adaptive background subtraction algorithm and the motion tracking process using this algorithm. The algorithm uses only luminance components of sampled image sequence pixels and models every pixel in a statistical model. The algorithm is characterized by its ability of real time detecting sudden lighting changes, and extracting and tracking motion objects faster. It is shown that our algorithm can be realized with lower time and space complexity and adjustable object detection error rate with comparison to other background subtraction algorithms. Making use of the algorithm, an indoor monitoring system is also worked out and the motion tracking process is presented in this paper. Experimental results testify the algorithm's good performances when used in an indoor monitoring system.

  9. Motion Tracking with Fast Adaptive Background Subtraction

    Institute of Scientific and Technical Information of China (English)

    Xiao; De-Gui; Yu; Sheng-sheng; 等

    2003-01-01

    To extract and track moving objects is usually one of the most important tasks of intelligent video surveillance systems. This paper presents a fast and adaptive background subtraction algorithm and the motion tracking process using this algorithm. The algorithm uses only luminance components of sampled image sequence pixels and models every pixel in a statistical model.The algorithm is characterized by its ability of real time detecting sudden lighting changes, and extracting and tracking motion objects faster. It is shown that our algorithm can be realized with lower time and space complexity and adjustable object detection error rate with comparison to other background subtraction algorithms. Making use of the algorithm, an indoor monitoring system is also worked out and the motion tracking process is presented in this paper.Experimental results testify the algorithms' good performances when used in an indoor monitoring system.

  10. Adaptation of Jobs for the Disabled.

    Science.gov (United States)

    International Labour Office, Geneva (Switzerland).

    The handbook provides an illustrated guide to ways of improving useful employment of the disabled by adapting jobs. How to increase employment opportunities by making simple adjustments (adaption or redesign of tools, machines, work places) is demonstrated. The nature of occupational handicap is discussed, stressing the importance of job and…

  11. Adaptation and initial validation of the Patient Health Questionnaire - 9 (PHQ-9) and the Generalized Anxiety Disorder - 7 Questionnaire (GAD-7) in an Arabic speaking Lebanese psychiatric outpatient sample.

    Science.gov (United States)

    Sawaya, Helen; Atoui, Mia; Hamadeh, Aya; Zeinoun, Pia; Nahas, Ziad

    2016-05-30

    The Patient Health Questionnaire - 9 (PHQ-9) and Generalized Anxiety Disorder - 7 (GAD-7) are short screening measures used in medical and community settings to assess depression and anxiety severity. The aim of this study is to translate the screening tools into Arabic and evaluate their psychometric properties in an Arabic-speaking Lebanese psychiatric outpatient sample. The patients completed the questionnaires, among others, prior to being evaluated by a clinical psychiatrist or psychologist. The scales' internal consistency and factor structure were measured and convergent and discriminant validity were established by comparing the scores with clinical diagnoses and the Psychiatric Diagnostic Screening Questionnaire - MDD subset (PDSQ - MDD). Results showed that the PHQ-9 and GAD-7 are reliable screening tools for depression and anxiety and their factor structures replicated those reported in the literature. Sensitivity and specificity analyses showed that the PHQ-9 is sensitive but not specific at capturing depressive symptoms when compared to clinician diagnoses whereas the GAD-7 was neither sensitive nor specific at capturing anxiety symptoms. The implications of these findings are discussed in reference to the scales themselves and the cultural specificity of the Lebanese population. PMID:27031595

  12. Adapting agriculture with traditional knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Swiderska, Krystyna; Reid, Hannah [IIED, London (United Kingdom); Song, Yiching; Li, Jingsong [Centre for Chinese Agriculutral Policy (China); Mutta, Doris [Kenya Forestry Research Institute (Kenya)

    2011-10-15

    Over the coming decades, climate change is likely to pose a major challenge to agriculture; temperatures are rising, rainfall is becoming more variable and extreme weather is becoming a more common event. Researchers and policymakers agree that adapting agriculture to these impacts is a priority for ensuring future food security. Strategies to achieve that in practice tend to focus on modern science. But evidence, both old and new, suggests that the traditional knowledge and crop varieties of indigenous peoples and local communities could prove even more important in adapting agriculture to climate change.

  13. Expressing Adaptation Strategies Using Adaptation Patterns

    Science.gov (United States)

    Zemirline, N.; Bourda, Y.; Reynaud, C.

    2012-01-01

    Today, there is a real challenge to enable personalized access to information. Several systems have been proposed to address this challenge including Adaptive Hypermedia Systems (AHSs). However, the specification of adaptation strategies remains a difficult task for creators of such systems. In this paper, we consider the problem of the definition…

  14. Adaptation: Paradigm for the Gut and an Academic Career

    OpenAIRE

    Warner, Brad W.

    2013-01-01

    Adaptation is an important compensatory response to environmental cues resulting in enhanced survival. In the gut, the abrupt loss of intestinal length is characterized by increased rates of enterocyte proliferation and apoptosis and culminates in adaptive villus and crypt growth. In the development of an academic pediatric surgical career, adaptation is also an important compensatory response to survive the ever changing research, clinical, and economic environment. The ability to adapt in b...

  15. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  16. Adaptive immunity to fungi.

    Science.gov (United States)

    Verma, Akash; Wüthrich, Marcel; Deepe, George; Klein, Bruce

    2014-11-06

    Life-threatening fungal infections have risen sharply in recent years, owing to the advances and intensity of medical care that may blunt immunity in patients. This emerging crisis has created the growing need to clarify immune defense mechanisms against fungi with the ultimate goal of therapeutic intervention. We describe recent insights in understanding the mammalian immune defenses that are deployed against pathogenic fungi. We focus on adaptive immunity to the major medically important fungi and emphasize three elements that coordinate the response: (1) dendritic cells and subsets that are mobilized against fungi in various anatomical compartments; (2) fungal molecular patterns and their corresponding receptors that signal responses and shape the differentiation of T-cell subsets and B cells; and, ultimately (3) the effector and regulatory mechanisms that eliminate these invaders while constraining collateral damage to vital tissue. These insights create a foundation for the development of new, immune-based strategies for prevention or enhanced clearance of systemic fungal diseases.

  17. Brazilian Portuguese transcultural adaptation of Barkley Deficits in Executive Functioning Scale (BDEFS

    Directory of Open Access Journals (Sweden)

    Victor Polignano Godoy

    2015-12-01

    Full Text Available Abstract Background Considering the importance of Executive Functions to clinical and nonclinical situations, Barkley proposed a new theory of executive functioning based on an evolutionary neuropsychological perspective and clinical research using large samples of clinical and community identified adults and children as well as children with ADHD followed to adulthood. Objective The present study aims to adapt the Barkley Deficits in Executive Functions Scales (BDEFS to Brazilian Portuguese and also assess its construct validity in a sample of normal Brazilian adults. Methods The original version of scale was adapted to Brazilian Portuguese according to the guideline from the ISPOR Task Force. To assess the semantic equivalence between the original and adapted version, both of them were applied into a sample of 25 Brazilian bilingual adults. Finally, 60 Brazilian adults completed the BDEFS and the Brazilian versions of Barratt Impulsiveness Scale (BIS-11 and Adult Self-Report Scale (ASRS-18 to assess convergent validity. Results The BDEFS Brazilian Portuguese version has semantic correspondence with the original version indicating that the adaptation procedure was successful. The BDEFS correlated significantly with the impulsivity and attention scores from the BIS-11 and ASRS-18 supporting its construct validity. Cronbach’s alpha (α = 0.961 indicated that the BDEFS translated version has satisfactory internal consistency. Discussion Together, these findings indicate the successful adaptation of the BDEFS to Brazilian Portuguese and support its utility in that population.

  18. Low dose effects. Adaptive response

    International Nuclear Information System (INIS)

    The purpose of this work was to evaluate if there are disturbancies in adaptive response when lymphocytes of people living on the polluted with radionuclides area after Chernobyl disaster and liquidators suffered from accident have been investigated. The level of lymphocytes with micronuclei have been scored in Moscow donors and people living in Bryansk region with the degree of contamination 15 - 40 Ci/km. The doses that liquidators have been obtained were not higher then 25 cGy. The mean spontaneous level of MN in control people and people from Chernobyl zones does't differ significantly but the individual variability in the mean value between two populations does not differ significantly too. And in this case it seems that persons of exposed areas. Then another important fact in lymphocytes of people living on polluted areas the chronic low dose irradiation does not induce the adaptive response. In Moscow people in most cases (≅ 59 %) the adaptive response is observed and in some cases the demonstration of adaptive response is not significant (≅1%). In Chernobyl population exposed to chronic low level, low dose rate irradiation there are fewer people here with distinct adaptive response (≅38%). And there appear beings with increased radiosensitivity after conditioned dose. Such population with enhanced radiosensitivity have not observed in Moscow. In liquidators the same types of effects have been registered. These results have been obtained on adults. Adaptive response in children 8 - 14 old population living in Moscow and in Chernobyl zone have been investigated too. In this case the spontaneous level of MN is higher in children living in polluted areas, after the 1.0 Gy irradiation the individual variability is very large. Only 5 % of children have distinct is very large. Only 5 % of children have distinct adaptive response, the enhancement of radiosensitivity after conditioned dose is observed. (authors)

  19. Approximation of NURBS Curves and Surfaces Using Adaptive Equidistant Parameterizations

    Institute of Scientific and Technical Information of China (English)

    Aziguli Wulamu; GOETTING Marc; ZECKZER Dirk

    2005-01-01

    Non-uniform rational B-spline (NURBS) curves and surfaces are very important tools for modelling curves and surfaces. Several important details, such as the choice of the sample points, of the parameterization, and of the termination condition, are however not well described. These details have a great influence on the performance of the approximation algorithm, both in terms of quality as well as time and space usage. This paper described how to sample points, examining two standard parameterizations: equidistant and chordal. A new and local parameterization, namely an adaptive equidistant model, was proposed, which enhances the equidistant model. Localization can also be used to enhance the chordal parameterization. For NURBS surfaces, one must choose which direction will be approximated first and must pay special attention to surfaces of degree 1 which have to be handled as a special case.

  20. Principles of adaptive optics

    CERN Document Server

    Tyson, Robert

    2010-01-01

    History and BackgroundIntroductionHistoryPhysical OpticsTerms in Adaptive OpticsSources of AberrationsAtmospheric TurbulenceThermal BloomingNonatmospheric SourcesAdaptive Optics CompensationPhase ConjugationLimitations of Phase ConjugationArtificial Guide StarsLasers for Guide StarsCombining the LimitationsLinear AnalysisPartial Phase ConjugationAdaptive Optics SystemsAdaptive Optics Imaging SystemsBeam Propagation Syst

  1. ASIC DESIGN OF ADAPTIVE THRESHOLD DENOISE DWT CHIP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the relationship of wavelet transform and perfect reconstructive FIR filter banks, this paper presents a real-time chip with adaptive Donoho's non-linear soft-threshold for denoising in different levels of multi-scale space through rearranging the input data during convolving, filtering and sub-sampling.And more important, it gives a simple iterative algorithm to calculate the variance of the noise in interregna with no signal.It works well whether the signal or noise is stationary or not.

  2. The Importance of Resilience for Well-Being in Retirement

    Directory of Open Access Journals (Sweden)

    Cristiane Pimentel Nalin

    2015-08-01

    Full Text Available The increase in the elderly population has prompted research on retirement. This study investigated the importance of resilience, economic satisfaction, the length of retirement, and planning to well-being during retirement of 270 participants. The majority of this sample were men (64%, and the mean age was 65 years (SD = 5.7. The participants were retired members of 10 public and private organizations in Rio de Janeiro. Factor analysis and hierarchical regression were performed. The results showed that determined resilience (mastery, adaptability, confidence and perseverance and socioeconomic satisfaction were the main predictors of well-being in retirement and explained 28% of this model. The findings suggest that well-being in retirement is closely related to socioeconomic satisfaction and determined resilience. Additional research should address the importance of resilience for the well-being of retirees who are or not members of retirement associations. Resilience attitudes should be promoted in Retirement Education Programs.

  3. Adaptation to climate change

    NARCIS (Netherlands)

    J. Carmin; K. Tierney; E. Chu; L.M. Hunter; J.T. Roberts; L. Shi

    2015-01-01

    Climate change adaptation involves major global and societal challenges such as finding adequate and equitable adaptation funding and integrating adaptation and development programs. Current funding is insufficient. Debates between the Global North and South center on how best to allocate the financ

  4. Molecular evolution and thermal adaptation

    Science.gov (United States)

    Chen, Peiqiu

    2011-12-01

    generations. Diversity plays an important role in thermal adaptation: While monoclonal strains adapt via acquisition and rapid fixation of new early mutations, wild population adapt via standing genetic variations, and they are more robust against thermal shocks due to greater diversity within the initial population.

  5. Improved Reliability Sensitivity Estimation and its Variance Analysis by a Novel β Hyper-plane Based Importance Sampling Method%基于β面截断重要抽样法可靠性灵敏度估计及其方差分析

    Institute of Scientific and Technical Information of China (English)

    张峰; 吕震宙; 崔利杰

    2011-01-01

    基于β面的截断重要抽样法可以用来求解单失效模式可靠性灵敏度.该方法在设计点处作失效面的虚拟切面β面,而β面将变量空间分割成重要抽样区域R和非重要抽样区域S.在R和S区域分别建立相应的截断重要抽样密度函数hR(x)和hs(x),从hR(x)和hs(x)中抽取的样本量按照R和S区域对可靠性灵敏度的贡献来分配,并通过迭代模拟计算来得到.本文推导了基于β面截断重要抽样法的可靠性灵敏度估计值方差和变异系数的计算公式,并将该方法推广应用到并联系统中.算例结果表明:在估计值相对误差小于2%、可靠性灵敏度估计值变异系数相同时,基于β面的截断重要抽样法的可靠性灵敏度估计所需的样本数比传统重要抽样法、β球截断重要抽样法计算量少.%A novel β hyper-plane based importance sampling method is presented to estimate reliability sensitivity of a structure. By introducing a virtual hyper-plane tangent to the failure surface, the variable space is separated into an importance region R and a unimportance region S, on which the truncated importance sampling functions hR(x) and hs(x) are established, respectively. The sampling numbers generated from hR(x) and hs(x) are dependent on the contribution of the reliability sensitivity, which is determined by the iterative simulations. The formulae of the reliability sensitivity estimation, the variance and the coefficient of variation are derived for the presented β hyper-plane importance sampling method. The presented method is suitable for the reliability sensitivity estimation of both the single failure mode and the multiple failure mode in parallel. Examples show that the proposed method is more efficient than the traditional importance sampling method and the β hyper-sphere importance sampling method, in the case that the variation coefficients of three estimations keep the same quantity and the relative errors of the

  6. Continuous-time adaptive critics.

    Science.gov (United States)

    Hanselmann, Thomas; Noakes, Lyle; Zaknich, Anthony

    2007-05-01

    A continuous-time formulation of an adaptive critic design (ACD) is investigated. Connections to the discrete case are made, where backpropagation through time (BPTT) and real-time recurrent learning (RTRL) are prevalent. Practical benefits are that this framework fits in well with plant descriptions given by differential equations and that any standard integration routine with adaptive step-size does an adaptive sampling for free. A second-order actor adaptation using Newton's method is established for fast actor convergence for a general plant and critic. Also, a fast critic update for concurrent actor-critic training is introduced to immediately apply necessary adjustments of critic parameters induced by actor updates to keep the Bellman optimality correct to first-order approximation after actor changes. Thus, critic and actor updates may be performed at the same time until some substantial error build up in the Bellman optimality or temporal difference equation, when a traditional critic training needs to be performed and then another interval of concurrent actor-critic training may resume. PMID:17526332

  7. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  8. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  9. An analysis of adaptation negotiations in Poznan

    International Nuclear Information System (INIS)

    Hastily presented as one of the major accomplishments of the 14. United Nations Conference on Climate Change in Poznan, discussions on adaptation actually need careful analysis. Obviously, an increasing number of stakeholders (whether Parties, delegation members, civil society, businesses) see adaptation as a top concern, and this resulted in Poznan in a strong presence of the issue in plenary sessions, contact and informal groups, side events, press conferences, stands, etc. With respect to the historical treatment of adaptation, which has been quite light before COP 13 in Bali (2007), the vogue for adaptation may be good news. However, all the difficulty now lies in translating the semantic success and political momentum into operational outcomes. As the following critical synthesis shows, Poznan can hardly be considered as a major breakthrough in that regard although some significant steps forward have been made. In the past, little importance has been given to adaption in the climate change talks until the middle of this decade. In the early days of discussions (the 80's), climate change was not seen as a pressing matter, impacts were not expected to occur if action to reduce climate change was appropriately taken and there was thus no hurry to adapt. Then, in the late 90's adaptation was seen as a possible alternative to mitigation, and those defending adaptation as being resigned. Adaptation only started to gain some momentum in 2005 in Montreal, and was finally considered on an equal footage with mitigation in 2007 in Bali. Discussions on adaptation are thus still not at the level of those on mitigation, but Poznan was in a sense a major accomplishment in bringing adaptation on top of the agenda. Before Poznan, adaptation under the UNFCCC was limited to a couple of loose work programmes (see below) and three small funds financing adaptation activities in developing countries. One of these activities, arguably the most visible, is the realisation of National

  10. Cognitive adaptation to nonmelanoma skin cancer.

    Science.gov (United States)

    Czajkowska, Zofia; Radiotis, George; Roberts, Nicole; Körner, Annett

    2013-01-01

    Taylor's (1983) cognitive adaptation theory posits that when people go through life transitions, such as being diagnosed with a chronic disease, they adjust to their new reality. The adjustment process revolves around three themes: search for positive meaning in the experience or optimism, attempt to regain a sense of mastery in life, as well as an effort to enhance self-esteem. In the sample of 57 patients with nonmelanoma skin cancer the Cognitive Adaptation Index successfully predicted participants' distress (p accounting for 60% of the variance and lending support for the Taylor's theory of cognitive adaptation in this population.

  11. Cognitive adaptation to nonmelanoma skin cancer.

    Science.gov (United States)

    Czajkowska, Zofia; Radiotis, George; Roberts, Nicole; Körner, Annett

    2013-01-01

    Taylor's (1983) cognitive adaptation theory posits that when people go through life transitions, such as being diagnosed with a chronic disease, they adjust to their new reality. The adjustment process revolves around three themes: search for positive meaning in the experience or optimism, attempt to regain a sense of mastery in life, as well as an effort to enhance self-esteem. In the sample of 57 patients with nonmelanoma skin cancer the Cognitive Adaptation Index successfully predicted participants' distress (p accounting for 60% of the variance and lending support for the Taylor's theory of cognitive adaptation in this population. PMID:23844920

  12. The Scope of Adaptive Digital Games for Education

    OpenAIRE

    Prince, Rikki; Davis, Hugh

    2008-01-01

    In learning technologies, there is a distinct difference between the user sequencing in a system based on IMS simple sequencing and an adaptive hypermedia system. This range of possibilities is important to consider when attempting to augment educational games with adaptive elements. This poster demonstrates how truly adaptive games could be designed and discusses why this is useful in the field of education.

  13. Adaptive research supervision : Exploring expert thesis supervisors' practical knowledge

    NARCIS (Netherlands)

    de Kleijn, Renske A M; Meijer, Paulien C.; Brekelmans, Mieke; Pilot, Albert

    2015-01-01

    Several researchers have suggested the importance of being responsive to students' needs in research supervision. Adapting support strategies to students' needs in light of the goals of a task is referred to as adaptivity. In the present study, the practice of adaptivity is explored by interviewing

  14. Nutrition and training adaptations in aquatic sports.

    Science.gov (United States)

    Mujika, Iñigo; Stellingwerff, Trent; Tipton, Kevin

    2014-08-01

    The adaptive response to training is determined by the combination of the intensity, volume, and frequency of the training. Various periodized approaches to training are used by aquatic sports athletes to achieve performance peaks. Nutritional support to optimize training adaptations should take periodization into consideration; that is, nutrition should also be periodized to optimally support training and facilitate adaptations. Moreover, other aspects of training (e.g., overload training, tapering and detraining) should be considered when making nutrition recommendations for aquatic athletes. There is evidence, albeit not in aquatic sports, that restricting carbohydrate availability may enhance some training adaptations. More research needs to be performed, particularly in aquatic sports, to determine the optimal strategy for periodizing carbohydrate intake to optimize adaptations. Protein nutrition is an important consideration for optimal training adaptations. Factors other than the total amount of daily protein intake should be considered. For instance, the type of protein, timing and pattern of protein intake and the amount of protein ingested at any one time influence the metabolic response to protein ingestion. Body mass and composition are important for aquatic sport athletes in relation to power-to-mass and for aesthetic reasons. Protein may be particularly important for athletes desiring to maintain muscle while losing body mass. Nutritional supplements, such as b-alanine and sodium bicarbonate, may have particular usefulness for aquatic athletes' training adaptation.

  15. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  16. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  17. Diffusion Adaptation over Networks

    CERN Document Server

    Sayed, Ali H

    2012-01-01

    Adaptive networks are well-suited to perform decentralized information processing and optimization tasks and to model various types of self organized and complex behavior encountered in nature. Adaptive networks consist of a collection of agents with processing and learning abilities. The agents are linked together through a connection topology, and they cooperate with each other through local interactions to solve distributed inference problems in real-time. The continuous diffusion of information across the network enables agents to adapt their performance in relation to changing data and network conditions; it also results in improved adaptation and learning performance relative to non-cooperative networks. This article provides an overview of diffusion strategies for adaptation and learning over networks. The article is divided into several sections: 1. Motivation; 2. Mean-Square-Error Estimation; 3. Distributed Optimization via Diffusion Strategies; 4. Adaptive Diffusion Strategies; 5. Performance of Ste...

  18. Adaptation of the Wechsler Intelligence Scale for Children-IV (WISC-IV) for Vietnam.

    Science.gov (United States)

    Dang, Hoang-Minh; Weiss, Bahr; Pollack, Amie; Nguyen, Minh Cao

    2012-12-01

    Intelligence testing is used for many purposes including identification of children for proper educational placement (e.g., children with learning disabilities, or intellectually gifted students), and to guide education by identifying cognitive strengths and weaknesses so that teachers can adapt their instructional style to students' specific learning styles. Most of the research involving intelligence tests has been conducted in highly developed Western countries, yet the need for intelligence testing is as or even more important in developing countries. The present study, conducted through the Vietnam National University Clinical Psychology CRISP Center, focused on the cultural adaptation of the WISC-IV intelligence test for Vietnam. We report on (a) the adaptation process including the translation, cultural analysis and modifications involved in adaptation, (b) present results of two pilot studies, and (c) describe collection of the standardization sample and results of analyses with the standardization sample, with the goal of sharing our experience with other researchers who may be involved in or interested in adapting or developing IQ tests for non-Western, non-English speaking cultures. PMID:23833330

  19. Inducible competitors and adaptive diversification

    Directory of Open Access Journals (Sweden)

    Beren W. ROBINSON, David W. PFENNIG

    2013-08-01

    Full Text Available Identifying the causes of diversification is central to evolutionary biology. The ecological theory of adaptive diversification holds that the evolution of phenotypic differences between populations and species––and the formation of new species––stems from divergent natural selection, often arising from competitive interactions. Although increasing evidence suggests that phenotypic plasticity can facilitate this process, it is not generally appreciated that competitively mediated selection often also provides ideal conditions for phenotypic plasticity to evolve in the first place. Here, we discuss how competition plays at least two key roles in adaptive diversification depending on its pattern. First, heterogenous competition initially generates heterogeneity in resource use that favors adaptive plasticity in the form of “inducible competitors”. Second, once such competitively induced plasticity evolves, its capacity to rapidly generate phenotypic variation and expose phenotypes to alternate selective regimes allows populations to respond readily to selection favoring diversification, as may occur when competition generates steady diversifying selection that permanently drives the evolutionary divergence of populations that use different resources. Thus, competition plays two important roles in adaptive diversification––one well-known and the other only now emerging––mediated through its effect on the evolution of phenotypic plasticity [Current Zoology 59 (4: 537–552, 2013].

  20. Inducible competitors and adaptive diversification

    Institute of Scientific and Technical Information of China (English)

    Beren W.ROBINSON; David W.PFENNIG

    2013-01-01

    Identifying the causes of diversification is central to evolutionary biology.The ecological theory of adaptive diversification holds that the evolution of phenotypic differences between populations and species-and the formation of new species-stems from divergent natural selection,often arising from competitive interactions.Although increasing evidence suggests that phenotypic plasticity can facilitate this process,it is not generally appreciated that competitively mediated selection often also provides ideal conditions for phenotypic plasticity to evolve in the first place.Here,we discuss how competition plays at least two key roles in adaptive diversification depending on its pattern.First,heterogenous competition initially generates heterogeneity in resource use that favors adaptive plasticity in the form of"inducible competitors".Second,once such competitively induced plasticity evolves,its capacity to rapidly generate phenotypic variation and expose phenotypes to alternate selective regimes allows populations to respond readily to selection favoring diversification,as may occur when competition generates steady diversifying selection that permanently drives the evolutionary divergence of populations that use different resources.Thus,competition plays two important roles in adaptive diversification--one well-known and the other only now emerging-mediated through its effect on the evolution of phenotypic plasticity.

  1. Comparison of semi-automatized assays for anti-T. gondii IgG detection in low-reactivity serum samples: importance of the results in patient counseling Comparação de ensaios semi-automatizados para pesquisa de IgG anti-T. gondii em amostras de soros de baixa reatividade: importância dos resultados no aconselhamento do paciente

    Directory of Open Access Journals (Sweden)

    Paulo Guilherme Leser

    2003-06-01

    Full Text Available Toxoplasmosis is a disease which can cause severe congenital infection and is normally diagnosed by the detection of T. gondii specific antibodies in the serum of infected patients. Several different tests allow to distinguish recent from past infections and to quantify anti-T. gondii specific IgG, and the results can be used as markers for immunity. In the present study, we compare the performance of two different methodologies, the Elfa (bioMérieux S.A and the Meia (Abbott Laboratories in detecting T. gondii specific IgG in low-reactivity sera. Of 76 analyzed samples, three presented discrepant results, being positive in the Abbott AxSYM Toxo IgG assay, and negative in the bioMérieux Vidas Toxo IgG II assay. By using other tests, the three sera were confirmed to be negative. The results are discussed in the context of their importance for patient management, especially during pregnancy.Toxoplasmose, doença conhecida por sua severidade na infecção congênita é geralmente diagnosticada pela demonstração de anticorpos específicos contra antígenos de T. gondii, presentes no soro de indivíduos infectados. Diferentes testes são disponíveis para diferenciar infecção recente de infecção pregressa, para quantificar anticorpos IgG anti-T. gondii nos soros dos pacientes e utilizar os resultados como marcadores de imunidade. Neste trabalho apresentamos os resultados do estudo comparativo de duas tecnologias, Elfa (bioMérieux S.A. e Meia (Abbott Laboratories, para pesquisa de anticorpos IgG anti-T. gondii em amostras de soros de baixa reatividade. De 76 amostras processadas, três apresentaram resultados discrepantes, reagentes para AxSYM Toxo IgG e não-reagentes para Vidas Toxo IgG II. A confirmação dos resultados, feita por bateria de testes, mostrou que todas as três amostras eram não-reagentes. Os resultados são discutidos em sua importância e orientação clínica, principalmente para a paciente gestante.

  2. The Adaptive Organization

    DEFF Research Database (Denmark)

    Andersen, Torben Juul; Hallin, Carina Antonia

    2016-01-01

    Contemporary organizations operate under turbulent business conditions and must adapt their strategies to ongoing changes. This article argues that sustainable organizational performance is achieved when top management directs and coordinates interactive processes anchored in emerging organizatio......Contemporary organizations operate under turbulent business conditions and must adapt their strategies to ongoing changes. This article argues that sustainable organizational performance is achieved when top management directs and coordinates interactive processes anchored in emerging...... the adaptive organization....

  3. Human Adaptations: Free divers

    OpenAIRE

    Tournat, Troy Z.

    2014-01-01

    Freediving has been around for thousands of years and was only way to dive until the inventionof oxygen tanks in the 19th century. Around the world, people dove for goods such as pearls, andtoday people freedive for sport. Divers stretch the limit of their body and mind’s capabilitiesthrough psychological adaptations from thermal, respiratory, and cardiovascular responses.Findings conclude that thermal adaptations are a similar process to cold adaptive response. Withthe implementation of wets...

  4. Consciousness And Adaptive Behavior

    OpenAIRE

    Sieb, Richard/A.

    2005-01-01

    Consciousness has resisted scientific explanation for centuries. The main problem in explaining consciousness is its subjectivity. Subjective systems may be adaptive. Humans can produce voluntary new or novel intentional (adaptive) action and such action is always accompanied by consciousness. Action normally arises from perception. Perception must be rerepresented in order to produce new or novel adaptive action. The internal explicit states produced by a widespread nonlinear emergen...

  5. Adaptive learning in moodle: three practical cases

    Directory of Open Access Journals (Sweden)

    Dolores LERÍS LÓPEZ

    2015-12-01

    Full Text Available One of the most important challenges that the education will have to face is the need to adapt the learning process to the student’s characteristics. Nowadays it is still noticed the weak technological support and the few personalised learning practices. Two main types of e-learning platforms have been developed for last years: Learning Management Systems (LMS and Adaptive Educational Hypermedia Systems. Both lines of development are converging so that the new versions of the LMS incorporate adaptive capacities that are allowing to design individualized or differentiated instruction. In this paper the adaptive functionalities available in Moodle are checked. It is explained how to implement three adaptive instructional designs in Moodle. Moreover, it is checked their effectiveness, in terms of the learning achieved by the student, and their efficiency, by reusing materials of previous learning experiences.

  6. Is adaptive co-management ethical?

    Science.gov (United States)

    Fennell, David; Plummer, Ryan; Marschke, Melissa

    2008-07-01

    'Good' governance and adaptive co-management hold broad appeal due to their positive connotations and 'noble ethical claims'. This paper poses a fundamental question: is adaptive co-management ethical? In pursuing an answer to this question, the concept of adaptive co-management is succinctly summarized and three ethical perspectives (deontology, teleology and existentialism) are explored. The case of adaptive co-management in Cambodia is described and subsequently considered through the lens of ethical triangulation. The case illuminates important ethical considerations and directs attention towards the need for meditative thinking which increases the value of tradition, ecology, and culture. Giving ethics a central position makes clear the potential for adaptive co-management to be an agent for governance, which is good, right and authentic as well as an arena to embrace uncertainty. PMID:17391840

  7. Group adaptation, formal darwinism and contextual analysis.

    Science.gov (United States)

    Okasha, S; Paternotte, C

    2012-06-01

    We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659-671) develop a novel approach to this question, building on Grafen's 'formal Darwinism' project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under which the selection-optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams' famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis.

  8. Adaptation of teleosts to very high salinity

    DEFF Research Database (Denmark)

    Laverty, Gary; Skadhauge, Erik

    2012-01-01

    A number of species of euryhaline teleosts have the remarkable ability to adapt and survive in environments of extreme salinity, up to two or even three times the osmolality of seawater. This review looks at some of the literature describing the adaptive changes that occur, primarily...... with intestinal water absorption and with the properties of the gill epithelium. While there is much that is still not completely understood, recent work has begun to look at these adaptations at the cellular and molecular level. As with seawater osmoregulation, fish adapting to hypersaline conditions generally...... in several species. Adaptive changes in the gill epithelium are also critical in this process, allowing for secretion of absorbed NaCl from the extracellular fluids. Most notably there are important changes in the numbers and size of mitochondrion-rich (MR) cells, the sites of active secretion of Cl...

  9. Quantifying the adaptive cycle

    Science.gov (United States)

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  10. Adaptive Wireless Transceiver Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Wireless technologies are an increasingly attractive means for spatial data, input, manipulation, and distribution. Mobitrum is proposing an innovative Adaptive...

  11. Quantifying adaptive evolution in the Drosophila immune system.

    Directory of Open Access Journals (Sweden)

    Darren J Obbard

    2009-10-01

    Full Text Available It is estimated that a large proportion of amino acid substitutions in Drosophila have been fixed by natural selection, and as organisms are faced with an ever-changing array of pathogens and parasites to which they must adapt, we have investigated the role of parasite-mediated selection as a likely cause. To quantify the effect, and to identify which genes and pathways are most likely to be involved in the host-parasite arms race, we have re-sequenced population samples of 136 immunity and 287 position-matched non-immunity genes in two species of Drosophila. Using these data, and a new extension of the McDonald-Kreitman approach, we estimate that natural selection fixes advantageous amino acid changes in immunity genes at nearly double the rate of other genes. We find the rate of adaptive evolution in immunity genes is also more variable than other genes, with a small subset of immune genes evolving under intense selection. These genes, which are likely to represent hotspots of host-parasite coevolution, tend to share similar functions or belong to the same pathways, such as the antiviral RNAi pathway and the IMD signalling pathway. These patterns appear to be general features of immune system evolution in both species, as rates of adaptive evolution are correlated between the D. melanogaster and D. simulans lineages. In summary, our data provide quantitative estimates of the elevated rate of adaptive evolution in immune system genes relative to the rest of the genome, and they suggest that adaptation to parasites is an important force driving molecular evolution.

  12. Assessing institutional capacities to adapt to climate change - integrating psychological dimensions in the Adaptive Capacity Wheel

    Science.gov (United States)

    Grothmann, T.; Grecksch, K.; Winges, M.; Siebenhüner, B.

    2013-03-01

    Several case studies show that "soft social factors" (e.g. institutions, perceptions, social capital) strongly affect social capacities to adapt to climate change. Many soft social factors can probably be changed faster than "hard social factors" (e.g. economic and technological development) and are therefore particularly important for building social capacities. However, there are almost no methodologies for the systematic assessment of soft social factors. Gupta et al. (2010) have developed the Adaptive Capacity Wheel (ACW) for assessing the adaptive capacity of institutions. The ACW differentiates 22 criteria to assess six dimensions: variety, learning capacity, room for autonomous change, leadership, availability of resources, fair governance. To include important psychological factors we extended the ACW by two dimensions: "adaptation motivation" refers to actors' motivation to realise, support and/or promote adaptation to climate. "Adaptation belief" refers to actors' perceptions of realisability and effectiveness of adaptation measures. We applied the extended ACW to assess adaptive capacities of four sectors - water management, flood/coastal protection, civil protection and regional planning - in North Western Germany. The assessments of adaptation motivation and belief provided a clear added value. The results also revealed some methodological problems in applying the ACW (e.g. overlap of dimensions), for which we propose methodological solutions.

  13. Assessing institutional capacities to adapt to climate change: integrating psychological dimensions in the Adaptive Capacity Wheel

    Science.gov (United States)

    Grothmann, T.; Grecksch, K.; Winges, M.; Siebenhüner, B.

    2013-12-01

    Several case studies show that social factors like institutions, perceptions and social capital strongly affect social capacities to adapt to climate change. Together with economic and technological development they are important for building social capacities. However, there are almost no methodologies for the systematic assessment of social factors. After reviewing existing methodologies we identify the Adaptive Capacity Wheel (ACW) by Gupta et al. (2010), developed for assessing the adaptive capacity of institutions, as the most comprehensive and operationalised framework to assess social factors. The ACW differentiates 22 criteria to assess 6 dimensions: variety, learning capacity, room for autonomous change, leadership, availability of resources, fair governance. To include important psychological factors we extended the ACW by two dimensions: "adaptation motivation" refers to actors' motivation to realise, support and/or promote adaptation to climate; "adaptation belief" refers to actors' perceptions of realisability and effectiveness of adaptation measures. We applied the extended ACW to assess adaptive capacities of four sectors - water management, flood/coastal protection, civil protection and regional planning - in northwestern Germany. The assessments of adaptation motivation and belief provided a clear added value. The results also revealed some methodological problems in applying the ACW (e.g. overlap of dimensions), for which we propose methodological solutions.

  14. Modular microfluidic system for biological sample preparation

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Klint A.; Mariella, Jr., Raymond P.; Bailey, Christopher G.; Ness, Kevin Dean

    2015-09-29

    A reconfigurable modular microfluidic system for preparation of a biological sample including a series of reconfigurable modules for automated sample preparation adapted to selectively include a) a microfluidic acoustic focusing filter module, b) a dielectrophoresis bacteria filter module, c) a dielectrophoresis virus filter module, d) an isotachophoresis nucleic acid filter module, e) a lyses module, and f) an isotachophoresis-based nucleic acid filter.

  15. The technological influence on health professionals' care: translation and adaptation of scales1

    Science.gov (United States)

    Almeida, Carlos Manuel Torres; Almeida, Filipe Nuno Alves dos Santos; Escola, Joaquim José Jacinto; Rodrigues, Vitor Manuel Costa Pereira

    2016-01-01

    Objectives: in this study, two research tools were validated to study the impact of technological influence on health professionals' care practice. Methods: the following methodological steps were taken: bibliographic review, selection of the scales, translation and cultural adaptation and analysis of psychometric properties. Results: the psychometric properties of the scale were assessed based on its application to a sample of 341 individuals (nurses, physicians, final-year nursing and medical students). The validity, reliability and internal consistency were tested. Two scales were found: Caring Attributes Questionnaire (adapted) with a Cronbach's Alpha coefficient of 0.647 and the Technological Influence Questionnaire (adapted) with an Alpha coefficient of 0.777. Conclusions: the scales are easy to apply and reveal reliable psychometric properties, an additional quality as they permit generalized studies on a theme as important as the impact of technological influence in health care. PMID:27143537

  16. A General Framework for Sequential and Adaptive Methods in Survival Studies

    CERN Document Server

    Luo, Xiaolong; Ying, Zhiliang

    2011-01-01

    Adaptive treatment allocation schemes based on interim responses have generated a great deal of recent interest in clinical trials and other follow-up studies. An important application of such schemes is in survival studies, where the response variable of interest is time to the occurrence of a certain event. Due to possible dependency structures inherited from the enrollment and allocation schemes, existing approaches to survival models, including those that handle staggered entry, cannot be applied directly. This paper develops a new general framework with its theoretical foundation for handling such adaptive designs. The new approach is based on marked point processes and differs from existing approaches in that it considers entry and calender times rather than survival and calender times. Large sample properties, which are essential for statistical inference, are established. Special attention is given to the Cox model and related score processes. Applications to adaptive and sequential designs are discus...

  17. Adaptive Multimedia Retrieval: Semantics, Context, and Adaptation

    DEFF Research Database (Denmark)

    This book constitutes the thoroughly refereed post-conference proceedings of the 10th International Conference on Adaptive Multimedia Retrieval, AMR 2012, held in Copenhagen, Denmark, in October 2012. The 17 revised full papers presented were carefully reviewed and selected from numerous submissi...

  18. Adaptive sharpening of photos

    Science.gov (United States)

    Safonov, Ilia V.; Rychagov, Michael N.; Kang, KiMin; Kim, Sang Ho

    2008-01-01

    Sharpness is an important attribute that contributes to the overall impression of printed photo quality. Often it is impossible to estimate sharpness prior to printing. Sometimes it is a complex task for a consumer to obtain accurate sharpening results by editing a photo on a computer. The novel method of adaptive sharpening aimed for photo printers is proposed. Our approach includes 3 key techniques: sharpness level estimation, local tone mapping and boosting of local contrast. Non-reference automatic sharpness level estimation is based on analysis of variations of edges histograms, where edges are produced by high-pass filters with various kernel sizes, array of integrals of logarithm of edges histograms characterizes photo sharpness, machine learning is applied to choose optimal parameters for given printing size and resolution. Local tone mapping with ordering is applied to decrease edge transition slope length without noticeable artifacts and with some noise suppression. Unsharp mask via bilateral filter is applied for boosting of local contrast. This stage does not produce strong halo artifact which is typical for the traditional unsharp mask filter. The quality of proposed approach is evaluated by surveying observer's opinions. According to obtained replies the proposed method enhances the majority of photos.

  19. Ship detection for high resolution optical imagery with adaptive target filter

    Science.gov (United States)

    Ju, Hongbin

    2015-10-01

    Ship detection is important due to both its civil and military use. In this paper, we propose a novel ship detection method, Adaptive Target Filter (ATF), for high resolution optical imagery. The proposed framework can be grouped into two stages, where in the first stage, a test image is densely divided into different detection windows and each window is transformed to a feature vector in its feature space. The Histograms of Oriented Gradients (HOG) is accumulated as a basic feature descriptor. In the second stage, the proposed ATF highlights all the ship regions and suppresses the undesired backgrounds adaptively. Each detection window is assigned a score, which represents the degree of the window belonging to a certain ship category. The ATF can be adaptively obtained by the weighted Logistic Regression (WLR) according to the distribution of backgrounds and targets of the input image. The main innovation of our method is that we only need to collect positive training samples to build the filter, while the negative training samples are adaptively generated by the input image. This is different to other classification method such as Support Vector Machine (SVM) and Logistic Regression (LR), which need to collect both positive and negative training samples. The experimental result on 1-m high resolution optical images shows the proposed method achieves a desired ship detection performance with higher quality and robustness than other methods, e.g., SVM and LR.

  20. Contrast adaptation in the Limulus lateral eye.

    Science.gov (United States)

    Valtcheva, Tchoudomira M; Passaglia, Christopher L

    2015-12-01

    Luminance and contrast adaptation are neuronal mechanisms employed by the visual system to adjust our sensitivity to light. They are mediated by an assortment of cellular and network processes distributed across the retina and visual cortex. Both have been demonstrated in the eyes of many vertebrates, but only luminance adaptation has been shown in invertebrate eyes to date. Since the computational benefits of contrast adaptation should apply to all visual systems, we investigated whether this mechanism operates in horseshoe crab eyes, one of the best-understood neural networks in the animal kingdom. The spike trains of optic nerve fibers were recorded in response to light stimuli modulated randomly in time and delivered to single ommatidia or the whole eye. We found that the retina adapts to both the mean luminance and contrast of a white-noise stimulus, that luminance- and contrast-adaptive processes are largely independent, and that they originate within an ommatidium. Network interactions are not involved. A published computer model that simulates existing knowledge of the horseshoe crab eye did not show contrast adaptation, suggesting that a heretofore unknown mechanism may underlie the phenomenon. This mechanism does not appear to reside in photoreceptors because white-noise analysis of electroretinogram recordings did not show contrast adaptation. The likely site of origin is therefore the spike discharge mechanism of optic nerve fibers. The finding of contrast adaption in a retinal network as simple as the horseshoe crab eye underscores the broader importance of this image processing strategy to vision. PMID:26445869

  1. Adaptation through chromosomal inversions in Anopheles

    Directory of Open Access Journals (Sweden)

    Diego eAyala

    2014-05-01

    Full Text Available Chromosomal inversions have been repeatedly involved in local adaptation in a large number of animals and plants. The ecological and behavioral plasticity of Anopheles species - human malaria vectors - is mirrored by high amounts of polymorphic inversions. The adaptive significance of chromosomal inversions has been consistently attested by strong and significant correlations between their frequencies and a number of phenotypic traits. Here, we provide an extensive literature review of the different adaptive traits associated with chromosomal inversions in the genus Anopheles. Traits having important consequences for the success of present and future vector control measures, such as insecticide resistance and behavioral changes, are discussed.

  2. Human Maternal Brain Plasticity: Adaptation to Parenting.

    Science.gov (United States)

    Kim, Pilyoung

    2016-09-01

    New mothers undergo dynamic neural changes that support positive adaptation to parenting and the development of mother-infant relationships. In this article, I review important psychological adaptations that mothers experience during pregnancy and the early postpartum period. I then review evidence of structural and functional plasticity in human mothers' brains, and explore how such plasticity supports mothers' psychological adaptation to parenting and sensitive maternal behaviors. Last, I discuss pregnancy and the early postpartum period as a window of vulnerabilities and opportunities when the human maternal brain is influenced by stress and psychopathology, but also receptive to interventions. PMID:27589497

  3. Pulse front adaptive optics in multiphoton microscopy

    Science.gov (United States)

    Sun, B.; Salter, P. S.; Booth, M. J.

    2016-03-01

    The accurate focusing of ultrashort laser pulses is extremely important in multiphoton microscopy. Using adaptive optics to manipulate the incident ultrafast beam in either the spectral or spatial domain can introduce significant benefits when imaging. Here we introduce pulse front adaptive optics: manipulating an ultrashort pulse in both the spatial and temporal domains. A deformable mirror and a spatial light modulator are operated in concert to modify contours of constant intensity in space and time within an ultrashort pulse. Through adaptive control of the pulse front, we demonstrate an enhancement in the measured fluorescence from a two photon microscope.

  4. Pulse front control with adaptive optics

    Science.gov (United States)

    Sun, B.; Salter, P. S.; Booth, M. J.

    2016-03-01

    The focusing of ultrashort laser pulses is extremely important for processes including microscopy, laser fabrication and fundamental science. Adaptive optic elements, such as liquid crystal spatial light modulators or membrane deformable mirrors, are routinely used for the correction of aberrations in these systems, leading to improved resolution and efficiency. Here, we demonstrate that adaptive elements used with ultrashort pulses should not be considered simply in terms of wavefront modification, but that changes to the incident pulse front can also occur. We experimentally show how adaptive elements may be used to engineer pulse fronts with spatial resolution.

  5. Parallel Adaptive Mesh Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L; Hornung, R; Plassmann, P; WIssink, A

    2005-03-04

    As large-scale, parallel computers have become more widely available and numerical models and algorithms have advanced, the range of physical phenomena that can be simulated has expanded dramatically. Many important science and engineering problems exhibit solutions with localized behavior where highly-detailed salient features or large gradients appear in certain regions which are separated by much larger regions where the solution is smooth. Examples include chemically-reacting flows with radiative heat transfer, high Reynolds number flows interacting with solid objects, and combustion problems where the flame front is essentially a two-dimensional sheet occupying a small part of a three-dimensional domain. Modeling such problems numerically requires approximating the governing partial differential equations on a discrete domain, or grid. Grid spacing is an important factor in determining the accuracy and cost of a computation. A fine grid may be needed to resolve key local features while a much coarser grid may suffice elsewhere. Employing a fine grid everywhere may be inefficient at best and, at worst, may make an adequately resolved simulation impractical. Moreover, the location and resolution of fine grid required for an accurate solution is a dynamic property of a problem's transient features and may not be known a priori. Adaptive mesh refinement (AMR) is a technique that can be used with both structured and unstructured meshes to adjust local grid spacing dynamically to capture solution features with an appropriate degree of resolution. Thus, computational resources can be focused where and when they are needed most to efficiently achieve an accurate solution without incurring the cost of a globally-fine grid. Figure 1.1 shows two example computations using AMR; on the left is a structured mesh calculation of a impulsively-sheared contact surface and on the right is the fuselage and volume discretization of an RAH-66 Comanche helicopter [35]. Note the

  6. Compressive adaptive computational ghost imaging

    CERN Document Server

    Aßmann, Marc; 10.1038/srep01545

    2013-01-01

    Compressive sensing is considered a huge breakthrough in signal acquisition. It allows recording an image consisting of $N^2$ pixels using much fewer than $N^2$ measurements if it can be transformed to a basis where most pixels take on negligibly small values. Standard compressive sensing techniques suffer from the computational overhead needed to reconstruct an image with typical computation times between hours and days and are thus not optimal for applications in physics and spectroscopy. We demonstrate an adaptive compressive sampling technique that performs measurements directly in a sparse basis. It needs much fewer than $N^2$ measurements without any computational overhead, so the result is available instantly.

  7. Twenty-five years of confirmatory adaptive designs: opportunities and pitfalls.

    Science.gov (United States)

    Bauer, Peter; Bretz, Frank; Dragalin, Vladimir; König, Franz; Wassmer, Gernot

    2016-02-10

    'Multistage testing with adaptive designs' was the title of an article by Peter Bauer that appeared 1989 in the German journal Biometrie und Informatik in Medizin und Biologie. The journal does not exist anymore but the methodology found widespread interest in the scientific community over the past 25 years. The use of such multistage adaptive designs raised many controversial discussions from the beginning on, especially after the publication by Bauer and Köhne 1994 in Biometrics: Broad enthusiasm about potential applications of such designs faced critical positions regarding their statistical efficiency. Despite, or possibly because of, this controversy, the methodology and its areas of applications grew steadily over the years, with significant contributions from statisticians working in academia, industry and agencies around the world. In the meantime, such type of adaptive designs have become the subject of two major regulatory guidance documents in the US and Europe and the field is still evolving. Developments are particularly noteworthy in the most important applications of adaptive designs, including sample size reassessment, treatment selection procedures, and population enrichment designs. In this article, we summarize the developments over the past 25 years from different perspectives. We provide a historical overview of the early days, review the key methodological concepts and summarize regulatory and industry perspectives on such designs. Then, we illustrate the application of adaptive designs with three case studies, including unblinded sample size reassessment, adaptive treatment selection, and adaptive endpoint selection. We also discuss the availability of software for evaluating and performing such designs. We conclude with a critical review of how expectations from the beginning were fulfilled, and - if not - discuss potential reasons why this did not happen. PMID:25778935

  8. [Postvagotomy adaptation syndrome].

    Science.gov (United States)

    Shapovalov, V A

    1998-01-01

    It was established in experiment, that the changes of the natural resistance of organism indexes and of the peritoneal cavity cytology has compensatory-adaptational character while the denervation-adaptational syndrome occurrence and progress, which may be assessed as eustress. Vagotomy and operative trauma cause qualitatively different reactions of an organism.

  9. Adaptive Wavelet Transforms

    Energy Technology Data Exchange (ETDEWEB)

    Szu, H.; Hsu, C. [Univ. of Southwestern Louisiana, Lafayette, LA (United States)

    1996-12-31

    Human sensors systems (HSS) may be approximately described as an adaptive or self-learning version of the Wavelet Transforms (WT) that are capable to learn from several input-output associative pairs of suitable transform mother wavelets. Such an Adaptive WT (AWT) is a redundant combination of mother wavelets to either represent or classify inputs.

  10. Behavioral Adaptation and Acceptance

    NARCIS (Netherlands)

    Martens, M.H.; Jenssen, G.D.

    2012-01-01

    One purpose of Intelligent Vehicles is to improve road safety, throughput, and emissions. However, the predicted effects are not always as large as aimed for. Part of this is due to indirect behavioral changes of drivers, also called behavioral adaptation. Behavioral adaptation (BA) refers to uninte

  11. Behavioural adaptation and acceptance

    NARCIS (Netherlands)

    Martens, M.H.; Jenssen, G.D.; Eskandarian, A.

    2012-01-01

    One purpose of Intelligent Vehicles is to improve road safety, throughput, and emissions. However, the predicted effects are not always as large as aimed for. Part of this is due to indirect behavioral changes of drivers, also called behavioral adaptation. Behavioral adaptation (BA) refers to uninte

  12. Adaptive Control Algorithms, Analysis and Applications

    OpenAIRE

    Landau, Ioan; Lozano, Rogelio; M'Saad, Mohammed; Karimi, Alireza

    2011-01-01

    Adaptive Control (second edition) shows how a desired level of system performance can be maintained automatically and in real time, even when process or disturbance parameters are unknown and variable. It is a coherent exposition of the many aspects of this field, setting out the problems to be addressed and moving on to solutions, their practical significance and their application. Discrete-time aspects of adaptive control are emphasized to reflect the importance of digital computers in the ...

  13. Importance of Corneal Thickness

    Science.gov (United States)

    ... News About Us Donate In This Section The Importance of Corneal Thickness email Send this article to ... is important because it can mask an accurate reading of eye pressure, causing doctors to treat you ...

  14. Cardiovascular adaptations to exercise training

    DEFF Research Database (Denmark)

    Hellsten, Ylva; Nyberg, Michael

    2016-01-01

    Aerobic exercise training leads to cardiovascular changes that markedly increase aerobic power and lead to improved endurance performance. The functionally most important adaptation is the improvement in maximal cardiac output which is the result of an enlargement in cardiac dimension, improved...... arteries is reduced, a factor contributing to increased arterial compliance. Endurance training may also induce alterations in the vasodilator capacity, although such adaptations are more pronounced in individuals with reduced vascular function. The microvascular net increases in size within the muscle...... allowing for an improved capacity for oxygen extraction by the muscle through a greater area for diffusion, a shorter diffusion distance, and a longer mean transit time for the erythrocyte to pass through the smallest blood vessels. The present article addresses the effect of endurance training on systemic...

  15. Adaptive noise cancellation

    CERN Document Server

    Akram, N

    1999-01-01

    In this report we describe the concept of adaptive noise canceling, an alternative method of estimating signals corrupted by additive noise of interference. The method uses 'primary' input containing the corrupted signal and a 'reference' input containing noise correlated in some unknown way with the primary noise, the reference input is adaptively filtered and subtracted from the primary input to obtain the signal estimate. Adaptive filtering before subtraction allows the treatment of inputs that are deterministic or stochastic, stationary or time variable. When the reference input is free of signal and certain other conditions are met then noise in the primary input can be essentially eliminated without signal distortion. It is further shown that the adaptive filter also acts as notch filter. Simulated results illustrate the usefulness of the adaptive noise canceling technique.

  16. Adaptive signal processor

    International Nuclear Information System (INIS)

    An experimental, general purpose adaptive signal processor system has been developed, utilizing a quantized (clipped) version of the Widrow-Hoff least-mean-square adaptive algorithm developed by Moschner. The system accommodates 64 adaptive weight channels with 8-bit resolution for each weight. Internal weight update arithmetic is performed with 16-bit resolution, and the system error signal is measured with 12-bit resolution. An adapt cycle of adjusting all 64 weight channels is accomplished in 8 μsec. Hardware of the signal processor utilizes primarily Schottky-TTL type integrated circuits. A prototype system with 24 weight channels has been constructed and tested. This report presents details of the system design and describes basic experiments performed with the prototype signal processor. Finally some system configurations and applications for this adaptive signal processor are discussed

  17. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    International Nuclear Information System (INIS)

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  18. Kinetic solvers with adaptive mesh in phase space.

    Science.gov (United States)

    Arslanbekov, Robert R; Kolobov, Vladimir I; Frolova, Anna A

    2013-12-01

    An adaptive mesh in phase space (AMPS) methodology has been developed for solving multidimensional kinetic equations by the discrete velocity method. A Cartesian mesh for both configuration (r) and velocity (v) spaces is produced using a "tree of trees" (ToT) data structure. The r mesh is automatically generated around embedded boundaries, and is dynamically adapted to local solution properties. The v mesh is created on-the-fly in each r cell. Mappings between neighboring v-space trees is implemented for the advection operator in r space. We have developed algorithms for solving the full Boltzmann and linear Boltzmann equations with AMPS. Several recent innovations were used to calculate the discrete Boltzmann collision integral with dynamically adaptive v mesh: the importance sampling, multipoint projection, and variance reduction methods. We have developed an efficient algorithm for calculating the linear Boltzmann collision integral for elastic and inelastic collisions of hot light particles in a Lorentz gas. Our AMPS technique has been demonstrated for simulations of hypersonic rarefied gas flows, ion and electron kinetics in weakly ionized plasma, radiation and light-particle transport through thin films, and electron streaming in semiconductors. We have shown that AMPS allows minimizing the number of cells in phase space to reduce the computational cost and memory usage for solving challenging kinetic problems. PMID:24483578

  19. A MODEL FOR RUN-TIME SOFTWARE ARCHITECTURE ADAPTATION

    Directory of Open Access Journals (Sweden)

    Fatemeh Khorasani

    2015-01-01

    Full Text Available Since the global demand for software systems and constantly changing environments and systems is increasing, the adaptability of software systems is of significant importance. Due to the architecture of software system is a high-level view of the system and makes the modifiability possible at an overall level, the adaptability of the software can be considered an effective approach to adapt software systems by changing architecture configuration. In this study, the architecture configuration is modified through xADL language which is a software architecture description language with a high flexibility. Software architecture reconfiguration is done based on existing rules of rule-based system, which are written with respect to three strategies of load balancing, fixed bandwidth and fixed latency. The proposed model of the study is simulated based on samples of client-server system, video conferencing system and students’ grading system. The proposed model can be used in all types of architecture, include Client Server Architecture, Service Oriented Architecture and etc.

  20. Multi-decadal range changes vs. thermal adaptation for north east Atlantic oceanic copepods in the face of climate change.

    Science.gov (United States)

    Hinder, Stephanie L; Gravenor, Mike B; Edwards, Martin; Ostle, Clare; Bodger, Owen G; Lee, Patricia L M; Walne, Antony W; Hays, Graeme C

    2014-01-01

    Populations may potentially respond to climate change in various ways including moving to new areas or alternatively staying where they are and adapting as conditions shift. Traditional laboratory and mesocosm experiments last days to weeks and thus only give a limited picture of thermal adaptation, whereas ocean warming occurring over decades allows the potential for selection of new strains better adapted to warmer conditions. Evidence for adaptation in natural systems is equivocal. We used a 50-year time series comprising of 117 056 samples in the NE Atlantic, to quantify the abundance and distribution of two particularly important and abundant members of the ocean plankton (copepods of the genus Calanus) that play a key trophic role for fisheries. Abundance of C. finmarchicus, a cold-water species, and C. helgolandicus, a warm-water species, were negatively and positively related to sea surface temperature (SST) respectively. However, the abundance vs. SST relationships for neither species changed over time in a manner consistent with thermal adaptation. Accompanying the lack of evidence for thermal adaptation there has been an unabated range contraction for C. finmarchicus and range expansion for C. helgolandicus. Our evidence suggests that thermal adaptation has not mitigated the impacts of ocean warming for dramatic range changes of these key species and points to continued dramatic climate induced changes in the biology of the oceans.

  1. Adaptive management of natural resources-framework and issues

    Science.gov (United States)

    Williams, B.K.

    2011-01-01

    Adaptive management, an approach for simultaneously managing and learning about natural resources, has been around for several decades. Interest in adaptive decision making has grown steadily over that time, and by now many in natural resources conservation claim that adaptive management is the approach they use in meeting their resource management responsibilities. Yet there remains considerable ambiguity about what adaptive management actually is, and how it is to be implemented by practitioners. The objective of this paper is to present a framework and conditions for adaptive decision making, and discuss some important challenges in its application. Adaptive management is described as a two-phase process of deliberative and iterative phases, which are implemented sequentially over the timeframe of an application. Key elements, processes, and issues in adaptive decision making are highlighted in terms of this framework. Special emphasis is given to the question of geographic scale, the difficulties presented by non-stationarity, and organizational challenges in implementing adaptive management. ?? 2010.

  2. An overview of importance splitting for rare event simulation

    Energy Technology Data Exchange (ETDEWEB)

    Morio, Jerome; Pastel, Rudy [Office National d' Etudes et Recherches Aerospatiales, The French Aerospace Lab, Long-term Design and System Integration Department (ONERA-DPRS-SSD), BP72, 29 avenue de la Division Leclerc, FR-92322 Chatillon Cedex (France); Le Gland, Francois, E-mail: jerome.morio@onera.f [INRIA Rennes, ASPI Applications of Interacting Particle Systems to Statistics, Campus de Beaulieu 35042, Rennes (France)

    2010-09-15

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this paper, we focus on a quite novel but somehow confidential alternative to importance sampling called importance splitting.

  3. 基于响应曲面和重要性抽样方法的热力系统参数失效概率计算%Calculation of Parameter Failure Probability of Thermodynamic System by Response Surface and Importance Sampling Method

    Institute of Scientific and Technical Information of China (English)

    尚彦龙; 蔡琦; 陈力生; 张杨伟

    2012-01-01

    本文研究了将响应曲面与重要性抽样相结合的方法用于复杂热力系统参数失效概率的计算.建立了热力系统物理过程参数失效的数学模型,在此基础上研究了将响应曲面与重要性抽样相结合的算法模型,并给出了热力系统组成设备的性能退化模型和基于重要性抽样的仿真流程,进而对反应堆净化系统工作过程中参数失效问题进行了分析计算.研究表明,对于高维、非线性特性明显并考虑性能退化的复杂热力系统参数失效概率的计算,重要性抽样法较直接抽样能以较高效率获得满意精度的计算结果,而响应曲面法存在局限;响应曲面和重要性抽样相结合的方法是分析热力系统物理过程参数失效的有效方法.%In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the draw-backs of response surface method.

  4. Adapt or Become Extinct!

    DEFF Research Database (Denmark)

    Goumas, Georgios; McKee, Sally A.; Själander, Magnus;

    2011-01-01

    boundaries (walls) for applications which limit software development (parallel programming wall), performance (memory wall, communication wall) and viability (power wall). The only way to survive in such a demanding environment is by adaptation. In this paper we discuss how dynamic information collected...... static analysis (either during ahead-of-time or just-in-time) compilation. We extend the notion of information-driven adaptation and outline the architecture of an infrastructure designed to enable information ow and adaptation throughout the life-cycle of an application....

  5. Adaptation investments and homeownership

    DEFF Research Database (Denmark)

    Hansen, Jørgen Drud; Skak, Morten

    2008-01-01

    This article develops a model where ownership improves efficiency of the housing market as it enhances the utility of housing consumption for some consumers. The model is based on an extended Hotelling-Lancaster utility approach in which the ideal variant of housing is obtainable only by adapting...... the home through a supplementary investment. Ownership offers low costs of adaptation, but has high contract costs compared with renting. Consumers simultaneously decide housing demand and tenure, and because of the different cost structure only consumers with strong preferences for individual adaptation...

  6. [Adaptive optics for ophthalmology].

    Science.gov (United States)

    Saleh, M

    2016-04-01

    Adaptive optics is a technology enhancing the visual performance of an optical system by correcting its optical aberrations. Adaptive optics have already enabled several breakthroughs in the field of visual sciences, such as improvement of visual acuity in normal and diseased eyes beyond physiologic limits, and the correction of presbyopia. Adaptive optics technology also provides high-resolution, in vivo imaging of the retina that may eventually help to detect the onset of retinal conditions at an early stage and provide better assessment of treatment efficacy.

  7. Adaptive network countermeasures.

    Energy Technology Data Exchange (ETDEWEB)

    McClelland-Bane, Randy; Van Randwyk, Jamie A.; Carathimas, Anthony G.; Thomas, Eric D.

    2003-10-01

    This report describes the results of a two-year LDRD funded by the Differentiating Technologies investment area. The project investigated the use of countermeasures in protecting computer networks as well as how current countermeasures could be changed in order to adapt with both evolving networks and evolving attackers. The work involved collaboration between Sandia employees and students in the Sandia - California Center for Cyber Defenders (CCD) program. We include an explanation of the need for adaptive countermeasures, a description of the architecture we designed to provide adaptive countermeasures, and evaluations of the system.

  8. Adaptive color quantization using the baker's transformation

    OpenAIRE

    Montagne, Christophe; Lelandais, Sylvie; Smolarz, André; Cornu, Philippe; Larabi, Mohamed-Chaker; Fernandez-Maloigne, Christine

    2006-01-01

    International audience In this article we propose an original technique to reduce the number of colors contained in an image. This method uses the "Bakers Transformation", which obtains a statistically suitable mixture of the pixels of the image. mm this mixture, we can extract several samples, which present the same characteristics as the initial image. The concept we imagined is to consider these samples as potential pallets of colors. These pallets make it possible to do an adaptive qua...

  9. The process of organisational adaptation through innovations, and organisational adaptability

    OpenAIRE

    Tikka, Tommi

    2010-01-01

    This study is about the process of organisational adaptation and organisational adaptability. The study generates a theoretical framework about organisational adaptation behaviour and conditions that have influence on success of organisational adaptation. The research questions of the study are: How does an organisation adapt through innovations, and which conditions enhance or impede organisational adaptation through innovations? The data were gathered from five case organisations withi...

  10. Sensorless adaptive optics and the effect of field of view in biological second harmonic generation microscopy

    Science.gov (United States)

    Vandendriessche, Stefaan; Vanbel, Maarten K.; Verbiest, Thierry

    2014-05-01

    In light of the population aging in many developed countries, there is a great economical interest in improving the speed and cost-efficiency of healthcare. Clinical diagnosis tools are key to these improvements, with biophotonics providing a means to achieve them. Standard optical microscopy of in vitro biological samples has been an important diagnosis tool since the invention of the microscope, with well known resolution limits. Nonlinear optical imaging improves on the resolution limits of linear microscopy, while providing higher contrast images and a greater penetration depth due to the red-shifted incident light compared to standard optical microscopy. It also provides information on molecular orientation and chirality. Adaptive optics can improve the quality of nonlinear optical images. We analyzed the effect of sensorless adaptive optics on the quality of the nonlinear optical images of biological samples. We demonstrate that care needs to be taken when using a large field of view. Our findings provide information on how to improve the quality of nonlinear optical imaging, and can be generalized to other in vitro biological samples. The image quality improvements achieved by adaptive optics should help speed up clinical diagnostics in vitro, while increasing their accuracy and helping decrease detection limits. The same principles apply to in vivo biological samples, and in the future it may be possible to extend these findings to other nonlinear optical effects used in biological imaging.

  11. Interrelations between psychosocial functioning and adaptive- and maladaptive-range personality traits.

    Science.gov (United States)

    Ro, Eunyoe; Clark, Lee Anna

    2013-08-01

    Decrements in one or more domains of psychosocial functioning (e.g., poor job performance, poor interpersonal relations) are commonly observed in psychiatric patients. The purpose of this study is to increase understanding of psychosocial functioning as a broad, multifaceted construct as well as its associations with both adaptive- and maladaptive-range personality traits in both nonclinical and psychiatric outpatient samples. The study was conducted in two phases. In Study 1, a nonclinical sample (N = 429) was administered seven psychosocial functioning and adaptive-range personality trait measures. In Study 2, psychiatric outpatients (N = 181) were administered the same psychosocial functioning measures, and maladaptive- as well as adaptive-range personality trait measures. Exploratory (both studies) and confirmatory (Study 2) factor analyses indicated a common three-factor, hierarchical structure of psychosocial functioning-Well Being, Social/Interpersonal Functioning, and Basic Functioning. These psychosocial functioning domains were closely--and differentially--linked with personality traits, especially strongly so in patients. Across samples, Well Being was associated with both Neuroticism/Negative Affectivity and Extraversion/Positive Affectivity, Social/Interpersonal Functioning was associated with both Agreeableness and Conscientiousness/Disinhibition, and Basic Functioning was associated with Conscientiousness/Disinhibition, although only modestly in the nonclinical sample. These relations generally were maintained even after partialing out current general dysphoric symptoms. These findings have implications for considering psychosocial functioning as an important third domain in a tripartite model together with personality and psychopathology. PMID:24016019

  12. Adaptation to and Recovery from Global Catastrophe

    Directory of Open Access Journals (Sweden)

    Seth D. Baum

    2013-03-01

    Full Text Available Global catastrophes, such as nuclear war, pandemics and ecological collapse threaten the sustainability of human civilization. To date, most work on global catastrophes has focused on preventing the catastrophes, neglecting what happens to any catastrophe survivors. To address this gap in the literature, this paper discusses adaptation to and recovery from global catastrophe. The paper begins by discussing the importance of global catastrophe adaptation and recovery, noting that successful adaptation/recovery could have value on even astronomical scales. The paper then discusses how the adaptation/recovery could proceed and makes connections to several lines of research. Research on resilience theory is considered in detail and used to develop a new method for analyzing the environmental and social stressors that global catastrophe survivors would face. This method can help identify options for increasing survivor resilience and promoting successful adaptation and recovery. A key point is that survivors may exist in small isolated communities disconnected from global trade and, thus, must be able to survive and rebuild on their own. Understanding the conditions facing isolated survivors can help promote successful adaptation and recovery. That said, the processes of global catastrophe adaptation and recovery are highly complex and uncertain; further research would be of great value.

  13. Women's positive adaptation in childhood and adulthood : A longitudinal study

    OpenAIRE

    Andersson, Håkan

    2007-01-01

    An area within psychology that looks at the strengths and positive sides of human life has emerged the last decade. It is called positive psychology and one area related to that is positive adaptation. The main purpose of this paper is to describe the natural history of females’ positive extrinsic and intrinsic adaptation from childhood to adulthood, with a focus on typical positive patterns of adaptation and how these patterns develop within the same individual. The sample consisted of about...

  14. Exploring the Use of Adaptively Restrained Particles for Graphics Simulations

    OpenAIRE

    Pierre-Luc Manteaux; Fran\\xe7ois Faure; Stephane Redon; Marie-Paule Cani

    2013-01-01

    International audience In this paper, we explore the use of Adaptively Restrained (AR) particles for graphics simulations. Contrary to previous methods, Adaptively Restrained Particle Simulations (ARPS) do not adapt time or space sampling, but rather switch the positional degrees of freedom of particles on and off, while letting their momenta evolve. Therefore, inter-particles forces do not have to be updated at each time step, in contrast with traditional methods that spend a lot of time ...

  15. Adapt or Die

    DEFF Research Database (Denmark)

    Brody, Joshua Eric; Larsen, Kasper Green

    2015-01-01

    In this paper, we study the role non-adaptivity plays in maintaining dynamic data structures. Roughly speaking, a data structure is non-adaptive if the memory locations it reads and/or writes when processing a query or update depend only on the query or update and not on the contents of previously...... read cells. We study such non-adaptive data structures in the cell probe model. This model is one of the least restrictive lower bound models and in particular, cell probe lower bounds apply to data structures developed in the popular word-RAM model. Unfortunately, this generality comes at a high cost...... several different notions of non-adaptivity and identify key properties that must be dealt with if we are to prove polynomial lower bounds without restrictions on the data structures. Finally, our results also unveil an interesting connection between data structures and depth-2 circuits. This allows us...

  16. Adaptive Architectural Envelope

    DEFF Research Database (Denmark)

    Foged, Isak Worre; Kirkegaard, Poul Henning

    2010-01-01

    Recent years have seen an increasing variety of applications of adaptive architectural structures for improvement of structural performance by recognizing changes in their environments and loads, adapting to meet goals, and using past events to improve future performance or maintain serviceability....... The general scopes of this paper are to develop a new adaptive kinetic architectural structure, particularly a reconfigurable architectural structure which can transform body shape from planar geometries to hyper-surfaces using different control strategies, i.e. a transformation into more than one or two...... different shape alternatives. The adaptive structure is a proposal for a responsive building envelope which is an idea of a first level operational framework for present and future investigations towards performance based responsive architectures through a set of responsive typologies. A mock- up concept...

  17. Asimovian Adaptive Agents

    CERN Document Server

    Gordon, D F

    2011-01-01

    The goal of this research is to develop agents that are adaptive and predictable and timely. At first blush, these three requirements seem contradictory. For example, adaptation risks introducing undesirable side effects, thereby making agents' behavior less predictable. Furthermore, although formal verification can assist in ensuring behavioral predictability, it is known to be time-consuming. Our solution to the challenge of satisfying all three requirements is the following. Agents have finite-state automaton plans, which are adapted online via evolutionary learning (perturbation) operators. To ensure that critical behavioral constraints are always satisfied, agents' plans are first formally verified. They are then reverified after every adaptation. If reverification concludes that constraints are violated, the plans are repaired. The main objective of this paper is to improve the efficiency of reverification after learning, so that agents have a sufficiently rapid response time. We present two solutions: ...

  18. The genomics of adaptation.

    Science.gov (United States)

    Radwan, Jacek; Babik, Wiesław

    2012-12-22

    The amount and nature of genetic variation available to natural selection affect the rate, course and outcome of evolution. Consequently, the study of the genetic basis of adaptive evolutionary change has occupied biologists for decades, but progress has been hampered by the lack of resolution and the absence of a genome-level perspective. Technological advances in recent years should now allow us to answer many long-standing questions about the nature of adaptation. The data gathered so far are beginning to challenge some widespread views of the way in which natural selection operates at the genomic level. Papers in this Special Feature of Proceedings of the Royal Society B illustrate various aspects of the broad field of adaptation genomics. This introductory article sets up a context and, on the basis of a few selected examples, discusses how genomic data can advance our understanding of the process of adaptation.

  19. Statistical Physics of Adaptation

    CERN Document Server

    Perunov, Nikolai; England, Jeremy

    2014-01-01

    All living things exhibit adaptations that enable them to survive and reproduce in the natural environment that they inhabit. From a biological standpoint, it has long been understood that adaptation comes from natural selection, whereby maladapted individuals do not pass their traits effectively to future generations. However, we may also consider the phenomenon of adaptation from the standpoint of physics, and ask whether it is possible to delineate what the difference is in terms of physical properties between something that is well-adapted to its surrounding environment, and something that is not. In this work, we undertake to address this question from a theoretical standpoint. Building on past fundamental results in far-from-equilibrium statistical mechanics, we demonstrate a generalization of the Helmholtz free energy for the finite-time stochastic evolution of driven Newtonian matter. By analyzing this expression term by term, we are able to argue for a general tendency in driven many-particle systems...

  20. Islands, resettlement and adaptation

    Science.gov (United States)

    Barnett, Jon; O'Neill, Saffron J.

    2012-01-01

    Resettlement of people living on islands in anticipation of climate impacts risks maladaptation, but some forms of population movement carry fewer risks and larger rewards in terms of adapting to climate change.

  1. Adaptive Heat Engine

    Science.gov (United States)

    Allahverdyan, A. E.; Babajanyan, S. G.; Martirosyan, N. H.; Melkikh, A. V.

    2016-07-01

    A major limitation of many heat engines is that their functioning demands on-line control and/or an external fitting between the environmental parameters (e.g., temperatures of thermal baths) and internal parameters of the engine. We study a model for an adaptive heat engine, where—due to feedback from the functional part—the engine's structure adapts to given thermal baths. Hence, no on-line control and no external fitting are needed. The engine can employ unknown resources; it can also adapt to results of its own functioning that make the bath temperatures closer. We determine resources of adaptation and relate them to the prior information available about the environment.

  2. Limits to adaptation

    Science.gov (United States)

    Dow, Kirstin; Berkhout, Frans; Preston, Benjamin L.; Klein, Richard J. T.; Midgley, Guy; Shaw, M. Rebecca

    2013-04-01

    An actor-centered, risk-based approach to defining limits to social adaptation provides a useful analytic framing for identifying and anticipating these limits and informing debates over society's responses to climate change.

  3. The genomics of adaptation.

    Science.gov (United States)

    Radwan, Jacek; Babik, Wiesław

    2012-12-22

    The amount and nature of genetic variation available to natural selection affect the rate, course and outcome of evolution. Consequently, the study of the genetic basis of adaptive evolutionary change has occupied biologists for decades, but progress has been hampered by the lack of resolution and the absence of a genome-level perspective. Technological advances in recent years should now allow us to answer many long-standing questions about the nature of adaptation. The data gathered so far are beginning to challenge some widespread views of the way in which natural selection operates at the genomic level. Papers in this Special Feature of Proceedings of the Royal Society B illustrate various aspects of the broad field of adaptation genomics. This introductory article sets up a context and, on the basis of a few selected examples, discusses how genomic data can advance our understanding of the process of adaptation. PMID:23097510

  4. Introducing the Adaptive Convex Enveloping

    CERN Document Server

    Yu, Sheng

    2011-01-01

    Convexity, though extremely important in mathematical programming, has not drawn enough attention in the field of dynamic programming. This paper gives conditions for verifying convexity of the cost-to-go functions, and introduces an accurate, fast and reliable algorithm for solving convex dynamic programs with multivariate continuous states and actions, called Adaptive Convex Enveloping. This is a short introduction of the core technique created and used in my dissertation, so it is less formal, and misses some parts, such as literature review and reference, compared to a full journal paper.

  5. Regulation of pH in human skeletal muscle: adaptations to physical activity

    DEFF Research Database (Denmark)

    Juel, C

    2008-01-01

    resonance technique to exercise experiments including blood sampling and muscle biopsies. The present review characterizes the cellular buffering system as well as the most important membrane transport systems involved (Na(+)/H(+) exchange, Na-bicarbonate co-transport and lactate/H(+) co......-transport) and describes the contribution of each transport system in pH regulation at rest and during muscle activity. It is reported that the mechanisms involved in pH regulation can undergo adaptational changes in association with physical activity and that these changes are of functional importance....

  6. Adaptive quantum teleportation

    OpenAIRE

    Modlawska, Joanna; Grudka, Andrzej

    2008-01-01

    We consider multiple teleportation in the Knill-Laflamme-Milburn (KLM) scheme. We introduce adaptive teleportation, i.e., such that the choice of entangled state used in the next teleportation depends on the results of the measurements performed during the previous teleportations. We show that adaptive teleportation enables an increase in the probability of faithful multiple teleportation in the KLM scheme. In particular if a qubit is to be teleported more than once then it is better to use n...

  7. Opportunistic Adaptation Knowledge Discovery

    OpenAIRE

    Badra, Fadi; Cordier, Amélie; Lieber, Jean

    2009-01-01

    The original publication is available at www.springerlink.com International audience Adaptation has long been considered as the Achilles' heel of case-based reasoning since it requires some domain-specific knowledge that is difficult to acquire. In this paper, two strategies are combined in order to reduce the knowledge engineering cost induced by the adaptation knowledge (CA) acquisition task: CA is learned from the case base by the means of knowledge discovery techniques, and the CA a...

  8. Frustratingly Easy Domain Adaptation

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We describe an approach to domain adaptation that is appropriate exactly in the case when one has enough ``target'' data to do slightly better than just using only ``source'' data. Our approach is incredibly simple, easy to implement as a preprocessing step (10 lines of Perl!) and outperforms state-of-the-art approaches on a range of datasets. Moreover, it is trivially extended to a multi-domain adaptation problem, where one has data from a variety of different domains.

  9. Robust Adaptive Control

    Science.gov (United States)

    Narendra, K. S.; Annaswamy, A. M.

    1985-01-01

    Several concepts and results in robust adaptive control are are discussed and is organized in three parts. The first part surveys existing algorithms. Different formulations of the problem and theoretical solutions that have been suggested are reviewed here. The second part contains new results related to the role of persistent excitation in robust adaptive systems and the use of hybrid control to improve robustness. In the third part promising new areas for future research are suggested which combine different approaches currently known.

  10. The genomics of adaptation

    OpenAIRE

    Radwan, Jacek; Babik, Wiesław

    2012-01-01

    The amount and nature of genetic variation available to natural selection affect the rate, course and outcome of evolution. Consequently, the study of the genetic basis of adaptive evolutionary change has occupied biologists for decades, but progress has been hampered by the lack of resolution and the absence of a genome-level perspective. Technological advances in recent years should now allow us to answer many long-standing questions about the nature of adaptation. The data gathered so far ...

  11. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  12. Leak test adapter for containers

    Science.gov (United States)

    Hallett, Brian H.; Hartley, Michael S.

    1996-01-01

    An adapter is provided for facilitating the charging of containers and leak testing penetration areas. The adapter comprises an adapter body and stem which are secured to the container's penetration areas. The container is then pressurized with a tracer gas. Manipulating the adapter stem installs a penetration plug allowing the adapter to be removed and the penetration to be leak tested with a mass spectrometer. Additionally, a method is provided for using the adapter.

  13. Sparse adaptive filters for echo cancellation

    CERN Document Server

    Paleologu, Constantin

    2011-01-01

    Adaptive filters with a large number of coefficients are usually involved in both network and acoustic echo cancellation. Consequently, it is important to improve the convergence rate and tracking of the conventional algorithms used for these applications. This can be achieved by exploiting the sparseness character of the echo paths. Identification of sparse impulse responses was addressed mainly in the last decade with the development of the so-called ``proportionate''-type algorithms. The goal of this book is to present the most important sparse adaptive filters developed for echo cancellati

  14. Adaptive image processing a computational intelligence perspective

    CERN Document Server

    Guan, Ling; Wong, Hau San

    2002-01-01

    Adaptive image processing is one of the most important techniques in visual information processing, especially in early vision such as image restoration, filtering, enhancement, and segmentation. While existing books present some important aspects of the issue, there is not a single book that treats this problem from a viewpoint that is directly linked to human perception - until now. This reference treats adaptive image processing from a computational intelligence viewpoint, systematically and successfully, from theory to applications, using the synergies of neural networks, fuzzy logic, and

  15. Partial update least-square adaptive filtering

    CERN Document Server

    Xie, Bei

    2014-01-01

    Adaptive filters play an important role in the fields related to digital signal processing and communication, such as system identification, noise cancellation, channel equalization, and beamforming. In practical applications, the computational complexity of an adaptive filter is an important consideration. The Least Mean Square (LMS) algorithm is widely used because of its low computational complexity (O(N)) and simplicity in implementation. The least squares algorithms, such as Recursive Least Squares (RLS), Conjugate Gradient (CG), and Euclidean Direction Search (EDS), can converge faster a

  16. Adaptation through proportion

    Science.gov (United States)

    Xiong, Liyang; Shi, Wenjia; Tang, Chao

    2016-08-01

    Adaptation is a ubiquitous feature in biological sensory and signaling networks. It has been suggested that adaptive systems may follow certain simple design principles across diverse organisms, cells and pathways. One class of networks that can achieve adaptation utilizes an incoherent feedforward control, in which two parallel signaling branches exert opposite but proportional effects on the output at steady state. In this paper, we generalize this adaptation mechanism by establishing a steady-state proportionality relationship among a subset of nodes in a network. Adaptation can be achieved by using any two nodes in the sub-network to respectively regulate the output node positively and negatively. We focus on enzyme networks and first identify basic regulation motifs consisting of two and three nodes that can be used to build small networks with proportional relationships. Larger proportional networks can then be constructed modularly similar to LEGOs. Our method provides a general framework to construct and analyze a class of proportional and/or adaptation networks with arbitrary size, flexibility and versatile functional features.

  17. Learning to Adapt. Organisational Adaptation to Climate Change Impacts

    International Nuclear Information System (INIS)

    Analysis of human adaptation to climate change should be based on realistic models of adaptive behaviour at the level of organisations and individuals. The paper sets out a framework for analysing adaptation to the direct and indirect impacts of climate change in business organisations with new evidence presented from empirical research into adaptation in nine case-study companies. It argues that adaptation to climate change has many similarities with processes of organisational learning. The paper suggests that business organisations face a number of obstacles in learning how to adapt to climate change impacts, especially in relation to the weakness and ambiguity of signals about climate change and the uncertainty about benefits flowing from adaptation measures. Organisations rarely adapt 'autonomously', since their adaptive behaviour is influenced by policy and market conditions, and draws on resources external to the organisation. The paper identifies four adaptation strategies that pattern organisational adaptive behaviour

  18. Agricultural adaptation to climate change in China

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper presents the study on agriculture adaptation toclimate change by adopting the assumed land use change strategy to resist the water shortage and to build the capacity to adapt the expected climate change in the northern China. The cost-benefit analysis result shows that assumed land use change from high water consuming rice cultivation to other crops is very effective. Over 7 billions m3 of water can be saved. Potential conflicts between different social interest groups, different regions, demand and supply, and present and future interests have been analyzed for to form a policy to implement the adaptation strategy. Trade, usually taken as one of adaptation strategies, was suggested as a policy option for to support land use change, which not only meets the consumption demand, but also, in terms of resources, imports waterresources.

  19. Migration from atolls as climate change adaptation

    DEFF Research Database (Denmark)

    Birk, Thomas Ladegaard Kümmel; Rasmussen, Kjeld

    2014-01-01

    Adaptive strategies are important for reducing the vulnerability of atoll communities to climate change and sea level rise in both the short and long term. This paper seeks to contribute to the emerging discourse on migration as a form of adaptation to climate change based on empirical studies...... in the two atoll communities, Reef Islands and Ontong Java, which are located in the periphery of Solomon Islands. The paper will outline current migration patterns in the two island groups and discuss how some of this migration may contribute to adaptation to climate change and other stresses. It shows...... in adaptation to climate change in exposed atoll communities, addressing some of the barriers to migration seems logical. This may be done by efforts to stimulate migrant income opportunities, by improving migrant living conditions and by improving the transport services to the islands....

  20. Investigation on children's social adaptive capacity

    Institute of Scientific and Technical Information of China (English)

    WANG Ya-ping; WANG Bao-yan; CHEN Yun-qi; WANG Ai-rong; ZHANG Rong; NIU Xiao-lin

    2002-01-01

    Objective:To understand the present conditions of Children's social adaptive capacity. Methods:social viability and its influence factors were investigated on 628 Children in 7 kindergartens of 4 cities in China. Results: The general trend of development of Child's social adaptive capacity was fairly good. The relevance ratio on the edge level was 0.3%. Of them 16.4% and 7% were excellent and extraordinary intelligence respectively. The family environment had played a very important role in child's social adaptive capacity. Conclusion: The research revealed that in the respect of training a child's social adaptive ability, the initiative of each family should be brought into full play and we should surmount the negative influence, and solve the contradiction between releasing one's control and taking care of everything, and arouse the conscious activity of the child, and ensure the unity and balance between the child's own body and the living environment.

  1. Dynamic optimization and adaptive controller design

    Science.gov (United States)

    Inamdar, S. R.

    2010-10-01

    In this work I present a new type of controller which is an adaptive tracking controller which employs dynamic optimization for optimizing current value of controller action for the temperature control of nonisothermal continuously stirred tank reactor (CSTR). We begin with a two-state model of nonisothermal CSTR which are mass and heat balance equations and then add cooling system dynamics to eliminate input multiplicity. The initial design value is obtained using local stability of steady states where approach temperature for cooling action is specified as a steady state and a design specification. Later we make a correction in the dynamics where material balance is manipulated to use feed concentration as a system parameter as an adaptive control measure in order to avoid actuator saturation for the main control loop. The analysis leading to design of dynamic optimization based parameter adaptive controller is presented. The important component of this mathematical framework is reference trajectory generation to form an adaptive control measure.

  2. Better economics: supporting adaptation with stakeholder analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chambwera, Muyeye; Zou, Ye; Boughlala, Mohamed

    2011-11-15

    Across the developing world, decision makers understand the need to adapt to climate change — particularly in agriculture, which supports a large proportion of low-income groups who are especially vulnerable to impacts such as increasing water scarcity or more erratic weather. But policymakers are often less clear about what adaptation action to take. Cost-benefit analyses can provide information on the financial feasibility and economic efficiency of a given policy. But such methods fail to capture the non-monetary benefits of adaptation, which can be even more important than the monetary ones. Ongoing work in Morocco shows how combining cost-benefit analysis with a more participatory stakeholder analysis can support effective decision making by identifying cross-sector benefits, highlighting areas of mutual interest among different stakeholders and more effectively assessing impacts on adaptive capacity.

  3. Adaptive designs based on the truncated product method

    Directory of Open Access Journals (Sweden)

    Neuhäuser Markus

    2005-09-01

    Full Text Available Abstract Background Adaptive designs are becoming increasingly important in clinical research. One approach subdivides the study into several (two or more stages and combines the p-values of the different stages using Fisher's combination test. Methods Alternatively to Fisher's test, the recently proposed truncated product method (TPM can be applied to combine the p-values. The TPM uses the product of only those p-values that do not exceed some fixed cut-off value. Here, these two competing analyses are compared. Results When an early termination due to insufficient effects is not appropriate, such as in dose-response analyses, the probability to stop the trial early with the rejection of the null hypothesis is increased when the TPM is applied. Therefore, the expected total sample size is decreased. This decrease in the sample size is not connected with a loss in power. The TPM turns out to be less advantageous, when an early termination of the study due to insufficient effects is possible. This is due to a decrease of the probability to stop the trial early. Conclusion It is recommended to apply the TPM rather than Fisher's combination test whenever an early termination due to insufficient effects is not suitable within the adaptive design.

  4. Importance of Family Routines

    Science.gov (United States)

    ... Listen Español Text Size Email Print Share The Importance of Family Routines Page Content ​Every family needs ... child to sleep. These rituals can include storytelling, reading aloud, conversation, and songs. Try to avoid exciting ...

  5. Mouse EEG spike detection based on the adapted continuous wavelet transform

    Science.gov (United States)

    Tieng, Quang M.; Kharatishvili, Irina; Chen, Min; Reutens, David C.

    2016-04-01

    Objective. Electroencephalography (EEG) is an important tool in the diagnosis of epilepsy. Interictal spikes on EEG are used to monitor the development of epilepsy and the effects of drug therapy. EEG recordings are generally long and the data voluminous. Thus developing a sensitive and reliable automated algorithm for analyzing EEG data is necessary. Approach. A new algorithm for detecting and classifying interictal spikes in mouse EEG recordings is proposed, based on the adapted continuous wavelet transform (CWT). The construction of the adapted mother wavelet is founded on a template obtained from a sample comprising the first few minutes of an EEG data set. Main Result. The algorithm was tested with EEG data from a mouse model of epilepsy and experimental results showed that the algorithm could distinguish EEG spikes from other transient waveforms with a high degree of sensitivity and specificity. Significance. Differing from existing approaches, the proposed approach combines wavelet denoising, to isolate transient signals, with adapted CWT-based template matching, to detect true interictal spikes. Using the adapted wavelet constructed from a predefined template, the adapted CWT is calculated on small EEG segments to fit dynamical changes in the EEG recording.

  6. IMPORTANCE OF MAIZE CROPPING

    OpenAIRE

    Mohammed Dhary Yousif EL-JUBOURI

    2012-01-01

    The Corn, wheat and rice together are the main crops. It is a plant that responds well to chemical and organic fertilization and the irrigation. But compliance is sensitive to optimum sowing time and integrated control of weeds, pests and diseases (2). The maize is the most important plant product, from the point of view commercially and is used primarily as fodder. The maize is an important source of vegetable oil and has many applications in industry, the manufacture of diverse items: cosme...

  7. Advanced hierarchical distance sampling

    Science.gov (United States)

    Royle, Andy

    2016-01-01

    In this chapter, we cover a number of important extensions of the basic hierarchical distance-sampling (HDS) framework from Chapter 8. First, we discuss the inclusion of “individual covariates,” such as group size, in the HDS model. This is important in many surveys where animals form natural groups that are the primary observation unit, with the size of the group expected to have some influence on detectability. We also discuss HDS integrated with time-removal and double-observer or capture-recapture sampling. These “combined protocols” can be formulated as HDS models with individual covariates, and thus they have a commonality with HDS models involving group structure (group size being just another individual covariate). We cover several varieties of open-population HDS models that accommodate population dynamics. On one end of the spectrum, we cover models that allow replicate distance sampling surveys within a year, which estimate abundance relative to availability and temporary emigration through time. We consider a robust design version of that model. We then consider models with explicit dynamics based on the Dail and Madsen (2011) model and the work of Sollmann et al. (2015). The final major theme of this chapter is relatively newly developed spatial distance sampling models that accommodate explicit models describing the spatial distribution of individuals known as Point Process models. We provide novel formulations of spatial DS and HDS models in this chapter, including implementations of those models in the unmarked package using a hack of the pcount function for N-mixture models.

  8. Massively Parallel Dimension Independent Adaptive Metropolis

    KAUST Repository

    Chen, Yuxin

    2015-05-14

    This work considers black-box Bayesian inference over high-dimensional parameter spaces. The well-known and widely respected adaptive Metropolis (AM) algorithm is extended herein to asymptotically scale uniformly with respect to the underlying parameter dimension, by respecting the variance, for Gaussian targets. The result- ing algorithm, referred to as the dimension-independent adaptive Metropolis (DIAM) algorithm, also shows improved performance with respect to adaptive Metropolis on non-Gaussian targets. This algorithm is further improved, and the possibility of probing high-dimensional targets is enabled, via GPU-accelerated numerical libraries and periodically synchronized concurrent chains (justified a posteriori). Asymptoti- cally in dimension, this massively parallel dimension-independent adaptive Metropolis (MPDIAM) GPU implementation exhibits a factor of four improvement versus the CPU-based Intel MKL version alone, which is itself already a factor of three improve- ment versus the serial version. The scaling to multiple CPUs and GPUs exhibits a form of strong scaling in terms of the time necessary to reach a certain convergence criterion, through a combination of longer time per sample batch (weak scaling) and yet fewer necessary samples to convergence. This is illustrated by e ciently sampling from several Gaussian and non-Gaussian targets for dimension d 1000.

  9. Integrating climate change adaptation into forest management

    Energy Technology Data Exchange (ETDEWEB)

    Spittlehouse, D.L. [British Columbia Ministry of Forestry, Victoria, BC (Canada)

    2005-10-01

    Although forest management decisions are often based on the assumption that the climate will remain relatively stable throughout a forest's life-time, the prospect of future climate change has challenged current decision-making processes. This paper reviewed challenges currently impeding responses to climate change and presented suggestions for integrating adaptation strategies into forest management. Adaptive actions reduce the risks of climate change by preparing for adverse effects and capitalizing on the benefits. However, the importance of forest ecosystems to society means that the direction and timing of adaptation should be carefully managed. Uncertainty in the magnitude and timing of future climate change is a significant challenge. In addition, different ecosystems are vulnerable to different aspects of change, and an important component of adaptation will be the balancing of different values. The size of the forested land base in most of Canada will mean that much of the forest will have to adjust to climatic changes without human intervention. Seed planning zones, reforestation standards and hydrologic and wildlife management guidelines are designed for the current climate regime, and there are currently no regulatory requirements for adaptation strategies. Societal adaptation will be a major component of any forest management adaptation strategy, and demands on forest resources will need to be revised. Adaptation to reduce the vulnerability of resources such as water quality and quantity and biological conservation will become high priorities in some areas. It was suggested that the adaptation of culverts, bridges and roads should be incorporated into an infrastructure replacement cycle. Areas for preservation where the future climate will become suitable for species whose current range is threatened by climate change should be identified. Adapting the forest through reforestation after disturbances such as harvest or fire was recommended. Other

  10. Solar tomography adaptive optics.

    Science.gov (United States)

    Ren, Deqing; Zhu, Yongtian; Zhang, Xi; Dou, Jiangpei; Zhao, Gang

    2014-03-10

    Conventional solar adaptive optics uses one deformable mirror (DM) and one guide star for wave-front sensing, which seriously limits high-resolution imaging over a large field of view (FOV). Recent progress toward multiconjugate adaptive optics indicates that atmosphere turbulence induced wave-front distortion at different altitudes can be reconstructed by using multiple guide stars. To maximize the performance over a large FOV, we propose a solar tomography adaptive optics (TAO) system that uses tomographic wave-front information and uses one DM. We show that by fully taking advantage of the knowledge of three-dimensional wave-front distribution, a classical solar adaptive optics with one DM can provide an extra performance gain for high-resolution imaging over a large FOV in the near infrared. The TAO will allow existing one-deformable-mirror solar adaptive optics to deliver better performance over a large FOV for high-resolution magnetic field investigation, where solar activities occur in a two-dimensional field up to 60'', and where the near infrared is superior to the visible in terms of magnetic field sensitivity.

  11. Solar Adaptive Optics

    Directory of Open Access Journals (Sweden)

    Thomas R. Rimmele

    2011-06-01

    Full Text Available Adaptive optics (AO has become an indispensable tool at ground-based solar telescopes. AO enables the ground-based observer to overcome the adverse effects of atmospheric seeing and obtain diffraction limited observations. Over the last decade adaptive optics systems have been deployed at major ground-based solar telescopes and revitalized ground-based solar astronomy. The relatively small aperture of solar telescopes and the bright source make solar AO possible for visible wavelengths where the majority of solar observations are still performed. Solar AO systems enable diffraction limited observations of the Sun for a significant fraction of the available observing time at ground-based solar telescopes, which often have a larger aperture than equivalent space based observatories, such as HINODE. New ground breaking scientific results have been achieved with solar adaptive optics and this trend continues. New large aperture telescopes are currently being deployed or are under construction. With the aid of solar AO these telescopes will obtain observations of the highly structured and dynamic solar atmosphere with unprecedented resolution. This paper reviews solar adaptive optics techniques and summarizes the recent progress in the field of solar adaptive optics. An outlook to future solar AO developments, including a discussion of Multi-Conjugate AO (MCAO and Ground-Layer AO (GLAO will be given.

  12. Complex and Adaptive Dynamical Systems A Primer

    CERN Document Server

    Gros, Claudius

    2011-01-01

    We are living in an ever more complex world, an epoch where human actions can accordingly acquire far-reaching potentialities. Complex and adaptive dynamical systems are ubiquitous in the world surrounding us and require us to adapt to new realities and the way of dealing with them. This primer has been developed with the aim of conveying a wide range of "commons-sense" knowledge in the field of quantitative complex system science at an introductory level, providing an entry point to this both fascinating and vitally important subject. The approach is modular and phenomenology driven. Examples of emerging phenomena of generic importance treated in this book are: -- The small world phenomenon in social and scale-free networks. -- Phase transitions and self-organized criticality in adaptive systems. -- Life at the edge of chaos and coevolutionary avalanches resulting from the unfolding of all living. -- The concept of living dynamical systems and emotional diffusive control within cognitive system theory. Techn...

  13. Import of textile machinery

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In 2007,the total importation of our textile machinery amounts to US$4.948 billion,increased by 20.62%over the same period of the previous year,which turns out to be a new high in the year of textile machinery.Among the imported produc ts in 2007,different impor ted produc ts witness growths of various degrees. The large-scale impor t increase of tex tile machiner y indicates the acceleration of textile technology and upgrading of textile industry,and demonstrates that our textile machinery industry still keeps distance from the international advanced technology in terms of product level,product stability as well as the product reliability although the rapid improvement was made in manufacturing in Chinese textile machinery industry in the last few years.In addition,the possibility of RMB appreciation still exists.Import increase of textile machinery brings a new historical high in 2007.

  14. Formation and Regulation of Adaptive Response in Nematode Caenorhabditis elegans

    Directory of Open Access Journals (Sweden)

    Y.-L. Zhao

    2012-01-01

    Full Text Available All organisms respond to environmental stresses (e.g., heavy metal, heat, UV irradiation, hyperoxia, food limitation, etc. with coordinated adjustments in order to deal with the consequences and/or injuries caused by the severe stress. The nematode Caenorhabditis elegans often exerts adaptive responses if preconditioned with low concentrations of agents or stressor. In C. elegans, three types of adaptive responses can be formed: hormesis, cross-adaptation, and dietary restriction. Several factors influence the formation of adaptive responses in nematodes, and some mechanisms can explain their response formation. In particular, antioxidation system, heat-shock proteins, metallothioneins, glutathione, signaling transduction, and metabolic signals may play important roles in regulating the formation of adaptive responses. In this paper, we summarize the published evidence demonstrating that several types of adaptive responses have converged in C. elegans and discussed some possible alternative theories explaining the adaptive response control.

  15. Adaptation in the context of technology development and transfer

    DEFF Research Database (Denmark)

    Olhoff, Anne

    2015-01-01

    and transfer. It summarizes what technologies for adaptation are, how they relate to development, and what their role is in adaptation. It subsequently highlights a number of policy and research issues that could be important to inform future policy. The commentary has two key messages. First, it argues...... that informed policy decisions on technology development and transfer to enhance adaptation require systematic assessments of the findings in the theoretical and empirical literature. Second, in light of the potential for overlap between processes for adaptation and processes for technologies for adaptation......Starting from a summary of key developments under the United Nations Framework Convention on Climate Change (UNFCCC) related to adaptation and technologies, the commentary provides an initial review of the available literature relevant to adaptation in the context of technology development...

  16. Roadmap towards justice in urban climate adaptation research

    Science.gov (United States)

    Shi, Linda; Chu, Eric; Anguelovski, Isabelle; Aylett, Alexander; Debats, Jessica; Goh, Kian; Schenk, Todd; Seto, Karen C.; Dodman, David; Roberts, Debra; Roberts, J. Timmons; Vandeveer, Stacy D.

    2016-02-01

    The 2015 United Nations Climate Change Conference in Paris (COP21) highlighted the importance of cities to climate action, as well as the unjust burdens borne by the world's most disadvantaged peoples in addressing climate impacts. Few studies have documented the barriers to redressing the drivers of social vulnerability as part of urban local climate change adaptation efforts, or evaluated how emerging adaptation plans impact marginalized groups. Here, we present a roadmap to reorient research on the social dimensions of urban climate adaptation around four issues of equity and justice: (1) broadening participation in adaptation planning; (2) expanding adaptation to rapidly growing cities and those with low financial or institutional capacity; (3) adopting a multilevel and multi-scalar approach to adaptation planning; and (4) integrating justice into infrastructure and urban design processes. Responding to these empirical and theoretical research needs is the first step towards identifying pathways to more transformative adaptation policies.

  17. Importance measures for nuclear waste repositories

    International Nuclear Information System (INIS)

    Several importance measures are identified for possible use in the performance assessment of a high-level nuclear waste repository. These importance measures are based on concepts of importance used in system reliability analysis, but the concepts are modified and adapted to the special characteristics of the repository and similar passive systems. In particular, the importance measures proposed here are based on risk (in comparison to traditional importance measures which are based on frequency of failure) and are intended to be more suitable to systems comprised of components whose behavior is most easily and naturally represented as continuous, rather than binary. These importance measures appear to be able to evaluate systems comprised of both continuous-behavior and binary-behavior components. Three separate examples are provided to illustrate the concepts and behavior of these importance measures. The first example demonstrates various formulations for the importance measures and their implementation for a simple radiation safety system comprised of a radiation source and three shields. The second example demonstrates use of these importance measures for a system comprised of components modeled with binary behavior and components modeled with continuous behavior. The third example investigates the use of these importance measures for a proposed repository system, using a total system model and code currently under development. Currently, these concepts and formulations of importance are undergoing further evaluation for a repository system to determine to what degree they provide useful insights and to determine which formulations are most useful

  18. An Overview of Importance Splitting for Rare Event Simulation

    Science.gov (United States)

    Morio, Jerome; Pastel, Rudy; Le Gland, Francois

    2010-01-01

    Monte Carlo simulations are a classical tool to analyse physical systems. When unlikely events are to be simulated, the importance sampling technique is often used instead of Monte Carlo. Importance sampling has some drawbacks when the problem dimensionality is high or when the optimal importance sampling density is complex to obtain. In this…

  19. Importation and Innovation

    OpenAIRE

    Frank R Lichtenberg

    2006-01-01

    Importation of drugs into the U.S. would result in a decline in U.S. drug prices. The purpose of this paper is to assess the consequences of importation for new drug development. A simple theoretical model of drug development suggests that the elasticity of innovation with respect to the expected price of drugs should be at least as great as the elasticity of innovation with respect to expected market size (disease incidence). I examine the cross-sectional relationship between pharmaceutical ...

  20. Import vs. Imitation?

    DEFF Research Database (Denmark)

    Kölcze, Zsófia

    2012-01-01

    in producing, maintaining and reproducing social identities, communicating new ideas and technological innovations and creating ideologies and cosmologies. Our understanding of material culture has obtained a social dimension, and we as archaeologists have become aware of the importance of making this aspect....... Instead, we have to focus on the social life and role of an artifact - whether it is an original object, imported from a different cultural sphere, or an imitation, produced locally. Through my presentation I would like to stimulate the debate about this new approach, starting off with the hoard finds...