WorldWideScience

Sample records for adaptive importance sampling

  1. Adaptive importance sampling for control and inference

    OpenAIRE

    Kappen, Hilbert Johan; Ruiz, Hans Christian

    2015-01-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feyman-Kac path integral and can be estimated using Monte Carlo sampling. In this contribution we review path integral control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers...

  2. Adaptive importance sampling of random walks on continuous state spaces

    Energy Technology Data Exchange (ETDEWEB)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-11-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material.

  3. AIS-BN: An Adaptive Importance Sampling Algorithm for Evidential Reasoning in Large Bayesian Networks

    CERN Document Server

    Cheng, J; 10.1613/jair.764

    2011-01-01

    Stochastic sampling algorithms, while an attractive alternative to exact algorithms in very large Bayesian network models, have been observed to perform poorly in evidential reasoning with extremely unlikely evidence. To address this problem, we propose an adaptive importance sampling algorithm, AIS-BN, that shows promising convergence rates even under extreme conditions and seems to outperform the existing sampling algorithms consistently. Three sources of this performance improvement are (1) two heuristics for initialization of the importance function that are based on the theoretical properties of importance sampling in finite-dimensional integrals and the structural advantages of Bayesian networks, (2) a smooth learning method for the importance function, and (3) a dynamic weighting function for combining samples from different stages of the algorithm. We tested the performance of the AIS-BN algorithm along with two state of the art general purpose sampling algorithms, likelihood weighting (Fung and Chang...

  4. Improving Adaptive Importance Sampling Simulation of Markovian Queueing Models using Non-parametric Smoothing

    NARCIS (Netherlands)

    Woudt, Edwin; de Boer, Pieter-Tjerk; van Ommeren, Jan C.W.

    2007-01-01

    Previous work on state-dependent adaptive importance sampling techniques for the simulation of rare events in Markovian queueing models used either no smoothing or a parametric smoothing technique, which was known to be non-optimal. In this paper, we introduce the use of kernel smoothing in this con

  5. 'Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems'

    NARCIS (Netherlands)

    Remondo Bueno, D.; Srinivasan, R.; Nicola, V.F.; van Etten, Wim; Tattje, H.E.P.

    2000-01-01

    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models

  6. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    Energy Technology Data Exchange (ETDEWEB)

    Li, Weixuan [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Lin, Guang, E-mail: guanglin@purdue.edu [Department of Mathematics and School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  7. Adaptive state-dependent importance sampling simulation of Markovian queueing networks

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; Nicola, V.F.

    2002-01-01

    In this paper, a method is presented for the efficient estimation of rare-event (buffer overflow) probabilities in queueing networks using importance sampling. Unlike previously proposed change of measures, the one used here is not static, i.e., it depends on the buffer contents at each of the

  8. A Class of Adaptive Importance Sampling Weighted EM Algorithms for Efficient and Robust Posterior and Predictive Simulation

    OpenAIRE

    Hoogerheide, L.F.; Opschoor, A.; Dijk, van, Nico M.

    2012-01-01

    This discussion paper was published in the Journal of Econometrics (2012). Vol. 171(2), 101-120. A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation. The proposed methods are robust in the sense that they can handle target distributions that exhibit non-elliptical shapes such as multimodality and skewness. The basic method makes use of sequences of importance weighted Expectation Maximization steps in order to efficiently construct a mixture of...

  9. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  10. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Bin, E-mail: bins@ieee.org [School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 (China)

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  11. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  12. Importance sampling for characterizing STAP detectors

    NARCIS (Netherlands)

    Srinivasan, R.; Rangaswamy, M.

    2007-01-01

    This paper describes the development of adaptive importance sampling techniques for estimating false alarm probabilities of detectors that use space-time adaptive processing (STAP) algorithms. Fast simulation using importance sampling methods has been notably successful in the study of conventional

  13. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    Science.gov (United States)

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  14. Fuzzy adaptive importance sampling method based on MCMC%基于MCMC的模糊自适应重要抽样法

    Institute of Scientific and Technical Information of China (English)

    王进玲; 曾声奎; 马纪明

    2012-01-01

    The traditional adaptive importance sampling method based on Markov chain Monte Carlo (MC-MC) can only be applied to the system of determined failure domain but not to the gradual structure system of fuzzy failure domain. A new fuzzy adaptive importance sampling method based on MCMC is proposed. Firstly, the Markov chain samples are constructed according to Metropolis algorithm from the initial sample in the failure domain. Then a kernel sampling probability density function is obtained by adaptive kernel density estimation and the importance sampling is carried out. Finally, the fuzzy failure domain is discretized to compute the fuzzy failure probability. This approach solves the problems that the performance reliability of gradual structure systems is hard to be analyzed through simulation, and the efficiency of simulation is very low. The feasibility and effectiveness of this method are demonstrated by the case of the actuator system.%针对传统的基于马尔可夫链蒙特卡罗(Markov chain Monte Carlo,MCMC)的自适应重要抽样法只适用于失效边界确定的系统,而不适用于失效域模糊的渐变结构系统问题,提出基于MCMC的模糊自适应重要抽样法.首先从模糊失效域内的某个初始点出发,根据Metropolis准则构造马尔可夫模拟样本点;然后利用自适应核密度估计构建核抽样概率密度函数并进行重要抽样;最后离散化模糊失效域以计算系统的模糊失效概率.该方法合理地解决了以往渐变结构系统性能可靠性难以仿真分析及仿真效率低的难题,具有较高的仿真效率和精度.应用舵机案例对方法的适用性及高效性进行了验证.

  15. Covariance-Adaptive Slice Sampling

    OpenAIRE

    Thompson, Madeleine; Neal, Radford M.

    2010-01-01

    We describe two slice sampling methods for taking multivariate steps using the crumb framework. These methods use the gradients at rejected proposals to adapt to the local curvature of the log-density surface, a technique that can produce much better proposals when parameters are highly correlated. We evaluate our methods on four distributions and compare their performance to that of a non-adaptive slice sampling method and a Metropolis method. The adaptive methods perform favorably on low-di...

  16. Adaptive sampling for noisy problems

    Energy Technology Data Exchange (ETDEWEB)

    Cantu-Paz, E

    2004-03-26

    The usual approach to deal with noise present in many real-world optimization problems is to take an arbitrary number of samples of the objective function and use the sample average as an estimate of the true objective value. The number of samples is typically chosen arbitrarily and remains constant for the entire optimization process. This paper studies an adaptive sampling technique that varies the number of samples based on the uncertainty of deciding between two individuals. Experiments demonstrate the effect of adaptive sampling on the final solution quality reached by a genetic algorithm and the computational cost required to find the solution. The results suggest that the adaptive technique can effectively eliminate the need to set the sample size a priori, but in many cases it requires high computational costs.

  17. 混合变量系统基于MCMC的自适应重要抽样法%Adaptive Importance Sampling of Hybrid Variable Systems Based on MCMC

    Institute of Scientific and Technical Information of China (English)

    王进玲; 曾声奎; 马纪明; 庞怡

    2012-01-01

    The occurrence of key failures in a system may cause the system to degrade into different discrete states of performance. The classical importance sampling method based on Markov chain Monte Carlo (MCMC) can only be applied to a continuous variable system and cannot resolve the problem of mixed systems including discrete variables. Therefore, an improved adaptive importance sampling method based on MCMC is proposed to support the efficient simulation of system performance reliability. First, a failure space is constructed by combining different failure domains, and Markov simulation samples are achieved by the initial sample wandering in the failure space. Second, with a comprehensive consideration of continuous and discrete variables, a hybrid sampling density function is obtained through kernel density evaluation. Then, importance sampling simulation is operated according to the last hybrid sampling density function and the performance reliability is computed. Finally, the simulation efficiency is analyzed in theory. The validity and high efficiency of the proposed method are demonstrated by the case of an electro-hydrostatic actuator (EHA) system.%系统关键故障的发生,会导致系统处于各种离散性能降级状态.针对传统的基于马尔可夫链蒙特卡罗( Markov chain Monte Carlo,MCMC)的自适应重要抽样法只适用于连续变量系统的不足,提出考虑混合变量的基于MCMC的自适应重要抽样法,以支持系统性能可靠性的高效仿真.该方法首先将由关键故障产生的不同失效域组成失效空间,并通过初始样本点在失效空间中随机游走构造马尔可夫链模拟样本;然后综合考虑连续变量与离散变量,利用核密度估计构建混合核抽样密度函数;再根据该密度函数进行重要抽样仿真并计算系统的性能可靠性;最后对该方法的仿真效率进行理论分析.通过电液舵机(Electro-Hydrostatic Actuator,EHA)案例对方法的正确性和仿真效率进行验证.

  18. Adaptive Sampling in Hierarchical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  19. Examples comparing importance sampling and the Metropolis algorithm

    OpenAIRE

    Bassetti, Federico; Diaconis, Persi

    2006-01-01

    Importance sampling, particularly sequential and adaptive importance sampling, have emerged as competitive simulation techniques to Markov-chain Monte-Carlo techniques. We compare importance sampling and the Metropolis algorithm as two ways of changing the output of a Markov chain to get a different stationary distribution.

  20. Importance Sampling for the Infinite Sites Model*

    OpenAIRE

    Hobolth, Asger; Uyenoyama, Marcy K; Wiuf, Carsten

    2008-01-01

    Importance sampling or Markov Chain Monte Carlo sampling is required for state-of-the-art statistical analysis of population genetics data. The applicability of these sampling-based inference techniques depends crucially on the proposal distribution. In this paper, we discuss importance sampling for the infinite sites model. The infinite sites assumption is attractive because it constraints the number of possible genealogies, thereby allowing for the analysis of larger data sets. We recall th...

  1. Adaptive sampling algorithm for detection of superpoints

    Institute of Scientific and Technical Information of China (English)

    CHENG Guang; GONG Jian; DING Wei; WU Hua; QIANG ShiQiang

    2008-01-01

    The superpoints are the sources (or the destinations) that connect with a great deal of destinations (or sources) during a measurement time interval, so detecting the superpoints in real time is very important to network security and management. Previous algorithms are not able to control the usage of the memory and to deliver the desired accuracy, so it is hard to detect the superpoints on a high speed link in real time. In this paper, we propose an adaptive sampling algorithm to detect the superpoints in real time, which uses a flow sample and hold module to reduce the detection of the non-superpoints and to improve the measurement accuracy of the superpoints. We also design a data stream structure to maintain the flow records, which compensates for the flow Hash collisions statistically. An adaptive process based on different sampling probabilities is used to maintain the recorded IP ad dresses in the limited memory. This algorithm is compared with the other algo rithms by analyzing the real network trace data. Experiment results and mathematic analysis show that this algorithm has the advantages of both the limited memory requirement and high measurement accuracy.

  2. An Adaptive Importance Sampling Theory Based on The Generalized Genetic Algorithm%基于广义遗传算法的自适应重要抽样理论

    Institute of Scientific and Technical Information of China (English)

    董聪; 郭晓华

    2000-01-01

    In the present paper,using the generalized genetic algorithm,the problem of finding out all design points in the case of generalized multiple design point is solved,establishing recursion-type bound-and-classification algorithm,the problem of reducing and synthesizing generaliged multiple design points is also solved.The present paper shows that the adaptive importance sampling theory based on the generalized genetic algorithm is an more efficient tool for the reliability simulation of nonlinear sys-tems.

  3. DMATIS: Dark Matter ATtenuation Importance Sampling

    Science.gov (United States)

    Mahdawi, Mohammad Shafi; Farrar, Glennys R.

    2017-05-01

    DMATIS (Dark Matter ATtenuation Importance Sampling) calculates the trajectories of DM particles that propagate in the Earth's crust and the lead shield to reach the DAMIC detector using an importance sampling Monte-Carlo simulation. A detailed Monte-Carlo simulation avoids the deficiencies of the SGED/KS method that uses a mean energy loss description to calculate the lower bound on the DM-proton cross section. The code implementing the importance sampling technique makes the brute-force Monte-Carlo simulation of moderately strongly interacting DM with nucleons computationally feasible. DMATIS is written in Python 3 and MATHEMATICA.

  4. Importance sampling the Rayleigh phase function

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall

    2011-01-01

    Rayleigh scattering is used frequently in Monte Carlo simulation of multiple scattering. The Rayleigh phase function is quite simple, and one might expect that it should be simple to importance sample it efficiently. However, there seems to be no one good way of sampling it in the literature. Thi....... This paper provides the details of several different techniques for importance sampling the Rayleigh phase function, and it includes a comparison of their performance as well as hints toward efficient implementation.......Rayleigh scattering is used frequently in Monte Carlo simulation of multiple scattering. The Rayleigh phase function is quite simple, and one might expect that it should be simple to importance sample it efficiently. However, there seems to be no one good way of sampling it in the literature...

  5. Importance Sampling Variance Reduction in GRESS ATMOSIM

    Energy Technology Data Exchange (ETDEWEB)

    Wakeford, Daniel Tyler [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-26

    This document is intended to introduce the importance sampling method of variance reduction to a Geant4 user for application to neutral particle Monte Carlo transport through the atmosphere, as implemented in GRESS ATMOSIM.

  6. Importance sampling for NMF class of STAP detectors

    NARCIS (Netherlands)

    Anitori, L.; Srinivasan, R.; Rangaswamy, M.

    2006-01-01

    Importance sampling (IS) techniques are applied to space-time adaptive processing (STAP) radar detection algorithms for performance characterization via fast estimation of false alarm probabilities (FAP’s). The work here builds on and extends the initial thrust in this area provided in a recent pape

  7. Importance sampling for NMF class of STAP detectors

    NARCIS (Netherlands)

    Aitori, L.; Srinivasan, R.; Rangaswamy, M.

    2006-01-01

    Importance sampling (IS) techniques are applied to space-time adaptive processing (STAP) radar detection algorithms for performance characterization via fast estimation of false alarm probabilities (FAP’s). The work here builds on and extends the initial thrust in this area provided in a recent

  8. Importance sampling for jump processes and applications to finance

    OpenAIRE

    Badouraly Kassim, Laetitia; Lelong, Jérôme; Loumrhari, Imane

    2013-01-01

    International audience; Adaptive importance sampling techniques are widely known for the Gaussian setting of Brownian driven diffusions. In this work, we want to extend them to jump processes. Our approach relies on a change of the jump intensity combined with the standard exponential tilting for the Brownian motion. The free parameters of our framework are optimized using sample average approximation techniques. We illustrate the efficiency of our method on the valuation of financial derivat...

  9. Better Confidence Intervals for Importance Sampling

    OpenAIRE

    HALIS SAK; WOLFGANG HÖRMANN; JOSEF LEYDOLD

    2010-01-01

    It is well known that for highly skewed distributions the standard method of using the t statistic for the confidence interval of the mean does not give robust results. This is an important problem for importance sampling (IS) as its final distribution is often skewed due to a heavy tailed weight distribution. In this paper, we first explain Hall's transformation and its variants to correct the confidence interval of the mean and then evaluate the performance of these methods for two numerica...

  10. Explaining Adaptive Radial-Based Direction Sampling

    NARCIS (Netherlands)

    L. Bauwens (Luc); C.S. Bos (Charles); H.K. van Dijk (Herman); R.D. van Oest (Rutger)

    2003-01-01

    textabstractIn this short paper we summarize the computational steps of Adaptive Radial-Based Direction Sampling (ARDS), which can be used for Bayesian analysis of ill behaved target densities. We consider one simulation experiment in order to illustrate the good performance of ARDS relative to the

  11. Sequential adaptive compressed sampling via Huffman codes

    CERN Document Server

    Aldroubi, Akram; Zarringhalam, Kourosh

    2008-01-01

    There are two main approaches in compressed sensing: the geometric approach and the combinatorial approach. In this paper we introduce an information theoretic approach and use results from the theory of Huffman codes to construct a sequence of binary sampling vectors to determine a sparse signal. Unlike other approaches, our approach is adaptive in the sense that each sampling vector depends on the previous sample. The number of measurements we need for a k-sparse vector in n-dimensional space is no more than O(k log n) and the reconstruction is O(k).

  12. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  13. HASE - The Helsinki adaptive sample preparation line

    Energy Technology Data Exchange (ETDEWEB)

    Palonen, V., E-mail: vesa.palonen@helsinki.fi [Department of Physics, University of Helsinki, P.O. Box 43, FI-00014 (Finland); Pesonen, A. [Laboratory of Chronology, Finnish Museum of Natural History, P.O. Box 64, FI-00014 (Finland); Herranen, T.; Tikkanen, P. [Department of Physics, University of Helsinki, P.O. Box 43, FI-00014 (Finland); Oinonen, M. [Laboratory of Chronology, Finnish Museum of Natural History, P.O. Box 64, FI-00014 (Finland)

    2013-01-15

    We have designed and built an adaptive sample preparation line with separate modules for combustion, molecular sieve handling, CO{sub 2} gas cleaning, CO{sub 2} storage, and graphitization. The line is also connected to an elemental analyzer. Operation of the vacuum equipment, a flow controller, pressure sensors, ovens, and graphitization reactors are automated with a reliable NI-cRIO real-time system. Stepped combustion can be performed in two ovens at temperatures up to 900 Degree-Sign C. Depending on the application, CuO or O{sub 2}-flow combustion can be used. A flow controller is used to adjust the O{sub 2} flow and pressure during combustion. For environmental samples, a module for molecular sieve regeneration and sample desorption is attached to the line replacing the combustion module. In the storage module, CO{sub 2} samples can be stored behind a gas-tight diaphragm valve and either stored for later graphitization or taken for measurements with separate equipment (AMS gas ion source or a separate mass spectrometer). The graphitization module consists of four automated reactors, capable of graphitizing samples with masses from 3 mg down to 50 {mu}g.

  14. Important ingredients for health adaptive information systems.

    Science.gov (United States)

    Senathirajah, Yalini; Bakken, Suzanne

    2011-01-01

    Healthcare information systems frequently do not truly meet clinician needs, due to the complexity, variability, and rapid change in medical contexts. Recently the internet world has been transformed by approaches commonly termed 'Web 2.0'. This paper proposes a Web 2.0 model for a healthcare adaptive architecture. The vision includes creating modular, user-composable systems which aim to make all necessary information from multiple internal and external sources available via a platform, for the user to use, arrange, recombine, author, and share at will, using rich interfaces where advisable. Clinicians can create a set of 'widgets' and 'views' which can transform data, reflect their domain knowledge and cater to their needs, using simple drag and drop interfaces without the intervention of programmers. We have built an example system, MedWISE, embodying the user-facing parts of the model. This approach to HIS is expected to have several advantages, including greater suitability to user needs (reflecting clinician rather than programmer concepts and priorities), incorporation of multiple information sources, agile reconfiguration to meet emerging situations and new treatment deployment, capture of user domain expertise and tacit knowledge, efficiencies due to workflow and human-computer interaction improvements, and greater user acceptance.

  15. A software sampling frequency adaptive algorithm for reducing spectral leakage

    Institute of Scientific and Technical Information of China (English)

    PAN Li-dong; WANG Fei

    2006-01-01

    Spectral leakage caused by synchronous error in a nonsynchronous sampling system is an important cause that reduces the accuracy of spectral analysis and harmonic measurement.This paper presents a software sampling frequency adaptive algorithm that can obtain the actual signal frequency more accurately,and then adjusts sampling interval base on the frequency calculated by software algorithm and modifies sampling frequency adaptively.It can reduce synchronous error and impact of spectral leakage;thereby improving the accuracy of spectral analysis and harmonic measurement for power system signal where frequency changes slowly.This algorithm has high precision just like the simulations show,and it can be a practical method in power system harmonic analysis since it can be implemented easily.

  16. Adaptive sampling for mesh spectrum editing

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiang-jun; ZHANG Hong-xin; BAO Hu-jun

    2006-01-01

    A mesh editing framework is presented in this paper, which integrates Free-Form Deformation (FFD) and geometry signal processing. By using simplified model from original mesh, the editing task can be accomplished with a few operations. We take the deformation of the proxy and the position coordinates of the mesh models as geometry signal. Wavelet analysis is employed to separate local detail information gracefully. The crucial innovation of this paper is a new adaptive regular sampling approach for our signal analysis based editing framework. In our approach, an original mesh is resampled and then refined iteratively which reflects optimization of our proposed spectrum preserving energy. As an extension of our spectrum editing scheme,the editing principle is applied to geometry details transferring, which brings satisfying results.

  17. Importance of sampling frequency when collecting diatoms

    KAUST Repository

    Wu, Naicheng

    2016-11-14

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency.

  18. Importance of sampling frequency when collecting diatoms

    Science.gov (United States)

    Wu, Naicheng; Faber, Claas; Sun, Xiuming; Qu, Yueming; Wang, Chao; Ivetic, Snjezana; Riis, Tenna; Ulrich, Uta; Fohrer, Nicola

    2016-11-01

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency.

  19. Importance sampling of severe wind gusts

    NARCIS (Netherlands)

    Bos, R.; Bierbooms, W.A.A.M.; Van Bussel, G.J.W.

    2015-01-01

    An important problem that arises during the design of wind turbines is estimating extreme loads with sufficient accuracy. This is especially difficult during iterative design phases when computational resources are scarce. Over the years, many methods have been proposed to extrapolate extreme load d

  20. Importance sampling of severe wind gusts

    NARCIS (Netherlands)

    Bos, R.; Bierbooms, W.A.A.M.; Van Bussel, G.J.W.

    2015-01-01

    An important problem that arises during the design of wind turbines is estimating extreme loads with sufficient accuracy. This is especially difficult during iterative design phases when computational resources are scarce. Over the years, many methods have been proposed to extrapolate extreme load d

  1. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio

    2011-01-01

    sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show...

  2. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  3. Adapted random sampling patterns for accelerated MRI.

    Science.gov (United States)

    Knoll, Florian; Clason, Christian; Diwoky, Clemens; Stollberger, Rudolf

    2011-02-01

    Variable density random sampling patterns have recently become increasingly popular for accelerated imaging strategies, as they lead to incoherent aliasing artifacts. However, the design of these sampling patterns is still an open problem. Current strategies use model assumptions like polynomials of different order to generate a probability density function that is then used to generate the sampling pattern. This approach relies on the optimization of design parameters which is very time consuming and therefore impractical for daily clinical use. This work presents a new approach that generates sampling patterns by making use of power spectra of existing reference data sets and hence requires neither parameter tuning nor an a priori mathematical model of the density of sampling points. The approach is validated with downsampling experiments, as well as with accelerated in vivo measurements. The proposed approach is compared with established sampling patterns, and the generalization potential is tested by using a range of reference images. Quantitative evaluation is performed for the downsampling experiments using RMS differences to the original, fully sampled data set. Our results demonstrate that the image quality of the method presented in this paper is comparable to that of an established model-based strategy when optimization of the model parameter is carried out and yields superior results to non-optimized model parameters. However, no random sampling pattern showed superior performance when compared to conventional Cartesian subsampling for the considered reconstruction strategy.

  4. Adaptive bandwidth measurements of importance functions for speech intelligibility prediction.

    Science.gov (United States)

    Whitmal, Nathaniel A; DeRoy, Kristina

    2011-12-01

    The Articulation Index (AI) and Speech Intelligibility Index (SII) predict intelligibility scores from measurements of speech and hearing parameters. One component in the prediction is the "importance function," a weighting function that characterizes contributions of particular spectral regions of speech to speech intelligibility. Previous work with SII predictions for hearing-impaired subjects suggests that prediction accuracy might improve if importance functions for individual subjects were available. Unfortunately, previous importance function measurements have required extensive intelligibility testing with groups of subjects, using speech processed by various fixed-bandwidth low-pass and high-pass filters. A more efficient approach appropriate to individual subjects is desired. The purpose of this study was to evaluate the feasibility of measuring importance functions for individual subjects with adaptive-bandwidth filters. In two experiments, ten subjects with normal-hearing listened to vowel-consonant-vowel (VCV) nonsense words processed by low-pass and high-pass filters whose bandwidths were varied adaptively to produce specified performance levels in accordance with the transformed up-down rules of Levitt [(1971). J. Acoust. Soc. Am. 49, 467-477]. Local linear psychometric functions were fit to resulting data and used to generate an importance function for VCV words. Results indicate that the adaptive method is reliable and efficient, and produces importance function data consistent with that of the corresponding AI/SII importance function.

  5. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  6. 19 CFR 151.67 - Sampling by importer.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling by importer. 151.67 Section 151.67 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE... importer. The importer may be permitted after entry to draw samples under Customs supervision in...

  7. Domain Adaptation: Overfitting and Small Sample Statistics

    CERN Document Server

    Foster, Dean; Salakhutdinov, Ruslan

    2011-01-01

    We study the prevalent problem when a test distribution differs from the training distribution. We consider a setting where our training set consists of a small number of sample domains, but where we have many samples in each domain. Our goal is to generalize to a new domain. For example, we may want to learn a similarity function using only certain classes of objects, but we desire that this similarity function be applicable to object classes not present in our training sample (e.g. we might seek to learn that "dogs are similar to dogs" even though images of dogs were absent from our training set). Our theoretical analysis shows that we can select many more features than domains while avoiding overfitting by utilizing data-dependent variance properties. We present a greedy feature selection algorithm based on using T-statistics. Our experiments validate this theory showing that our T-statistic based greedy feature selection is more robust at avoiding overfitting than the classical greedy procedure.

  8. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue – it is known that in this setting ‘traditional’ stateindependent importance-sampling distributions perfo

  9. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue – it is known that in this setting ‘traditional’ state-independent importance-sampling distributions perf

  10. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, Denis; Scheinhardt, Werner; Mandjes, Michel

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting “traditional” state-independent importance-sampling distributions per

  11. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    D. Miretskiy; W. Scheinhardt; M. Mandjes

    2010-01-01

    This article considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue; it is known that in this setting "traditional" state-independent importance-sampling distributions per

  12. Irregular and adaptive sampling for automatic geophysic measure systems

    Science.gov (United States)

    Avagnina, Davide; Lo Presti, Letizia; Mulassano, Paolo

    2000-07-01

    In this paper a sampling method, based on an irregular and adaptive strategy, is described. It can be used as automatic guide for rovers designed to explore terrestrial and planetary environments. Starting from the hypothesis that a explorative vehicle is equipped with a payload able to acquire measurements of interesting quantities, the method is able to detect objects of interest from measured points and to realize an adaptive sampling, while badly describing the not interesting background.

  13. Robust Fusion of Irregularly Sampled Data Using Adaptive Normalized Convolution

    Directory of Open Access Journals (Sweden)

    Schutte Klamer

    2006-01-01

    Full Text Available We present a novel algorithm for image fusion from irregularly sampled data. The method is based on the framework of normalized convolution (NC, in which the local signal is approximated through a projection onto a subspace. The use of polynomial basis functions in this paper makes NC equivalent to a local Taylor series expansion. Unlike the traditional framework, however, the window function of adaptive NC is adapted to local linear structures. This leads to more samples of the same modality being gathered for the analysis, which in turn improves signal-to-noise ratio and reduces diffusion across discontinuities. A robust signal certainty is also adapted to the sample intensities to minimize the influence of outliers. Excellent fusion capability of adaptive NC is demonstrated through an application of super-resolution image reconstruction.

  14. Time-Stampless Adaptive Nonuniform Sampling for Stochastic Signals

    Science.gov (United States)

    Feizi, Soheil; Goyal, Vivek K.; Medard, Muriel

    2012-10-01

    In this paper, we introduce a time-stampless adaptive nonuniform sampling (TANS) framework, in which time increments between samples are determined by a function of the $m$ most recent increments and sample values. Since only past samples are used in computing time increments, it is not necessary to save sampling times (time stamps) for use in the reconstruction process. We focus on two TANS schemes for discrete-time stochastic signals: a greedy method, and a method based on dynamic programming. We analyze the performances of these schemes by computing (or bounding) their trade-offs between sampling rate and expected reconstruction distortion for autoregressive and Markovian signals. Simulation results support the analysis of the sampling schemes. We show that, by opportunistically adapting to local signal characteristics, TANS may lead to improved power efficiency in some applications.

  15. Time-Stampless Adaptive Nonuniform Sampling for Stochastic Signals

    CERN Document Server

    Feizi, Soheil; Medard, Muriel

    2011-01-01

    In this paper, we introduce a time-stampless adaptive nonuniform sampling (TANS) framework, in which time increments between samples are determined by a function of the $m$ most recent increments and sample values. Since only past samples are used in computing time increments, it is not necessary to save sampling times (time stamps) for use in the reconstruction process. We focus on two TANS schemes for discrete-time stochastic signals: a greedy method, and a method based on dynamic programming. We analyze the performances of these schemes by computing (or bounding) their trade-offs between sampling rate and expected reconstruction distortion for autoregressive and Markovian signals. Simulation results support the analysis of the sampling schemes. We show that, by opportunistically adapting to local signal characteristics, TANS may lead to improved power efficiency in some applications.

  16. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  17. Adaptive sampling program support for expedited site characterization

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.

    1993-10-01

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ``real-time`` data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection.

  18. Identifying important nodes by adaptive LeaderRank

    Science.gov (United States)

    Xu, Shuang; Wang, Pei

    2017-03-01

    Spreading process is a common phenomenon in complex networks. Identifying important nodes in complex networks is of great significance in real-world applications. Based on the spreading process on networks, a lot of measures have been proposed to evaluate the importance of nodes. However, most of the existing measures are appropriate to static networks, which are fragile to topological perturbations. Many real-world complex networks are dynamic rather than static, meaning that the nodes and edges of such networks may change with time, which challenge numerous existing centrality measures. Based on a new weighted mechanism and the newly proposed H-index and LeaderRank (LR), this paper introduces a variant of the LR measure, called adaptive LeaderRank (ALR), which is a new member of the LR-family. Simulations on six real-world networks reveal that the new measure can well balance between prediction accuracy and robustness. More interestingly, the new measure can better adapt to the adjustment or local perturbations of network topologies, as compared with the existing measures. By discussing the detailed properties of the measures from the LR-family, we illustrate that the ALR has its competitive advantages over the other measures. The proposed algorithm enriches the measures to understand complex networks, and may have potential applications in social networks and biological systems.

  19. Efficient importance sampling in low dimensions using affine arithmetic

    OpenAIRE

    Everitt, Richard G.

    2017-01-01

    Despite the development of sophisticated techniques such as sequential Monte Carlo, importance sampling (IS) remains an important Monte Carlo method for low dimensional target distributions. This paper describes a new technique for constructing proposal distributions for IS, using affine arithmetic. This work builds on the Moore rejection sampler to which we provide a comparison.

  20. Adaptive Monte Carlo on multivariate binary sampling spaces

    CERN Document Server

    Schäfer, Christian

    2010-01-01

    A Monte Carlo algorithm is said to be adaptive if it can adjust automatically its current proposal distribution, using past simulations. The choice of the parametric family that defines the set of proposal distributions is critical for a good performance. We treat the problem of constructing such parametric families for adaptive sampling on multivariate binary spaces. A practical motivation for this problem is variable selection in a linear regression context, where we need to either find the best model, with respect to some criterion, or to sample from a Bayesian posterior distribution on the model space. In terms of adaptive algorithms, we focus on the Cross-Entropy (CE) method for optimisation, and the Sequential Monte Carlo (SMC) methods for sampling. Raw versions of both SMC and CE algorithms are easily implemented using binary vectors with independent components. However, for high-dimensional model choice problems, these straightforward proposals do not yields satisfactory results. The key to advanced a...

  1. Adaptive Sampling for High Throughput Data Using Similarity Measures

    Energy Technology Data Exchange (ETDEWEB)

    Bulaevskaya, V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sales, A. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  2. Adaptation of the methodology of sample surveys for marketing researches

    Directory of Open Access Journals (Sweden)

    Kataev Andrey

    2015-08-01

    Full Text Available The article presents the results of the theory of adaptation of sample survey for the purposes of marketing, that allows to answer the fundamental question of any marketing research – how many objects should be studied for drawing adequate conclusions.

  3. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  4. Importance Sampling Simulations of Markovian Reliability Systems using Cross Entropy

    NARCIS (Netherlands)

    Ridder, Ad

    2004-01-01

    This paper reports simulation experiments, applying the cross entropy method suchas the importance sampling algorithm for efficient estimation of rare event probabilities in Markovian reliability systems. The method is compared to various failurebiasing schemes that have been proved to give estimato

  5. Importance Sampling Simulations of Markovian Reliability Systems using Cross Entropy

    NARCIS (Netherlands)

    Ridder, Ad

    2004-01-01

    This paper reports simulation experiments, applying the cross entropy method suchas the importance sampling algorithm for efficient estimation of rare event probabilities in Markovian reliability systems. The method is compared to various failurebiasing schemes that have been proved to give estimato

  6. Stochastic seismic inversion using greedy annealed importance sampling

    Science.gov (United States)

    Xue, Yang; Sen, Mrinal K.

    2016-10-01

    A global optimization method called very fast simulated annealing (VFSA) inversion has been applied to seismic inversion. Here we address some of the limitations of VFSA by developing a new stochastic inference method, named greedy annealed importance sampling (GAIS). GAIS combines VFSA and greedy importance sampling (GIS), which uses a greedy search in the important regions located by VFSA, in order to attain fast convergence and provide unbiased estimation. We demonstrate the performance of GAIS with application to seismic inversion of field post- and pre-stack datasets. The results indicate that GAIS can improve lateral continuity of the inverted impedance profiles and provide better estimation of uncertainties than using VFSA alone. Thus this new hybrid method combining global and local optimization methods can be applied in seismic reservoir characterization and reservoir monitoring for accurate estimation of reservoir models and their uncertainties.

  7. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Institute of Scientific and Technical Information of China (English)

    Huang Jiangtao; Gao Zhenghong; Zhou Zhu; Zhao Ke

    2015-01-01

    Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is pro-posed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF) network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sam-pling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE) feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowd-ness enhance (RCE) adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic opti-mization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  8. Adaptation of a Digitally Predistorted RF Amplifier Using Selective Sampling

    Institute of Scientific and Technical Information of China (English)

    R. Neil Braithwaite

    2011-01-01

    In this paper, a reduced-cost method of measuring residual nonlinearities in an adaptive digitally predistorted amplifier is proposed. Measurements obtained by selective sampling of the amplifier output are integrated over the input envelope range to adapt a fourth-order polynomial predistorter with memory correction. Results for a WCDMA input with a 101 carrier configuration show that a transmitter using the proposed method can meet the adjacent channel leakage ratio (ACLR) specification. Inverse modeling of the nonlinearity is proposed as a future extension that will reduce the cost of the system further.

  9. Importance Sampling for Failure Probabilities in Computing and Data Transmission

    DEFF Research Database (Denmark)

    Asmussen, Søren

    We study efficient simulation algorithms for estimating P(Χ > χ), where Χ is the total time of a job with ideal time T that needs to be restarted after a failure. The main tool is importance sampling where one tries to identify a good importance distribution via an asymptotic description of the c...... the computational effort is taken into account. To resolve this problem, an alternative algorithm using twosided Lundberg bounds is suggested....

  10. Importance sampling for failure probabilities in computing and data transmission

    DEFF Research Database (Denmark)

    Asmussen, Søren

    2009-01-01

    In this paper we study efficient simulation algorithms for estimating P(X›x), where X is the total time of a job with ideal time $T$ that needs to be restarted after a failure. The main tool is importance sampling, where a good importance distribution is identified via an asymptotic description...... these asymptotic descriptions have bounded relative error as x→∞ when combined with the ideas used for a fixed t. Nevertheless, we give examples of algorithms carefully designed to enjoy bounded relative error that may provide little or no asymptotic improvement over crude Monte Carlo simulation when...

  11. Joint importance sampling of low-order volumetric scattering

    DEFF Research Database (Denmark)

    Georgiev, Iliyan; Křivánek, Jaroslav; Hachisuka, Toshiya

    2013-01-01

    Central to all Monte Carlo-based rendering algorithms is the construction of light transport paths from the light sources to the eye. Existing rendering approaches sample path vertices incrementally when constructing these light transport paths. The resulting probability density is thus a product...... of the conditional densities of each local sampling step, constructed without explicit control over the form of the final joint distribution of the complete path. We analyze why current incremental construction schemes often lead to high variance in the presence of participating media, and reveal...... that such approaches are an unnecessary legacy inherited from traditional surface-based rendering algorithms. We devise joint importance sampling of path vertices in participating media to construct paths that explicitly account for the product of all scattering and geometry terms along a sequence of vertices instead...

  12. Importance of accurate sampling techniques in microbiological diagnosis of endophthalmitis

    OpenAIRE

    Banu A.; Sriprakash KS; Nagaraj ER; Meundi M

    2011-01-01

    BackgroundEndophthalmitis is an ocular emergency and bacteria arethe commonest aetiological agents of infectiousendophthalmitis. Any delay in treatment will result inserious complications like complete loss of vision.Therefore, obtaining the most appropriate sample is ofparamount importance for a microbiologist to identify theaetiological agents that help the ophthalmologist inplanning treatment.ObjectiveThis study was undertaken to determine the intraocularspecimen that is most likely to yie...

  13. Variance Analysis and Adaptive Sampling for Indirect Light Path Reuse

    Institute of Scientific and Technical Information of China (English)

    Hao Qin; Xin Sun; Jun Yan; Qi-Ming Hou; Zhong Ren; Kun Zhou

    2016-01-01

    In this paper, we study the estimation variance of a set of global illumination algorithms based on indirect light path reuse. These algorithms usually contain two passes — in the first pass, a small number of indirect light samples are generated and evaluated, and they are then reused by a large number of reconstruction samples in the second pass. Our analysis shows that the covariance of the reconstruction samples dominates the estimation variance under high reconstruction rates and increasing the reconstruction rate cannot effectively reduce the covariance. We also find that the covariance represents to what degree the indirect light samples are reused during reconstruction. This analysis motivates us to design a heuristic approximating the covariance as well as an adaptive sampling scheme based on this heuristic to reduce the rendering variance. We validate our analysis and adaptive sampling scheme in the indirect light field reconstruction algorithm and the axis-aligned filtering algorithm for indirect lighting. Experiments are in accordance with our analysis and show that rendering artifacts can be greatly reduced at a similar computational cost.

  14. Importance of accurate sampling techniques in microbiological diagnosis of endophthalmitis

    Directory of Open Access Journals (Sweden)

    Banu A

    2011-05-01

    Full Text Available BackgroundEndophthalmitis is an ocular emergency and bacteria arethe commonest aetiological agents of infectiousendophthalmitis. Any delay in treatment will result inserious complications like complete loss of vision.Therefore, obtaining the most appropriate sample is ofparamount importance for a microbiologist to identify theaetiological agents that help the ophthalmologist inplanning treatment.ObjectiveThis study was undertaken to determine the intraocularspecimen that is most likely to yield a positive culture onmicrobiological examination.MethodsFrom 60 cases, intraocular samples were collected in theoperation theatre under anaesthesia. The samples obtainedwere aqueous humour and vitreous humour by vitreous tap,vitreous biopsy or pars plana vitrectomy. The specimenswere processed within half an hour, first by inoculating ontoculture media and then direct smear examination by Gram’sStainResultsEighty samples were obtained from 60 cases of which themost were vitreous fluid (vitreous biopsy/tap + vitrectomyfluid, i.e., 75%. Culture was positive in 88% vitrectomy fluidas compared to 74% in vitreous tap/biopsy followed by 20%in aqueous fluid.ConclusionsVitrectomy fluid appears to be the best sample for culturefrom clinically diagnosed endophthalmitis cases.

  15. A cheap and quickly adaptable in situ electrical contacting TEM sample holder design.

    Science.gov (United States)

    Börrnert, Felix; Voigtländer, Ralf; Rellinghaus, Bernd; Büchner, Bernd; Rümmeli, Mark H; Lichte, Hannes

    2014-04-01

    In situ electrical characterization of nanostructures inside a transmission electron microscope provides crucial insight into the mechanisms of functioning micro- and nano-electronic devices. For such in situ investigations specialized sample holders are necessary. A simple and affordable but flexible design is important, especially, when sample geometries change, a holder should be adaptable with minimum effort. Atomic resolution imaging is standard nowadays, so a sample holder must ensure this capability. A sample holder design for on-chip samples is presented that fulfils these requisites. On-chip sample devices have the advantage that they can be manufactured via standard fabrication routes.

  16. SAChES: Scalable Adaptive Chain-Ensemble Sampling.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ray, Jaideep [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Huang, Maoyi [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hou, Zhangshuan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bao, Jie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ren, Huiying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-08-01

    We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the use of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.

  17. An importance sampling algorithm for estimating extremes of perpetuity sequences

    DEFF Research Database (Denmark)

    Collamore, Jeffrey F.

    2012-01-01

    In a wide class of problems in insurance and financial mathematics, it is of interest to study the extremal events of a perpetuity sequence. This paper addresses the problem of numerically evaluating these rare event probabilities. Specifically, an importance sampling algorithm is described which...... is efficient in the sense that it exhibits bounded relative error, and which is optimal in an appropriate asymptotic sense. The main idea of the algorithm is to use a ``dual" change of measure, which is employed to an associated Markov chain over a randomly-stopped time interval. The algorithm also makes use...

  18. The Importance of Formalizing Computational Models of Face Adaptation Aftereffects

    Science.gov (United States)

    Ross, David A.; Palmeri, Thomas J.

    2016-01-01

    Face adaptation is widely used as a means to probe the neural representations that support face recognition. While the theories that relate face adaptation to behavioral aftereffects may seem conceptually simple, our work has shown that testing computational instantiations of these theories can lead to unexpected results. Instantiating a model of face adaptation not only requires specifying how faces are represented and how adaptation shapes those representations but also specifying how decisions are made, translating hidden representational states into observed responses. Considering the high-dimensionality of face representations, the parallel activation of multiple representations, and the non-linearity of activation functions and decision mechanisms, intuitions alone are unlikely to succeed. If the goal is to understand mechanism, not simply to examine the boundaries of a behavioral phenomenon or correlate behavior with brain activity, then formal computational modeling must be a component of theory testing. To illustrate, we highlight our recent computational modeling of face adaptation aftereffects and discuss how models can be used to understand the mechanisms by which faces are recognized. PMID:27378960

  19. Distributed Database Kriging for Adaptive Sampling (D2 KAS)

    Science.gov (United States)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; Rouet-Leduc, Bertrand; McPherson, Allen L.; Germann, Timothy C.; Junghans, Christoph

    2015-07-01

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5-25, while retaining high accuracy for various choices of the algorithm parameters.

  20. Adaptive Sampling of Time Series During Remote Exploration

    Science.gov (United States)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  1. Semigroups and sequential importance sampling for multiway tables

    CERN Document Server

    Yoshida, Ruriko; Wei, Shaoceng; Zhou, Feng; Haws, David

    2011-01-01

    When an interval of integers between the lower bound $l_i$ and the upper bound $u_i$ is the support of the marginal distribution $n_i|(n_{i-1}, ...,n_1)$, Chen et al, 2005 noticed that sampling from the interval at each step, for $n_i$ during a sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection Chen et al 2006. In this paper we consider the uniform distribution as the target distribution. First we show that if we fix the number of rows and columns of the design matrix of the model for contingency tables then there exists a polynomial time algorithm in terms of the input size to sample a table from the set of all tables satisfying all marginals defined by the given model via the SIS procedure without rejection. We then show experimentall...

  2. Adaptive Sampling for Learning Gaussian Processes Using Mobile Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yunfei Xu

    2011-03-01

    Full Text Available This paper presents a novel class of self-organizing sensing agents that adaptively learn an anisotropic, spatio-temporal Gaussian process using noisy measurements and move in order to improve the quality of the estimated covariance function. This approach is based on a class of anisotropic covariance functions of Gaussian processes introduced to model a broad range of spatio-temporal physical phenomena. The covariance function is assumed to be unknown a priori. Hence, it is estimated by the maximum a posteriori probability (MAP estimator. The prediction of the field of interest is then obtained based on the MAP estimate of the covariance function. An optimal sampling strategy is proposed to minimize the information-theoretic cost function of the Fisher Information Matrix. Simulation results demonstrate the effectiveness and the adaptability of the proposed scheme.

  3. Improving Wang-Landau sampling with adaptive windows.

    Science.gov (United States)

    Cunha-Netto, A G; Caparica, A A; Tsai, Shan-Ho; Dickman, Ronald; Landau, D P

    2008-11-01

    Wang-Landau sampling (WLS) of large systems requires dividing the energy range into "windows" and joining the results of simulations in each window. The resulting density of states (and associated thermodynamic functions) is shown to suffer from boundary effects in simulations of lattice polymers and the five-state Potts model. Here, we implement WLS using adaptive windows. Instead of defining fixed energy windows (or windows in the energy-magnetization plane for the Potts model), the boundary positions depend on the set of energy values on which the histogram is flat at a given stage of the simulation. Shifting the windows each time the modification factor f is reduced, we eliminate border effects that arise in simulations using fixed windows. Adaptive windows extend significantly the range of system sizes that may be studied reliably using WLS.

  4. Improving Wang-Landau sampling with adaptive windows

    Science.gov (United States)

    Cunha-Netto, A. G.; Caparica, A. A.; Tsai, Shan-Ho; Dickman, Ronald; Landau, D. P.

    2008-11-01

    Wang-Landau sampling (WLS) of large systems requires dividing the energy range into “windows” and joining the results of simulations in each window. The resulting density of states (and associated thermodynamic functions) is shown to suffer from boundary effects in simulations of lattice polymers and the five-state Potts model. Here, we implement WLS using adaptive windows. Instead of defining fixed energy windows (or windows in the energy-magnetization plane for the Potts model), the boundary positions depend on the set of energy values on which the histogram is flat at a given stage of the simulation. Shifting the windows each time the modification factor f is reduced, we eliminate border effects that arise in simulations using fixed windows. Adaptive windows extend significantly the range of system sizes that may be studied reliably using WLS.

  5. Adaptation pathways of global wheat production: Importance of strategic adaptation to climate change.

    Science.gov (United States)

    Tanaka, Akemi; Takahashi, Kiyoshi; Masutomi, Yuji; Hanasaki, Naota; Hijioka, Yasuaki; Shiogama, Hideo; Yamanaka, Yasuhiro

    2015-09-16

    Agricultural adaptation is necessary to reduce the negative impacts of climate change on crop yields and to maintain food production. However, few studies have assessed the course of adaptation along with the progress of climate change in each of the current major food producing countries. Adaptation pathways, which describe the temporal sequences of adaptations, are helpful for illustrating the timing and intensity of the adaptation required. Here we present adaptation pathways in the current major wheat-producing countries, based on sequential introduction of the minimum adaptation measures necessary to maintain current wheat yields through the 21st century. We considered two adaptation options: (i) expanding irrigation infrastructure; and (ii) switching crop varieties and developing new heat-tolerant varieties. We find that the adaptation pathways differ markedly among the countries. The adaptation pathways are sensitive to both the climate model uncertainty and natural variability of the climate system, and the degree of sensitivity differs among countries. Finally, the negative impacts of climate change could be moderated by implementing adaptations steadily according to forecasts of the necessary future adaptations, as compared to missing the appropriate timing to implement adaptations.

  6. Gap processing for adaptive maximal Poisson-disk sampling

    KAUST Repository

    Yan, Dongming

    2013-09-01

    In this article, we study the generation of maximal Poisson-disk sets with varying radii. First, we present a geometric analysis of gaps in such disk sets. This analysis is the basis for maximal and adaptive sampling in Euclidean space and on manifolds. Second, we propose efficient algorithms and data structures to detect gaps and update gaps when disks are inserted, deleted, moved, or when their radii are changed.We build on the concepts of regular triangulations and the power diagram. Third, we show how our analysis contributes to the state-of-the-art in surface remeshing. © 2013 ACM.

  7. Importance sampling. I. Computing multimodel p values in linkage analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kong, A.; Frigge, M.; Irwin, M.; Cox, N. (Univ. of Chicago, IL (United States))

    1992-12-01

    In linkage analysis, when the lod score is maximized over multiple genetic models, standard asymptotic approximation of the significance level does not apply. Monte Carlo methods can be used to estimate the p value, but procedures currently used are extremely inefficient. The authors propose a Monte Carlo procedure based on the concept of importance sampling, which can be thousands of times more efficient than current procedures. With a reasonable amount of computing time, extremely accurate estimates of the p values can be obtained. Both theoretical results and an example of maturity-onset diabetes of the young (MODY) are presented to illustrate the efficiency performance of their method. Relations between single-model and multimodel p values are explored. The new procedure is also used to investigate the performance of asymptotic approximations in a single model situation. 22 refs., 6 figs., 1 tab.

  8. Stochastic approximation Monte Carlo importance sampling for approximating exact conditional probabilities

    KAUST Repository

    Cheon, Sooyoung

    2013-02-16

    Importance sampling and Markov chain Monte Carlo methods have been used in exact inference for contingency tables for a long time, however, their performances are not always very satisfactory. In this paper, we propose a stochastic approximation Monte Carlo importance sampling (SAMCIS) method for tackling this problem. SAMCIS is a combination of adaptive Markov chain Monte Carlo and importance sampling, which employs the stochastic approximation Monte Carlo algorithm (Liang et al., J. Am. Stat. Assoc., 102(477):305-320, 2007) to draw samples from an enlarged reference set with a known Markov basis. Compared to the existing importance sampling and Markov chain Monte Carlo methods, SAMCIS has a few advantages, such as fast convergence, ergodicity, and the ability to achieve a desired proportion of valid tables. The numerical results indicate that SAMCIS can outperform the existing importance sampling and Markov chain Monte Carlo methods: It can produce much more accurate estimates in much shorter CPU time than the existing methods, especially for the tables with high degrees of freedom. © 2013 Springer Science+Business Media New York.

  9. Semigroups and sequential importance sampling for multiway tables and beyond

    CERN Document Server

    Xi, Jing; Zhou, Feng; Yoshida, Ruriko; Haws, David

    2011-01-01

    When an interval of integers between the lower bound l_i and the upper bounds u_i is the support of the marginal distribution n_i|(n_{i-1}, ...,n_1), Chen et al. 2005 noticed that sampling from the interval at each step, for n_i during the sequential importance sampling (SIS) procedure, always produces a table which satisfies the marginal constraints. However, in general, the interval may not be equal to the support of the marginal distribution. In this case, the SIS procedure may produce tables which do not satisfy the marginal constraints, leading to rejection [Chen et al. 2006]. Rejecting tables is computationally expensive and incorrect proposal distributions result in biased estimators for the number of tables given its marginal sums. This paper has two focuses; (1) we propose a correction coefficient which corrects an interval of integers between the lower bound l_i and the upper bounds u_i to the support of the marginal distribution asymptotically even with rejections and with the same time complexity ...

  10. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    Science.gov (United States)

    Grace, Joseph M.; Verseux, Cyprien; Gentry, Diana; Moffet, Amy; Thayabaran, Ramanen; Wong, Nathan; Rothschild, Lynn

    2013-01-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of bacteria to the presence of a toxic metal, automatically adjusting the level of toxicity based on the

  11. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    Science.gov (United States)

    Grace, J. M.; Verseux, C.; Gentry, D.; Moffet, A.; Thayabaran, R.; Wong, N.; Rothschild, L.

    2013-12-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates[Wielgoss et al., 2013]. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques[Wassmann et al., 2010]. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols[Alcántara-Díaz et al., 2004; Goldman and Travisano, 2011]. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of

  12. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  13. Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions

    DEFF Research Database (Denmark)

    Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette

    2016-01-01

    We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found...... by assembling DNA from fragments (reads), locating a gene in this sequence and translating the gene to a protein. Sampling using this program generates random instance of the puzzle, but it is possible constrain the difficulty and to customize the secret protein word. Because of these constraints...... and the randomness of the generation process, sampling may fail to generate a satisfactory puzzle. To avoid failure we employ a strategy using adaptive probabilities which change in response to previous steps of generative process, thus minimizing the risk of failure....

  14. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    Science.gov (United States)

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  15. Adaptive Sampling-Based Information Collection for Wireless Body Area Networks.

    Science.gov (United States)

    Xu, Xiaobin; Zhao, Fang; Wang, Wendong; Tian, Hui

    2016-08-31

    To collect important health information, WBAN applications typically sense data at a high frequency. However, limited by the quality of wireless link, the uploading of sensed data has an upper frequency. To reduce upload frequency, most of the existing WBAN data collection approaches collect data with a tolerable error. These approaches can guarantee precision of the collected data, but they are not able to ensure that the upload frequency is within the upper frequency. Some traditional sampling based approaches can control upload frequency directly, however, they usually have a high loss of information. Since the core task of WBAN applications is to collect health information, this paper aims to collect optimized information under the limitation of upload frequency. The importance of sensed data is defined according to information theory for the first time. Information-aware adaptive sampling is proposed to collect uniformly distributed data. Then we propose Adaptive Sampling-based Information Collection (ASIC) which consists of two algorithms. An adaptive sampling probability algorithm is proposed to compute sampling probabilities of different sensed values. A multiple uniform sampling algorithm provides uniform samplings for values in different intervals. Experiments based on a real dataset show that the proposed approach has higher performance in terms of data coverage and information quantity. The parameter analysis shows the optimized parameter settings and the discussion shows the underlying reason of high performance in the proposed approach.

  16. Sample-adaptive-prediction for HEVC SCC intra coding with ridge estimation from spatially neighboring samples

    Science.gov (United States)

    Kang, Je-Won; Ryu, Soo-Kyung

    2017-02-01

    In this paper a sample-adaptive prediction technique is proposed to yield efficient coding performance in an intracoding for screen content video coding. The sample-based prediction is to reduce spatial redundancies in neighboring samples. To this aim, the proposed technique uses a weighted linear combination of neighboring samples and applies the robust optimization technique, namely, ridge estimation to derive the weights in a decoder side. The ridge estimation uses L2 norm based regularization term, and, thus the solution is more robust to high variance samples such as in sharp edges and high color contrasts exhibited in screen content videos. It is demonstrated with the experimental results that the proposed technique provides an improved coding gain as compared to the HEVC screen content video coding reference software.

  17. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    Science.gov (United States)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-01

    Molecular dynamics simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules, but they are limited by the time scale barrier. That is, we may not obtain properties' efficiently because we need to run microseconds or longer simulations using femtosecond time steps. To overcome this time scale barrier, we can use the weighted ensemble (WE) method, a powerful enhanced sampling method that efficiently samples thermodynamic and kinetic properties. However, the WE method requires an appropriate partitioning of phase space into discrete macrostates, which can be problematic when we have a high-dimensional collective space or when little is known a priori about the molecular system. Hence, we developed a new WE-based method, called the "Concurrent Adaptive Sampling (CAS) algorithm," to tackle these issues. The CAS algorithm is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective variables and adaptive macrostates to enhance the sampling in the high-dimensional space. This is especially useful for systems in which we do not know what the right reaction coordinates are, in which case we can use many collective variables to sample conformations and pathways. In addition, a clustering technique based on the committor function is used to accelerate sampling the slowest process in the molecular system. In this paper, we introduce the new method and show results from two-dimensional models and bio-molecules, specifically penta-alanine and a triazine trimer.

  18. Peptide Backbone Sampling Convergence with the Adaptive Biasing Force Algorithm

    Science.gov (United States)

    Faller, Christina E.; Reilly, Kyle A.; Hills, Ronald D.; Guvench, Olgun

    2013-01-01

    Complete Boltzmann sampling of reaction coordinates in biomolecular systems continues to be a challenge for unbiased molecular dynamics simulations. A growing number of methods have been developed for applying biases to biomolecular systems to enhance sampling while enabling recovery of the unbiased (Boltzmann) distribution of states. The Adaptive Biasing Force (ABF) algorithm is one such method, and works by canceling out the average force along the desired reaction coordinate(s) using an estimate of this force progressively accumulated during the simulation. Upon completion of the simulation, the potential of mean force, and therefore Boltzmann distribution of states, is obtained by integrating this average force. In an effort to characterize the expected performance in applications such as protein loop sampling, ABF was applied to the full ranges of the Ramachandran ϕ/ψ backbone dihedral reaction coordinates for dipeptides of the 20 amino acids using all-atom explicit-water molecular dynamics simulations. Approximately half of the dipeptides exhibited robust and rapid convergence of the potential of mean force as a function of ϕ/ψ in triplicate 50-ns simulations, while the remainder exhibited varying degrees of less complete convergence. The greatest difficulties in achieving converged ABF sampling were seen in the branched-sidechain amino acids threonine and valine, as well as the special case of proline. Proline dipeptide sampling was further complicated by trans-to-cis peptide bond isomerization not observed in unbiased control molecular dynamics simulations. Overall, the ABF method was found to be a robust means of sampling the entire ϕ/ψ reaction coordinate for the 20 amino acids, including high free-energy regions typically inaccessible in standard molecular dynamics simulations. PMID:23215032

  19. Adaptive sampling for nonlinear dimensionality reduction based on manifold learning

    DEFF Research Database (Denmark)

    Franz, Thomas; Zimmermann, Ralf; Goertz, Stefan

    2017-01-01

    We make use of the non-intrusive dimensionality reduction method Isomap in order to emulate nonlinear parametric flow problems that are governed by the Reynolds-averaged Navier-Stokes equations. Isomap is a manifold learning approach that provides a low-dimensional embedding space...... that is approximately isometric to the manifold that is assumed to be formed by the high-fidelity Navier-Stokes flow solutions under smooth variations of the inflow conditions. The focus of the work at hand is the adaptive construction and refinement of the Isomap emulator: We exploit the non-Euclidean Isomap metric...... to detect and fill up gaps in the sampling in the embedding space. The performance of the proposed manifold filling method will be illustrated by numerical experiments, where we consider nonlinear parameter-dependent steady-state Navier-Stokes flows in the transonic regime....

  20. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum.

    Directory of Open Access Journals (Sweden)

    Noah D Charney

    Full Text Available Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds.

  1. Temporally adaptive sampling: a case study in rare species survey design with marbled salamanders (Ambystoma opacum).

    Science.gov (United States)

    Charney, Noah D; Kubel, Jacob E; Eiseman, Charles S

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds.

  2. Multi-Scaling Sampling: An Adaptive Sampling Method for Discovering Approximate Association Rules

    Institute of Scientific and Technical Information of China (English)

    Cai-Yan Jia; Xie-Ping Gao

    2005-01-01

    One of the obstacles of the efficient association rule mining is the explosive expansion of data sets since it is costly or impossible to scan large databases, esp., for multiple times. A popular solution to improve the speed and scalability of the association rule mining is to do the algorithm on a random sample instead of the entire database. But how to effectively define and efficiently estimate the degree of error with respect to the outcome of the algorithm, and how to determine the sample size needed are entangling researches until now. In this paper, an effective and efficient algorithm is given based on the PAC (Probably Approximate Correct) learning theory to measure and estimate sample error. Then, a new adaptive, on-line, fast sampling strategy - multi-scaling sampling - is presented inspired by MRA (Multi-Resolution Analysis) and Shannon sampling theorem, for quickly obtaining acceptably approximate association rules at appropriate sample size. Both theoretical analysis and empirical study have showed that the sampling strategy can achieve a very good speed-accuracy trade-off.

  3. The importance of cooling of urine samples for doping analysis

    NARCIS (Netherlands)

    Kuenen, J. Gijs; Konings, Wil N.

    2010-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  4. The importance of cooling of urine samples for doping analysis

    NARCIS (Netherlands)

    Kuenen, J.G.; Konings, W.N.

    2009-01-01

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling fa

  5. Monte Carlo importance sampling for the MCNP{trademark} general source

    Energy Technology Data Exchange (ETDEWEB)

    Lichtenstein, H.

    1996-01-09

    Research was performed to develop an importance sampling procedure for a radiation source. The procedure was developed for the MCNP radiation transport code, but the approach itself is general and can be adapted to other Monte Carlo codes. The procedure, as adapted to MCNP, relies entirely on existing MCNP capabilities. It has been tested for very complex descriptions of a general source, in the context of the design of spent-reactor-fuel storage casks. Dramatic improvements in calculation efficiency have been observed in some test cases. In addition, the procedure has been found to provide an acceleration to acceptable convergence, as well as the benefit of quickly identifying user specified variance-reduction in the transport that effects unstable convergence.

  6. Importance of eccentric actions in performance adaptations to resistance training

    Science.gov (United States)

    Dudley, Gary A.; Miller, Bruce J.; Buchanan, Paul; Tesch, Per A.

    1991-01-01

    The importance of eccentric (ecc) muscle actions in resistance training for the maintenance of muscle strength and mass in hypogravity was investigated in experiments in which human subjects, divided into three groups, were asked to perform four-five sets of 6 to 12 repetitions (rep) per set of three leg press and leg extension exercises, 2 days each weeks for 19 weeks. One group, labeled 'con', performed each rep with only concentric (con) actions, while group con/ecc with performed each rep with only ecc actions; the third group, con/con, performed twice as many sets with only con actions. Control subjects did not train. It was found that resistance training wih both con and ecc actions induced greater increases in muscle strength than did training with only con actions.

  7. Efficiently Sampling Conformations and Pathways Using the Concurrent Adaptive Sampling (CAS) Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Surl Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-21

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,” has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer

  8. Parallel importance sampling in conditional linear Gaussian networks

    DEFF Research Database (Denmark)

    Salmerón, Antonio; Ramos-López, Darío; Borchani, Hanen

    2015-01-01

    In this paper we analyse the problem of probabilistic inference in CLG networks when evidence comes in streams. In such situations, fast and scalable algorithms, able to provide accurate responses in a short time are required. We consider the instantiation of variational inference and importance ...

  9. Adaptive behavior in autism: Minimal clinically important differences on the Vineland-II.

    Science.gov (United States)

    Chatham, C H; Taylor, K I; Charman, T; Liogier D'ardhuy, X; Eule, E; Fedele, A; Hardan, A Y; Loth, E; Murtagh, L; Del Valle Rubido, M; San Jose Caceres, A; Sevigny, J; Sikich, L; Snyder, L; Tillmann, J E; Ventola, P E; Walton-Bowen, K L; Wang, P P; Willgoss, T; Bolognani, F

    2017-09-21

    Autism Spectrum Disorder (ASD) is associated with persistent impairments in adaptive abilities across multiple domains. These social, personal, and communicative impairments become increasingly pronounced with development, and are present regardless of IQ. The Vineland Adaptive Behavior Scales, Second Edition (Vineland-II) is the most commonly used instrument for quantifying these impairments, but minimal clinically important differences (MCIDs) on Vineland-II scores have not been rigorously established in ASD. We pooled data from several consortia/registries (EU-AIMS LEAP study, ABIDE-I, ABIDE-II, INFOR, Simons Simplex Collection and Autism Treatment Network [ATN]) and clinical investigations and trials (Stanford, Yale, Roche) resulting in a data set of over 9,000 individuals with ASD. Two approaches were used to estimate MCIDs: distribution-based methods and anchor-based methods. Distribution-based MCID [d-MCID] estimates included the standard error of the measurement, as well as one-fifth and one-half of the covariate-adjusted standard deviation (both cross-sectionally and longitudinally). Anchor-based MCID [a-MCID] estimates include the slope of linear regression of clinician ratings of severity on the Vineland-II score, the slope of linear regression of clinician ratings of longitudinal improvement category on Vineland-II change, the Vineland-II change score maximally differentiating clinical impressions of minimal versus no improvement, and equipercentile equating. Across strata, the Vineland-II Adaptive Behavior Composite standardized score MCID estimates range from 2.01 to 3.2 for distribution-based methods, and from 2.42 to 3.75 for sample-size-weighted anchor-based methods. Lower Vineland-II standardized score MCID estimates were observed for younger and more cognitively impaired populations. These MCID estimates enable users of Vineland-II to assess both the statistical and clinical significance of any observed change. Autism Res 2017. © 2017

  10. Importance Nested Sampling and the MultiNest Algorithm

    CERN Document Server

    Feroz, F; Cameron, E; Pettitt, A N

    2013-01-01

    Bayesian inference involves two main computational challenges. First, in estimating the parameters of some model for the data, the posterior distribution may well be highly multi-modal: a regime in which the convergence to stationarity of traditional Markov Chain Monte Carlo (MCMC) techniques becomes incredibly slow. Second, in selecting between a set of competing models the necessary estimation of the Bayesian evidence for each is, by definition, a (possibly high-dimensional) integration over the entire parameter space; again this can be a daunting computational task, although new Monte Carlo (MC) integration algorithms offer solutions of ever increasing efficiency. Nested sampling (NS) is one such contemporary MC strategy targeted at calculation of the Bayesian evidence, but which also enables posterior inference as a by-product, thereby allowing simultaneous parameter estimation and model selection. The widely-used MultiNest algorithm presents a particularly efficient implementation of the NS technique for...

  11. Job performance ratings : The relative importance of mental ability, conscientiousness, and career adaptability

    NARCIS (Netherlands)

    Ohme, Melanie; Zacher, Hannes

    2015-01-01

    According to career construction theory, continuous adaptation to the work environment is crucial to achieve work and career success. In this study, we examined the relative importance of career adaptability for job performance ratings using an experimental policy-capturing design. Employees (N = 13

  12. Sampling technique is important for optimal isolation of pharyngeal gonorrhoea.

    Science.gov (United States)

    Mitchell, M; Rane, V; Fairley, C K; Whiley, D M; Bradshaw, C S; Bissessor, M; Chen, M Y

    2013-11-01

    Culture is insensitive for the detection of pharyngeal gonorrhoea but isolation is pivotal to antimicrobial resistance surveillance. The aim of this study was to ascertain whether recommendations provided to clinicians (doctors and nurses) on pharyngeal swabbing technique could improve gonorrhoea detection rates and to determine which aspects of swabbing technique are important for optimal isolation. This study was undertaken at the Melbourne Sexual Health Centre, Australia. Detection rates among clinicians for pharyngeal gonorrhoea were compared before (June 2006-May 2009) and after (June 2009-June 2012) recommendations on swabbing technique were provided. Associations between detection rates and reported swabbing technique obtained via a clinician questionnaire were examined. The overall yield from testing before and after provision of the recommendations among 28 clinicians was 1.6% (134/8586) and 1.8% (264/15,046) respectively (p=0.17). Significantly higher detection rates were seen following the recommendations among clinicians who reported a change in their swabbing technique in response to the recommendations (2.1% vs. 1.5%; p=0.004), swabbing a larger surface area (2.0% vs. 1.5%; p=0.02), applying more swab pressure (2.5% vs. 1.5%; p<0.001) and a change in the anatomical sites they swabbed (2.2% vs. 1.5%; p=0.002). The predominant change in sites swabbed was an increase in swabbing of the oropharynx: from a median of 0% to 80% of the time. More thorough swabbing improves the isolation of pharyngeal gonorrhoea using culture. Clinicians should receive training to ensure swabbing is performed with sufficient pressure and that it covers an adequate area that includes the oropharynx.

  13. State-independent importance sampling for random walks with regularly varying increments

    Directory of Open Access Journals (Sweden)

    Karthyek R. A. Murthy

    2015-03-01

    Full Text Available We develop importance sampling based efficient simulation techniques for three commonly encountered rare event probabilities associated with random walks having i.i.d. regularly varying increments; namely, 1 the large deviation probabilities, 2 the level crossing probabilities, and 3 the level crossing probabilities within a regenerative cycle. Exponential twisting based state-independent methods, which are effective in efficiently estimating these probabilities for light-tailed increments are not applicable when the increments are heavy-tailed. To address the latter case, more complex and elegant state-dependent efficient simulation algorithms have been developed in the literature over the last few years. We propose that by suitably decomposing these rare event probabilities into a dominant and further residual components, simpler state-independent importance sampling algorithms can be devised for each component resulting in composite unbiased estimators with desirable efficiency properties. When the increments have infinite variance, there is an added complexity in estimating the level crossing probabilities as even the well known zero-variance measures have an infinite expected termination time. We adapt our algorithms so that this expectation is finite while the estimators remain strongly efficient. Numerically, the proposed estimators perform at least as well, and sometimes substantially better than the existing state-dependent estimators in the literature.

  14. Adaptive cluster sampling: An efficient method for assessing inconspicuous species

    Science.gov (United States)

    Andrea M. Silletti; Joan Walker

    2003-01-01

    Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...

  15. Sampling plant diversity and rarity at landscape scales: importance of sampling time in species detectability.

    Directory of Open Access Journals (Sweden)

    Jian Zhang

    Full Text Available Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼ 65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys.

  16. Species ecological similarity modulates the importance of colonization history for adaptive radiation.

    Science.gov (United States)

    Tan, Jiaqi; Yang, Xian; Jiang, Lin

    2017-06-01

    Adaptive radiation is an important evolutionary process, through which a single ancestral lineage rapidly gives rise to multiple newly formed lineages that specialize in different niches. In the first-arrival hypothesis, David Lack emphasized the importance of species colonization history for adaptive radiation, suggesting that the earlier arrival of a diversifying species would allow it to radiate to a greater extent. Here, we report on the first rigorous experimental test of this hypothesis, using the rapidly evolving bacterium Pseudomonas fluorescens SBW25 and six different bacterial competitors. We show that the earlier arrival of P. fluorescens facilitated its diversification. Nevertheless, significant effects of colonization history, which led to alternative diversification trajectories, were observed only when the competitors shared similar niche and competitive fitness with P. fluorescens. These results highlight the important role of species colonization history, modified by their ecological differences, for adaptive radiation. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.

  17. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging.

    Science.gov (United States)

    Juang, Kai-Wei; Lee, Dar-Yuan; Teng, Yun-Lung

    2005-11-01

    Correctly classifying "contaminated" areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the "contaminated" areas.

  18. Robust Fusion of Irregularly Sampled Data Using Adaptive Normalized Convolution

    NARCIS (Netherlands)

    Pham, T.Q.; Van Vliet, L.J.; Schutte, K.

    2006-01-01

    We present a novel algorithm for image fusion from irregularly sampled data. The method is based on the framework of normalized convolution (NC), in which the local signal is approximated through a projection onto a subspace. The use of polynomial basis functions in this paper makes NC equivalent to

  19. Robust Fusion of Irregularly Sampled Data Using Adaptive Normalized Convolution

    NARCIS (Netherlands)

    Pham, T.Q.; Vliet, L.J. van; Schutte, K.

    2006-01-01

    We present a novel algorithm for image fusion from irregularly sampled data. The method is based on the framework of normalized convolution (NC), in which the local signal is approximated through a projection onto a subspace. The use of polynomial basis functions in this paper makes NC equivalent to

  20. Climate variables explain neutral and adaptive variation within salmonid metapopulations: the importance of replication in landscape genetics.

    Science.gov (United States)

    Hand, Brian K; Muhlfeld, Clint C; Wade, Alisa A; Kovach, Ryan P; Whited, Diane C; Narum, Shawn R; Matala, Andrew P; Ackerman, Michael W; Garner, Brittany A; Kimball, John S; Stanford, Jack A; Luikart, Gordon

    2016-02-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST ) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  1. Climate variables explain neutral and adaptive variation within salmonid metapopulations: The importance of replication in landscape genetics

    Science.gov (United States)

    Hand, Brian K; Muhlfeld, Clint C.; Wade, Alisa A.; Kovach, Ryan; Whited, Diane C.; Narum, Shawn R.; Matala, Andrew P; Ackerman, Michael W.; Garner, B. A.; Kimball, John S; Stanford, Jack A.; Luikart, Gordon

    2016-01-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  2. Adaptive Sampling for WSAN Control Applications Using Artificial Neural Networks

    OpenAIRE

    2012-01-01

    Wireless sensor actuator networks are becoming a solution for control applications. Reliable data transmission and real time constraints are the most significant challenges. Control applications will have some Quality of Service (QoS) requirements from the sensor network, such as minimum delay and guaranteed delivery of packets. We investigate variable sampling method to mitigate the effects of time delays in wireless networked control systems using an observer based control system model. Our...

  3. Adapting sampling plans to caribou distribution on calving grounds

    Directory of Open Access Journals (Sweden)

    Michel Crête

    1991-10-01

    Full Text Available Between 1984 and 1988, the size of the two caribou herds in northern Québec was derived by combining estimates of female numbers on calving grounds in June and composition counts during rut in autumn. Sampling with aerial photos was conducted on calving grounds to determine the number of animals per km2, telemetry served to estimate the proportion of females in the census area at the time of photography in addition to summer survival rate, and helicopter or ground observations were used for composition counts. Observers were able to detect on black and white negatives over 95 percent of caribou counted from a helicopter flying at low altitude over the same area; photo scale varied between = 1:3 600 and 1:6 000. Sampling units covering less than 15-20 ha were the best for sampling caribou distribution on calving grounds, where density generally averaged » 10 individuals-km"2. Around 90 percent of caribou on calving grounds were females; others were mostly yearling males. During the 1-2 day photographic census, 64 to 77 percent of the females were present on the calving areas. Summer survival exceeded 95 percent in three summers. In autumn, females composed between 45 and 54 percent of each herd. The Rivière George herd was estimated at 682 000 individuals (± 36%; alpha = 0.10 in 1988. This estimate was imprecise due to insufficiens sample size for measuring animal density on the calving ground and for determining proportion of females on the calving ground at the time of the photo census. To improve precision and reduce cost, it is proposed to estimate herd size of tundra caribou in one step, using only aerial photos in early June without telemetry.

  4. Long-term dynamics of adaptive evolution in a globally important phytoplankton species to ocean acidification

    Science.gov (United States)

    Schlüter, Lothar; Lohbeck, Kai T.; Gröger, Joachim P.; Riebesell, Ulf; Reusch, Thorsten B. H.

    2016-01-01

    Marine phytoplankton may adapt to ocean change, such as acidification or warming, because of their large population sizes and short generation times. Long-term adaptation to novel environments is a dynamic process, and phenotypic change can take place thousands of generations after exposure to novel conditions. We conducted a long-term evolution experiment (4 years = 2100 generations), starting with a single clone of the abundant and widespread coccolithophore Emiliania huxleyi exposed to three different CO2 levels simulating ocean acidification (OA). Growth rates as a proxy for Darwinian fitness increased only moderately under both levels of OA [+3.4% and +4.8%, respectively, at 1100 and 2200 μatm partial pressure of CO2 (Pco2)] relative to control treatments (ambient CO2, 400 μatm). Long-term adaptation to OA was complex, and initial phenotypic responses of ecologically important traits were later reverted. The biogeochemically important trait of calcification, in particular, that had initially been restored within the first year of evolution was later reduced to levels lower than the performance of nonadapted populations under OA. Calcification was not constitutively lost but returned to control treatment levels when high CO2–adapted isolates were transferred back to present-day control CO2 conditions. Selection under elevated CO2 exacerbated a general decrease of cell sizes under long-term laboratory evolution. Our results show that phytoplankton may evolve complex phenotypic plasticity that can affect biogeochemically important traits, such as calcification. Adaptive evolution may play out over longer time scales (>1 year) in an unforeseen way under future ocean conditions that cannot be predicted from initial adaptation responses. PMID:27419227

  5. The relative power of genome scans to detect local adaptation depends on sampling design and statistical method.

    Science.gov (United States)

    Lotterhos, Katie E; Whitlock, Michael C

    2015-03-01

    Although genome scans have become a popular approach towards understanding the genetic basis of local adaptation, the field still does not have a firm grasp on how sampling design and demographic history affect the performance of genome scans on complex landscapes. To explore these issues, we compared 20 different sampling designs in equilibrium (i.e. island model and isolation by distance) and nonequilibrium (i.e. range expansion from one or two refugia) demographic histories in spatially heterogeneous environments. We simulated spatially complex landscapes, which allowed us to exploit local maxima and minima in the environment in 'pair' and 'transect' sampling strategies. We compared F(ST) outlier and genetic-environment association (GEA) methods for each of two approaches that control for population structure: with a covariance matrix or with latent factors. We show that while the relative power of two methods in the same category (F(ST) or GEA) depended largely on the number of individuals sampled, overall GEA tests had higher power in the island model and F(ST) had higher power under isolation by distance. In the refugia models, however, these methods varied in their power to detect local adaptation at weakly selected loci. At weakly selected loci, paired sampling designs had equal or higher power than transect or random designs to detect local adaptation. Our results can inform sampling designs for studies of local adaptation and have important implications for the interpretation of genome scans based on landscape data. © 2015 John Wiley & Sons Ltd.

  6. Recruitment of hard-to-reach population subgroups via adaptations of the snowball sampling strategy.

    Science.gov (United States)

    Sadler, Georgia Robins; Lee, Hau-Chen; Lim, Rod Seung-Hwan; Fullerton, Judith

    2010-09-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author's program of research are provided to demonstrate how adaptations of snowball sampling can be used effectively in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more-vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or for research studies when the recruitment of a population-based sample is not essential.

  7. Simulation of a Jackson tandem network using state-dependent importance sampling

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rareevent simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jackson twonode tandem queue. It is known that in this setting ‘traditional’ state-independent importance-sampling distributions perform po

  8. Simulation of a Jackson tandem network using state-dependent importance sampling

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jackson two-node tandem queue. It is known that in this setting `traditional' state-independent importance-sampling distributions perform

  9. Adaptive free energy sampling in multidimensional collective variable space using boxed molecular dynamics.

    Science.gov (United States)

    O'Connor, Mike; Paci, Emanuele; McIntosh-Smith, Simon; Glowacki, David R

    2016-12-22

    The past decade has seen the development of a new class of rare event methods in which molecular configuration space is divided into a set of boundaries/interfaces, and then short trajectories are run between boundaries. For all these methods, an important concern is how to generate boundaries. In this paper, we outline an algorithm for adaptively generating boundaries along a free energy surface in multi-dimensional collective variable (CV) space, building on the boxed molecular dynamics (BXD) rare event algorithm. BXD is a simple technique for accelerating the simulation of rare events and free energy sampling which has proven useful for calculating kinetics and free energy profiles in reactive and non-reactive molecular dynamics (MD) simulations across a range of systems, in both NVT and NVE ensembles. Two key developments outlined in this paper make it possible to automate BXD, and to adaptively map free energy and kinetics in complex systems. First, we have generalized BXD to multidimensional CV space. Using strategies from rigid-body dynamics, we have derived a simple and general velocity-reflection procedure that conserves energy for arbitrary collective variable definitions in multiple dimensions, and show that it is straightforward to apply BXD to sampling in multidimensional CV space so long as the Cartesian gradients ∇CV are available. Second, we have modified BXD to undertake on-the-fly statistical analysis during a trajectory, harnessing the information content latent in the dynamics to automatically determine boundary locations. Such automation not only makes BXD considerably easier to use; it also guarantees optimal boundaries, speeding up convergence. We have tested the multidimensional adaptive BXD procedure by calculating the potential of mean force for a chemical reaction recently investigated using both experimental and computational approaches - i.e., F + CD3CN → DF + D2CN in both the gas phase and a strongly coupled explicit CD3CN solvent

  10. Adaptive Biasing Combined with Hamiltonian Replica Exchange to Improve Umbrella Sampling Free Energy Simulations.

    Science.gov (United States)

    Zeller, Fabian; Zacharias, Martin

    2014-02-11

    The accurate calculation of potentials of mean force for ligand-receptor binding is one of the most important applications of molecular simulation techniques. Typically, the separation distance between ligand and receptor is chosen as a reaction coordinate along which a PMF can be calculated with the aid of umbrella sampling (US) techniques. In addition, restraints can be applied on the relative position and orientation of the partner molecules to reduce accessible phase space. An approach combining such phase space reduction with flattening of the free energy landscape and configurational exchanges has been developed, which significantly improves the convergence of PMF calculations in comparison with standard umbrella sampling. The free energy surface along the reaction coordinate is smoothened by iteratively adapting biasing potentials corresponding to previously calculated PMFs. Configurations are allowed to exchange between the umbrella simulation windows via the Hamiltonian replica exchange method. The application to a DNA molecule in complex with a minor groove binding ligand indicates significantly improved convergence and complete reversibility of the sampling along the pathway. The calculated binding free energy is in excellent agreement with experimental results. In contrast, the application of standard US resulted in large differences between PMFs calculated for association and dissociation pathways. The approach could be a useful alternative to standard US for computational studies on biomolecular recognition processes.

  11. Parks, people, and change: the importance of multistakeholder engagement in adaptation planning for conserved areas

    Directory of Open Access Journals (Sweden)

    Corrine N. Knapp

    2014-12-01

    Full Text Available Climate change challenges the traditional goals and conservation strategies of protected areas, necessitating adaptation to changing conditions. Denali National Park and Preserve (Denali in south central Alaska, USA, is a vast landscape that is responding to climate change in ways that will impact both ecological resources and local communities. Local observations help to inform understanding of climate change and adaptation planning, but whose knowledge is most important to consider? For this project we interviewed long-term Denali staff, scientists, subsistence community members, bus drivers, and business owners to assess what types of observations each can contribute, how climate change is impacting each, and what they think the National Park Service should do to adapt. The project shows that each type of long-term observer has different types of observations, but that those who depend more directly on natural resources for their livelihoods have more and different observations than those who do not. These findings suggest that engaging multiple groups of stakeholders who interact with the park in distinct ways adds substantially to the information provided by Denali staff and scientists and offers a broader foundation for adaptation planning. It also suggests that traditional protected area paradigms that fail to learn from and foster appropriate engagement of people may be maladaptive in the context of climate change.

  12. Energy consumption in development countries - effects on economics and compulsory adaptation of oil importers and consumers

    Energy Technology Data Exchange (ETDEWEB)

    Czakainski, M.

    1983-09-01

    Energy supply of developing countries is mainly based on mineral oil products and traditional energy carriers. The oil-importing developing countries were hard hit in their development efforts by the two oil-price surges of 1973/74 and, especially, of 1979/80. Any adaptation measures taken by the oil-importing countries of the third world should aim at tapping their own energy reserves, at a mature energy policy concerning concepts and instruments and at restructuring the strong dependence on oil of their consumption structures.

  13. A robust adaptive sampling method for faster acquisition of MR images.

    Science.gov (United States)

    Vellagoundar, Jaganathan; Machireddy, Ramasubba Reddy

    2015-06-01

    A robust adaptive k-space sampling method is proposed for faster acquisition and reconstruction of MR images. In this method, undersampling patterns are generated based on magnitude profile of a fully acquired 2-D k-space data. Images are reconstructed using compressive sampling reconstruction algorithm. Simulation experiments are done to assess the performance of the proposed method under various signal-to-noise ratio (SNR) levels. The performance of the method is better than non-adaptive variable density sampling method when k-space SNR is greater than 10dB. The method is implemented on a fully acquired multi-slice raw k-space data and a quality assurance phantom data. Data reduction of up to 60% is achieved in the multi-slice imaging data and 75% is achieved in the phantom imaging data. The results show that reconstruction accuracy is improved over non-adaptive or conventional variable density sampling method. The proposed sampling method is signal dependent and the estimation of sampling locations is robust to noise. As a result, it eliminates the necessity of mathematical model and parameter tuning to compute k-space sampling patterns as required in non-adaptive sampling methods. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Importance of Interaction between Integrin and Actin Cytoskeleton in Suspension Adaptation of CHO cells.

    Science.gov (United States)

    Walther, Christa G; Whitfield, Robert; James, David C

    2016-04-01

    The biopharmaceutical production process relies upon mammalian cell technology where single cells proliferate in suspension in a chemically defined synthetic environment. This environment lacks exogenous growth factors, usually contributing to proliferation of fibroblastic cell types such as Chinese hamster ovary (CHO) cells. Use of CHO cells for production hence requires a lengthy 'adaptation' process to select clones capable of proliferation as single cells in suspension. The underlying molecular changes permitting proliferation in suspension are not known. Comparison of the non-suspension-adapted clone CHO-AD and a suspension-adapted propriety cell line CHO-SA by flow cytometric analysis revealed a highly variable bi-modal expression pattern for cell-to-cell contact proteins in contrast to the expression pattern seen for integrins. Those have a uni-modal expression on suspension and adherent cells. Integrins showed a conformation distinguished by regularly distributed clusters forming a sphere on the cell membrane of suspension-adapted cells. Actin cytoskeleton analysis revealed reorganisation from the typical fibrillar morphology found in adherent cells to an enforced spherical subcortical actin sheath in suspension cells. The uni-modal expression and specific clustering of integrins could be confirmed for CHO-S, another suspension cell line. Cytochalasin D treatment resulted in breakdown of the actin sheath and the sphere-like integrin conformation demonstrating the link between integrins and actin in suspension-adapted CHO cells. The data demonstrates the importance of signalling changes, leading to an integrin rearrangement on the cell surface, and the necessity of the reinforcement of the actin cytoskeleton for proliferation in suspension conditions.

  15. Hybrid algorithm of ensemble transform and importance sampling for assimilation of non-Gaussian observations

    Directory of Open Access Journals (Sweden)

    Shin'ya Nakano

    2014-05-01

    Full Text Available A hybrid algorithm that combines the ensemble transform Kalman filter (ETKF and the importance sampling approach is proposed. Since the ETKF assumes a linear Gaussian observation model, the estimate obtained by the ETKF can be biased in cases with nonlinear or non-Gaussian observations. The particle filter (PF is based on the importance sampling technique, and is applicable to problems with nonlinear or non-Gaussian observations. However, the PF usually requires an unrealistically large sample size in order to achieve a good estimation, and thus it is computationally prohibitive. In the proposed hybrid algorithm, we obtain a proposal distribution similar to the posterior distribution by using the ETKF. A large number of samples are then drawn from the proposal distribution, and these samples are weighted to approximate the posterior distribution according to the importance sampling principle. Since the importance sampling provides an estimate of the probability density function (PDF without assuming linearity or Gaussianity, we can resolve the bias due to the nonlinear or non-Gaussian observations. Finally, in the next forecast step, we reduce the sample size to achieve computational efficiency based on the Gaussian assumption, while we use a relatively large number of samples in the importance sampling in order to consider the non-Gaussian features of the posterior PDF. The use of the ETKF is also beneficial in terms of the computational simplicity of generating a number of random samples from the proposal distribution and in weighting each of the samples. The proposed algorithm is not necessarily effective in case that the ensemble is located distant from the true state. However, monitoring the effective sample size and tuning the factor for covariance inflation could resolve this problem. In this paper, the proposed hybrid algorithm is introduced and its performance is evaluated through experiments with non-Gaussian observations.

  16. Statistical evaluation of the New Zealand food safety authority sampling protocol for imported food.

    Science.gov (United States)

    Govindaraju, Kondaswamy; Bebbington, Mark; Wrathall, Thewaporn

    2010-05-01

    The New Zealand Food Safety Authority sampling protocol for compliance inspection of imported food products is evaluated for its ability to provide consumer protection. The sampling protocol involves both partial testing of imported consignments and complete skipping inspection of consignments based on the quality history. The risk posed by the strategies of partial testing and skipping inspection of imports is evaluated using the average outgoing quality limit and other performance measures. The cost dimension of sampling inspection is also considered. Suggestions for improvement, which include tightening the skipping inspection parameters, are made.

  17. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...... selection. The proposed technique is applied to an RF sub-sampling receiver and has revealed great improvements in the SNIR of the receiver. Measurements on an experimental sub-sampling receiver show that the presented method provides up to 85.9 dB improvements in the SNIR of the receiver when comparing...... it for best and worst choice of sampling rate....

  18. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    Energy Technology Data Exchange (ETDEWEB)

    Juang, K.-W. [Department of Post-Modern Agriculture, MingDao University, Pitou, Changhua, Taiwan (China); Lee, D.-Y. [Graduate Institute of Agricultural Chemistry, National Taiwan University, Taipei, Taiwan (China)]. E-mail: dylee@ccms.ntu.edu.tw; Teng, Y.-L. [Graduate Institute of Agricultural Chemistry, National Taiwan University, Taipei, Taiwan (China)

    2005-11-15

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging.

  19. Iterative Monte Carlo with bead-adapted sampling for complex-time correlation functions

    Science.gov (United States)

    Jadhao, Vikram; Makri, Nancy

    2010-03-01

    In a recent communication [V. Jadhao and N. Makri, J. Chem. Phys. 129, 161102 (2008)], we introduced an iterative Monte Carlo (IMC) path integral methodology for calculating complex-time correlation functions. This method constitutes a stepwise evaluation of the path integral on a grid selected by a Monte Carlo procedure, circumventing the exponential growth of statistical error with increasing propagation time, while realizing the advantageous scaling of importance sampling in the grid selection and integral evaluation. In the present paper, we present an improved formulation of IMC, which is based on a bead-adapted sampling procedure; thus leading to grid point distributions that closely resemble the absolute value of the integrand at each iteration. We show that the statistical error of IMC does not grow upon repeated iteration, in sharp contrast to the performance of the conventional path integral approach which leads to exponential increase in statistical uncertainty. Numerical results on systems with up to 13 degrees of freedom and propagation up to 30 times the "thermal" time ℏβ /2 illustrate these features.

  20. Intraspecific shape variation in horseshoe crabs: the importance of sexual and natural selection for local adaptation

    DEFF Research Database (Denmark)

    Faurby, Søren; Nielsen, Kasper Sauer Kollerup; Bussarawit, Somchai

    2011-01-01

    A morphometric analysis of the body shape of three species of horseshoe crabs was undertaken in order to infer the importance of natural and sexual selection. It was expected that natural selection would be most intense, leading to highest regional differentiation, in the American species Limulus...... polyphemus, which has the largest climatic differences between different populations. Local adaptation driven by sexual selection was expected in males but not females because horseshoe crab mating behaviour leads to competition between males, but not between females. Three hundred fifty-nine horseshoe crabs...

  1. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  2. Implementation of time-efficient adaptive sampling function design for improved undersampled MRI reconstruction.

    Science.gov (United States)

    Choi, Jinhyeok; Kim, Hyeonjin

    2016-12-01

    To improve the efficacy of undersampled MRI, a method of designing adaptive sampling functions is proposed that is simple to implement on an MR scanner and yet effectively improves the performance of the sampling functions. An approximation of the energy distribution of an image (E-map) is estimated from highly undersampled k-space data acquired in a prescan and efficiently recycled in the main scan. An adaptive probability density function (PDF) is generated by combining the E-map with a modeled PDF. A set of candidate sampling functions are then prepared from the adaptive PDF, among which the one with maximum energy is selected as the final sampling function. To validate its computational efficiency, the proposed method was implemented on an MR scanner, and its robust performance in Fourier-transform (FT) MRI and compressed sensing (CS) MRI was tested by simulations and in a cherry tomato. The proposed method consistently outperforms the conventional modeled PDF approach for undersampling ratios of 0.2 or higher in both FT-MRI and CS-MRI. To fully benefit from undersampled MRI, it is preferable that the design of adaptive sampling functions be performed online immediately before the main scan. In this way, the proposed method may further improve the efficacy of the undersampled MRI. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Coalescent: an open-science framework for importance sampling in coalescent theory.

    Science.gov (United States)

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  4. Cross-entropy optimisation of importance sampling parameters for statistical model checking

    CERN Document Server

    Jégourel, Cyrille; Sedwards, Sean

    2012-01-01

    Statistical model checking avoids the exponential growth of states associated with probabilistic model checking by estimating properties from multiple executions of a system and by giving results within confidence bounds. Rare properties are often very important but pose a particular challenge for simulation-based approaches, hence a key objective under these circumstances is to reduce the number and length of simulations necessary to produce a given level of confidence. Importance sampling is a well-established technique that achieves this, however to maintain the advantages of statistical model checking it is necessary to find good importance sampling distributions without considering the entire state space. Motivated by the above, we present a simple algorithm that uses the notion of cross-entropy to find the optimal parameters for an importance sampling distribution. In contrast to previous work, our algorithm uses a low dimensional vector of parameters to define this distribution and thus avoids the ofte...

  5. Data reduction in the ITMS system through a data acquisition model with self-adaptive sampling rate

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, M. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain)], E-mail: mariano.ruiz@upm.es; Lopez, JM.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Barrera, E. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Melendez, R. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2008-04-15

    Long pulse or steady state operation of fusion experiments require data acquisition and processing systems that reduce the volume of data involved. The availability of self-adaptive sampling rate systems and the use of real-time lossless data compression techniques can help solve these problems. The former is important for continuous adaptation of sampling frequency for experimental requirements. The latter allows the maintenance of continuous digitization under limited memory conditions. This can be achieved by permanent transmission of compressed data to other systems. The compacted transfer ensures the use of minimum bandwidth. This paper presents an implementation based on intelligent test and measurement system (ITMS), a data acquisition system architecture with multiprocessing capabilities that permits it to adapt the system's sampling frequency throughout the experiment. The sampling rate can be controlled depending on the experiment's specific requirements by using an external dc voltage signal or by defining user events through software. The system takes advantage of the high processing capabilities of the ITMS platform to implement a data reduction mechanism based in lossless data compression algorithms which are themselves based in periodic deltas.

  6. Career Adapt-Abilities Scale in a French-Speaking Swiss Sample: Psychometric Properties and Relationships to Personality and Work Engagement

    Science.gov (United States)

    Rossier, Jerome; Zecca, Gregory; Stauffer, Sarah D.; Maggiori, Christian; Dauwalder, Jean-Pierre

    2012-01-01

    The aim of this study was to analyze the psychometric properties of the Career Adapt-Abilities Scale (CAAS) in a French-speaking Swiss sample and its relationship with personality dimensions and work engagement. The heterogeneous sample of 391 participants (M[subscript age] = 39.59, SD = 12.30) completed the CAAS-International and a short version…

  7. Career Adapt-Abilities Scale in a French-Speaking Swiss Sample: Psychometric Properties and Relationships to Personality and Work Engagement

    Science.gov (United States)

    Rossier, Jerome; Zecca, Gregory; Stauffer, Sarah D.; Maggiori, Christian; Dauwalder, Jean-Pierre

    2012-01-01

    The aim of this study was to analyze the psychometric properties of the Career Adapt-Abilities Scale (CAAS) in a French-speaking Swiss sample and its relationship with personality dimensions and work engagement. The heterogeneous sample of 391 participants (M[subscript age] = 39.59, SD = 12.30) completed the CAAS-International and a short version…

  8. Estimating and Projecting Trends in HIV/AIDS Generalized Epidemics Using Incremental Mixture Importance Sampling.

    Science.gov (United States)

    Raftery, Adrian E; Bao, Le

    2010-12-01

    The Joint United Nations Programme on HIV/AIDS (UNAIDS) has decided to use Bayesian melding as the basis for its probabilistic projections of HIV prevalence in countries with generalized epidemics. This combines a mechanistic epidemiological model, prevalence data, and expert opinion. Initially, the posterior distribution was approximated by sampling-importance-resampling, which is simple to implement, easy to interpret, transparent to users, and gave acceptable results for most countries. For some countries, however, this is not computationally efficient because the posterior distribution tends to be concentrated around nonlinear ridges and can also be multimodal. We propose instead incremental mixture importance sampling (IMIS), which iteratively builds up a better importance sampling function. This retains the simplicity and transparency of sampling importance resampling, but is much more efficient computationally. It also leads to a simple estimator of the integrated likelihood that is the basis for Bayesian model comparison and model averaging. In simulation experiments and on real data, it outperformed both sampling importance resampling and three publicly available generic Markov chain Monte Carlo algorithms for this kind of problem.

  9. Optimal adaptive group sequential design with flexible timing of sample size determination.

    Science.gov (United States)

    Cui, Lu; Zhang, Lanju; Yang, Bo

    2017-04-26

    Flexible sample size designs, including group sequential and sample size re-estimation designs, have been used as alternatives to fixed sample size designs to achieve more robust statistical power and better trial efficiency. In this work, a new representation of sample size re-estimation design suggested by Cui et al. [5,6] is introduced as an adaptive group sequential design with flexible timing of sample size determination. This generalized adaptive group sequential design allows one time sample size determination either before the start of or in the mid-course of a clinical study. The new approach leads to possible design optimization on an expanded space of design parameters. Its equivalence to sample size re-estimation design proposed by Cui et al. provides further insight on re-estimation design and helps to address common confusions and misunderstanding. Issues in designing flexible sample size trial, including design objective, performance evaluation and implementation are touched upon with an example to illustrate. Copyright © 2017. Published by Elsevier Inc.

  10. Dynamically optimized Wang-Landau sampling with adaptive trial moves and modification factors.

    Science.gov (United States)

    Koh, Yang Wei; Lee, Hwee Kuan; Okabe, Yutaka

    2013-11-01

    The density of states of continuous models is known to span many orders of magnitudes at different energies due to the small volume of phase space near the ground state. Consequently, the traditional Wang-Landau sampling which uses the same trial move for all energies faces difficulties sampling the low-entropic states. We developed an adaptive variant of the Wang-Landau algorithm that very effectively samples the density of states of continuous models across the entire energy range. By extending the acceptance ratio method of Bouzida, Kumar, and Swendsen such that the step size of the trial move and acceptance rate are adapted in an energy-dependent fashion, the random walker efficiently adapts its sampling according to the local phase space structure. The Wang-Landau modification factor is also made energy dependent in accordance with the step size, enhancing the accumulation of the density of states. Numerical simulations show that our proposed method performs much better than the traditional Wang-Landau sampling.

  11. Local Adaptation in European Firs Assessed through Extensive Sampling across Altitudinal Gradients in Southern Europe

    Science.gov (United States)

    Postolache, Dragos; Lascoux, Martin; Drouzas, Andreas D.; Källman, Thomas; Leonarduzzi, Cristina; Liepelt, Sascha; Piotti, Andrea; Popescu, Flaviu; Roschanski, Anna M.; Zhelev, Peter; Fady, Bruno; Vendramin, Giovanni Giuseppe

    2016-01-01

    Background Local adaptation is a key driver of phenotypic and genetic divergence at loci responsible for adaptive traits variations in forest tree populations. Its experimental assessment requires rigorous sampling strategies such as those involving population pairs replicated across broad spatial scales. Methods A hierarchical Bayesian model of selection (HBM) that explicitly considers both the replication of the environmental contrast and the hierarchical genetic structure among replicated study sites is introduced. Its power was assessed through simulations and compared to classical ‘within-site’ approaches (FDIST, BAYESCAN) and a simplified, within-site, version of the model introduced here (SBM). Results HBM demonstrates that hierarchical approaches are very powerful to detect replicated patterns of adaptive divergence with low false-discovery (FDR) and false-non-discovery (FNR) rates compared to the analysis of different sites separately through within-site approaches. The hypothesis of local adaptation to altitude was further addressed by analyzing replicated Abies alba population pairs (low and high elevations) across the species’ southern distribution range, where the effects of climatic selection are expected to be the strongest. For comparison, a single population pair from the closely related species A. cephalonica was also analyzed. The hierarchical model did not detect any pattern of adaptive divergence to altitude replicated in the different study sites. Instead, idiosyncratic patterns of local adaptation among sites were detected by within-site approaches. Conclusion Hierarchical approaches may miss idiosyncratic patterns of adaptation among sites, and we strongly recommend the use of both hierarchical (multi-site) and classical (within-site) approaches when addressing the question of adaptation across broad spatial scales. PMID:27392065

  12. Sample based 3D face reconstruction from a single frontal image by adaptive locally linear embedding

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; ZHUANG Yue-ting

    2007-01-01

    In this paper, we propose a highly automatic approach for 3D photorealistic face reconstruction from a single frontal image. The key point of our work is the implementation of adaptive manifold learning approach. Beforehand, an active appearance model (AAM) is trained for automatic feature extraction and adaptive locally linear embedding (ALLE) algorithm is utilized to reduce the dimensionality of the 3D database. Then, given an input frontal face image, the corresponding weights between 3D samples and the image are synthesized adaptively according to the AAM selected facial features. Finally, geometry reconstruction is achieved by linear weighted combination of adaptively selected samples. Radial basis function (RBF) is adopted to map facial texture from the frontal image to the reconstructed face geometry. The texture of invisible regions between the face and the ears is interpolated by sampling from the frontal image. This approach has several advantages: (1) Only a single frontal face image is needed for highly automatic face reconstruction; (2) Compared with former works, our reconstruction approach provides higher accuracy; (3) Constraint based RBF texture mapping provides natural appearance for reconstructed face.

  13. Profit based phase II sample size determination when adaptation by design is adopted

    OpenAIRE

    Martini, D.

    2014-01-01

    Background. Adaptation by design consists in conservatively estimating the phase III sample size on the basis of phase II data, and can be applied in almost all therapeutic areas; it is based on the assumption that the effect size of the drug is the same in phase II and phase III trials, that is a very common scenario assumed in product development. Adaptation by design reduces the probability on underpowered experiments and can improve the overall success probability of phase II and III tria...

  14. The Vineland Adaptive Behavior Scale in a sample of normal French Children: a research note.

    Science.gov (United States)

    Fombonne, E; Achard, S

    1993-09-01

    The Vineland Adaptive Behavior scale (survey form) was used in a sample of 151 normal children under age 18. Standardized mean scores of French children were comparable to those of the American normative sample. From the age of 6 onwards, French children scored consistently lower in the Daily Living Skills domain though the magnitude of this difference remained moderate. While the overall findings support the cross-cultural stability of the psychometric properties of this instrument, attention is drawn to potential problems in the use of the Vineland scales, with special reference to autistic samples.

  15. Adaptation to climate change and climate variability:The importance of understanding agriculture as performance

    NARCIS (Netherlands)

    Crane, T.A.; Roncoli, C.; Hoogenboom, G.

    2011-01-01

    Most climate change studies that address potential impacts and potential adaptation strategies are largely based on modelling technologies. While models are useful for visualizing potential future outcomes and evaluating options for potential adaptation, they do not adequately represent and integrat

  16. Adaptation to climate change and climate variability in European agriculture: The importance of farm level responses

    NARCIS (Netherlands)

    Reidsma, P.; Ewert, F.; Oude Lansink, A.G.J.M.; Leemans, R.

    2010-01-01

    Climatic conditions and hence climate change influence agriculture. Most studies that addressed the vulnerability of agriculture to climate change have focused on potential impacts without considering adaptation. When adaptation strategies are considered, socio-economic conditions and farm managemen

  17. An Adaptive Defect Weighted Sampling Algorithm to Design Pseudoknotted RNA Secondary Structures.

    Science.gov (United States)

    Zandi, Kasra; Butler, Gregory; Kharma, Nawwaf

    2016-01-01

    Computational design of RNA sequences that fold into targeted secondary structures has many applications in biomedicine, nanotechnology and synthetic biology. An RNA molecule is made of different types of secondary structure elements and an important RNA element named pseudoknot plays a key role in stabilizing the functional form of the molecule. However, due to the computational complexities associated with characterizing pseudoknotted RNA structures, most of the existing RNA sequence designer algorithms generally ignore this important structural element and therefore limit their applications. In this paper we present a new algorithm to design RNA sequences for pseudoknotted secondary structures. We use NUPACK as the folding algorithm to compute the equilibrium characteristics of the pseudoknotted RNAs, and describe a new adaptive defect weighted sampling algorithm named Enzymer to design low ensemble defect RNA sequences for targeted secondary structures including pseudoknots. We used a biological data set of 201 pseudoknotted structures from the Pseudobase library to benchmark the performance of our algorithm. We compared the quality characteristics of the RNA sequences we designed by Enzymer with the results obtained from the state of the art MODENA and antaRNA. Our results show our method succeeds more frequently than MODENA and antaRNA do, and generates sequences that have lower ensemble defect, lower probability defect and higher thermostability. Finally by using Enzymer and by constraining the design to a naturally occurring and highly conserved Hammerhead motif, we designed 8 sequences for a pseudoknotted cis-acting Hammerhead ribozyme. Enzymer is available for download at https://bitbucket.org/casraz/enzymer.

  18. FloodNet: Coupling Adaptive Sampling with Energy Aware Routing in a Flood Warning System

    Institute of Scientific and Technical Information of China (English)

    Jing Zhou; David De Roure

    2007-01-01

    We describe the design of FloodNet, a flood warning system, which uses a grid-based flood predictor model developed by environmental experts to make flood predictions based on readings of water level collected by a set of sensor nodes.To optimize battery consumption, the reporting frequency of sensor nodes is required to be adaptive to local conditions as well as the flood predictor model.We therefore propose an energy aware routing protocol which allows sensor nodes to consume energy according to this need.This system is notable both for the adaptive sampling regime and the methodology adopted in the design of the adaptive behavior, which involved development of simulation tools and very close collaboration with environmental experts.

  19. Efficient Importance Sampling Heuristics for the Simulation of Population Overflow in Feed-Forward Queueing Networks

    NARCIS (Netherlands)

    Nicola, Victor F.; Zaburnenko, Tatiana S.

    2006-01-01

    In this paper we propose a state-dependent importance sampling heuristic to estimate the probability of population overflow in feed-forward networks. This heuristic attempts to approximate the “optimal” state-dependent change of measure without the need for difficult analysis or costly optimization i

  20. Resampling: An improvement of importance sampling in varying population size models.

    Science.gov (United States)

    Merle, C; Leblois, R; Rousset, F; Pudlo, P

    2017-04-01

    Sequential importance sampling algorithms have been defined to estimate likelihoods in models of ancestral population processes. However, these algorithms are based on features of the models with constant population size, and become inefficient when the population size varies in time, making likelihood-based inferences difficult in many demographic situations. In this work, we modify a previous sequential importance sampling algorithm to improve the efficiency of the likelihood estimation. Our procedure is still based on features of the model with constant size, but uses a resampling technique with a new resampling probability distribution depending on the pairwise composite likelihood. We tested our algorithm, called sequential importance sampling with resampling (SISR) on simulated data sets under different demographic cases. In most cases, we divided the computational cost by two for the same accuracy of inference, in some cases even by one hundred. This study provides the first assessment of the impact of such resampling techniques on parameter inference using sequential importance sampling, and extends the range of situations where likelihood inferences can be easily performed.

  1. Importance Sampling Simulation of Population Overflow in Two-node Tandem Networks

    NARCIS (Netherlands)

    Nicola, Victor F.; Zaburnenko, Tatiana S.; Baier, C.; Chiola, G.; Smirni, E.

    2005-01-01

    In this paper we consider the application of importance sampling in simulations of Markovian tandem networks in order to estimate the probability of rare events, such as network population overflow. We propose a heuristic methodology to obtain a good approximation to the 'optimal' state-dependent ch

  2. Rare-event simulation for tandem queues: A simple and efficient importance sampling scheme

    NARCIS (Netherlands)

    Miretskiy, D.; Scheinhardt, W.; Mandjes, M.

    2009-01-01

    This paper focuses on estimating the rare event of overflow in the downstream queue of a tandem Jackson queue, relying on importance sampling. It is known that in this setting ‘traditional’ state-independent schemes perform poorly. More sophisticated state-dependent schemes yield asymptotic efficien

  3. Importance sampling for Lambda-coalescents in the infinitely many sites model

    CERN Document Server

    Birkner, Matthias; Steinruecken, Matthias; 10.1016/j.tpb.2011.01.005

    2011-01-01

    We present and discuss new importance sampling schemes for the approximate computation of the sample probability of observed genetic types in the infinitely many sites model from population genetics. More specifically, we extend the 'classical framework', where genealogies are assumed to be governed by Kingman's coalescent, to the more general class of Lambda-coalescents and develop further Hobolth et. al.'s (2008) idea of deriving importance sampling schemes based on 'compressed genetrees'. The resulting schemes extend earlier work by Griffiths and Tavar\\'e (1994), Stephens and Donnelly (2000), Birkner and Blath (2008) and Hobolth et. al. (2008). We conclude with a performance comparison of classical and new schemes for Beta- and Kingman coalescents.

  4. Assessing employability capacities and career adaptability in a sample of human resource professionals

    Directory of Open Access Journals (Sweden)

    Melinde Coetzee

    2015-03-01

    Full Text Available Orientation: Employers have come to recognise graduates’ employability capacities and their ability to adapt to new work demands as important human capital resources for sustaining a competitive business advantage.Research purpose: The study sought (1 to ascertain whether a significant relationship exists between a set of graduate employability capacities and a set of career adaptability capacities and (2 to identify the variables that contributed the most to this relationship.Motivation for the study: Global competitive markets and technological advances are increasingly driving the demand for graduate knowledge and skills in a wide variety of jobs. Contemporary career theory further emphasises career adaptability across the lifespan as a critical skill for career management agency. Despite the apparent importance attached to employees’ employability and career adaptability, there seems to be a general lack of research investigating the association between these constructs.Research approach, design and method: A cross-sectional, quantitative research design approach was followed. Descriptive statistics, Pearson product-moment correlations and canonical correlation analysis were performed to achieve the objective of the study. The participants (N = 196 were employed in professional positions in the human resource field and were predominantly early career black people and women.Main findings: The results indicated positive multivariate relationships between the variables and showed that lifelong learning capacities and problem solving, decision-making and interactive skills contributed the most to explaining the participants’ career confidence, career curiosity and career control.Practical/managerial implications: The study suggests that developing professional graduates’ employability capacities may strengthen their career adaptability. These capacities were shown to explain graduates’ active engagement in career management strategies

  5. Adaptive list sequential sampling method for population-based observational studies

    Science.gov (United States)

    2014-01-01

    Background In population-based observational studies, non-participation and delayed response to the invitation to participate are complications that often arise during the recruitment of a sample. When both are not properly dealt with, the composition of the sample can be different from the desired composition. Inviting too many individuals or too few individuals from a particular subgroup could lead to unnecessary costs or decreased precision. Another problem is that there is frequently no or only partial information available about the willingness to participate. In this situation, we cannot adjust the recruitment procedure for non-participation before the recruitment period starts. Methods We have developed an adaptive list sequential sampling method that can deal with unknown participation probabilities and delayed responses to the invitation to participate in the study. In a sequential way, we evaluate whether we should invite a person from the population or not. During this evaluation, we correct for the fact that this person could decline to participate using an estimated participation probability. We use the information from all previously invited persons to estimate the participation probabilities for the non-evaluated individuals. Results The simulations showed that the adaptive list sequential sampling method can be used to estimate the participation probability during the recruitment period, and that it can successfully recruit a sample with a specific composition. Conclusions The adaptive list sequential sampling method can successfully recruit a sample with a specific desired composition when we have partial or no information about the willingness to participate before we start the recruitment period and when individuals may have a delayed response to the invitation. PMID:24965316

  6. Adaptive list sequential sampling method for population-based observational studies.

    Science.gov (United States)

    Hof, Michel H; Ravelli, Anita C J; Zwinderman, Aeilko H

    2014-06-25

    In population-based observational studies, non-participation and delayed response to the invitation to participate are complications that often arise during the recruitment of a sample. When both are not properly dealt with, the composition of the sample can be different from the desired composition. Inviting too many individuals or too few individuals from a particular subgroup could lead to unnecessary costs or decreased precision. Another problem is that there is frequently no or only partial information available about the willingness to participate. In this situation, we cannot adjust the recruitment procedure for non-participation before the recruitment period starts. We have developed an adaptive list sequential sampling method that can deal with unknown participation probabilities and delayed responses to the invitation to participate in the study. In a sequential way, we evaluate whether we should invite a person from the population or not. During this evaluation, we correct for the fact that this person could decline to participate using an estimated participation probability. We use the information from all previously invited persons to estimate the participation probabilities for the non-evaluated individuals. The simulations showed that the adaptive list sequential sampling method can be used to estimate the participation probability during the recruitment period, and that it can successfully recruit a sample with a specific composition. The adaptive list sequential sampling method can successfully recruit a sample with a specific desired composition when we have partial or no information about the willingness to participate before we start the recruitment period and when individuals may have a delayed response to the invitation.

  7. Improved Algorithms and Coupled Neutron-Photon Transport for Auto-Importance Sampling Method

    CERN Document Server

    Wang, Xin; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Li, Jun-Li

    2016-01-01

    Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed by Tsinghua University for deep penetration problem, which can improve computational efficiency significantly without pre-calculations for importance distribution. However AIS method is only validated with several basic deep penetration problems of simple geometries and cannot be used for coupled neutron-photon transport. This paper firstly presented the latest algorithm improvements for AIS method including particle transport, fictitious particles creation and adjustment, fictitious surface geometry, random number allocation and calculation of estimated relative error, which made AIS method applicable to complicated deep penetration problem. Then, a coupled Neutron-Photon Auto-Importance Sampling (NP-AIS) method was proposed to apply AIS method with the improved algorithms in coupled neutron-photon Monte Carlo transport. Finally, the NUREG/CR-6115 PWR benchmark model was calculated with the method of geometry splitti...

  8. Adaptive lambda square dynamics simulation: an efficient conformational sampling method for biomolecules.

    Science.gov (United States)

    Ikebe, Jinzen; Sakuraba, Shun; Kono, Hidetoshi

    2014-01-05

    A novel, efficient sampling method for biomolecules is proposed. The partial multicanonical molecular dynamics (McMD) was recently developed as a method that improved generalized ensemble (GE) methods to focus sampling only on a part of a system (GEPS); however, it was not tested well. We found that partial McMD did not work well for polylysine decapeptide and gave significantly worse sampling efficiency than a conventional GE. Herein, we elucidate the fundamental reason for this and propose a novel GEPS, adaptive lambda square dynamics (ALSD), which can resolve the problem faced when using partial McMD. We demonstrate that ALSD greatly increases the sampling efficiency over a conventional GE. We believe that ALSD is an effective method and is applicable to the conformational sampling of larger and more complicated biomolecule systems. Copyright © 2013 Wiley Periodicals, Inc.

  9. Sample selection and adaptive weight allocation for compressive MIMO UWB noise radar

    Science.gov (United States)

    Kwon, Yangsoo; Narayanan, Ram M.; Rangaswamy, Muralidhar

    2012-06-01

    In this paper, we propose a sample selection method for compressive multiple-input multiple-output (MIMO) ultra-wideband (UWB) noise radar imaging. The proposed sample selection is based on comparing norm values of the transmitted sequences, and selects the largest M samples among N candidates per antenna. Moreover, we propose an adaptive weight allocation which improves normalized mean-square error (NMSE) by maximizing the mutual information between target echoes and the transmitted signals. Further, this weighting scheme is applicable to both sample selection schemes, a conventional random sampling and the proposed selection. Simulations show that the proposed selection method can improve the multiple target detection probability and NMSE. Moreover, the proposed weight allocation scheme is applicable to those selection methods and obtains spatial diversity and signal-to-noise ratio (SNR) gains.

  10. Comparative transcriptomics of elasmobranchs and teleosts highlight important processes in adaptive immunity and regional endothermy.

    Science.gov (United States)

    Marra, Nicholas J; Richards, Vincent P; Early, Angela; Bogdanowicz, Steve M; Pavinski Bitar, Paulina D; Stanhope, Michael J; Shivji, Mahmood S

    2017-01-30

    -taxa transcriptomic-based perspective on differences between elasmobranchs and teleosts, and suggests various unique features associated with the adaptive immune system of elasmobranchs, pointing in particular to the potential importance of MHC Class II. This in turn suggests that expanded comparative work involving additional tissues, as well as genome sequencing of multiple elasmobranch species would be productive in elucidating the regulatory and genome architectural hallmarks of elasmobranchs.

  11. Correlations for the Vineland Adaptive Behavior Scales with Kaufman Brief Intelligence Test in a forensic sample.

    Science.gov (United States)

    Hayes, Susan; Farnill, Douglas

    2003-04-01

    People with an intellectual disability are over-represented in the criminal justice system in many western countries. Identifying accused persons with intellectual disability is important if they are to receive protections available under the law. Accurate diagnosis is also relevant for correctional administrators, probation and parole services, and community services. Diagnosis of intellectual disability must be made on the basis of both cognitive skills (intelligence) and adaptive behavior. In this study, the Kaufman Brief Intelligence Test assessed intelligence, and the Vineland Adaptive Behavior Scales assessed adaptive behavior, through self-report. Tests were administered to 150 offenders, ranging in age from 13 to 53 years, in Australian prisons, juvenile detention centers, legal aid offices, and probation services. Pearson product-moment correlation coefficients calculated among all subtests and between total scores were significant. ROC curve analyses demonstrated that performance on each effectively predicted a standard score of less than 70 on the other one.

  12. Compact Ocean Models Enable Onboard AUV Autonomy and Decentralized Adaptive Sampling

    Science.gov (United States)

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Compact Ocean Models Enable Onboard AUV Autonomy and...Models Enable Onboard AUV Autonomy and Decentralized Adaptive Sampling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...onboard autonomy of underwater vehicles”, in Proc. AGU Ocean Science Meeting, Salt Lake City, UT. [published] ● Frolov, S., R., Kudela, J., Bellingham

  13. Adaptive Polar Sampling: A New MC Technique for the Analysis of Ill-behaved Surfaces

    OpenAIRE

    BAUWENS, Luc; Bos, Charles S.; Van Dijk, Herman K.

    1998-01-01

    This discussion paper resulted in a publication in: (W. Jansen and J.G. Bethlehem eds.) 'Compstat 2000, Statistics Netherlands', 2000, pages 13-14. Adaptive Polar Sampling is proposed as an algorithm where random drawings aredirectly generated from the target function (posterior) in all-but-onedirections of the parameter space. The method is based on the mixed integrationtechnique of Van Dijk, Kloek & Boender (1985) but extends this one by replacingthe one-dimensional quadrature step by Monte...

  14. Performance evaluation of an importance sampling technique in a Jackson network

    Science.gov (United States)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  15. Efficient adaptive designs with mid-course sample size adjustment in clinical trials

    CERN Document Server

    Bartroff, Jay

    2011-01-01

    Adaptive designs have been proposed for clinical trials in which the nuisance parameters or alternative of interest are unknown or likely to be misspecified before the trial. Whereas most previous works on adaptive designs and mid-course sample size re-estimation have focused on two-stage or group sequential designs in the normal case, we consider here a new approach that involves at most three stages and is developed in the general framework of multiparameter exponential families. Not only does this approach maintain the prescribed type I error probability, but it also provides a simple but asymptotically efficient sequential test whose finite-sample performance, measured in terms of the expected sample size and power functions, is shown to be comparable to the optimal sequential design, determined by dynamic programming, in the simplified normal mean case with known variance and prespecified alternative, and superior to the existing two-stage designs and also to adaptive group sequential designs when the al...

  16. Variability in Adaptive Behavior in Autism: Evidence for the Importance of Family History

    OpenAIRE

    Mazefsky, C.A.; Williams, D. L.; Minshew, N. J.

    2008-01-01

    Adaptive behavior in autism is highly variable and strongly related to prognosis. This study explored family history as a potential source of variability in adaptive behavior in autism. Participants included 77 individuals (mean age=18) with average or better intellectual ability and autism. Parents completed the Family History Interview about the presence of broader autism phenotype symptoms and major psychiatric disorders in first degree relatives. Adaptive behavior was assessed via the Vin...

  17. Adaptation and Validation of the Sexual Assertiveness Scale (SAS) in a Sample of Male Drug Users.

    Science.gov (United States)

    Vallejo-Medina, Pablo; Sierra, Juan Carlos

    2015-04-21

    The aim of the present study was to adapt and validate the Sexual Assertiveness Scale (SAS) in a sample of male drug users. A sample of 326 male drug users and 322 non-clinical males was selected by cluster sampling and convenience sampling, respectively. Results showed that the scale had good psychometric properties and adequate internal consistency reliability (Initiation = .66, Refusal = .74 and STD-P = .79). An evaluation of the invariance showed strong factor equivalence between both samples. A high and moderate effect of Differential Item Functioning was only found in items 1 and 14 (∆R 2 Nagelkerke = .076 and .037, respectively). We strongly recommend not using item 1 if the goal is to compare the scores of both groups, otherwise the comparison will be biased. Correlations obtained between the CSFQ-14 and the safe sex ratio and the SAS subscales were significant (CI = 95%) and indicated good concurrent validity. Scores of male drug users were similar to those of non-clinical males. Therefore, the adaptation of the SAS to drug users provides enough guarantees for reliable and valid use in both clinical practice and research, although care should be taken with item 1.

  18. Self-organizing adaptive map: autonomous learning of curves and surfaces from point samples.

    Science.gov (United States)

    Piastra, Marco

    2013-05-01

    Competitive Hebbian Learning (CHL) (Martinetz, 1993) is a simple and elegant method for estimating the topology of a manifold from point samples. The method has been adopted in a number of self-organizing networks described in the literature and has given rise to related studies in the fields of geometry and computational topology. Recent results from these fields have shown that a faithful reconstruction can be obtained using the CHL method only for curves and surfaces. Within these limitations, these findings constitute a basis for defining a CHL-based, growing self-organizing network that produces a faithful reconstruction of an input manifold. The SOAM (Self-Organizing Adaptive Map) algorithm adapts its local structure autonomously in such a way that it can match the features of the manifold being learned. The adaptation process is driven by the defects arising when the network structure is inadequate, which cause a growth in the density of units. Regions of the network undergo a phase transition and change their behavior whenever a simple, local condition of topological regularity is met. The phase transition is eventually completed across the entire structure and the adaptation process terminates. In specific conditions, the structure thus obtained is homeomorphic to the input manifold. During the adaptation process, the network also has the capability to focus on the acquisition of input point samples in critical regions, with a substantial increase in efficiency. The behavior of the network has been assessed experimentally with typical data sets for surface reconstruction, including suboptimal conditions, e.g. with undersampling and noise.

  19. Importance of impacts scenarios for the adaptation of agriculture to climate change

    Science.gov (United States)

    Zullo, J.; Macedo, C.; Pinto, H. S.; Assad, E. D.; Koga Vicente, A.

    2012-12-01

    The great possibility that the climate is already changing, and the most drastic way possible, increases the challenge of agricultural engineering, especially in environmentally vulnerable areas and in regions where agriculture has a high economic and social importance. Knowledge of potential impacts that may be caused by changes in water and thermal regimes in coming decades is increasingly strategic, as they allow the development of techniques to adapt agriculture to climate change and therefore minimizes the risk of undesirable impacts, for example, in food and nutritional security. Thus, the main objective of this paper is to describe a way to generate impacts scenarios caused by anomalies of precipitation and temperature in the definition of climate risk zoning of an agricultural crop very important in the tropics, such as the sugar cane, especially in central-southern Brazil, which is one of its main world producers. A key point here is the choice of the climate model to be used, considering that 23 different models were used in the fourth IPCC report published in 2007. The number and range of available models requires the definition of criteria for choosing the most suitable for the preparation of the impacts scenarios. One way proposed and used in this work is based on the definition of two groups of models according to 27 technical attributes of them. The clustering of 23 models in two groups, with a model representing each group (UKMO_HadCM3 and MIROC3.2_medres), assists the generation and comparison of impacts scenarios, making them more representative and useful. Another important aspect in the generation of impacts scenarios is the estimate of the relative importance of the anomalies of precipitation and temperature, which are the most commonly used. To assess the relative importance of the anomalies are generated scenarios considering an anomaly at a time and both together. The impacts scenarios for a high emission of greenhouse gases (A2), from 2010

  20. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  1. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  2. Burnout and Engagement: Relative Importance of Predictors and Outcomes in Two Health Care Worker Samples.

    Science.gov (United States)

    Fragoso, Zachary L; Holcombe, Kyla J; McCluney, Courtney L; Fisher, Gwenith G; McGonagle, Alyssa K; Friebe, Susan J

    2016-06-09

    This study's purpose was twofold: first, to examine the relative importance of job demands and resources as predictors of burnout and engagement, and second, the relative importance of engagement and burnout related to health, depressive symptoms, work ability, organizational commitment, and turnover intentions in two samples of health care workers. Nurse leaders (n = 162) and licensed emergency medical technicians (EMTs; n = 102) completed surveys. In both samples, job demands predicted burnout more strongly than job resources, and job resources predicted engagement more strongly than job demands. Engagement held more weight than burnout for predicting commitment, and burnout held more weight for predicting health outcomes, depressive symptoms, and work ability. Results have implications for the design, evaluation, and effectiveness of workplace interventions to reduce burnout and improve engagement among health care workers. Actionable recommendations for increasing engagement and decreasing burnout in health care organizations are provided.

  3. Adapting chain referral methods to sample new migrants: Possibilities and limitations

    Directory of Open Access Journals (Sweden)

    Lucinda Platt

    2015-09-01

    Full Text Available Background: Demographic research on migration requires representative samples of migrant populations. Yet recent immigrants, who are particularly informative about current migrant flows, are difficult to capture even in specialist surveys. Respondent-driven sampling (RDS, a chain referral sampling and analysis technique, potentially offers the opportunity to achieve population-level inference of recently arrived migrant populations. Objective: We evaluate the attempt to use RDS to sample two groups of migrants, from Pakistan and Poland, who had arrived in the UK within the previous 18 months, and we present an alternative approach adapted to recent migrants. Methods: We discuss how connectedness, privacy, clustering, and motivation are expected to differ among recently arrived migrants, compared to typical applications of RDS. We develop a researcher-led chain referral approach, and compare success in recruitment and indicators of representativeness to standard RDS recruitment. Results: Our researcher-led approach led to higher rates of chain-referral, and enabled us to reach population members with smaller network sizes. The researcher-led approach resulted in similar recruiter-recruit transition probabilities to traditional RDS across many demographic and social characteristics. However, we did not succeed in building up long referral chains, largely due to the lack of connectedness of our target populations and some reluctance to refer. There were some differences between the two migrant groups, with less mobile and less hidden Pakistani men producing longer referral chains. Conclusions: Chain referral is difficult to implement for sampling newly arrived migrants. However, our researcher-led adaptation shows promise for less hidden and more stable recent immigrant populations. Contribution: The paper offers an evaluation of RDS for surveying recent immigrants and an adaptation that may be effective under certain conditions.

  4. Social daydreaming and adjustment: An experience-sampling study of socio-emotional adaptation during a life transition

    Directory of Open Access Journals (Sweden)

    Giulia Lara Poerio

    2016-01-01

    Full Text Available Estimates suggest that up to half of waking life is spent daydreaming; that is, engaged in thought that is independent of, and unrelated to, one’s current task. Emerging research indicates that daydreams are predominately social suggesting that daydreams may serve socio-emotional functions. Here we explore the functional role of social daydreaming for socio-emotional adjustment during an important and stressful life transition (the transition to university using experience-sampling with 103 participants over 28 days. Over time, social daydreams increased in their positive characteristics and positive emotional outcomes; specifically, participants reported that their daydreams made them feel more socially connected and less lonely, and that the content of their daydreams became less fanciful and involved higher quality relationships. These characteristics then predicted less loneliness at the end of the study, which, in turn was associated with greater social adaptation to university. Feelings of connection resulting from social daydreams were also associated with less emotional inertia in participants who reported being less socially adapted to university. Findings indicate that social daydreaming is functional for promoting socio-emotional adjustment to an important life event. We highlight the need to consider the social content of stimulus-independent cognitions, their characteristics, and patterns of change, to specify how social thoughts enable socio-emotional adaptation.

  5. Social Daydreaming and Adjustment: An Experience-Sampling Study of Socio-Emotional Adaptation During a Life Transition.

    Science.gov (United States)

    Poerio, Giulia L; Totterdell, Peter; Emerson, Lisa-Marie; Miles, Eleanor

    2016-01-01

    Estimates suggest that up to half of waking life is spent daydreaming; that is, engaged in thought that is independent of, and unrelated to, one's current task. Emerging research indicates that daydreams are predominately social suggesting that daydreams may serve socio-emotional functions. Here we explore the functional role of social daydreaming for socio-emotional adjustment during an important and stressful life transition (the transition to university) using experience-sampling with 103 participants over 28 days. Over time, social daydreams increased in their positive characteristics and positive emotional outcomes; specifically, participants reported that their daydreams made them feel more socially connected and less lonely, and that the content of their daydreams became less fanciful and involved higher quality relationships. These characteristics then predicted less loneliness at the end of the study, which, in turn was associated with greater social adaptation to university. Feelings of connection resulting from social daydreams were also associated with less emotional inertia in participants who reported being less socially adapted to university. Findings indicate that social daydreaming is functional for promoting socio-emotional adjustment to an important life event. We highlight the need to consider the social content of stimulus-independent cognitions, their characteristics, and patterns of change, to specify how social thoughts enable socio-emotional adaptation.

  6. RNAseq revealed the important gene pathways controlling adaptive mechanisms under waterlogged stress in maize.

    Science.gov (United States)

    Arora, Kanika; Panda, Kusuma Kumari; Mittal, Shikha; Mallikarjuna, Mallana Gowdra; Rao, Atmakuri Ramakrishna; Dash, Prasanta Kumar; Thirunavukkarasu, Nepolean

    2017-09-08

    Waterlogging causes yield penalty in maize-growing countries of subtropical regions. Transcriptome analysis of the roots of a tolerant inbred HKI1105 using RNA sequencing revealed 21,364 differentially expressed genes (DEGs) under waterlogged stress condition. These 21,364 DEGs are known to regulate important pathways including energy-production, programmed cell death (PCD), aerenchyma formation, and ethylene responsiveness. High up-regulation of invertase (49-fold) and hexokinase (36-fold) in roots explained the ATP requirement in waterlogging condition. Also, high up-regulation of expansins (42-fold), plant aspartic protease A3 (19-fold), polygalacturonases (16-fold), respiratory burst oxidase homolog (12-fold), and hydrolases (11-fold) explained the PCD of root cortical cells followed by the formation of aerenchyma tissue during waterlogging stress. We hypothesized that the oxygen transfer in waterlogged roots is promoted by a cross-talk of fermentative, metabolic, and glycolytic pathways that generate ATPs for PCD and aerenchyma formation in root cortical cells. SNPs were mapped to the DEGs regulating aerenchyma formation (12), ethylene-responsive factors (11), and glycolysis (4) under stress. RNAseq derived SNPs can be used in selection approaches to breed tolerant hybrids. Overall, this investigation provided significant evidence of genes operating in the adaptive traits such as ethylene production and aerenchyma formation to cope-up the waterlogging stress.

  7. The importance of trabecular hypertrophy in right ventricular adaptation to chronic pressure overload.

    Science.gov (United States)

    van de Veerdonk, Mariëlle C; Dusoswa, Sophie A; Marcus, J Tim; Bogaard, Harm-Jan; Spruijt, Onno; Kind, Taco; Westerhof, Nico; Vonk-Noordegraaf, Anton

    2014-02-01

    To assess the contribution of right ventricular (RV) trabeculae and papillary muscles (TPM) to RV mass and volumes in controls and patients with pulmonary arterial hypertension (PAH). Furthermore, to evaluate whether TPM shows a similar response as the RV free wall (RVFW) to changes in pulmonary artery pressure (PAP) during follow-up. 50 patients underwent cardiac magnetic resonance (CMR) and right heart catheterization at baseline and after one-year follow-up. Furthermore 20 controls underwent CMR. RV masses were assessed with and without TPM. TPM constituted a larger proportion of total RV mass and RV end-diastolic volume (RVEDV) in PAH than in controls (Mass: 35 ± 7 vs. 25 ± 5 %; p TPM mass was related to the RVFW mass in patients (baseline: R = 0.65; p TPM from the assessment resulted in altered RV mass, volumes and function than when included (all p TPM mass (β = 0.44; p = 0.004) but not the changes in RVFW mass (p = 0.095) were independently related to changes in PAP during follow-up. RV TPM showed a larger contribution to total RV mass in PAH (~35 %) compared to controls (~25 %). Inclusion of TPM in the analyses significantly influenced the magnitude of the RV volumes and mass. Furthermore, TPM mass was stronger related to changes in PAP than RVFW mass. Our results implicate that TPM are important contributors to RV adaptation during pressure overload and cannot be neglected from the RV assessment.

  8. Phylogeographic differentiation versus transcriptomic adaptation to warm temperatures in Zostera marina, a globally important seagrass

    NARCIS (Netherlands)

    Jueterbock, Alexander; Franssen, S. U.; Bergmann, N.; Gu, J.; Coyer, J. A.; Reusch, T. B. H.; Bornberg-Bauer, E.; Olsen, J. L.

    2016-01-01

    Populations distributed across a broad thermal cline are instrumental in addressing adaptation to increasing temperatures under global warming. Using a space-for-time substitution design, we tested for parallel adaptation to warm temperatures along two independent thermal clines in Zostera marina,

  9. Phylogeographic differentiation versus transcriptomic adaptation to warm temperatures in Zostera marina, a globally important seagrass

    NARCIS (Netherlands)

    Jueterbock, Alexander; Franssen, S. U.; Bergmann, N.; Gu, J.; Coyer, J. A.; Reusch, T. B. H.; Bornberg-Bauer, E.; Olsen, J. L.

    2016-01-01

    Populations distributed across a broad thermal cline are instrumental in addressing adaptation to increasing temperatures under global warming. Using a space-for-time substitution design, we tested for parallel adaptation to warm temperatures along two independent thermal clines in Zostera marina, t

  10. Adaptive sampling strategy support for the unlined chromic acid pit, chemical waste landfill, Sandia National Laboratories, Albuquerque, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.L.

    1993-11-01

    Adaptive sampling programs offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the real-time data generated by an adaptive sampling program. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system (SitePlanner{trademark} ) for data fusion, management, and display and combined Bayesian/geostatistical methods (PLUME) for contamination-extent estimation and sample location selection. This approach is applied in a retrospective study of a subsurface chromium plume at Sandia National Laboratories` chemical waste landfill. Retrospective analyses suggest the potential for characterization cost savings on the order of 60% through a reduction in the number of sampling programs, total number of soil boreholes, and number of samples analyzed from each borehole.

  11. Adaptive sampling strategy support for the unlined chromic acid pit, chemical waste landfill, Sandia National Laboratories, Albuquerque, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.L.

    1993-11-01

    Adaptive sampling programs offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the real-time data generated by an adaptive sampling program. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system (SitePlanner{trademark} ) for data fusion, management, and display and combined Bayesian/geostatistical methods (PLUME) for contamination-extent estimation and sample location selection. This approach is applied in a retrospective study of a subsurface chromium plume at Sandia National Laboratories` chemical waste landfill. Retrospective analyses suggest the potential for characterization cost savings on the order of 60% through a reduction in the number of sampling programs, total number of soil boreholes, and number of samples analyzed from each borehole.

  12. Adaptive Sampling approach to environmental site characterization at Joliet Army Ammunition Plant: Phase 2 demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Bujewski, G.E. [Sandia National Labs., Albuquerque, NM (United States). Environmental Restoration Technologies Dept.; Johnson, R.L. [Argonne National Lab., IL (United States)

    1996-04-01

    Adaptive sampling programs provide real opportunities to save considerable time and money when characterizing hazardous waste sites. This Strategic Environmental Research and Development Program (SERDP) project demonstrated two decision-support technologies, SitePlanner{trademark} and Plume{trademark}, that can facilitate the design and deployment of an adaptive sampling program. A demonstration took place at Joliet Army Ammunition Plant (JAAP), and was unique in that it was tightly coupled with ongoing Army characterization work at the facility, with close scrutiny by both state and federal regulators. The demonstration was conducted in partnership with the Army Environmental Center`s (AEC) Installation Restoration Program and AEC`s Technology Development Program. AEC supported researchers from Tufts University who demonstrated innovative field analytical techniques for the analysis of TNT and DNT. SitePlanner{trademark} is an object-oriented database specifically designed for site characterization that provides an effective way to compile, integrate, manage and display site characterization data as it is being generated. Plume{trademark} uses a combination of Bayesian analysis and geostatistics to provide technical staff with the ability to quantitatively merge soft and hard information for an estimate of the extent of contamination. Plume{trademark} provides an estimate of contamination extent, measures the uncertainty associated with the estimate, determines the value of additional sampling, and locates additional samples so that their value is maximized.

  13. The Portuguese adaptation of the Gudjonsson Suggestibility Scale (GSS1) in a sample of inmates.

    Science.gov (United States)

    Pires, Rute; Silva, Danilo R; Ferreira, Ana Sousa

    2014-01-01

    This paper comprises two studies which address the validity of the Portuguese adaptation of the Gudjonsson Suggestibility Scale, GSS1. In study 1, the means and standard deviations for the suggestibility results of a sample of Portuguese inmates (N=40, Mage=37.5 years, SD=8.1) were compared to those of a sample of Icelandic inmates (Gudjonsson, 1997; Gudjonsson & Sigurdsson, 1996). Portuguese inmates' results were in line with the original results. In study 2, the means and standard deviations for the suggestibility results of the sample of Portuguese inmates were compared to those of a general Portuguese population sample (N=57, Mage=36.1 years, SD=12.7). The forensic sample obtained significantly higher scores in suggestibility measures than the general population sample. ANOVA confirmed that the increased suggestibility in the inmates sample was due to the limited memory capacity of this latter group. Given that the results of both studies 1 and 2 are in keeping with the author's original results (Gudjonsson, 1997), this may be regarded as a confirmation of the validity of the Portuguese GSS1. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Resilient microorganisms in dust samples of the International Space Station-survival of the adaptation specialists.

    Science.gov (United States)

    Mora, Maximilian; Perras, Alexandra; Alekhova, Tatiana A; Wink, Lisa; Krause, Robert; Aleksandrova, Alina; Novozhilova, Tatiana; Moissl-Eichinger, Christine

    2016-12-20

    The International Space Station (ISS) represents a unique biotope for the human crew but also for introduced microorganisms. Microbes experience selective pressures such as microgravity, desiccation, poor nutrient-availability due to cleaning, and an increased radiation level. We hypothesized that the microbial community inside the ISS is modified by adapting to these stresses. For this reason, we analyzed 8-12 years old dust samples from Russian ISS modules with major focus on the long-time surviving portion of the microbial community. We consequently assessed the cultivable microbiota of these samples in order to analyze their extremotolerant potential against desiccation, heat-shock, and clinically relevant antibiotics. In addition, we studied the bacterial and archaeal communities from the stored Russian dust samples via molecular methods (next-generation sequencing, NGS) and compared our new data with previously derived information from the US American ISS dust microbiome. We cultivated and identified in total 85 bacterial, non-pathogenic isolates (17 different species) and 1 fungal isolate from the 8-12 year old dust samples collected in the Russian segment of the ISS. Most of these isolates exhibited robust resistance against heat-shock and clinically relevant antibiotics. Microbial 16S rRNA gene and archaeal 16S rRNA gene targeting Next Generation Sequencing showed signatures of human-associated microorganisms (Corynebacterium, Staphylococcus, Coprococcus etc.), but also specifically adapted extremotolerant microorganisms. Besides bacteria, the detection of archaeal signatures in higher abundance was striking. Our findings reveal (i) the occurrence of living, hardy microorganisms in archived Russian ISS dust samples, (ii) a profound resistance capacity of ISS microorganisms against environmental stresses, and (iii) the presence of archaeal signatures on board. In addition, we found indications that the microbial community in the Russian segment dust

  15. Mapping differential elemental accumulation in fish tissues: Importance of fish tissue sampling standardization

    Directory of Open Access Journals (Sweden)

    Jovičić Katarina

    2016-01-01

    Full Text Available The concentrations of As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Ni, Pb, Se and Zn in the muscle, gills, liver and intestine of the wels catfish (Silurus glanis from the Danube River were analyzed by inductively coupled plasma mass spectrometry (ICP-MS. The aim of the study was to determine whether in complex muscle/skin, gill filament/gill arch, proximal/distal liver and proximal/median/distal intestine samples, particular components differ in concentrations of the analyzed elements. Results indicated that there were no differences in the accumulation of different elements between the proximal and distal liver segments and between the proximal and median intestine sections. Conversely, elemental accumulation patterns in muscle and skin differed significantly. Significant differences were also observed between the gill arch and filaments, as well as between the distal and the two upper intestine sections. Findings indicated the importance of detailed reporting of tissue sampling, i.e. whether the skin was included in the muscle sample, as well as if the gill arch and filaments were analyzed together. Due to a potential bias that can be produced by different muscle/skin or gill arch/filament ratios included in the sample, we strongly recommend that they should not be analyzed together. Results of the present study might be of interest to the scientific community and stakeholders involved in aquatic ecosystem monitoring programs. [Projekat Ministarstva nauke Republike Srbije, br. TR37009 i br. 173045

  16. Adaptive expertise in teamwork environment:the importance of social aspects in expert work and learning

    OpenAIRE

    Pihlaja, K. (Kaisa)

    2016-01-01

    Today’s society and modern working life is in a constant change which poses challenges for professional expertise as well as to educational systems that are expected to produce the future experts. Work tasks are becoming increasingly complex and multifaceted in which domain-specific knowledge and routine expertise may not suffice anymore, but calls for adaptive expertise: the ability to adapt in new and unfamiliar settings, use knowledge flexibly in creating high-quality, innovative solutions...

  17. Future tendencies of climate indicators important for adaptation and mitigation strategies in forestry

    Science.gov (United States)

    Galos, Borbala; Hänsler, Andreas; Gulyas, Krisztina; Bidlo, Andras; Czimber, Kornel

    2014-05-01

    impact analyses and build an important basis of the future adaptation strategies in forestry, agriculture and water management. Funding: The research is supported by the TÁMOP-4.2.2.A-11/1/KONV-2012-0013 and TÁMOP-4.1.1.C-12/1/KONV-2012-0012 (ZENFE) joint EU-national research projects. Keywords: climate indices, climate change impacts, forestry, regional climate modelling

  18. High-resolution in-depth imaging of optically cleared thick samples using an adaptive SPIM

    Science.gov (United States)

    Masson, Aurore; Escande, Paul; Frongia, Céline; Clouvel, Grégory; Ducommun, Bernard; Lorenzo, Corinne

    2015-11-01

    Today, Light Sheet Fluorescence Microscopy (LSFM) makes it possible to image fluorescent samples through depths of several hundreds of microns. However, LSFM also suffers from scattering, absorption and optical aberrations. Spatial variations in the refractive index inside the samples cause major changes to the light path resulting in loss of signal and contrast in the deepest regions, thus impairing in-depth imaging capability. These effects are particularly marked when inhomogeneous, complex biological samples are under study. Recently, chemical treatments have been developed to render a sample transparent by homogenizing its refractive index (RI), consequently enabling a reduction of scattering phenomena and a simplification of optical aberration patterns. One drawback of these methods is that the resulting RI of cleared samples does not match the working RI medium generally used for LSFM lenses. This RI mismatch leads to the presence of low-order aberrations and therefore to a significant degradation of image quality. In this paper, we introduce an original optical-chemical combined method based on an adaptive SPIM and a water-based clearing protocol enabling compensation for aberrations arising from RI mismatches induced by optical clearing methods and acquisition of high-resolution in-depth images of optically cleared complex thick samples such as Multi-Cellular Tumour Spheroids.

  19. Importance sampling method of correction for multiple testing in affected sib-pair linkage analysis

    OpenAIRE

    Klein, Alison P.; Kovac, Ilija; Sorant, Alexa JM; Baffoe-Bonnie, Agnes; Doan, Betty Q; Ibay, Grace; Lockwood, Erica; Mandal, Diptasri; Santhosh, Lekshmi; Weissbecker, Karen; Woo, Jessica; Zambelli-Weiner, April; Zhang, Jie; Naiman, Daniel Q.; Malley, James

    2003-01-01

    Using the Genetic Analysis Workshop 13 simulated data set, we compared the technique of importance sampling to several other methods designed to adjust p-values for multiple testing: the Bonferroni correction, the method proposed by Feingold et al., and naïve Monte Carlo simulation. We performed affected sib-pair linkage analysis for each of the 100 replicates for each of five binary traits and adjusted the derived p-values using each of the correction methods. The type I error rates for each...

  20. Adaptive multi-sample-based photoacoustic tomography with imaging quality optimization

    Institute of Scientific and Technical Information of China (English)

    Yuxin Wang; Jie Yuan; Sidan Du; Xiaojun Liu; Guan Xu; Xueding Wang

    2015-01-01

    The energy of light exposed on human skin is compulsively limited for safety reasons which affects the power of photoacoustic (PA) signal and its signal-to-noise ratio (SNR) level.Thus,the final reconstructed PA image quality is degraded.This Letter proposes an adaptive multi-sample-based approach to enhance the SNR of PA signals and in addition,detailed information in rebuilt PA images that used to be buried in the noise can be distinguished.Both ex vivo and in vivo experiments are conducted to validate the effectiveness of our proposed method which provides its potential value in clinical trials.

  1. Organ sample generator for expected treatment dose construction and adaptive inverse planning optimization

    Energy Technology Data Exchange (ETDEWEB)

    Nie Xiaobo; Liang Jian; Yan Di [Department of Radiation Oncology, Beaumont Health System, Royal Oak, Michigan 48073 (United States)

    2012-12-15

    Purpose: To create an organ sample generator (OSG) for expected treatment dose construction and adaptive inverse planning optimization. The OSG generates random samples of organs of interest from a distribution obeying the patient specific organ variation probability density function (PDF) during the course of adaptive radiotherapy. Methods: Principle component analysis (PCA) and a time-varying least-squares regression (LSR) method were used on patient specific geometric variations of organs of interest manifested on multiple daily volumetric images obtained during the treatment course. The construction of the OSG includes the determination of eigenvectors of the organ variation using PCA, and the determination of the corresponding coefficients using time-varying LSR. The coefficients can be either random variables or random functions of the elapsed treatment days depending on the characteristics of organ variation as a stationary or a nonstationary random process. The LSR method with time-varying weighting parameters was applied to the precollected daily volumetric images to determine the function form of the coefficients. Eleven h and n cancer patients with 30 daily cone beam CT images each were included in the evaluation of the OSG. The evaluation was performed using a total of 18 organs of interest, including 15 organs at risk and 3 targets. Results: Geometric variations of organs of interest during h and n cancer radiotherapy can be represented using the first 3 {approx} 4 eigenvectors. These eigenvectors were variable during treatment, and need to be updated using new daily images obtained during the treatment course. The OSG generates random samples of organs of interest from the estimated organ variation PDF of the individual. The accuracy of the estimated PDF can be improved recursively using extra daily image feedback during the treatment course. The average deviations in the estimation of the mean and standard deviation of the organ variation PDF for h

  2. Adaptive k-space sampling design for edge-enhanced DCE-MRI using compressed sensing.

    Science.gov (United States)

    Raja, Rajikha; Sinha, Neelam

    2014-09-01

    The critical challenge in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is the trade-off between spatial and temporal resolution due to the limited availability of acquisition time. To address this, it is imperative to under-sample k-space and to develop specific reconstruction techniques. Our proposed method reconstructs high-quality images from under-sampled dynamic k-space data by proposing two main improvements; i) design of an adaptive k-space sampling lattice and ii) edge-enhanced reconstruction technique. A high-resolution data set obtained before the start of the dynamic phase is utilized. The sampling pattern is designed to adapt to the nature of k-space energy distribution obtained from the static high-resolution data. For image reconstruction, the well-known compressed sensing-based total variation (TV) minimization constrained reconstruction scheme is utilized by incorporating the gradient information obtained from the static high-resolution data. The proposed method is tested on seven real dynamic time series consisting of 2 breast data sets and 5 abdomen data sets spanning 1196 images in all. For data availability of only 10%, performance improvement is seen across various quality metrics. Average improvements in Universal Image Quality Index and Structural Similarity Index Metric of up to 28% and 24% on breast data and about 17% and 9% on abdomen data, respectively, are obtained for the proposed method as against the baseline TV reconstruction with variable density random sampling pattern. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. The importance of training strategy adaptation: a learner-oriented approach for improving older adults' memory and transfer.

    Science.gov (United States)

    Bottiroli, Sara; Cavallini, Elena; Dunlosky, John; Vecchi, Tomaso; Hertzog, Christopher

    2013-09-01

    We investigated the benefits of strategy-adaptation training for promoting transfer effects. This learner-oriented approach--which directly encourages the learner to generalize strategic behavior to new tasks--helps older adults appraise new tasks and adapt trained strategies to them. In Experiment 1, older adults in a strategy-adaptation training group used 2 strategies (imagery and sentence generation) while practicing 2 tasks (list and associative learning); they were then instructed on how to do a simple task analysis to help them adapt the trained strategies for 2 different unpracticed tasks (place learning and text learning) that were discussed during training. Two additional criterion tasks (name-face associative learning and grocery-list learning) were never mentioned during training. Two other groups were included: A strategy training group (who received strategy training and transfer instructions but not strategy-adaptation training) and a waiting-list control group. Both training procedures enhanced older adults' performance on the trained tasks and those tasks that were discussed during training, but transfer was greatest after strategy-adaptation training. Experiment 2 found that strategy-adaptation training conducted via a manual that older adults used at home also promoted transfer. These findings demonstrate the importance of adopting a learner-oriented approach to promote transfer of strategy training.

  4. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  5. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    Science.gov (United States)

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The importance of socio-ecological system dynamics in understanding adaptation to global change in the forestry sector.

    Science.gov (United States)

    Blanco, Victor; Brown, Calum; Holzhauer, Sascha; Vulturius, Gregor; Rounsevell, Mark D A

    2017-07-01

    Adaptation is necessary to cope with or take advantage of the effects of climate change on socio-ecological systems. This is especially important in the forestry sector, which is sensitive to the ecological and economic impacts of climate change, and where the adaptive decisions of owners play out over long periods of time. Relatively little is known about how successful these decisions are likely to be in meeting demands for ecosystem services in an uncertain future. We explore adaptation to global change in the forestry sector using CRAFTY-Sweden; an agent-based model that represents large-scale land-use dynamics, based on the demand and supply of ecosystem services. Future impacts and adaptation within the Swedish forestry sector were simulated for scenarios of socio-economic change (Shared Socio-economic Pathways) and climatic change (Representative Concentration Pathways, for three climate models), between 2010 and 2100. Substantial differences were found in the competitiveness and coping ability of land owners implementing different management strategies through time. Generally, multi-objective management was found to provide the best basis for adaptation. Across large regions, however, a combination of management strategies was better at meeting ecosystem service demands. Results also show that adaptive capacity evolves through time in response to external (global) drivers and interactions between individual actors. This suggests that process-based models are more appropriate for the study of autonomous adaptation and future adaptive and coping capacities than models based on indicators, discrete time snapshots or exogenous proxies. Nevertheless, a combination of planned and autonomous adaptation by institutions and forest owners is likely to be more successful than either group acting alone. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Directory of Open Access Journals (Sweden)

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  8. Image classification with densely sampled image windows and generalized adaptive multiple kernel learning.

    Science.gov (United States)

    Yan, Shengye; Xu, Xinxing; Xu, Dong; Lin, Stephen; Li, Xuelong

    2015-03-01

    We present a framework for image classification that extends beyond the window sampling of fixed spatial pyramids and is supported by a new learning algorithm. Based on the observation that fixed spatial pyramids sample a rather limited subset of the possible image windows, we propose a method that accounts for a comprehensive set of windows densely sampled over location, size, and aspect ratio. A concise high-level image feature is derived to effectively deal with this large set of windows, and this higher level of abstraction offers both efficient handling of the dense samples and reduced sensitivity to misalignment. In addition to dense window sampling, we introduce generalized adaptive l(p)-norm multiple kernel learning (GA-MKL) to learn a robust classifier based on multiple base kernels constructed from the new image features and multiple sets of prelearned classifiers from other classes. With GA-MKL, multiple levels of image features are effectively fused, and information is shared among different classifiers. Extensive evaluation on benchmark datasets for object recognition (Caltech256 and Caltech101) and scene recognition (15Scenes) demonstrate that the proposed method outperforms the state-of-the-art under a broad range of settings.

  9. Adaptive sampling in two-phase designs: a biomarker study for progression in arthritis

    Science.gov (United States)

    McIsaac, Michael A; Cook, Richard J

    2015-01-01

    Response-dependent two-phase designs are used increasingly often in epidemiological studies to ensure sampling strategies offer good statistical efficiency while working within resource constraints. Optimal response-dependent two-phase designs are difficult to implement, however, as they require specification of unknown parameters. We propose adaptive two-phase designs that exploit information from an internal pilot study to approximate the optimal sampling scheme for an analysis based on mean score estimating equations. The frequency properties of estimators arising from this design are assessed through simulation, and they are shown to be similar to those from optimal designs. The design procedure is then illustrated through application to a motivating biomarker study in an ongoing rheumatology research program. Copyright © 2015 © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:25951124

  10. Sheet metal hardening curve determined by laminated sample and its adaptability to sheet forming processes

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The hardening curve for sheet metal can be determined from the load-displacement curve of tensilespecimen with rectangular cross-section. Therefore, uniaxial compression test on cylinder specimen made from laminated sample is put forward. Considering the influence of anisotropy on hardening properties and the stress state in popular forming process, plane strain compression test on cubic specimen made from laminated sample was advanced. Results show that the deformation range of hardening curves obtained from the presented methods is wide, which meets the need for the application in sheet metal forming processes. In view of the characteristics of methods presented in the paper and the stress strain state of various forming processes, the adaptability of the two methods presented in this paper is given.

  11. Improved algorithms and coupled neutron-photon transport for auto-importance sampling method

    Science.gov (United States)

    Wang, Xin; Li, Jun-Li; Wu, Zhen; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Gang, Zhi; Xu, Hong

    2017-01-01

    The Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed for deep penetration problems, which can significantly improve computational efficiency without pre-calculations for importance distribution. However, the AIS method is only validated with several simple examples, and cannot be used for coupled neutron-photon transport. This paper presents improved algorithms for the AIS method, including particle transport, fictitious particle creation and adjustment, fictitious surface geometry, random number allocation and calculation of the estimated relative error. These improvements allow the AIS method to be applied to complicated deep penetration problems with complex geometry and multiple materials. A Completely coupled Neutron-Photon Auto-Importance Sampling (CNP-AIS) method is proposed to solve the deep penetration problems of coupled neutron-photon transport using the improved algorithms. The NUREG/CR-6115 PWR benchmark was calculated by using the methods of CNP-AIS, geometry splitting with Russian roulette and analog Monte Carlo, respectively. The calculation results of CNP-AIS are in good agreement with those of geometry splitting with Russian roulette and the benchmark solutions. The computational efficiency of CNP-AIS for both neutron and photon is much better than that of geometry splitting with Russian roulette in most cases, and increased by several orders of magnitude compared with that of the analog Monte Carlo. Supported by the subject of National Science and Technology Major Project of China (2013ZX06002001-007, 2011ZX06004-007) and National Natural Science Foundation of China (11275110, 11375103)

  12. Motion-adapted pulse sequences for oriented sample (OS) solid-state NMR of biopolymers.

    Science.gov (United States)

    Lu, George J; Opella, Stanley J

    2013-08-28

    One of the main applications of solid-state NMR is to study the structure and dynamics of biopolymers, such as membrane proteins, under physiological conditions where the polypeptides undergo global motions as they do in biological membranes. The effects of NMR radiofrequency irradiations on nuclear spins are strongly influenced by these motions. For example, we previously showed that the MSHOT-Pi4 pulse sequence yields spectra with resonance line widths about half of those observed using the conventional pulse sequence when applied to membrane proteins undergoing rapid uniaxial rotational diffusion in phospholipid bilayers. In contrast, the line widths were not changed in microcrystalline samples where the molecules did not undergo global motions. Here, we demonstrate experimentally and describe analytically how some Hamiltonian terms are susceptible to sample motions, and it is their removal through the critical π/2 Z-rotational symmetry that confers the "motion adapted" property to the MSHOT-Pi4 pulse sequence. This leads to the design of separated local field pulse sequence "Motion-adapted SAMPI4" and is generalized to an approach for the design of decoupling sequences whose performance is superior in the presence of molecular motions. It works by cancelling the spin interaction by explicitly averaging the reduced Wigner matrix to zero, rather than utilizing the 2π nutation to average spin interactions. This approach is applicable to both stationary and magic angle spinning solid-state NMR experiments.

  13. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    Science.gov (United States)

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-03-28

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  14. Psychometric validity and clinical usefulness of the Vineland Adaptive Behavior Scales and the AAMD Adaptive Behavior Scale for an autistic sample.

    Science.gov (United States)

    Perry, A; Factor, D C

    1989-03-01

    Two prominent assessment measures of adaptive behavior were compared and evaluated in terms of their psychometric properties and their clinical usefulness for autistic children and adolescents. The AAMD Adaptive Behavior Scale-School Edition (Lambert & Windmiller, 1981) and the Vineland Adapative Behavior Scales (Sparrow, Balla, & Cicchetti, 1984) were compared in 15 autistic persons aged 8 to 18. Correlations between the two instruments revealed good concurrent validity. The psychometric properties of the tests were similar to those found in samples of mentally retarded persons. The use of adaptive behavior measures for autistic children and adolescents is encouraged. Clinical advantages and disadvantages of the two tests are discussed.

  15. Adapting the Vegetative Vigour Terrestrial Plant Test for assessing ecotoxicity of aerosol samples.

    Science.gov (United States)

    Kováts, Nora; Horváth, Eszter; Eck-Varanka, Bettina; Csajbók, Eszter; Hoffer, András

    2017-06-01

    Plants, being recognized to show high sensitivity to air pollution, have been long used to assess the ecological effects of airborne contaminants. However, many changes in vegetation are now generally attributed to atmospheric deposition of aerosol particles; the dose-effect relationships of this process are usually poorly known. In contrast to bioindication studies, ecotoxicological tests (or bioassays) are controlled and reproducible where ecological responses are determined quantitatively. In our study, the No. 227 OECD Guideline for the Testing of Chemicals: Terrestrial Plant Test: Vegetative Vigour Test (hereinafter referred to as 'Guideline') was adapted and its applicability for assessing the ecotoxicity of water-soluble aerosol compounds of aerosol samples was evaluated. In the aqueous extract of the sample, concentration of metals, benzenes, aliphatic hydrocarbons and PAHs was determined analytically. Cucumis sativus L. plants were sprayed with the aqueous extract of urban aerosol samples collected in a winter sampling campaign in Budapest. After the termination of the test, on day 22, the following endpoints were measured: fresh weight, shoot length and visible symptoms. The higher concentrations applied caused leaf necrosis due to toxic compounds found in the extract. On the other hand, the extract elucidated stimulatory effect at low concentration on both fresh weight and shoot length. The test protocol, based on the Guideline, seems sensitive enough to assess the phytotoxicity of aqueous extract of aerosol and to establish clear cause-effect relationship.

  16. Estimation variance bounds of importance sampling simulations in digital communication systems

    Science.gov (United States)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  17. Estimation variance bounds of importance sampling simulations in digital communication systems

    Science.gov (United States)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  18. A hybrid algorithm for reliability analysis combining Kriging and subset simulation importance sampling

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Cao; Sun, Zhili; Zhao, Qianli; Wang, Qibin [Northeastern University, Shenyang (China); Wang, Shuang [Jiangxi University of Science and Technology, Ganzhou (China)

    2015-08-15

    To solve the problem of large computation when failure probability with time-consuming numerical model is calculated, we propose an improved active learning reliability method called AK-SSIS based on AK-IS algorithm. First, an improved iterative stopping criterion in active learning is presented so that iterations decrease dramatically. Second, the proposed method introduces Subset simulation importance sampling (SSIS) into the active learning reliability calculation, and then a learning function suitable for SSIS is proposed. Finally, the efficiency of AK-SSIS is proved by two academic examples from the literature. The results show that AK-SSIS requires fewer calls to the performance function than AK-IS, and the failure probability obtained from AK-SSIS is very robust and accurate. Then this method is applied on a spur gear pair for tooth contact fatigue reliability analysis.

  19. Model reduction algorithms for optimal control and importance sampling of diffusions

    Science.gov (United States)

    Hartmann, Carsten; Schütte, Christof; Zhang, Wei

    2016-08-01

    We propose numerical algorithms for solving optimal control and importance sampling problems based on simplified models. The algorithms combine model reduction techniques for multiscale diffusions and stochastic optimization tools, with the aim of reducing the original, possibly high-dimensional problem to a lower dimensional representation of the dynamics, in which only a few relevant degrees of freedom are controlled or biased. Specifically, we study situations in which either a reaction coordinate onto which the dynamics can be projected is known, or situations in which the dynamics shows strongly localized behavior in the small noise regime. No explicit assumptions about small parameters or scale separation have to be made. We illustrate the approach with simple, but paradigmatic numerical examples.

  20. Adapting dried blood spot sampling for an anti-therapeutic antibody immunogenicity assay.

    Science.gov (United States)

    Xiang, Yuhong; Welch, Mackenzie; Amaravadi, Lakshmi; Stebbins, Christopher

    2013-07-31

    Dried blood spot sampling is a microvolume sampling technique with many potential advantages. It allows for easier handling and less expensive shipment and storage of biological samples. Additionally, it can provide ethical benefits in the pre-clinical setting through a reduction in animal usage by allowing intensive serial sample collection from the same animals. In the clinical setting, ease of sample collection, greater flexibility of sample storage, and shipping are distinct advantages. These advantages can enhance preclinical and clinical data quality, where immunogenicity monitoring plays an important role in the interpretation of pharmacokinetic data. To date, a method for usage of dried blood spot sampling with an immunogenicity assay has not been published. Herein we demonstrate that the measurement of anti-drug antibodies (ADA) using DBS was comparable to traditional methods in terms of reproducibility, assay sensitivity and drug tolerance. The data demonstrate that DBS is a viable sample collection method, and in some cases may be preferred, over classic serum or plasma sampling for antidrug antibody assays. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. The jigsaw puzzle of sequence phenotype inference: Piecing together Shannon entropy, importance sampling, and Empirical Bayes.

    Science.gov (United States)

    Shreif, Zeina; Striegel, Deborah A; Periwal, Vipul

    2015-09-07

    A nucleotide sequence 35 base pairs long can take 1,180,591,620,717,411,303,424 possible values. An example of systems biology datasets, protein binding microarrays, contain activity data from about 40,000 such sequences. The discrepancy between the number of possible configurations and the available activities is enormous. Thus, albeit that systems biology datasets are large in absolute terms, they oftentimes require methods developed for rare events due to the combinatorial increase in the number of possible configurations of biological systems. A plethora of techniques for handling large datasets, such as Empirical Bayes, or rare events, such as importance sampling, have been developed in the literature, but these cannot always be simultaneously utilized. Here we introduce a principled approach to Empirical Bayes based on importance sampling, information theory, and theoretical physics in the general context of sequence phenotype model induction. We present the analytical calculations that underlie our approach. We demonstrate the computational efficiency of the approach on concrete examples, and demonstrate its efficacy by applying the theory to publicly available protein binding microarray transcription factor datasets and to data on synthetic cAMP-regulated enhancer sequences. As further demonstrations, we find transcription factor binding motifs, predict the activity of new sequences and extract the locations of transcription factor binding sites. In summary, we present a novel method that is efficient (requiring minimal computational time and reasonable amounts of memory), has high predictive power that is comparable with that of models with hundreds of parameters, and has a limited number of optimized parameters, proportional to the sequence length.

  2. Importance Sampling Variance Reduction for the Fokker-Planck Rarefied Gas Particle Method

    CERN Document Server

    Collyer, Benjamin; Lockerby, Duncan

    2015-01-01

    Models and methods that are able to accurately and efficiently predict the flows of low-speed rarefied gases are in high demand, due to the increasing ability to manufacture devices at micro and nano scales. One such model and method is a Fokker-Planck approximation to the Boltzmann equation, which can be solved numerically by a stochastic particle method. The stochastic nature of this method leads to noisy estimates of the thermodynamic quantities one wishes to sample when the signal is small in comparison to the thermal velocity of the gas. Recently, Gorji et al have proposed a method which is able to greatly reduce the variance of the estimators, by creating a correlated stochastic process which acts as a control variate for the noisy estimates. However, there are potential difficulties involved when the geometry of the problem is complex, as the method requires the density to be solved for independently. Importance sampling is a variance reduction technique that has already been shown to successfully redu...

  3. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  4. Intraspecific shape variation in horseshoe crabs: the importance of sexual and natural selection for local adaptation

    DEFF Research Database (Denmark)

    Faurby, Søren; Nielsen, Kasper Sauer Kollerup; Bussarawit, Somchai

    2011-01-01

    polyphemus, which has the largest climatic differences between different populations. Local adaptation driven by sexual selection was expected in males but not females because horseshoe crab mating behaviour leads to competition between males, but not between females. Three hundred fifty-nine horseshoe crabs...... from nine populations, representing three species, were analyzed using a digitizer to position sixty morphometric landmarks in a three-dimensional space. Discriminant analysis revealed strong regional structuring within a species, which suggests strong philopatry, and showed the existence...

  5. Terminological Importation for Adapting Reusable Knowledge Representation Components in the KSM Environment

    OpenAIRE

    Sierra, José Luis; Molina, Martin

    1998-01-01

    This paper describes the adaptation approach of reusable knowledge representation components used in the KSM environment for the formulation and operationalisation of structured knowledge models. Reusable knowledge representation components in KSM are called primitives of representation. A primitive of representation provides: (1) a knowledge representation formalism (2) a set of tasks that use this knowledge together with several problem-solving methods to carry out these tasks (3) a knowled...

  6. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity over Generalized Fading Channels

    KAUST Repository

    Rached, Nadhir B.

    2015-11-13

    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closedform expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive Monte Carlo (MC) simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations.

  7. Component-adaptive up-sampling for inter layer interpolation in scalable video coding

    Institute of Scientific and Technical Information of China (English)

    WANG Zhang; ZHANG JiXian; LI HaiTao

    2009-01-01

    Scalable video coding (SVC) is a newly emerging standard to be finalized as an extension of H.264/AVC. The most attractive characters in SVC are the inter layer prediction techniques, such as Intra_BL mode. But in current SVC scheme, a uniform up-sampling filter (UUSF) is employed to magnify all components of an image, which will be very inefficient and result in a lot of redundant computational complexity. To overcome this, we propose an efficient component-adaptive up-sampling filter (CAUSF) for inter layer interpolation. In CAUSF, one character of human vision system is considered, and different up-sampling filters are assigned to different components. In particular, the six-tap FIR filter used in UUSF is kept and assigned for luminance component. But for chrominance components, a new four-tap FIR filter is used. Experimental results show that CAUSF maintains the performances of coded bit-rate and PSNR-Y without any noticeable loss, and provides significant reduction in computational complexity.

  8. PARALLEL ADAPTIVE MULTILEVEL SAMPLING ALGORITHMS FOR THE BAYESIAN ANALYSIS OF MATHEMATICAL MODELS

    KAUST Repository

    Prudencio, Ernesto

    2012-01-01

    In recent years, Bayesian model updating techniques based on measured data have been applied to many engineering and applied science problems. At the same time, parallel computational platforms are becoming increasingly more powerful and are being used more frequently by the engineering and scientific communities. Bayesian techniques usually require the evaluation of multi-dimensional integrals related to the posterior probability density function (PDF) of uncertain model parameters. The fact that such integrals cannot be computed analytically motivates the research of stochastic simulation methods for sampling posterior PDFs. One such algorithm is the adaptive multilevel stochastic simulation algorithm (AMSSA). In this paper we discuss the parallelization of AMSSA, formulating the necessary load balancing step as a binary integer programming problem. We present a variety of results showing the effectiveness of load balancing on the overall performance of AMSSA in a parallel computational environment.

  9. Accelerating the convergence of replica exchange simulations using Gibbs sampling and adaptive temperature sets

    CERN Document Server

    Vogel, Thomas

    2015-01-01

    We recently introduced a novel replica-exchange scheme in which an individual replica can sample from states encountered by other replicas at any previous time by way of a global configuration database, enabling the fast propagation of relevant states through the whole ensemble of replicas. This mechanism depends on the knowledge of global thermodynamic functions which are measured during the simulation and not coupled to the heat bath temperatures driving the individual simulations. Therefore, this setup also allows for a continuous adaptation of the temperature set. In this paper, we will review the new scheme and demonstrate its capability. The method is particularly useful for the fast and reliable estimation of the microcanonical temperature T(U) or, equivalently, of the density of states g(U) over a wide range of energies.

  10. [Attaching importance to study on acute health risk assessment and adaptation of air pollution and climate change].

    Science.gov (United States)

    Shi, X M

    2017-03-10

    Air pollution and climate change have become key environmental and public health problems around the world, which poses serious threat to human health. How to assess and mitigate the health risks and increase the adaptation of the public have become an urgent topic of research in this area. The six papers in this issue will provide important and rich information on design, analysis method, indicator selection and setting about acute health risk assessment and adaptation study of air pollution and climate change in China, reflecting the advanced conceptions of multi-center and area-specific study and multi-pollutant causing acute effect study. However, the number and type of the cities included in these studies were still limited. In future, researchers should further expand detailed multi-center and multi-area study coverage, conduct area specific predicting and early warning study and strengthen adaptation study.

  11. Recruiting hard-to-reach United States population sub-groups via adaptations of snowball sampling strategy

    Science.gov (United States)

    Sadler, Georgia Robins; Lee, Hau-Chen; Seung-Hwan Lim, Rod; Fullerton, Judith

    2011-01-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author’s program of research are provided to demonstrate how adaptations of snowball sampling can be effectively used in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or subjects for research studies when recruitment of a population based sample is not essential. PMID:20727089

  12. Rapid detection and differentiation of important Campylobacter spp. in poultry samples by dot blot and PCR.

    Science.gov (United States)

    Fontanot, Marco; Iacumin, Lucilla; Cecchini, Francesca; Comi, Giuseppe; Manzano, Marisa

    2014-10-01

    The detection of Campylobacter, the most commonly reported cause of foodborne gastroenteritis in the European Union, is very important for human health. The most commonly recognised risk factor for infection is the handling and/or consumption of undercooked poultry meat. The methods typically applied to evaluate the presence/absence of Campylobacter in food samples are direct plating and/or enrichment culture based on the Horizontal Method for Detection and Enumeration of Campylobacter spp. (ISO 10272-1B: 2006) and PCR. Molecular methods also allow for the detection of cells that are viable but cannot be cultivated on agar media and that decrease the time required for species identification. The current study proposes the use of two molecular methods for species identification: dot blot and PCR. The dot blot method had a sensitivity of 25 ng for detection of DNA extracted from a pure culture using a digoxigenin-labelled probe for hybridisation; the target DNA was extracted from the enrichment broth at 24 h. PCR was performed using a pair of sensitive and specific primers for the detection of Campylobacter jejuni and Campylobacter coli after 24 h of enrichment in Preston broth. The initial samples were contaminated by 5 × 10 C. jejuni cells/g and 1.5 × 10(2)C. coli cells/g, thus the number of cells present in the enrichment broth at 0 h was 1 or 3 cell/g, respectively.

  13. The importance of inducible clindamycin resistance in enterotoxin positive S. aureus isolated from clinical samples

    Directory of Open Access Journals (Sweden)

    Memariani M

    2009-07-01

    Full Text Available "nBackground: Clindamycin is a suitable antibiotic for treatment of skin and soft tissue infections. Moreover, it can suppress toxin production in many pathogenic bacteria such as S. aureus. There are two mechanisms of resistance in this antibiotic. Constitutive resistance can be detected by standard disk diffusion method but in the case of inducible resistance, D-test should be carried out. The main aim of this study is to determine prevalence of clindamycin inducible resistance among methicillin resistant and susceptible isolates of S. aureus isolated from different clinical samples. "nMethods: A total of 87 clinical isolates from clinical samples were collected. Methicillin resistance was determined using standard disk diffusion method. Subsequently, D-test was carried out according to CLSI guideline. Presence of the sea gene (enterotoxin A was detected by PCR using specific primers. "nResults: Out of 87 isolates, 18(20.7% were clindamycin inducible resistant while constitutive resistance was detected among 21(24.1% isolates. The 95% Confidence intervals for the proportion of inducible clindamycin resistance among clinical isolates of S. aureus was 12.2% to 29.2%. The inducible phenotype in MRSA isolates was more common than that of MSSA isolates (33.3% vs 5.1%.Significant differences were found between prevalence of inducible clindamycin resistance and type of infection (p=0.045. Importantly, there was a significant correlation between sea gene and the constitutive/inducible resistance (p<0.0001. "nConclusions: Due to the high prevalence of clindamycin inducible resistance among clinical isolates of S. aureus, we recommend D-test to avoid treatment failure.

  14. Real-time nutrient monitoring in rivers: adaptive sampling strategies, technological challenges and future directions

    Science.gov (United States)

    Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Bradley, Chris

    2016-04-01

    Excessive nutrient concentrations in river waters threaten aquatic ecosystem functioning and can pose substantial risks to human health. Robust monitoring strategies are therefore required to generate reliable estimates of river nutrient loads and to improve understanding of the catchment processes that drive spatiotemporal patterns in nutrient fluxes. Furthermore, these data are vital for prediction of future trends under changing environmental conditions and thus the development of appropriate mitigation measures. In recent years, technological developments have led to an increase in the use of continuous in-situ nutrient analysers, which enable measurements at far higher temporal resolutions than can be achieved with discrete sampling and subsequent laboratory analysis. However, such instruments can be costly to run and difficult to maintain (e.g. due to high power consumption and memory requirements), leading to trade-offs between temporal and spatial monitoring resolutions. Here, we highlight how adaptive monitoring strategies, comprising a mixture of temporal sample frequencies controlled by one or more 'trigger variables' (e.g. river stage, turbidity, or nutrient concentration), can advance our understanding of catchment nutrient dynamics while simultaneously overcoming many of the practical and economic challenges encountered in typical in-situ river nutrient monitoring applications. We present examples of short-term variability in river nutrient dynamics, driven by complex catchment behaviour, which support our case for the development of monitoring systems that can adapt in real-time to rapid environmental changes. In addition, we discuss the advantages and disadvantages of current nutrient monitoring techniques, and suggest new research directions based on emerging technologies and highlight how these might improve: 1) monitoring strategies, and 2) understanding of linkages between catchment processes and river nutrient fluxes.

  15. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Directory of Open Access Journals (Sweden)

    Bruno Srbinovski

    2016-03-01

    Full Text Available Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind. Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources and power hungry sensors (ultrasonic wind sensor and gas sensors. The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  16. Estimation of failure probabilities of linear dynamic systems by importance sampling

    Indian Academy of Sciences (India)

    Anna Ivanova Olsen; Arvid Naess

    2006-08-01

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the first iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the first maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure.

  17. Efficient estimation of abundance for patchily distributed populations via two-phase, adaptive sampling.

    Science.gov (United States)

    Conroy, M.J.; Runge, J.P.; Barker, R.J.; Schofield, M.R.; Fonnesbeck, C.J.

    2008-01-01

    Many organisms are patchily distributed, with some patches occupied at high density, others at lower densities, and others not occupied. Estimation of overall abundance can be difficult and is inefficient via intensive approaches such as capture-mark-recapture (CMR) or distance sampling. We propose a two-phase sampling scheme and model in a Bayesian framework to estimate abundance for patchily distributed populations. In the first phase, occupancy is estimated by binomial detection samples taken on all selected sites, where selection may be of all sites available, or a random sample of sites. Detection can be by visual surveys, detection of sign, physical captures, or other approach. At the second phase, if a detection threshold is achieved, CMR or other intensive sampling is conducted via standard procedures (grids or webs) to estimate abundance. Detection and CMR data are then used in a joint likelihood to model probability of detection in the occupancy sample via an abundance-detection model. CMR modeling is used to estimate abundance for the abundance-detection relationship, which in turn is used to predict abundance at the remaining sites, where only detection data are collected. We present a full Bayesian modeling treatment of this problem, in which posterior inference on abundance and other parameters (detection, capture probability) is obtained under a variety of assumptions about spatial and individual sources of heterogeneity. We apply the approach to abundance estimation for two species of voles (Microtus spp.) in Montana, USA. We also use a simulation study to evaluate the frequentist properties of our procedure given known patterns in abundance and detection among sites as well as design criteria. For most population characteristics and designs considered, bias and mean-square error (MSE) were low, and coverage of true parameter values by Bayesian credibility intervals was near nominal. Our two-phase, adaptive approach allows efficient estimation of

  18. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  19. Motion-adapted pulse sequences for oriented sample (OS) solid-state NMR of biopolymers

    Science.gov (United States)

    Lu, George J.; Opella, Stanley J.

    2013-01-01

    One of the main applications of solid-state NMR is to study the structure and dynamics of biopolymers, such as membrane proteins, under physiological conditions where the polypeptides undergo global motions as they do in biological membranes. The effects of NMR radiofrequency irradiations on nuclear spins are strongly influenced by these motions. For example, we previously showed that the MSHOT-Pi4 pulse sequence yields spectra with resonance line widths about half of those observed using the conventional pulse sequence when applied to membrane proteins undergoing rapid uniaxial rotational diffusion in phospholipid bilayers. In contrast, the line widths were not changed in microcrystalline samples where the molecules did not undergo global motions. Here, we demonstrate experimentally and describe analytically how some Hamiltonian terms are susceptible to sample motions, and it is their removal through the critical π/2 Z-rotational symmetry that confers the “motion adapted” property to the MSHOT-Pi4 pulse sequence. This leads to the design of separated local field pulse sequence “Motion-adapted SAMPI4” and is generalized to an approach for the design of decoupling sequences whose performance is superior in the presence of molecular motions. It works by cancelling the spin interaction by explicitly averaging the reduced Wigner matrix to zero, rather than utilizing the 2π nutation to average spin interactions. This approach is applicable to both stationary and magic angle spinning solid-state NMR experiments. PMID:24006989

  20. Comparison of estimates of hardwood bole volume using importance sampling, the centroid method, and some taper equations

    Science.gov (United States)

    Harry V., Jr. Wiant; Michael L. Spangler; John E. Baumgras

    2002-01-01

    Various taper systems and the centroid method were compared to unbiased volume estimates made by importance sampling for 720 hardwood trees selected throughout the state of West Virginia. Only the centroid method consistently gave volumes estimates that did not differ significantly from those made by importance sampling, although some taper equations did well for most...

  1. Flexible binding simulation by a novel and improved version of virtual-system coupled adaptive umbrella sampling

    Science.gov (United States)

    Dasgupta, Bhaskar; Nakamura, Haruki; Higo, Junichi

    2016-10-01

    Virtual-system coupled adaptive umbrella sampling (VAUS) enhances sampling along a reaction coordinate by using a virtual degree of freedom. However, VAUS and regular adaptive umbrella sampling (AUS) methods are yet computationally expensive. To decrease the computational burden further, improvements of VAUS for all-atom explicit solvent simulation are presented here. The improvements include probability distribution calculation by a Markov approximation; parameterization of biasing forces by iterative polynomial fitting; and force scaling. These when applied to study Ala-pentapeptide dimerization in explicit solvent showed advantage over regular AUS. By using improved VAUS larger biological systems are amenable.

  2. Insights on antioxidant assays for biological samples based on the reduction of copper complexes-the importance of analytical conditions.

    Science.gov (United States)

    Marques, Sara S; Magalhães, Luís M; Tóth, Ildikó V; Segundo, Marcela A

    2014-06-25

    Total antioxidant capacity assays are recognized as instrumental to establish antioxidant status of biological samples, however the varying experimental conditions result in conclusions that may not be transposable to other settings. After selection of the complexing agent, reagent addition order, buffer type and concentration, copper reducing assays were adapted to a high-throughput scheme and validated using model biological antioxidant compounds of ascorbic acid, Trolox (a soluble analogue of vitamin E), uric acid and glutathione. A critical comparison was made based on real samples including NIST-909c human serum certified sample, and five study samples. The validated method provided linear range up to 100 µM Trolox, (limit of detection 2.3 µM; limit of quantification 7.7 µM) with recovery results above 85% and precision <5%. The validated developed method with an increased sensitivity is a sound choice for assessment of TAC in serum samples.

  3. Insights on Antioxidant Assays for Biological Samples Based on the Reduction of Copper Complexes—The Importance of Analytical Conditions

    Directory of Open Access Journals (Sweden)

    Sara S. Marques

    2014-06-01

    Full Text Available Total antioxidant capacity assays are recognized as instrumental to establish antioxidant status of biological samples, however the varying experimental conditions result in conclusions that may not be transposable to other settings. After selection of the complexing agent, reagent addition order, buffer type and concentration, copper reducing assays were adapted to a high-throughput scheme and validated using model biological antioxidant compounds of ascorbic acid, Trolox (a soluble analogue of vitamin E, uric acid and glutathione. A critical comparison was made based on real samples including NIST-909c human serum certified sample, and five study samples. The validated method provided linear range up to 100 µM Trolox, (limit of detection 2.3 µM; limit of quantification 7.7 µM with recovery results above 85% and precision <5%. The validated developed method with an increased sensitivity is a sound choice for assessment of TAC in serum samples.

  4. Adaptive Sampling-Based Information Collection for Wireless Body Area Networks

    OpenAIRE

    Xiaobin Xu; Fang Zhao; Wendong Wang; Hui Tian

    2016-01-01

    To collect important health information, WBAN applications typically sense data at a high frequency. However, limited by the quality of wireless link, the uploading of sensed data has an upper frequency. To reduce upload frequency, most of the existing WBAN data collection approaches collect data with a tolerable error. These approaches can guarantee precision of the collected data, but they are not able to ensure that the upload frequency is within the upper frequency. Some traditional sampl...

  5. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Science.gov (United States)

    Vedder, Oscar; Bouwhuis, Sandra; Sheldon, Ben C

    2013-07-01

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major) breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations.

  6. Quantitative assessment of the importance of phenotypic plasticity in adaptation to climate change in wild bird populations.

    Directory of Open Access Journals (Sweden)

    Oscar Vedder

    2013-07-01

    Full Text Available Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for organisms to track changing environments by in situ adaptation. Here, we use a detailed individual-specific long-term population study of great tits (Parus major breeding in Wytham Woods, Oxford, UK to parameterise a mechanistic model and thus directly estimate the rate of environmental change to which in situ adaptation is possible. Using the effect of changes in early spring temperature on temporal synchrony between birds and a critical food resource, we focus in particular on the contribution of phenotypic plasticity to population persistence. Despite using conservative estimates for evolutionary and reproductive potential, our results suggest little risk of population extinction under projected local temperature change; however, this conclusion relies heavily on the extent to which phenotypic plasticity tracks the changing environment. Extrapolating the model to a broad range of life histories in birds suggests that the importance of phenotypic plasticity for adjustment to projected rates of temperature change increases with slower life histories, owing to lower evolutionary potential. Understanding the determinants and constraints on phenotypic plasticity in natural populations is thus crucial for characterising the risks that rapidly changing environments pose for the persistence of such populations.

  7. Important aspects of residue sampling in drilling dikes; Aspectos importantes para a amostragem de residuos em diques de perfuracao

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Gilvan Ferreira da [PETROBRAS, Rio de Janeiro (Brazil). Centro de Pesquisas. Div. de Explotacao

    1989-12-31

    This paper describes the importance of sampling in the evaluation of physical and chemical properties of residues found in drilling dikes, considering the later selection of treatment methods or discard of these residues. We present the fundamental concepts of applied statistics, which are essential to the elaboration of sampling plans, with views of obtaining exact and precise results. Other types of samples are also presented, as well as sampling equipment and methods for storage and preservation of the samples. As a conclusion, we the example of the implementation of a sampling plan. (author) 3 refs., 9 figs., 3 tabs.

  8. Cortisol secretion and functional disabilities in old age: importance of using adaptive control strategies.

    Science.gov (United States)

    Wrosch, Carsten; Miller, Gregory E; Schulz, Richard

    2009-11-01

    To examine whether the use of health-related control strategies moderates the association between elevated diurnal cortisol secretion and increases in older adults' functional disabilities. Functional disabilities of 164 older adults were assessed over 4 years by measuring participants' problems with performing activities of daily living. The main predictors included baseline levels of diurnal cortisol secretion and control strategies used to manage physical health threats. A large increase in functional disabilities was observed among participants who secreted elevated baseline levels of cortisol and did not use health-related control strategies. By contrast, high cortisol level was not associated with increases in functional disabilities among participants who reported using these control strategies. Among participants with low cortisol level, there was a relatively smaller increase in functional disabilities over time, and the use of control strategies was not significantly associated with changes in functional disabilities. The findings suggest that high cortisol level is associated with an increase in older adults' functional disabilities, but only if older adults do not engage in adaptive control strategies.

  9. Blood Volume: Importance and Adaptations to Exercise Training, Environmental Stresses and Trauma/Sickness

    Science.gov (United States)

    Sawka, Michael N.; Convertino, Victor A.; Eichner, E. Randy; Schnieder, Suzanne M.; Young, Andrew J.

    2000-01-01

    This paper reviews the influence of several perturbations (physical exercise, heat stress, terrestrial altitude, microgravity, and trauma/sickness) on adaptations of blood volume (BV), erythrocyte volume (EV), and plasma volume (PV). Exercise training can induced BV expansion; PV expansion usually occurs immediately, but EV expansion takes weeks. EV and PV expansion contribute to aerobic power improvements associated with exercise training. Repeated heat exposure induces PV expansion but does not alter EV. PV expansion does not improve thermoregulation, but EV expansion improves thermoregulation during exercise in the heat. Dehydration decreases PV (and increases plasma tonicity) which elevates heat strain and reduces exercise performance. High altitude exposure causes rapid (hours) plasma loss. During initial weeks at altitude, EV is unaffected, but a gradual expansion occurs with extended acclimatization. BV adjustments contribute, but are not key, to altitude acclimatization. Microgravity decreases PV and EV which contribute to orthostatic intolerance and decreased exercise capacity in astronauts. PV decreases may result from lower set points for total body water and central venous pressure, which EV decrease bay result form increased erythrocyte destruction. Trauma, renal disease, and chronic diseases cause anemia from hemorrhage and immune activation, which suppressions erythropoiesis. The re-establishment of EV is associated with healing, improved life quality, and exercise capabilities for these injured/sick persons.

  10. The Importance of Pressure Sampling Frequency in Models for Determination of Critical Wave Loadingson Monolithic Structures

    DEFF Research Database (Denmark)

    Burcharth, Hans F.; Andersen, Thomas Lykke; Meinert, Palle

    2008-01-01

    This paper discusses the influence of wave load sampling frequency on calculated sliding distance in an overall stability analysis of a monolithic caisson. It is demonstrated by a specific example of caisson design that for this kind of analyses the sampling frequency in a small scale model could...... be as low as 100 Hz in model scale. However, for design of structure elements like the wave wall on the top of a caisson the wave load sampling frequency must be much higher, in the order of 1000 Hz in the model. Elastic-plastic deformations of foundation and structure were not included in the analysis....

  11. Cultural adaptation in measuring common client characteristics with an urban Mainland Chinese sample.

    Science.gov (United States)

    Song, Xiaoxia; Anderson, Timothy; Beutler, Larry E; Sun, Shijin; Wu, Guohong; Kimpara, Satoko

    2015-01-01

    This study aimed to develop a culturally adapted version of the Systematic Treatment Selection-Innerlife (STS) in China. A total of 300 nonclinical participants collected from Mainland China and 240 nonclinical US participants were drawn from archival data. A Chinese version of the STS was developed, using translation and back-translation procedures. After confirmatory factor analysis (CFA) of the original STS sub scales failed on both samples, exploratory factor analysis (EFA) was then used to access whether a simple structure would emerge on these STS treatment items. Parallel analysis and minimum average partial were used to determine the number of factor to retain. Three cross-cultural factors were found in this study, Internalized Distress, Externalized Distress and interpersonal relations. This supported that regardless of whether one is in presumably different cultural contexts of the USA or China, psychological distress is expressed in a few basic channels of internalized distress, externalized distress, and interpersonal relations, from which different manifestations in different culture were also discussed.

  12. What do we call Adaptive Management? A general characterization from a global sample

    Directory of Open Access Journals (Sweden)

    T. Espigares

    2008-03-01

    Full Text Available This study presents a characterisation of the implementation of Adaptive Management (AM from the analysis of 35 projects around the world. Our results reveal that AM projects are usually aimed at ecosystem management, conservation and restoration. Also, they mainly act upon forest or epicontinental water ecosystems and their goal is generally species exploitation and in most cases these projects act at a local scale. From a methodological point of view, most AM cases use an active approach and monitoring programs and were at the phase of problem identification. We found differences in the implementation of AM between developed and developing countries that were present in our samples in the following way: AM projects in developed countries were typically carried out by state agencies, and focused on solving problems concerning epicontinental waters and the public use of ecosystems. They had the support of national funds and used modelling techniques. In contrast, the AM projects from developing countries were mainly aimed at the conservation of natural protected areas and at the mitigation of environmental impacts derived from mining activities. The financial support of these projects was frequently provided by international organizations, and the use of modelling techniques was uncommon. For a better exploitation of all the possibilities of AM, we suggest the use of criteria to be customized to the specific needs of the socio-economic reality of every country and to monitor the results at a global scale to continuously improve this practice.

  13. French Adaptation of the Narcissistic Personality Inventory in a Belgian French-Speaking Sample.

    Science.gov (United States)

    Braun, Stéphanie; Kempenaers, Chantal; Linkowski, Paul; Loas, Gwenolé

    2016-01-01

    The Narcissistic Personality Inventory (NPI) is the most widely used self-report scale to assess the construct of narcissism, especially in its grandiosity expression. Over the years, several factor models have been proposed in order to improve the understanding of the multidimensional aspect of this construct. The available data are heterogeneous, suggesting one to at least seven factors. In this study, we propose a French adaptation of the NPI submitted to a sample of Belgian French-speaking students (n = 942). We performed a principal component analysis on a tetrachoric correlation matrix to explore its factor structure. Unlike previous studies, our study shows that a first factor explains the largest part of the variance. Internal consistency is excellent and we reproduced the sex differences reported when using the original scale. Correlations with social desirability are taken into account in the interpretation of our results. Altogether, the results of this study support a unidimensional structure for the NPI using the total score as a self-report measure of the Narcissistic Personality Disorder in its grandiose form. Future studies including confirmatory factor analysis and gender invariance measurement are also discussed.

  14. Blood Volume: Importance and Adaptations to Exercise Training, Environmental Stresses and Trauma Sickness

    Science.gov (United States)

    2000-02-01

    the elevation of Hb concentration (from plasma loss) is the most important factor contributing to the performance improve- ment by facilitating O2...A., P. J. BROCK, L. C. KEIL, E. M. BERNAUER, and J. E. GREENLEAF. Exercise training-induced hypervolemia: role of plasma albumin, renin , and...mechanism of hypervol- emia. J. Appl. Physiol. 48:657–664, 1980. 27. CONVERTINO, V. A., L. C. KEIL, and J. E. GREENLEAF. Plasma volume, renin , and

  15. Monte Carlo sampling and multivariate adaptive regression splines as tools for QSAR modelling of HIV-1 reverse transcriptase inhibitors.

    Science.gov (United States)

    Alamdari, R F; Mani-Varnosfaderani, A; Asadollahi-Baboli, M; Khalafi-Nezhad, A

    2012-10-01

    The present work focuses on the development of an interpretable quantitative structure-activity relationship (QSAR) model for predicting the anti-HIV activities of 67 thiazolylthiourea derivatives. This set of molecules has been proposed as potent HIV-1 reverse transcriptase inhibitors (RT-INs). The molecules were encoded to a diverse set of molecular descriptors, spanning different physical and chemical properties. Monte Carlo (MC) sampling and multivariate adaptive regression spline (MARS) techniques were used to select the most important descriptors and to predict the activity of the molecules. The most important descriptor was found to be the aspherisity index. The analysis of variance (ANOVA) and interpretable spline equations showed that the geometrical shape of the molecules has considerable effect on their activities. It seems that the linear molecules are more active than symmetric top compounds. The final MARS model derived displayed a good predictive ability judging from the determination coefficient corresponding to the leave multiple out (LMO) cross-validation technique, i.e. r (2 )= 0.828 (M = 12) and r (2 )= 0.813 (M = 20). The results of this work showed that the developed spline model is robust, has a good predictive power, and can then be used as a reliable tool for designing novel HIV-1 RT-INs.

  16. Interexaminer agreement dental caries epidemiological surveys: the importance of disease prevalence in the sample

    Directory of Open Access Journals (Sweden)

    Aline Sampieri Tonello

    Full Text Available Abstract: Objective: To identify desirable characteristics, including different sample sizes and dental caries prevalences, in virtual samples that allow, at the same time, higher values of general agreement percentage (GPA and Kappa coefficient (κ, under a low confidence interval (CI, in reproducibility studies. Method: A total of 384 statistical simulations of inter-examiner calibration, varying sample size (12, 15, 20, 60, 200 and 500 individuals, caries prevalence (30, 50, 60 and 90% and percentages of positive (PA and negative (NA agreement (30, 50, 60 and 90% were undertaken. GPA and κ were used to measure reproducibility and define deviation between them. Results: The sample of 60 individuals, under caries prevalence of 50%, PA and NA of 90%, presented a GPA and Kappa values of 90 and 80%, respectively, a relative small confidence interval (95%CI 0.65 - 0.95 and a GPA/Kappa deviation of 10.00. Conclusion: A virtual sample of 60 individuals, under caries prevalence of 50%, seems feasible to produce a satisfactory interexaminer agreement at epidemiological conditions. However, epidemiological studies to corroborate or refute this assertion are necessary.

  17. A Feedfordward Adaptive Controller to Reduce the Imaging Time of Large-Sized Biological Samples with a SPM-Based Multiprobe Station

    Directory of Open Access Journals (Sweden)

    Manel Puig-Vidal

    2012-01-01

    Full Text Available The time required to image large samples is an important limiting factor in SPM-based systems. In multiprobe setups, especially when working with biological samples, this drawback can make impossible to conduct certain experiments. In this work, we present a feedfordward controller based on bang-bang and adaptive controls. The controls are based in the difference between the maximum speeds that can be used for imaging depending on the flatness of the sample zone. Topographic images of Escherichia coli bacteria samples were acquired using the implemented controllers. Results show that to go faster in the flat zones, rather than using a constant scanning speed for the whole image, speeds up the imaging process of large samples by up to a 4x factor.

  18. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    Science.gov (United States)

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  19. The two-component system CBO2306/CBO2307 is important for cold adaptation of Clostridium botulinum ATCC 3502.

    Science.gov (United States)

    Derman, Yağmur; Isokallio, Marita; Lindström, Miia; Korkeala, Hannu

    2013-10-01

    Clostridium botulinum is a notorious foodborne pathogen. Its ability to adapt to and grow at low temperatures is of interest for food safety. Two-component systems (TCSs) have been reported to be involved in cold-shock and growth at low temperatures. Here we show the importance of TCS CBO2306/CBO2307 in the cold-shock response of C. botulinum ATCC 3502. The relative expression levels of the cbo2306 and cbo2307 were up to 4.4-fold induced in the cold-shocked cultures but negatively regulated in the late-log and stationary growth phase in relation to early logarithmic growth phase in non-shocked cultures. Importance of the CBO2306/CBO2307 in the cold stress was further demonstrated by impaired growth of insertional cbo2306 or cbo2307 knockout mutants in relation to the wild-type strain ATCC 3502. The results suggest that the TCS CBO2306/CBO2307 is important for cold-shock response and adaptation of C. botulinum ATCC 3502 to low temperature.

  20. The soft palate is an important site of adaptation for transmissible influenza viruses.

    Science.gov (United States)

    Lakdawala, Seema S; Jayaraman, Akila; Halpin, Rebecca A; Lamirande, Elaine W; Shih, Angela R; Stockwell, Timothy B; Lin, Xudong; Simenauer, Ari; Hanson, Christopher T; Vogel, Leatrice; Paskel, Myeisha; Minai, Mahnaz; Moore, Ian; Orandle, Marlene; Das, Suman R; Wentworth, David E; Sasisekharan, Ram; Subbarao, Kanta

    2015-10-01

    Influenza A viruses pose a major public health threat by causing seasonal epidemics and sporadic pandemics. Their epidemiological success relies on airborne transmission from person to person; however, the viral properties governing airborne transmission of influenza A viruses are complex. Influenza A virus infection is mediated via binding of the viral haemagglutinin (HA) to terminally attached α2,3 or α2,6 sialic acids on cell surface glycoproteins. Human influenza A viruses preferentially bind α2,6-linked sialic acids whereas avian influenza A viruses bind α2,3-linked sialic acids on complex glycans on airway epithelial cells. Historically, influenza A viruses with preferential association with α2,3-linked sialic acids have not been transmitted efficiently by the airborne route in ferrets. Here we observe efficient airborne transmission of a 2009 pandemic H1N1 (H1N1pdm) virus (A/California/07/2009) engineered to preferentially bind α2,3-linked sialic acids. Airborne transmission was associated with rapid selection of virus with a change at a single HA site that conferred binding to long-chain α2,6-linked sialic acids, without loss of α2,3-linked sialic acid binding. The transmissible virus emerged in experimentally infected ferrets within 24 hours after infection and was remarkably enriched in the soft palate, where long-chain α2,6-linked sialic acids predominate on the nasopharyngeal surface. Notably, presence of long-chain α2,6-linked sialic acids is conserved in ferret, pig and human soft palate. Using a loss-of-function approach with this one virus, we demonstrate that the ferret soft palate, a tissue not normally sampled in animal models of influenza, rapidly selects for transmissible influenza A viruses with human receptor (α2,6-linked sialic acids) preference.

  1. Indigenizing or Adapting? Importing Buddhism into a Settler-colonial Society

    Directory of Open Access Journals (Sweden)

    Sally McAra

    2015-02-01

    Full Text Available In this paper I problematize the phrase "indigenization of Buddhism" (Spuler 2003, cf. Baumann 1997 through an investigation of a Buddhist project in a settler-colonial society. An international organization called the Foundation for the Preservation of the Mahayana Tradition (FPMT is constructing a forty-five-meter high stupa in rural Australia with the intention "to provide a refuge of peace and serenity for all." In 2003, a woman of Aboriginal descent met with the stupa developers to express her concern about the project. While her complaint does not represent local Aboriginal views about the stupa (other Aboriginal groups expressed support for it, it illustrates how in settler-colonial societies, Buddhist cultural imports that mark the land can have unexpected implications for indigenous people. This paper offers a glimpse of the multi-layered power relations that form the often invisible backdrop to the establishment of Buddhism in settler-colonial societies and suggests that we need to find terms other than "indigenization" when analyzing this.

  2. Sampling procedure in a willow plantation for chemical elements important for biomass combustion quality

    DEFF Research Database (Denmark)

    Liu, Na; Nielsen, Henrik Kofoed; Jørgensen, Uffe

    2015-01-01

    Willow (Salix spp.) is expected to contribute significantly to the woody bioenergy system in the future, so more information on how to sample the quality of the willow biomass is needed. The objectives of this study were to investigate the spatial variation of elements within shoots of a willow...... clone ‘Tordis’, and to reveal the relationship between sampling position, shoot diameters, and distribution of elements. Five Tordis willow shoots were cut into 10–50 cm sections from base to top. The ash content and concentration of twelve elements (Al, Ca, Cd, Cu, Fe, K, Mg, Mn, Na, P, Si, and Zn......) in each section were determined. The results showed large spatial variation in the distribution of most elements along the length of the willow shoots. Concentrations of elements in 2-year old shoots of the willow clone Tordis were fairly stable within the range of 100–285 cm above ground and resembled...

  3. Importance of sample pH on recovery of mutagenicity from drinking water by XAD resins

    Energy Technology Data Exchange (ETDEWEB)

    Ringhand, H.P.; Meier, J.R.; Kopfler, F.C.; Schenck, K.M.; Kaylor, W.H.; Mitchell, D.E.

    1987-04-01

    Sample pH and the presence of a chlorine residual were evaluated for their effects of the recovery of mutagenicity in drinking water following concentration by XAD resins. The levels of mutagenicity in the pH 2 concentrates were 7-8-fold higher than those of the pH 8 concentrates, suggesting that acidic compounds accounted for the majority of the mutagenicity. The presence of a chlorine residual had little effect on the levels of mutagenicity at either pH. Comparisons of the mutagenic activity for the pH 2 resin concentrates vs. pH 8 concentrates prepared by lyophilization further indicated that the acidic mutagens were products of disinfection with chlorine and not artifacts of the sample acidification step in the concentration procedure. 27 references, 6 figures, 1 table.

  4. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    Science.gov (United States)

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  5. Adaptive geostatistical sampling enables efficient identification of malaria hotspots in repeated cross-sectional surveys in rural Malawi

    Science.gov (United States)

    Chipeta, Michael G.; McCann, Robert S.; Phiri, Kamija S.; van Vugt, Michèle; Takken, Willem; Diggle, Peter; Terlouw, Anja D.

    2017-01-01

    Introduction In the context of malaria elimination, interventions will need to target high burden areas to further reduce transmission. Current tools to monitor and report disease burden lack the capacity to continuously detect fine-scale spatial and temporal variations of disease distribution exhibited by malaria. These tools use random sampling techniques that are inefficient for capturing underlying heterogeneity while health facility data in resource-limited settings are inaccurate. Continuous community surveys of malaria burden provide real-time results of local spatio-temporal variation. Adaptive geostatistical design (AGD) improves prediction of outcome of interest compared to current random sampling techniques. We present findings of continuous malaria prevalence surveys using an adaptive sampling design. Methods We conducted repeated cross sectional surveys guided by an adaptive sampling design to monitor the prevalence of malaria parasitaemia and anaemia in children below five years old in the communities living around Majete Wildlife Reserve in Chikwawa district, Southern Malawi. AGD sampling uses previously collected data to sample new locations of high prediction variance or, where prediction exceeds a set threshold. We fitted a geostatistical model to predict malaria prevalence in the area. Findings We conducted five rounds of sampling, and tested 876 children aged 6–59 months from 1377 households over a 12-month period. Malaria prevalence prediction maps showed spatial heterogeneity and presence of hotspots—where predicted malaria prevalence was above 30%; predictors of malaria included age, socio-economic status and ownership of insecticide-treated mosquito nets. Conclusions Continuous malaria prevalence surveys using adaptive sampling increased malaria prevalence prediction accuracy. Results from the surveys were readily available after data collection. The tool can assist local managers to target malaria control interventions in areas with the

  6. Eco-Physiologic studies an important tool for the adaptation of forestry to global changes.

    Directory of Open Access Journals (Sweden)

    HASAN CANI

    2014-06-01

    Full Text Available Forests are the dominant land use in Albania, occupying almost 1.5 million hectares [11], but c.a. 70% of the forest area belong coppices and shrub forests, as the results of unsustainable practices, intensive cutting and overgrazing. Forest ecosystems serve many ecological roles, including regulation of the planet's carbon and water cycles. Forests are also important components of economic systems. Research in the Forest Ecophysiology studies on the Faculty of Forestry Sciences is intended to produce biological knowledge that can be used to better manage forest resources for sustainable production of economic and non-economic values and aims to improve the understanding of past and current dynamics of Mediterranean and temperate forests. The overarching goal is to quantify the influence of genetics, climate, environmental stresses, and forest management inputs on forest productivity and carbon sequestration, and to understand the physiological mechanisms underlying these responses.Process-based models open the way to useful predictions of the future growth rate of forests and provide a means of assessing the probable effects of variations in climate and management on forest productivity. As such they have the potential to overcome the limitations of conventional forest growth and yield models. This paper discusses the basic physiological processes that determine the growth of plants, the way they are affected by environmental factors and how we can improve processes that are well-understood such as growth from leaf to stand level and productivity. The study trays to show a clear relationship between temperature and water relations and other factors affecting forest plant germination and growth that are often looked at separately. This integrated approach will provide the most comprehensive source for process-based modelling, which is valuable to ecologists, plant physiologists, forest planners and environmental scientists [10]. Actually the

  7. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    Science.gov (United States)

    2010-04-01

    ... conduct of Customs business and no danger to the revenue prospective purchaser may be permitted to examine...; repacking; examination of merchandise by prospective purchasers. 19.8 Section 19.8 Customs Duties U.S... goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. Importers...

  8. A comparison of adaptive sampling designs and binary spatial models: A simulation study using a census of Bromus inermis

    Science.gov (United States)

    Irvine, Kathryn M.; Thornton, Jamie; Backus, Vickie M.; Hohmann, Matthew G.; Lehnhoff, Erik A.; Maxwell, Bruce D.; Michels, Kurt; Rew, Lisa

    2013-01-01

    Commonly in environmental and ecological studies, species distribution data are recorded as presence or absence throughout a spatial domain of interest. Field based studies typically collect observations by sampling a subset of the spatial domain. We consider the effects of six different adaptive and two non-adaptive sampling designs and choice of three binary models on both predictions to unsampled locations and parameter estimation of the regression coefficients (species–environment relationships). Our simulation study is unique compared to others to date in that we virtually sample a true known spatial distribution of a nonindigenous plant species, Bromus inermis. The census of B. inermis provides a good example of a species distribution that is both sparsely (1.9 % prevalence) and patchily distributed. We find that modeling the spatial correlation using a random effect with an intrinsic Gaussian conditionally autoregressive prior distribution was equivalent or superior to Bayesian autologistic regression in terms of predicting to un-sampled areas when strip adaptive cluster sampling was used to survey B. inermis. However, inferences about the relationships between B. inermis presence and environmental predictors differed between the two spatial binary models. The strip adaptive cluster designs we investigate provided a significant advantage in terms of Markov chain Monte Carlo chain convergence when trying to model a sparsely distributed species across a large area. In general, there was little difference in the choice of neighborhood, although the adaptive king was preferred when transects were randomly placed throughout the spatial domain.

  9. Numerical Study of φ4 Model by Potential Importance Sampling Method

    Institute of Scientific and Technical Information of China (English)

    YUAN Qing-Xin; DING Guo-Hui

    2006-01-01

    We investigate the phenomena of spontaneous symmetry breaking for φ4 modelon a square lattice in the parameter space by using the potential importance samplingmethod, which was proposed by Milchev, Heermann, and Binder [J. Stat. Phys. 44 (1986) 749]. The critical values of the parameters allow us to determine the phase diagram of the model. At the same time, some relevant quantities such as susceptibility and specific heat are also obtained.

  10. De novo mutations from sporadic schizophrenia cases highlight important signaling genes in an independent sample.

    Science.gov (United States)

    Kranz, Thorsten M; Harroch, Sheila; Manor, Orly; Lichtenberg, Pesach; Friedlander, Yechiel; Seandel, Marco; Harkavy-Friedman, Jill; Walsh-Messinger, Julie; Dolgalev, Igor; Heguy, Adriana; Chao, Moses V; Malaspina, Dolores

    2015-08-01

    Schizophrenia is a debilitating syndrome with high heritability. Genomic studies reveal more than a hundred genetic variants, largely nonspecific and of small effect size, and not accounting for its high heritability. De novo mutations are one mechanism whereby disease related alleles may be introduced into the population, although these have not been leveraged to explore the disease in general samples. This paper describes a framework to find high impact genes for schizophrenia. This study consists of two different datasets. First, whole exome sequencing was conducted to identify disruptive de novo mutations in 14 complete parent-offspring trios with sporadic schizophrenia from Jerusalem, which identified 5 sporadic cases with de novo gene mutations in 5 different genes (PTPRG, TGM5, SLC39A13, BTK, CDKN3). Next, targeted exome capture of these genes was conducted in 48 well-characterized, unrelated, ethnically diverse schizophrenia cases, recruited and characterized by the same research team in New York (NY sample), which demonstrated extremely rare and potentially damaging variants in three of the five genes (MAFgene they carried. Functional de novo mutations in protein-interaction domains in sporadic schizophrenia can illuminate risk genes that increase the propensity to develop schizophrenia across ethnicities.

  11. Allergic contact dermatitis from exotic woods: importance of patch-testing with patient-provided samples.

    Science.gov (United States)

    Podjasek, Joshua O; Cook-Norris, Robert H; Richardson, Donna M; Drage, Lisa A; Davis, Mark D P

    2011-01-01

    Exotic woods from tropical and subtropical regions (eg, from South America, south Asia, and Africa) frequently are used occupationally and recreationally by woodworkers and hobbyists. These exotic woods more commonly provoke irritant contact dermatitis reactions, but they also can provoke allergic contact dermatitis reactions. We report three patients seen at Mayo Clinic (Rochester, MN) with allergic contact dermatitis reactions to exotic woods. Patch testing was performed and included patient-provided wood samples. Avoidance of identified allergens was recommended. For all patients, the dermatitis cleared or improved after avoidance of the identified allergens. Clinicians must be aware of the potential for allergic contact dermatitis reactions to compounds in exotic woods. Patch testing should be performed with suspected woods for diagnostic confirmation and allowance of subsequent avoidance of the allergens.

  12. Importance of long-time simulations for rare event sampling in zinc finger proteins.

    Science.gov (United States)

    Godwin, Ryan; Gmeiner, William; Salsbury, Freddie R

    2016-01-01

    Molecular dynamics (MD) simulation methods have seen significant improvement since their inception in the late 1950s. Constraints of simulation size and duration that once impeded the field have lessened with the advent of better algorithms, faster processors, and parallel computing. With newer techniques and hardware available, MD simulations of more biologically relevant timescales can now sample a broader range of conformational and dynamical changes including rare events. One concern in the literature has been under which circumstances it is sufficient to perform many shorter timescale simulations and under which circumstances fewer longer simulations are necessary. Herein, our simulations of the zinc finger NEMO (2JVX) using multiple simulations of length 15, 30, 1000, and 3000 ns are analyzed to provide clarity on this point.

  13. The importance of measuring and accounting for potential biases in respondent-driven samples.

    Science.gov (United States)

    Rudolph, Abby E; Fuller, Crystal M; Latkin, Carl

    2013-07-01

    Respondent-driven sampling (RDS) is often viewed as a superior method for recruiting hard-to-reach populations disproportionately burdened with poor health outcomes. As an analytic approach, it has been praised for its ability to generate unbiased population estimates via post-stratified weights which account for non-random recruitment. However, population estimates generated with RDSAT (RDS Analysis Tool) are sensitive to variations in degree weights. Several assumptions are implicit in the degree weight and are not routinely assessed. Failure to meet these assumptions could result in inaccurate degree measures and consequently result in biased population estimates. We highlight potential biases associated with violating the assumptions implicit in degree weights for the RDSAT estimator and propose strategies to measure and possibly correct for biases in the analysis.

  14. Performance of sampling density-weighted and postfiltered density-adapted projection reconstruction in sodium magnetic resonance imaging.

    Science.gov (United States)

    Konstandin, Simon; Nagel, Armin M

    2013-02-01

    Sampling density-weighted apodization projection reconstruction sequences are evaluated for three-dimensional radial imaging. The readout gradients of the sampling density-weighted apodization sequence are designed such that the locally averaged sampling density matches a Hamming filter function. This technique is compared with density-adapted projection reconstruction with nonfiltered and postfiltered image reconstruction. Sampling density-weighted apodization theoretically allows for a 1.28-fold higher signal-to-noise ratio compared with postfiltered density-adapted projection reconstruction sequences, if T(2)* decay is negligible compared with the readout duration T(RO). Simulations of the point-spread functions are performed for monoexponential and biexponential decay to investigate the effects of T(2)* decay on the performance of the different sequences. Postfiltered density-adapted projection reconstruction performs superior to sampling density-weighted apodization for large T(RO)/T(2)* ratios [>1.36 (monoexponential decay); >0.35 (biexponential decay with T(2s)*/T(2f)* = 10)], if signal-to-noise ratio of point-like objects is considered. In conclusion, it depends on the readout parameters, the T(2)* relaxation times, and the dimensions of the subject which of both sequences is most suitable. Copyright © 2012 Wiley Periodicals, Inc.

  15. Study of a MEMS-based Shack-Hartmann wavefront sensor with adjustable pupil sampling for astronomical adaptive optics.

    Science.gov (United States)

    Baranec, Christoph; Dekany, Richard

    2008-10-01

    We introduce a Shack-Hartmann wavefront sensor for adaptive optics that enables dynamic control of the spatial sampling of an incoming wavefront using a segmented mirror microelectrical mechanical systems (MEMS) device. Unlike a conventional lenslet array, subapertures are defined by either segments or groups of segments of a mirror array, with the ability to change spatial pupil sampling arbitrarily by redefining the segment grouping. Control over the spatial sampling of the wavefront allows for the minimization of wavefront reconstruction error for different intensities of guide source and different atmospheric conditions, which in turn maximizes an adaptive optics system's delivered Strehl ratio. Requirements for the MEMS devices needed in this Shack-Hartmann wavefront sensor are also presented.

  16. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  17. Adaptation and psychometric properties of the student career construction inventory for a Portuguese sample: formative and reflective constructs.

    Science.gov (United States)

    Rocha, Magda; Guimarães, Maria Isabel

    2012-12-01

    The adaptation of the student career construction inventory was carried out with a Portuguese sample of 356 first-year economics, management, psychology, nursing, nutrition sciences, bio-engineering, and biosciences students (244 women, 112 men; M age = 19.4, SD = 4.4) in the Catholic University of Portugal, Porto. Confirmatory factorial analysis supported the prior structure of the reflective models, with acceptable fit indexes. Internal consistency coefficients for the scales were poor to acceptable (.51 to .89). The formative nature of career adaptability was supported in a complex model identified by structural relations for which the fit indexes were weak but acceptable for a preliminary study.

  18. Importance of sample size for the estimation of repeater F waves in amyotrophic lateral sclerosis.

    Science.gov (United States)

    Fang, Jia; Liu, Ming-Sheng; Guan, Yu-Zhou; Cui, Bo; Cui, Li-Ying

    2015-02-20

    In amyotrophic lateral sclerosis (ALS), repeater F waves are increased. Accurate assessment of repeater F waves requires an adequate sample size. We studied the F waves of left ulnar nerves in ALS patients. Based on the presence or absence of pyramidal signs in the left upper limb, the ALS patients were divided into two groups: One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group. The Index repeating neurons (RN) and Index repeater F waves (Freps) were compared among the P, NP and control groups following 20 and 100 stimuli respectively. For each group, the Index RN and Index Freps obtained from 20 and 100 stimuli were compared. In the P group, the Index RN (P = 0.004) and Index Freps (P = 0.001) obtained from 100 stimuli were significantly higher than from 20 stimuli. For F waves obtained from 20 stimuli, no significant differences were identified between the P and NP groups for Index RN (P = 0.052) and Index Freps (P = 0.079); The Index RN (P waves obtained from 100 stimuli, the Index RN (P waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS. For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy, 100 stimuli would be required.

  19. A Modified Trap for Adult Sampling of Medically Important Flies (Insecta: Diptera

    Directory of Open Access Journals (Sweden)

    Kamran Akbarzadeh

    2012-12-01

    Full Text Available Background: Bait-trapping appears to be a generally useful method of studying fly populations. The aim of this study was to construct a new adult flytrap by some modifications in former versions and to evaluate its applicability in a subtropical zone in southern Iran.Methods: The traps were constructed with modification by adding some equipment to a polyethylene container (18× 20× 33 cm with lid. The fresh sheep meat was used as bait. Totally 27 adult modified traps were made and tested for their efficacies to attract adult flies. The experiment was carried out in a range of different topographic areas of Fars Province during June 2010.Results: The traps were able to attract various groups of adult flies belonging to families of: Calliphoridae, Sarcophagidae, Muscidae, and Faniidae. The species of Calliphora vicina (Diptera: Calliphoridae, Sarcophaga argyrostoma (Diptera: Sarcophagidae and Musca domestica (Diptera: Muscidae include the majority of the flies collected by this sheep-meat baited trap.Conclusion: This adult flytrap can be recommended for routine field sampling to study diversity and population dynamics of flies where conducting of daily collection is difficult.

  20. Importance of Sample Size for the Estimation of Repeater F Waves in Amyotrophic Lateral Sclerosis

    Directory of Open Access Journals (Sweden)

    Jia Fang

    2015-01-01

    Full Text Available Background: In amyotrophic lateral sclerosis (ALS, repeater F waves are increased. Accurate assessment of repeater F waves requires an adequate sample size. Methods: We studied the F waves of left ulnar nerves in ALS patients. Based on the presence or absence of pyramidal signs in the left upper limb, the ALS patients were divided into two groups: One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group. The Index repeating neurons (RN and Index repeater F waves (Freps were compared among the P, NP and control groups following 20 and 100 stimuli respectively. For each group, the Index RN and Index Freps obtained from 20 and 100 stimuli were compared. Results: In the P group, the Index RN (P = 0.004 and Index Freps (P = 0.001 obtained from 100 stimuli were significantly higher than from 20 stimuli. For F waves obtained from 20 stimuli, no significant differences were identified between the P and NP groups for Index RN (P = 0.052 and Index Freps (P = 0.079; The Index RN (P < 0.001 and Index Freps (P < 0.001 of the P group were significantly higher than the control group; The Index RN (P = 0.002 of the NP group was significantly higher than the control group. For F waves obtained from 100 stimuli, the Index RN (P < 0.001 and Index Freps (P < 0.001 of the P group were significantly higher than the NP group; The Index RN (P < 0.001 and Index Freps (P < 0.001 of the P and NP groups were significantly higher than the control group. Conclusions: Increased repeater F waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS. For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy, 100 stimuli would be required.

  1. Accounting for sampling patterns reverses the relative importance of trade and climate for the global sharing of exotic plants

    Science.gov (United States)

    Sofaer, Helen; Jarnevich, Catherine S.

    2017-01-01

    AimThe distributions of exotic species reflect patterns of human-mediated dispersal, species climatic tolerances and a suite of other biotic and abiotic factors. The relative importance of each of these factors will shape how the spread of exotic species is affected by ongoing economic globalization and climate change. However, patterns of trade may be correlated with variation in scientific sampling effort globally, potentially confounding studies that do not account for sampling patterns.LocationGlobal.Time periodMuseum records, generally from the 1800s up to 2015.Major taxa studiedPlant species exotic to the United States.MethodsWe used data from the Global Biodiversity Information Facility (GBIF) to summarize the number of plant species with exotic occurrences in the United States that also occur in each other country world-wide. We assessed the relative importance of trade and climatic similarity for explaining variation in the number of shared species while evaluating several methods to account for variation in sampling effort among countries.ResultsAccounting for variation in sampling effort reversed the relative importance of trade and climate for explaining numbers of shared species. Trade was strongly correlated with numbers of shared U.S. exotic plants between the United States and other countries before, but not after, accounting for sampling variation among countries. Conversely, accounting for sampling effort strengthened the relationship between climatic similarity and species sharing. Using the number of records as a measure of sampling effort provided a straightforward approach for the analysis of occurrence data, whereas species richness estimators and rarefaction were less effective at removing sampling bias.Main conclusionsOur work provides support for broad-scale climatic limitation on the distributions of exotic species, illustrates the need to account for variation in sampling effort in large biodiversity databases, and highlights the

  2. Importance of Sample Size for the Estimation of Repeater F Waves in Amyotrophic Lateral Sclerosis

    Institute of Scientific and Technical Information of China (English)

    Jia Fang; Ming-Sheng Liu; Yu-Zhou Guan; Bo Cui; Li-Ying Cui

    2015-01-01

    Background:In amyotrophic lateral sclerosis (ALS),repeater F waves are increased.Accurate assessment of repeater F waves requires an adequate sample size.Methods:We studied the F waves of left ulnar nerves in ALS patients.Based on the presence or absence of pyramidal signs in the left upper limb,the ALS patients were divided into two groups:One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group.The Index repeating neurons (RN) and Index repeater F waves (Freps) were compared among the P,NP and control groups following 20 and 100 stimuli respectively.For each group,the Index RN and Index Freps obtained from 20 and 100 stimuli were compared.Results:In the P group,the Index RN (P =0.004) and Index Freps (P =0.001) obtained from 100 stimuli were significantly higher than from 20 stimuli.For F waves obtained from 20 stimuli,no significant differences were identified between the P and NP groups for Index RN (P =0.052) and Index Freps (P =0.079); The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the control group; The Index RN (P =0.002) of the NP group was significantly higher than the control group.For F waves obtained from 100 stimuli,the Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the NP group; The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P and NP groups were significantly higher than the control group.Conclusions:Increased repeater F waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS.For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy,100 stimuli would be required.

  3. Spatiotonal adaptivity in super-resolution of under-sampled image sequences

    NARCIS (Netherlands)

    Pham, T.Q.

    2006-01-01

    This thesis concerns the use of spatial and tonal adaptivity in improving the resolution of aliased image sequences under scene or camera motion. Each of the five content chapters focuses on a different subtopic of super-resolution: image registration (chapter 2), image fusion (chapter 3 and 4), sup

  4. Spatiotonal adaptivity in super-resolution of under-sampled image sequences

    NARCIS (Netherlands)

    Pham, T.Q.

    2006-01-01

    This thesis concerns the use of spatial and tonal adaptivity in improving the resolution of aliased image sequences under scene or camera motion. Each of the five content chapters focuses on a different subtopic of super-resolution: image registration (chapter 2), image fusion (chapter 3 and 4),

  5. An energy-efficient adaptive sampling scheme for wireless sensor networks

    NARCIS (Netherlands)

    Masoum, Alireza; Meratnia, Nirvana; Havinga, Paul J.M.

    2013-01-01

    Wireless sensor networks are new monitoring platforms. To cope with their resource constraints, in terms of energy and bandwidth, spatial and temporal correlation in sensor data can be exploited to find an optimal sampling strategy to reduce number of sampling nodes and/or sampling frequencies while

  6. Evaluation of endoscopically obtained duodenal biopsy samples from cats and dogs in an adapter-modified Ussing chamber

    Science.gov (United States)

    DeBiasio, John V.; Suchodolski, Jan S.; Newman, Shelley; Musch, Mark W.; Steiner, Jörg M.

    2014-01-01

    This study was conducted to evaluate an adapter-modified Ussing chamber for assessment of transport physiology in endoscopically obtained duodenal biopsies from healthy cats and dogs, as well as dogs with chronic enteropathies. 17 duodenal biopsies from five cats and 51 duodenal biopsies from 13 dogs were obtained. Samples were transferred into an adapter-modified Ussing chamber and sequentially exposed to various absorbagogues and secretagogues. Overall, 78.6% of duodenal samples obtained from cats responded to at least one compound. In duodenal biopsies obtained from dogs, the rate of overall response ranged from 87.5% (healthy individuals; n = 8), to 63.6% (animals exhibiting clinical signs of gastrointestinal disease and histopathological unremarkable duodenum; n = 15), and 32.1% (animals exhibiting clinical signs of gastrointestinal diseases and moderate to severe histopathological lesions; n = 28). Detailed information regarding the magnitude and duration of the response are provided. The adapter-modified Ussing chamber enables investigation of the absorptive and secretory capacity of endoscopically obtained duodenal biopsies from cats and dogs and has the potential to become a valuable research tool. The response of samples was correlated with histopathological findings. PMID:24378587

  7. Research on non-uniform sampling problem when adapting wavenumber algorithm to multiple-receiver synthetic aperture sonar

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The azimuth sampling of multiple-receiver SAS systems is non-uniform,which causes standard wavenumber algorithm(ω—κ) can't be applied to multiple-receiver SAS image reconstruction.To solve the problem,two methods are presented,which can adapt the standardω—κalgorithm to multiple-receiver SAS system.One method named Non-uniform Separate Fourier Transform(NSFFT) converts the Fourier Transform(FT) of the non-uniform samples in azimuth direction into several uniform FTs on the assumption that the sonar array...

  8. Detecting Local Adaptation Using the Joint Sampling of Polymorphism Data in the Parental and Derived Populations

    OpenAIRE

    Innan, Hideki; Kim, Yuseob

    2008-01-01

    When a local colonization in a new niche occurs, the new derived population should be subject to different selective pressures from that in the original parental population; consequently it is likely that many loci will be subject to directional selection. In such a quick adaptation event through environmental changes, it is reasonable to consider that selection utilizes genetic variations accumulated in the precolonization phase. This mode of selection from standing variation would play an i...

  9. Clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile cultivated from stool samples of hospitalized patients

    Directory of Open Access Journals (Sweden)

    Predrag Stojanovic

    2012-03-01

    Full Text Available The aim of this study was to fortify the clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile isolated from stool samples of hospitalized patients. This survey included 80 hospitalized patients with diarrhea and positive findings of Clostridium difficile in stool samples, and 100 hospitalized patients with formed stool as a control group. Bacteriological examination of a stool samples was conducted using standard microbiological methods. Stool sample were inoculated directly on nutrient media for bacterial cultivation (blood agar using 5% sheep blood, Endo agar, selective Salmonella Shigella agar, Selenite-F broth, CIN agar and Skirrow's medium, and to selective cycloserine-cefoxitin-fructose agar (CCFA (Biomedics, Parg qe tehnicologico, Madrid, Spain for isolation of Clostridium difficile. Clostridium difficile toxin was detected by ELISA-ridascreen Clostridium difficile Toxin A/B (R-Biopharm AG, Germany and ColorPAC ToxinA test (Becton Dickinson, USA. Examination of stool specimens for the presence of parasites (causing diarrhea was done using standard methods (conventional microscopy, commercial concentration test Paraprep S Gold kit (Dia Mondial, France and RIDA®QUICK Cryptosporidium/Giardia Combi test (R-Biopharm AG, Germany. Examination of stool specimens for the presence of fungi (causing diarrhea was performed by standard methods. All stool samples positive for Clostridium difficile were tested for Rota, Noro, Astro and Adeno viruses by ELISA - ridascreen (R-Biopharm AG, Germany. In this research we isolated 99 Clostridium difficile strains from 116 stool samples of 80 hospitalized patients with diarrhea. The 53 (66.25% of patients with diarrhea were positive for toxins A and B, one (1.25% were positive for only toxin B. Non-toxigenic Clostridium difficile isolated from samples of 26 (32.5% patients. However, other pathogenic microorganisms of intestinal tract cultivated from samples of 16 patients

  10. Enhancing the Frequency Adaptability of Periodic Current Controllers with a Fixed Sampling Rate for Grid-Connected Power Converters

    DEFF Research Database (Denmark)

    Yang, Yongheng; Zhou, Keliang; Blaabjerg, Frede

    2016-01-01

    the instantaneous grid information (e.g., frequency and phase of the grid voltage) for the current control, which is commonly performed by a Phase-Locked-Loop (PLL) system. Hence, harmonics and deviations in the estimated frequency by the PLL could lead to current tracking performance degradation, especially...... for the periodic signal controllers (e.g., PR and RC) with a fixed sampling rate. In this paper, the impacts of frequency deviations induced by the PLL and/or the grid disturbances on the selected current controllers are investigated by analyzing the frequency adaptability of these current controllers....... Subsequently, strategies to enhance the frequency adaptability of the current controllers are proposed for the power converters to produce high quality feed-in currents even in the presence of grid frequency deviations. Specifically, by feeding back the PLL estimated frequency to update the center frequencies...

  11. An adaptive sampling algorithm for Doppler-shift fluorescence velocimetry in high-speed flows

    Science.gov (United States)

    Le Page, Laurent M.; O'Byrne, Sean

    2017-03-01

    We present an approach to improving the efficiency of obtaining samples over a given domain for the peak location of Gaussian line-shapes. The method uses parameter estimates obtained from previous measurements to determine subsequent sampling locations. The method may be applied to determine the location of a spectral peak, where the monetary or time cost is too high to allow a less efficient search method, such as sampling at uniformly distributed domain locations, to be used. We demonstrate the algorithm using linear least-squares fitting of log-scaled planar laser-induced fluorescence data combined with Monte-Carlo simulation of measurements, to accurately determine the Doppler-shifted fluorescence peak frequency for each pixel of a fluorescence image. A simulated comparison between this approach and a uniformly spaced sampling approach is carried out using fits both for a single pixel and for a collection of pixels representing the fluorescence images that would be obtained in a hypersonic flow facility. In all cases, the peak location of Doppler-shifted line-shapes were determined to a similar precision with fewer samples than could be achieved using the more typical uniformly distributed sampling approach.

  12. Quantitative Assessment of the Importance of Phenotypic Plasticity in Adaptation to Climate Change in Wild Bird Populations

    NARCIS (Netherlands)

    Vedder, Oscar; Bouwhuis, Sandra; Sheldon, Ben C.

    Predictions about the fate of species or populations under climate change scenarios typically neglect adaptive evolution and phenotypic plasticity, the two major mechanisms by which organisms can adapt to changing local conditions. As a consequence, we have little understanding of the scope for

  13. Adaptation of the Nomophobia Questionnaire (NMP-Q) to Spanish in a sample of adolescents.

    Science.gov (United States)

    González-Cabrera, Joaquín; León-Mejía, Ana; Pérez-Sancho, Carlota; Calvete, Esther

    2017-07-01

    Nomophobia is the fear of being out of mobile phone contact. People suffering from this anxiety disorder have feelings of stress and nervousness when access to their mobiles or computers is not possible. This work is an adaptation and validation study of the Spanish version of the Nomophobia Questionnaire (NMP-Q). The study included 306 students (46.1% males and 53.9% females) with ages ranging 13 to 19 years (Md=15.41±1.22). Exploratory factor analysis revealed four dimensions that accounted for 64.4% of total variance. The ordinal α-value was 0.95, ranging from 0.75 to 0.92 across factors. Measure of stability was calculated by the testretest method (r=0.823). Indicators of convergence with the Spanish versions of the “Mobile Phone Problem Use Scale” (r=0.654) and the “Generalized Problematic Internet Use Scale” (r=0.531) were identified. Problematic mobile phone use patterns were examined taking the 15P, 80P and 95P percentiles as cut-off points. Scores of 39, 87 and 116 on NMP-Q corresponded to occasional, at-risk and problematic users, respectively. Psychometric analysis shows that the Spanish version of the NMP-Q is a valid and reliable tool for the study of nomophobia.

  14. Efficient Bayes-Adaptive Reinforcement Learning using Sample-Based Search

    CERN Document Server

    Guez, Arthur; Dayan, Peter

    2012-01-01

    Bayesian model-based reinforcement learning is a formally elegant approach to learning optimal behaviour under model uncertainty. In this setting, a Bayes-optimal policy captures the ideal trade-off between exploration and exploitation. Unfortunately, finding Bayes-optimal policies is notoriously taxing due to the enormous search space in the augmented belief-state MDP. In this paper we exploit recent advances in sample-based planning, based on Monte-Carlo tree search, to introduce a tractable method for approximate Bayes-optimal planning. Unlike prior work in this area, we avoid expensive applications of Bayes rule within the search tree, by lazily sampling models from the current beliefs. Our approach outperformed prior Bayesian model-based RL algorithms by a significant margin on several well-known benchmark problems.

  15. Adaptive robust image registration approach based on adequately sampling polar transform and weighted angular projection function

    Science.gov (United States)

    Wei, Zhao; Tao, Feng; Jun, Wang

    2013-10-01

    An efficient, robust, and accurate approach is developed for image registration, which is especially suitable for large-scale change and arbitrary rotation. It is named the adequately sampling polar transform and weighted angular projection function (ASPT-WAPF). The proposed ASPT model overcomes the oversampling problem of conventional log-polar transform. Additionally, the WAPF presented as the feature descriptor is robust to the alteration in the fovea area of an image, and reduces the computational cost of the following registration process. The experimental results show two major advantages of the proposed method. First, it can register images with high accuracy even when the scale factor is up to 10 and the rotation angle is arbitrary. However, the maximum scaling estimated by the state-of-the-art algorithms is 6. Second, our algorithm is more robust to the size of the sampling region while not decreasing the accuracy of the registration.

  16. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping [Academia Sinica, Institute of Astronomy and Astrophysics, Taipei, Taiwan (China); Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Girart, Josep M. [Institut de Ciències de l' Espai, CSIC-IEEC, Campus UAB, Facultat de Ciències, C5p 2, 08193 Bellaterra, Catalonia (Spain); Frau, Pau [Observatorio Astronómico Nacional, Alfonso XII, 3 E-28014 Madrid (Spain); Li, Hua-Bai [Department of Physics, The Chinese University of Hong Kong (Hong Kong); Li, Zhi-Yun [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904 (United States); Padovani, Marco [Laboratoire Univers et Particules de Montpellier, UMR 5299 du CNRS, Université de Montpellier II, place E. Bataillon, cc072, F-34095 Montpellier (France); Qiu, Keping [School of Astronomy and Space Science, Nanjing University, 22 Hankou Road, Nanjiing 210093 (China); Rao, Ramprasad, E-mail: pmkoch@asiaa.sinica.edu.tw [Academia Sinica, Institute of Astronomy and Astrophysics, 645 N. Aohoku Place, Hilo, HI 96720 (United States)

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  17. Adaptation of ATI-R Scale to Turkish Samples: Validity and Reliability Analyses

    Science.gov (United States)

    Tezci, Erdogan

    2017-01-01

    Teachers' teaching approaches have become an important issue in the search of quality in education and teaching because of their effect on students' learning. Improvements in teachers' knowledge and awareness of their own teaching approaches enable them to adopt teaching process in accordance with their students' learning styles. The Approaches to…

  18. Random Transect with Adaptive Clustering Sampling Design - ArcPad Applet Manual

    Science.gov (United States)

    2011-09-01

    18527. Bais, H. P., T. L. Weir, L. G. Perry, S. Gilroy, and J. M. Vivanco. 2006. The role of root exudates in rhizosphere interactions with plants and...communities and alter important ecosystem-level properties such as hydrology, disturbance regimes, nutrient cycling, and microbial processes

  19. SAMPLING ADAPTIVE STRATEGY AND SPATIAL ORGANISATION ESTIMATION OF SOIL ANIMAL COMMUNITIES AT VARIOUS HIERARCHICAL LEVELS OF URBANISED TERRITORIES

    Directory of Open Access Journals (Sweden)

    Baljuk J.A.

    2014-12-01

    Full Text Available In work the algorithm of adaptive strategy of optimum spatial sampling for studying of the spatial organisation of communities of soil animals in the conditions of an urbanization have been presented. As operating variables the principal components obtained as a result of the analysis of the field data on soil penetration resistance, soils electrical conductivity and density of a forest stand, collected on a quasiregular grid have been used. The locations of experimental polygons have been stated by means of program ESAP. The sampling has been made on a regular grid within experimental polygons. The biogeocoenological estimation of experimental polygons have been made on a basis of A.L.Belgard's ecomorphic analysis. The spatial configuration of biogeocoenosis types has been established on the basis of the data of earth remote sensing and the analysis of digital elevation model. The algorithm was suggested which allows to reveal the spatial organisation of soil animal communities at investigated point, biogeocoenosis, and landscape.

  20. Diagnosis of Cerebral Toxoplasmosis in AIDS Patients in Brazil: Importance of Molecular and Immunological Methods Using Peripheral Blood Samples

    Science.gov (United States)

    Colombo, Fabio A.; Vidal, José E.; Oliveira, Augusto C. Penalva de; Hernandez, Adrián V.; Bonasser-Filho, Francisco; Nogueira, Roberta S.; Focaccia, Roberto; Pereira-Chioccola, Vera Lucia

    2005-01-01

    Cerebral toxoplasmosis is the most common cerebral focal lesion in AIDS and still accounts for high morbidity and mortality in Brazil. Its occurrence is more frequent in patients with low CD4+ T-cell counts. It is directly related to the prevalence of anti-Toxoplasma gondii antibodies in the population. Therefore, it is important to evaluate sensitive, less invasive, and rapid diagnostic tests. We evaluated the value of PCR using peripheral blood samples on the diagnosis of cerebral toxoplasmosis and whether its association with immunological assays can contribute to a timely diagnosis. We prospectively analyzed blood samples from 192 AIDS patients divided into two groups. The first group was composed of samples from 64 patients with cerebral toxoplasmosis diagnosed by clinical and radiological features. The second group was composed of samples from 128 patients with other opportunistic diseases. Blood collection from patients with cerebral toxoplasmosis was done before or on the third day of anti-toxoplasma therapy. PCR for T. gondii, indirect immunofluorescence, enzyme-linked immunosorbent assay, and an avidity test for toxoplasmosis were performed on all samples. The PCR sensitivity and specificity for diagnosis of cerebral toxoplasmosis in blood were 80% and 98%, respectively. Patients with cerebral toxoplasmosis (89%) presented higher titers of anti-T. gondii IgG antibodies than patients with other diseases (57%) (P < 0.001). These findings suggest the clinical value of the use of both PCR and high titers of anti-T. gondii IgG antibodies for the diagnosis of cerebral toxoplasmosis. This strategy may prevent more invasive approaches. PMID:16207959

  1. Preliminary Efficacy of Adapted Responsive Teaching for Infants at Risk of Autism Spectrum Disorder in a Community Sample

    Directory of Open Access Journals (Sweden)

    Grace T. Baranek

    2015-01-01

    Full Text Available This study examined the (a feasibility of enrolling 12-month-olds at risk of ASD from a community sample into a randomized controlled trial, (b subsequent utilization of community services, and (c potential of a novel parent-mediated intervention to improve outcomes. The First Year Inventory was used to screen and recruit 12-month-old infants at risk of ASD to compare the effects of 6–9 months of Adapted Responsive Teaching (ART versus referral to early intervention and monitoring (REIM. Eighteen families were followed for ~20 months. Assessments were conducted before randomization, after treatment, and at 6-month follow-up. Utilization of community services was highest for the REIM group. ART significantly outperformed REIM on parent-reported and observed measures of child receptive language with good linear model fit. Multiphase growth models had better fit for more variables, showing the greatest effects in the active treatment phase, where ART outperformed REIM on parental interactive style (less directive, child sensory responsiveness (less hyporesponsive, and adaptive behavior (increased communication and socialization. This study demonstrates the promise of a parent-mediated intervention for improving developmental outcomes for infants at risk of ASD in a community sample and highlights the utility of earlier identification for access to community services earlier than standard practice.

  2. Adaptive foveated single-pixel imaging with dynamic super-sampling

    CERN Document Server

    Phillips, David B; Taylor, Jonathan M; Edgar, Matthew P; Barnett, Stephen M; Gibson, Graham G; Padgett, Miles J

    2016-01-01

    As an alternative to conventional multi-pixel cameras, single-pixel cameras enable images to be recorded using a single detector that measures the correlations between the scene and a set of patterns. However, to fully sample a scene in this way requires at least the same number of correlation measurements as there are pixels in the reconstructed image. Therefore single-pixel imaging systems typically exhibit low frame-rates. To mitigate this, a range of compressive sensing techniques have been developed which rely on a priori knowledge of the scene to reconstruct images from an under-sampled set of measurements. In this work we take a different approach and adopt a strategy inspired by the foveated vision systems found in the animal kingdom - a framework that exploits the spatio-temporal redundancy present in many dynamic scenes. In our single-pixel imaging system a high-resolution foveal region follows motion within the scene, but unlike a simple zoom, every frame delivers new spatial information from acros...

  3. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics.

    Science.gov (United States)

    Feng, Shu; Gale, Michael J; Fay, Jonathan D; Faridi, Ambar; Titus, Hope E; Garg, Anupam K; Michaels, Keith V; Erker, Laura R; Peters, Dawn; Smith, Travis B; Pennesi, Mark E

    2015-09-01

    To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population.

  4. Focussing over the edge: adaptive subsurface laser fabrication up to the sample face.

    Science.gov (United States)

    Salter, P S; Booth, M J

    2012-08-27

    Direct laser writing is widely used for fabrication of subsurface, three dimensional structures in transparent media. However, the accessible volume is limited by distortion of the focussed beam at the sample edge. We determine the aberrated focal intensity distribution for light focused close to the edge of the substrate. Aberrations are modelled by dividing the pupil into two regions, each corresponding to light passing through the top and side facets. Aberration correction is demonstrated experimentally using a liquid crystal spatial light modulator for femtosecond microfabrication in fused silica. This technique allows controlled subsurface fabrication right up to the edge of the substrate. This can benefit a wide range of applications using direct laser writing, including the manufacture of waveguides and photonic crystals.

  5. Adaptation of the Thinking Styles Inventory (TSI within a Romanian student sample

    Directory of Open Access Journals (Sweden)

    Maricutoiu, L.P.

    2014-07-01

    Full Text Available The present paper presents the psychometric properties of the Thinking Styles Inventory (TSI in a sample of 543 Romanian undergraduate students. The TSI is a self-report questionnaire developed for the assessment of 13 types of preferences for problem solving (or thinking styles. The internal reliability analyses indicated that TSI scales have poor reliability (Cronbach's alphas between .26 and .72, with a median value of .62, and these values were slightly improved after we removed of 10 items from the original questionnaire. Confirmatory factor analyses failed to identify an appropriate solution for describing the relationships between the TSI items, indicating poor structural validity of the questionnaire. Further analyses indicated that the sex of the respondent has small effects on TSI scales. Also, results indicated that TSI scales can be used effectively to predict the academic specialization of the respondent.

  6. On the importance of accounting for competing risks in pediatric brain cancer: II. Regression modeling and sample size.

    Science.gov (United States)

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-03-15

    To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  8. Low-discrepancy sampling of parametric surface using adaptive space-filling curves (SFC)

    Science.gov (United States)

    Hsu, Charles; Szu, Harold

    2014-05-01

    Space-Filling Curves (SFCs) are encountered in different fields of engineering and computer science, especially where it is important to linearize multidimensional data for effective and robust interpretation of the information. Examples of multidimensional data are matrices, images, tables, computational grids, and Electroencephalography (EEG) sensor data resulting from the discretization of partial differential equations (PDEs). Data operations like matrix multiplications, load/store operations and updating and partitioning of data sets can be simplified when we choose an efficient way of going through the data. In many applications SFCs present just this optimal manner of mapping multidimensional data onto a one dimensional sequence. In this report, we begin with an example of a space-filling curve and demonstrate how it can be used to find the most similarity using Fast Fourier transform (FFT) through a set of points. Next we give a general introduction to space-filling curves and discuss properties of them. Finally, we consider a discrete version of space-filling curves and present experimental results on discrete space-filling curves optimized for special tasks.

  9. Importance of Mobile Genetic Elements and Conjugal Gene Transfer for Subsurface Microbial Community Adaptation to Biotransformation of Metals

    Energy Technology Data Exchange (ETDEWEB)

    Sorensen, Soren J.

    2005-06-01

    The overall goal of this project is to investigate the effect of mobile genetic elements and conjugal gene transfer on subsurface microbial community adaptation to mercury and chromium stress and biotransformation. Our studies focus on the interaction between the fate of these metals in the subsurface and the microbial community structure and activity.

  10. Importance Sampling Based Decision Trees for Security Assessment and the Corresponding Preventive Control Schemes: the Danish Case Study

    DEFF Research Database (Denmark)

    Liu, Leo; Rather, Zakir Hussain; Chen, Zhe

    2013-01-01

    and adopts a methodology of importance sampling to maximize the information contained in the database so as to increase the accuracy of DT. Further, this paper also studies the effectiveness of DT by implementing its corresponding preventive control schemes. These approaches are tested on the detailed model......Decision Trees (DT) based security assessment helps Power System Operators (PSO) by providing them with the most significant system attributes and guiding them in implementing the corresponding emergency control actions to prevent system insecurity and blackouts. DT is obtained offline from time......-domain simulation and the process of data mining, which is then implemented online as guidelines for preventive control schemes. An algorithm named Classification and Regression Trees (CART) is used to train the DT and key to this approach lies on the accuracy of DT. This paper proposes contingency oriented DT...

  11. Using adaptive sampling and triangular meshes for the processing and inversion of potential field data

    Science.gov (United States)

    Foks, Nathan Leon

    The interpretation of geophysical data plays an important role in the analysis of potential field data in resource exploration industries. Two categories of interpretation techniques are discussed in this thesis; boundary detection and geophysical inversion. Fault or boundary detection is a method to interpret the locations of subsurface boundaries from measured data, while inversion is a computationally intensive method that provides 3D information about subsurface structure. My research focuses on these two aspects of interpretation techniques. First, I develop a method to aid in the interpretation of faults and boundaries from magnetic data. These processes are traditionally carried out using raster grid and image processing techniques. Instead, I use unstructured meshes of triangular facets that can extract inferred boundaries using mesh edges. Next, to address the computational issues of geophysical inversion, I develop an approach to reduce the number of data in a data set. The approach selects the data points according to a user specified proxy for its signal content. The approach is performed in the data domain and requires no modification to existing inversion codes. This technique adds to the existing suite of compressive inversion algorithms. Finally, I develop an algorithm to invert gravity data for an interfacing surface using an unstructured mesh of triangular facets. A pertinent property of unstructured meshes is their flexibility at representing oblique, or arbitrarily oriented structures. This flexibility makes unstructured meshes an ideal candidate for geometry based interface inversions. The approaches I have developed provide a suite of algorithms geared towards large-scale interpretation of potential field data, by using an unstructured representation of both the data and model parameters.

  12. Stable isotope probing reveals the importance of Comamonas and Pseudomonadaceae in RDX degradation in samples from a Navy detonation site.

    Science.gov (United States)

    Jayamani, Indumathy; Cupples, Alison M

    2015-07-01

    This study investigated the microorganisms involved in hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) degradation from a detonation area at a Navy base. Using Illumina sequencing, microbial communities were compared between the initial sample, samples following RDX degradation, and controls not amended with RDX to determine which phylotypes increased in abundance following RDX degradation. The effect of glucose on these communities was also examined. In addition, stable isotope probing (SIP) using labeled ((13)C3, (15)N3-ring) RDX was performed. Illumina sequencing revealed that several phylotypes were more abundant following RDX degradation compared to the initial soil and the no-RDX controls. For the glucose-amended samples, this trend was strong for an unclassified Pseudomonadaceae phylotype and for Comamonas. Without glucose, Acinetobacter exhibited the greatest increase following RDX degradation compared to the initial soil and no-RDX controls. Rhodococcus, a known RDX degrader, also increased in abundance following RDX degradation. For the SIP study, unclassified Pseudomonadaceae was the most abundant phylotype in the heavy fractions in both the presence and absence of glucose. In the glucose-amended heavy fractions, the 16S ribosomal RNA (rRNA) genes of Comamonas and Anaeromxyobacter were also present. Without glucose, the heavy fractions also contained the 16S rRNA genes of Azohydromonas and Rhodococcus. However, all four phylotypes were present at a much lower level compared to unclassified Pseudomonadaceae. Overall, these data indicate that unclassified Pseudomonadaceae was primarily responsible for label uptake in both treatments. This study indicates, for the first time, the importance of Comamonas for RDX removal.

  13. The Importance of Sample Return in Establishing Chemical Evidence for Life on Mars or Other Solar System Bodies

    Science.gov (United States)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA s robotic exploration program over the next decade. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including complex organic compounds important in life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right sample (i.e. one with biosignatures or having a high probability of biosignatures) to Earth would allow for more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct Martian life. Here we will discuss the current analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) using the Sample Analysis at Mars (SAM) instrument suite and how sample return missions from Mars and other targets of astrobiological interest will help advance our understanding of chemical biosignatures in the solar system.

  14. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    Science.gov (United States)

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history.

  15. Predicting the impacts of climate change on animal distributions: the importance of local adaptation and species' traits

    Energy Technology Data Exchange (ETDEWEB)

    HELLMANN, J. J.; LOBO, N. F.

    2011-12-20

    The geographic range limits of many species are strongly affected by climate and are expected to change under global warming. For species that are able to track changing climate over broad geographic areas, we expect to see shifts in species distributions toward the poles and away from the equator. A number of ecological and evolutionary factors, however, could restrict this shifting or redistribution under climate change. These factors include restricted habitat availability, restricted capacity for or barriers to movement, or reduced abundance of colonists due the perturbation effect of climate change. This research project examined the last of these constraints - that climate change could perturb local conditions to which populations are adapted, reducing the likelihood that a species will shift its distribution by diminishing the number of potential colonists. In the most extreme cases, species ranges could collapse over a broad geographic area with no poleward migration and an increased risk of species extinction. Changes in individual species ranges are the processes that drive larger phenomena such as changes in land cover, ecosystem type, and even changes in carbon cycling. For example, consider the poleward range shift and population outbreaks of the mountain pine beetle that has decimated millions of acres of Douglas fir trees in the western US and Canada. Standing dead trees cause forest fires and release vast quantities of carbon to the atmosphere. The beetle likely shifted its range because it is not locally adapted across its range, and it appears to be limited by winter low temperatures that have steadily increased in the last decades. To understand range and abundance changes like the pine beetle, we must reveal the extent of adaptive variation across species ranges - and the physiological basis of that adaptation - to know if other species will change as readily as the pine beetle. Ecologists tend to assume that range shifts are the dominant

  16. Using implicit association tests in age-heterogeneous samples: The importance of cognitive abilities and quad model processes.

    Science.gov (United States)

    Wrzus, Cornelia; Egloff, Boris; Riediger, Michaela

    2017-08-01

    Implicit association tests (IATs) are increasingly used to indirectly assess people's traits, attitudes, or other characteristics. In addition to measuring traits or attitudes, IAT scores also reflect differences in cognitive abilities because scores are based on reaction times (RTs) and errors. As cognitive abilities change with age, questions arise concerning the usage and interpretation of IATs for people of different age. To address these questions, the current study examined how cognitive abilities and cognitive processes (i.e., quad model parameters) contribute to IAT results in a large age-heterogeneous sample. Participants (N = 549; 51% female) in an age-stratified sample (range = 12-88 years) completed different IATs and 2 tasks to assess cognitive processing speed and verbal ability. From the IAT data, D2-scores were computed based on RTs, and quad process parameters (activation of associations, overcoming bias, detection, guessing) were estimated from individual error rates. Substantial IAT scores and quad processes except guessing varied with age. Quad processes AC and D predicted D2-scores of the content-specific IAT. Importantly, the effects of cognitive abilities and quad processes on IAT scores were not significantly moderated by participants' age. These findings suggest that IATs seem suitable for age-heterogeneous studies from adolescence to old age when IATs are constructed and analyzed appropriately, for example with D-scores and process parameters. We offer further insight into how D-scoring controls for method effects in IATs and what IAT scores capture in addition to implicit representations of characteristics. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. [Study of the surface tear tension and evaluation of its importance for the retinal physiology and pathology in contact correction and in adaptation to soft contact lenses].

    Science.gov (United States)

    Cherepnin, A I; Smoliakova, G P; Sorokin, E L

    2003-01-01

    The surface lachrymal-fluid (LF) tension was investigated by teardrop dissection in 115 patients with myopia before they were prescribed soft contact lenses (SCL). Such tension was found to be of clinical importance for the development of SCL adaptation disorders. A longer adaptation period in patients with myopia was associated with a low surface LF tension. A high surface LF tension concurrent with the teardrop dissection mode of the destruction type was typical of the pathological nature of SCL adaptation (12.1% of patients). The obtained data are needed to detect timely the risk of dysadaptation disorders and corneal complications before SCL prescription for the purpose of undertaking the pathogenetically substantiated medication to prevent such complications.

  18. Defining "Normophilic" and "Paraphilic" Sexual Fantasies in a Population-Based Sample: On the Importance of Considering Subgroups.

    Science.gov (United States)

    Joyal, Christian C

    2015-12-01

    interests, and the crucial difference between SF and sexual interest is underlined. Joyal CC. Defining "normophilic" and "paraphilic" sexual fantasies in a population-based sample: On the importance of considering subgroups. Sex Med 2015;3:321-330.

  19. RESULTS FROM EPA FUNDED RESEARCH PROGRAMS ON THE IMPORTANCE OF PURGE VOLUME, SAMPLE VOLUME, SAMPLE FLOW RATE AND TEMPORAL VARIATIONS ON SOIL GAS CONCENTRATIONS

    Science.gov (United States)

    Two research studies funded and overseen by EPA have been conducted since October 2006 on soil gas sampling methods and variations in shallow soil gas concentrations with the purpose of improving our understanding of soil gas methods and data for vapor intrusion applications. Al...

  20. Comparative genomics reveals high biological diversity and specific adaptations in the industrially and medically important fungal genus Aspergillus

    DEFF Research Database (Denmark)

    de Vries, Ronald P.; Riley, Robert; Wiebenga, Ad

    2017-01-01

    Background:  The fungal genus Aspergillus is of critical importance to humankind. Species include those with industrial applications, important pathogens of humans, animals and crops, a source of potent carcinogenic contaminants of food, and an important genetic model. The genome sequences of eight...... here, allows for the first time a genus-wide view of the biological diversity of the aspergilli and in many, but not all, cases linked genome differences to phenotype. Insights gained could be exploited for biotechnological and medical applications of fungi....

  1. Whole genome amplification for CGH analysis: Linker-adapter PCR as the method of choice for difficult and limited samples.

    Science.gov (United States)

    Pirker, Christine; Raidl, Maria; Steiner, Elisabeth; Elbling, Leonilla; Holzmann, Klaus; Spiegl-Kreinecker, Sabine; Aubele, Michaela; Grasl-Kraupp, Bettina; Marosi, Christine; Micksche, Michael; Berger, Walter

    2004-09-01

    Comparative genomic hybridization (CGH) is a powerful method to investigate chromosomal imbalances in tumor cells. However, DNA quantity and quality can be limiting factors for successful CGH analysis. The aim of this study was to investigate the applicability of degenerate oligonucleotide-primed PCR (DOP-PCR) and a recently developed linker-adapter-mediated PCR (LA-PCR) for whole genome amplification for use in CGH, especially for difficult source material. We comparatively analyzed DNA of variable quality derived from different cell/tissue types. Additionally, dilution experiments down to the DNA content of a single cell were performed. FISH and/or classical cytogenetic analyses were used as controls. In the case of high quality DNA samples, both methods were equally suitable for CGH. When analyzing very small amounts of these DNA samples (equivalent to one or a few human diploid cells), DOP-PCR-CGH, but not LA-PCR-CGH, frequently produced false-positive signals (e.g., gains in 1p and 16p, and losses in chromosome 4q). In case of formalin-fixed paraffin-embedded tissues, success rates by LA-PCR-CGH were significantly higher as compared to DOP-PCR-CGH. DNA of minor quality frequently could be analyzed correctly by LA-PCR-CGH, but was prone to give false-positive and/or false-negative results by DOP-PCR-CGH. LA-PCR is superior to DOP-PCR for amplification of DNA for CGH analysis, especially in the case of very limited or partly degraded source material. Copyright 2004 Wiley-Liss, Inc

  2. Methodological Adaptations for Investigating the Perceptions of Language-Impaired Adolescents Regarding the Relative Importance of Selected Communication Skills

    Science.gov (United States)

    Reed, Vicki A.; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…

  3. Methodological Adaptations for Investigating the Perceptions of Language-Impaired Adolescents Regarding the Relative Importance of Selected Communication Skills

    Science.gov (United States)

    Reed, Vicki A.; Brammall, Helen

    2006-01-01

    This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…

  4. "Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation"

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2011-09-01

    Full Text Available Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found not to have modeled the analyses to take account of the complex sample (Johnson & Elliott, 1998 even when publishing in highly-regarded journals. It is well known that failure to appropriately model the complex sample can substantially bias the results of the analysis. Examples presented in this paper highlight the risk of error of inference and mis-estimation of parameters from failure to analyze these data sets appropriately.

  5. Diagnosing Intellectual Disability in a Forensic Sample: Gender and Age Effects on the Relationship between Cognitive and Adaptive Functioning

    Science.gov (United States)

    Hayes, Susan C.

    2005-01-01

    Background: The relationship between adaptive behaviour and cognitive functioning in offenders with intellectual disabilities is not well researched. This study aims to examine gender and age effects on the relationship between these two areas of functioning. Method: The "Vineland Adaptive Behavior Scales" (VABS) and the "Kaufman…

  6. Importance of Mobile Genetic Elements and Conjugal Gene Transfer for Subsurface Microbial Community Adaptation to Biotransformation of Metals

    Energy Technology Data Exchange (ETDEWEB)

    Sorensen, Soren J.

    2004-06-01

    Soils used in the present DOE project were obtained from the Field Research Center (FRC) through correspondence with FRC Manager David Watson. We obtained a total of six soils sampled at different distances from the surface: (A) Non-contaminated surface soil from Hinds Creek Floodplain (0 mbs (meter below surface)). (B) Mercury-contaminated surface soil from Lower East Fork Poplar Creek Floodplain (0 mbs). (C) Mercury-contaminated subsurface soil from Lower East Fork Poplar Creek Floodplain (0.5 mbs). (D) Mercury-contaminated subsurface soil from Lower East Fork Poplar Creek Floodplain (1.0 mbs). (E) Non-contaminated surface soil from Ish Creek Floodplain (0 mbs). (F) Non-contaminated surface soil from Ish Creek Floodplain (0.5 mbs).

  7. 76 FR 65165 - Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative...

    Science.gov (United States)

    2011-10-20

    ... Service, USDA. ACTION: Notice. SUMMARY: We are advising the public of our decision to implement a risk... will evaluate the risk associated with combinations of taxa of plants for planting and countries from... planting from a certain country is determined to present a medium or low risk, it will be sampled at the...

  8. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    Science.gov (United States)

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods w...

  9. Bottom–up protein identifications from microliter quantities of individual human tear samples. Important steps towards clinical relevance.

    Directory of Open Access Journals (Sweden)

    Peter Raus

    2015-12-01

    With 375 confidently identified proteins in the healthy adult tear, the obtained results are comprehensive and in large agreement with previously published observations on pooled samples of multiple patients. We conclude that, to a limited extent, bottom–up tear protein identifications from individual patients may have clinical relevance.

  10. A Keck Adaptive Optics Survey of a Representative Sample of Gravitationally-Lensed Star-Forming Galaxies: High Spatial Resolution Studies of Kinematics and Metallicity Gradients

    CERN Document Server

    Leethochawalit, Nicha; Ellis, Richard S; Stark, Daniel P; Richard, Johan; Zitrin, Adi; Auger, Matthew

    2015-01-01

    We discuss spatially resolved emission line spectroscopy secured for a total sample of 15 gravitationally lensed star-forming galaxies at a mean redshift of $z\\simeq2$ based on Keck laser-assisted adaptive optics observations undertaken with the recently-improved OSIRIS integral field unit (IFU) spectrograph. By exploiting gravitationally lensed sources drawn primarily from the CASSOWARY survey, we sample these sub-L$^{\\ast}$ galaxies with source-plane resolutions of a few hundred parsecs ensuring well-sampled 2-D velocity data and resolved variations in the gas-phase metallicity. Such high spatial resolution data offers a critical check on the structural properties of larger samples derived with coarser sampling using multiple-IFU instruments. We demonstrate how serious errors of interpretation can only be revealed through better sampling. Although we include four sources from our earlier work, the present study provides a more representative sample unbiased with respect to emission line strength. Contrary t...

  11. Liver kinetics of glucose analogs measured in pigs by PET: importance of dual-input blood sampling

    DEFF Research Database (Denmark)

    Munk, O L; Bass, L; Roelsgaard, K;

    2001-01-01

    Metabolic processes studied by PET are quantified traditionally using compartmental models, which relate the time course of the tracer concentration in tissue to that in arterial blood. For liver studies, the use of arterial input may, however, cause systematic errors to the estimated kinetic....... Hepatic arterial and portal venous blood samples and flows were measured during the scan. The dual-input function was calculated as the flow-weighted input. RESULTS: For both MG and FDG, the compartmental analysis using arterial input led to systematic underestimation of the rate constants for rapid blood...... of conventional arterial sampling underestimated these parameters compared with independent measurements of hepatic flow and hepatic blood volume. In contrast, the linear Gjedde-Patlak analysis, being less informative but more robust, gave similar parameter estimates (K, V) with both input functions...

  12. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    Science.gov (United States)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.

  13. Fast simulated annealing and adaptive Monte Carlo sampling based parameter optimization for dense optical-flow deformable image registration of 4DCT lung anatomy

    Science.gov (United States)

    Dou, Tai H.; Min, Yugang; Neylon, John; Thomas, David; Kupelian, Patrick; Santhanam, Anand P.

    2016-03-01

    Deformable image registration (DIR) is an important step in radiotherapy treatment planning. An optimal input registration parameter set is critical to achieve the best registration performance with the specific algorithm. Methods In this paper, we investigated a parameter optimization strategy for Optical-flow based DIR of the 4DCT lung anatomy. A novel fast simulated annealing with adaptive Monte Carlo sampling algorithm (FSA-AMC) was investigated for solving the complex non-convex parameter optimization problem. The metric for registration error for a given parameter set was computed using landmark-based mean target registration error (mTRE) between a given volumetric image pair. To reduce the computational time in the parameter optimization process, a GPU based 3D dense optical-flow algorithm was employed for registering the lung volumes. Numerical analyses on the parameter optimization for the DIR were performed using 4DCT datasets generated with breathing motion models and open-source 4DCT datasets. Results showed that the proposed method efficiently estimated the optimum parameters for optical-flow and closely matched the best registration parameters obtained using an exhaustive parameter search method.

  14. Perceptions of Australian marine protected area managers regarding the role, importance, and achievability of adaptation for managing the risks of climate change

    Directory of Open Access Journals (Sweden)

    Christopher Cvitanovic

    2014-12-01

    Full Text Available The rapid development of adaptation as a mainstream strategy for managing the risks of climate change has led to the emergence of a broad range of adaptation policies and management strategies globally. However, the success of such policies or management interventions depends on the effective integration of new scientific research into the decision-making process. Ineffective communication between scientists and environmental decision makers represents one of the key barriers limiting the integration of science into the decision-making process in many areas of natural resource management. This can be overcome by understanding the perceptions of end users, so as to identify knowledge gaps and develop improved and targeted strategies for communication and engagement. We assessed what one group of environmental decision makers, Australian marine protected area (MPA managers, viewed as the major risks associated with climate change, and their perceptions regarding the role, importance, and achievability of adaptation for managing these risks. We also assessed what these managers perceived as the role of science in managing the risks from climate change, and identified the factors that increased their trust in scientific information. We do so by quantitatively surveying 30 MPA managers across 3 Australian management agencies. We found that although MPA managers have a very strong awareness of the range and severity of risks posed by climate change, their understanding of adaptation as an option for managing these risks is less comprehensive. We also found that although MPA managers view science as a critical source of information for informing the decision-making process, it should be considered in context with other knowledge types such as community and cultural knowledge, and be impartial, evidence based, and pragmatic in outlining policy and management recommendations that are realistically achievable.

  15. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Science.gov (United States)

    2010-07-01

    ... requirements apply to importers who transport motor vehicle diesel fuel, NRLM diesel fuel, or ECA marine fuel by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Motor Vehicle Diesel...

  16. Liver kinetics of glucose analogs measured in pigs by PET: importance of dual-input blood sampling

    DEFF Research Database (Denmark)

    Munk, O L; Bass, L; Roelsgaard, K

    2001-01-01

    parameters, because of ignorance of the dual blood supply from the hepatic artery and the portal vein to the liver. METHODS: Six pigs underwent PET after [15O]carbon monoxide inhalation, 3-O-[11C]methylglucose (MG) injection, and [18F]FDG injection. For the glucose scans, PET data were acquired for 90 min....... Hepatic arterial and portal venous blood samples and flows were measured during the scan. The dual-input function was calculated as the flow-weighted input. RESULTS: For both MG and FDG, the compartmental analysis using arterial input led to systematic underestimation of the rate constants for rapid blood...

  17. Integration of morphological data sets for phylogenetic analysis of Amniota: the importance of integumentary characters and increased taxonomic sampling.

    Science.gov (United States)

    Hill, Robert V

    2005-08-01

    Several mutually exclusive hypotheses have been advanced to explain the phylogenetic position of turtles among amniotes. Traditional morphology-based analyses place turtles among extinct anapsids (reptiles with a solid skull roof), whereas more recent studies of both morphological and molecular data support an origin of turtles from within Diapsida (reptiles with a doubly fenestrated skull roof). Evaluation of these conflicting hypotheses has been hampered by nonoverlapping taxonomic samples and the exclusion of significant taxa from published analyses. Furthermore, although data from soft tissues and anatomical systems such as the integument may be particularly relevant to this problem, they are often excluded from large-scale analyses of morphological systematics. Here, conflicting hypotheses of turtle relationships are tested by (1) combining published data into a supermatrix of morphological characters to address issues of character conflict and missing data; (2) increasing taxonomic sampling by more than doubling the number of operational taxonomic units to test internal relationships within suprageneric ingroup taxa; and (3) increasing character sampling by approximately 25% by adding new data on the osteology and histology of the integument, an anatomical system that has been historically underrepresented in morphological systematics. The morphological data set assembled here represents the largest yet compiled for Amniota. Reevaluation of character data from prior studies of amniote phylogeny favors the hypothesis that turtles indeed have diapsid affinities. Addition of new ingroup taxa alone leads to a decrease in overall phylogenetic resolution, indicating that existing characters used for amniote phylogeny are insufficient to explain the evolution of more highly nested taxa. Incorporation of new data from the soft and osseous components of the integument, however, helps resolve relationships among both basal and highly nested amniote taxa. Analysis of a

  18. Cold shock genes cspA and cspB from Caulobacter crescentus are posttranscriptionally regulated and important for cold adaptation.

    Science.gov (United States)

    Mazzon, Ricardo R; Lang, Elza A S; Silva, Carolina A P T; Marques, Marilis V

    2012-12-01

    Cold shock proteins (CSPs) are nucleic acid binding chaperones, first described as being induced to solve the problem of mRNA stabilization after temperature downshift. Caulobacter crescentus has four CSPs: CspA and CspB, which are cold induced, and CspC and CspD, which are induced only in stationary phase. In this work we have determined that the synthesis of both CspA and CspB reaches the maximum levels early in the acclimation phase. The deletion of cspA causes a decrease in growth at low temperature, whereas the strain with a deletion of cspB has a very subtle and transient cold-related growth phenotype. The cspA cspB double mutant has a slightly more severe phenotype than that of the cspA mutant, suggesting that although CspA may be more important to cold adaptation than CspB, both proteins have a role in this process. Gene expression analyses were carried out using cspA and cspB regulatory fusions to the lacZ reporter gene and showed that both genes are regulated at the transcriptional and posttranscriptional levels. Deletion mapping of the long 5'-untranslated region (5'-UTR) of each gene identified a common region important for cold induction, probably via translation enhancement. In contrast to what was reported for other bacteria, these cold shock genes have no regulatory regions downstream from ATG that are important for cold induction. This work shows that the importance of CspA and CspB to C. crescentus cold adaptation, mechanisms of regulation, and pattern of expression during the acclimation phase apparently differs in many aspects from what has been described so far for other bacteria.

  19. A "sample-and-hold" pulse-counting integrator as a mechanism for graded memory underlying sensorimotor adaptation.

    Science.gov (United States)

    Oestreich, Jörg; Dembrow, Nikolai C; George, Andrew A; Zakon, Harold H

    2006-02-16

    The mechanisms behind the induction of cellular correlates of memory by sensory input and their contribution to meaningful behavioral changes are largely unknown. We previously reported a graded memory in the form of sensorimotor adaptation in the electromotor output of electric fish. Here we show that the mechanism for this adaptation is a synaptically induced long-lasting shift in intrinsic neuronal excitability. This mechanism rapidly integrates hundreds of spikes in a second, or gradually integrates the same number of spikes delivered over tens of minutes. Thus, this mechanism appears immune to frequency-dependent fluctuations in input and operates as a simple pulse counter over a wide range of time scales, enabling it to transduce graded sensory information into a graded memory and a corresponding change in the behavioral output. This adaptation is based on an NMDA receptor-mediated change in intrinsic excitability of the postsynaptic neurons involving the Ca2+-dependent activation of TRP channels.

  20. Climate impacts on European agriculture and water management in the context of adaptation and mitigation--the importance of an integrated approach.

    Science.gov (United States)

    Falloon, Pete; Betts, Richard

    2010-11-01

    We review and qualitatively assess the importance of interactions and feedbacks in assessing climate change impacts on water and agriculture in Europe. We focus particularly on the impact of future hydrological changes on agricultural greenhouse gas (GHG) mitigation and adaptation options. Future projected trends in European agriculture include northward movement of crop suitability zones and increasing crop productivity in Northern Europe, but declining productivity and suitability in Southern Europe. This may be accompanied by a widening of water resource differences between the North and South, and an increase in extreme rainfall events and droughts. Changes in future hydrology and water management practices will influence agricultural adaptation measures and alter the effectiveness of agricultural mitigation strategies. These interactions are often highly complex and influenced by a number of factors which are themselves influenced by climate. Mainly positive impacts may be anticipated for Northern Europe, where agricultural adaptation may be shaped by reduced vulnerability of production, increased water supply and reduced water demand. However, increasing flood hazards may present challenges for agriculture, and summer irrigation shortages may result from earlier spring runoff peaks in some regions. Conversely, the need for effective adaptation will be greatest in Southern Europe as a result of increased production vulnerability, reduced water supply and increased demands for irrigation. Increasing flood and drought risks will further contribute to the need for robust management practices. The impacts of future hydrological changes on agricultural mitigation in Europe will depend on the balance between changes in productivity and rates of decomposition and GHG emission, both of which depend on climatic, land and management factors. Small increases in European soil organic carbon (SOC) stocks per unit land area are anticipated considering changes in climate

  1. Testing Set-Point Theory in a Swiss National Sample: Reaction and Adaptation to Major Life Events.

    Science.gov (United States)

    Anusic, Ivana; Yap, Stevie C Y; Lucas, Richard E

    2014-12-01

    Set-point theory posits that individuals react to the experience of major life events, but quickly adapt back to pre-event baseline levels of subjective well-being in the years following the event. A large, nationally representative panel study of Swiss households was used to examine set-point theory by investigating the extent of adaptation following the experience of marriage, childbirth, widowhood, unemployment, and disability. Our results demonstrate that major life events are associated with marked change in life satisfaction and, for some events (e.g., marriage, disability), these changes are relatively long lasting even when accounting for normative, age related change.

  2. Sleepiness and Motor Vehicle Crashes in a Representative Sample of Portuguese Drivers: The Importance of Epidemiological Representative Surveys.

    Science.gov (United States)

    Gonçalves, M; Peralta, A R; Monteiro Ferreira, J; Guilleminault, Christian

    2015-01-01

    Sleepiness is considered to be a leading cause of crashes. Despite the huge amount of information collected in questionnaire studies, only some are based on representative samples of the population. Specifics of the populations studied hinder the generalization of these previous findings. For the Portuguese population, data from sleep-related car crashes/near misses and sleepiness while driving are missing. The objective of this study is to determine the prevalence of near-miss and nonfatal motor vehicle crashes related to sleepiness in a representative sample of Portuguese drivers. Structured phone interviews regarding sleepiness and sleep-related crashes and near misses, driving habits, demographic data, and sleep quality were conducted using the Pittsburgh Sleep Quality Index and sleep apnea risk using the Berlin questionnaire. A multivariate regression analysis was used to determine the associations with sleepy driving (feeling sleepy or falling asleep while driving) and sleep-related near misses and crashes. Nine hundred subjects, representing the Portuguese population of drivers, were included; 3.1% acknowledged falling asleep while driving during the previous year and 0.67% recalled sleepiness-related crashes. Higher education, driving more than 15,000 km/year, driving more frequently between 12:00 a.m. and 6 a.m., fewer years of having a driver's license, less total sleep time per night, and higher scores on the Epworth Sleepiness Scale (ESS) were all independently associated with sleepy driving. Sleepiness-related crashes and near misses were associated only with falling asleep at the wheel in the previous year. Sleep-related crashes occurred more frequently in drivers who had also had sleep-related near misses. Portugal has lower self-reported sleepiness at the wheel and sleep-related near misses than most other countries where epidemiological data are available. Different population characteristics and cultural, social, and road safety specificities may

  3. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    Science.gov (United States)

    Peterson, A Townsend; Moses, Lina M; Bausch, Daniel G

    2014-01-01

    Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk.

  4. Mapping transmission risk of Lassa fever in West Africa: the importance of quality control, sampling bias, and error weighting.

    Directory of Open Access Journals (Sweden)

    A Townsend Peterson

    Full Text Available Lassa fever is a disease that has been reported from sites across West Africa; it is caused by an arenavirus that is hosted by the rodent M. natalensis. Although it is confined to West Africa, and has been documented in detail in some well-studied areas, the details of the distribution of risk of Lassa virus infection remain poorly known at the level of the broader region. In this paper, we explored the effects of certainty of diagnosis, oversampling in well-studied region, and error balance on results of mapping exercises. Each of the three factors assessed in this study had clear and consistent influences on model results, overestimating risk in southern, humid zones in West Africa, and underestimating risk in drier and more northern areas. The final, adjusted risk map indicates broad risk areas across much of West Africa. Although risk maps are increasingly easy to develop from disease occurrence data and raster data sets summarizing aspects of environments and landscapes, this process is highly sensitive to issues of data quality, sampling design, and design of analysis, with macrogeographic implications of each of these issues and the potential for misrepresenting real patterns of risk.

  5. Virulence characterisation of Salmonella enterica isolates of differing antimicrobial resistance recovered from UK livestock and imported meat samples.

    Directory of Open Access Journals (Sweden)

    Roderick eCard

    2016-05-01

    Full Text Available Salmonella enterica is a foodborne zoonotic pathogen of significant public health concern. We have characterised the virulence and antimicrobial resistance gene content of 95 Salmonella isolates from 11 serovars by DNA microarray recovered from UK livestock or imported meat. Genes encoding resistance to sulphonamides (sul1, sul2, tetracycline (tet(A, tet(B, streptomycin (strA, strB, aminoglycoside (aadA1, aadA2, beta-lactam (blaTEM, and trimethoprim (dfrA17 were common. Virulence gene content differed between serovars; S. Typhimurium formed two subclades based on virulence plasmid presence. Thirteen isolates were selected by their virulence profile for pathotyping using the Galleria mellonella pathogenesis model. Infection with a chicken invasive S. Enteritidis or S. Gallinarum isolate, a multidrug resistant S. Kentucky, or a S. Typhimurium DT104 isolate resulted in high mortality of the larvae; notably presence of the virulence plasmid in S. Typhimurium was not associated with increased larvae mortality. Histopathological examination showed that infection caused severe damage to the Galleria gut structure. Enumeration of intracellular bacteria in the larvae 24 hours post-infection showed increases of up to 7 log above the initial inoculum and transmission electron microscopy (TEM showed bacterial replication in the haemolymph. TEM also revealed the presence of vacuoles containing bacteria in the haemocytes, similar to Salmonella containing vacuoles observed in mammalian macrophages; although there was no evidence from our work of bacterial replication within vacuoles. This work shows that microarrays can be used for rapid virulence genotyping of S. enterica and that the Galleria animal model replicates some aspects of Salmonella infection in mammals. These procedures can be used to help inform on the pathogenicity of isolates that may be antibiotic resistant and have scope to aid the assessment of their potential public and animal health risk.

  6. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  7. Behavioral Regulation, Visual Spatial Maturity in Kindergarten, and the Relationship of School Adaptation in the First Grade for a Sample of Turkish Children.

    Science.gov (United States)

    Özer, Serap

    2016-04-01

    Behavioral regulation has recently become an important variable in research looking at kindergarten and first-grade achievement of children in private and public schools. The purpose of this study was to examine a measure of behavioral regulation, the Head Toes Knees Shoulders Task, and to evaluate its relationship with visual spatial maturity at the end of kindergarten. Later, in first grade, teachers were asked to rate the children (N = 82) in terms of academic and behavioral adaptation. Behavioral regulation and visual spatial maturity were significantly different between the two school types, but ratings by the teachers in the first grade were affected by children's visual spatial maturity rather than by behavioral regulation. Socioeducational opportunities provided by the two types of schools may be more important to school adaptation than behavioral regulation.

  8. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Adaptation and evaluation of a rapid test for the diagnosis of sheep scrapie in samples of rectal mucosa.

    Science.gov (United States)

    González, Lorenzo; Horton, Robert; Ramsay, Drew; Toomik, Reet; Leathers, Valerie; Tonelli, Quentin; Dagleish, Mark P; Jeffrey, Martin; Terry, Linda

    2008-03-01

    In recent publications, it was shown that disease-associated prion protein (PrP(d)) accumulates in the lymphoid tissue of the rectal mucosa of a high proportion of scrapie-infected sheep at clinical and preclinical stages, regardless of several host factors; PrP(d) can also be detected in biopsy specimens of rectal mucosa, with an increased probability proportional to age or incubation period and with an efficiency almost identical to that of tonsil biopsies. Rectal biopsies have the advantages of providing higher numbers of lymphoid follicles and of being simpler to perform, which makes them suitable for scrapie screening in the field. In biopsy samples, PrP(d) could be demonstrated by immunohistochemical (IHC) and Western immunoblotting methods, and the purpose of the present study was to optimize and evaluate a "rapid test" for the diagnosis of scrapie in rectal biopsy samples. The HerdChek CWD (chronic wasting disease) antigen EIA (enzyme immunoassay) test was chosen and, once optimized, provided specificity and sensitivity figures of 99.2% and 93.5%, respectively, compared with IHC results in the same samples obtained at a postmortem. The sensitivity of the assay increased from 82.1%, when a single rectal mucosa sample was tested to 99.4% for those sheep in which 3 or more samples were analyzed. Similarly, sensitivity values of the HerdChek CWD antigen EIA test on biopsy samples increased from 95% to 100% for sheep subjected to 1 or 2 sequential biopsies 4 months apart, respectively. Thus, a preclinical diagnosis of scrapie in live sheep can be achieved by a combination of a simple sampling procedure, which can be repeated several times with no detrimental effect for the animals, and a rapid and efficient laboratory method.

  10. The two-sample problem for Poisson processes: adaptive tests with a non-asymptotic wild bootstrap approach

    CERN Document Server

    Reynaud-Bouret, Patricia; Laurent, Béatrice

    2012-01-01

    Considering two independent Poisson processes, we address the question of testing equality of their respective intensities. We construct multiple testing procedures from the aggregation of single tests whose testing statistics come from model selection, thresholding and/or kernel estimation methods. The corresponding critical values are computed through a non-asymptotic wild bootstrap approach. The obtained tests are proved to be exactly of level $\\alpha$, and to satisfy non-asymptotic oracle type inequalities. From these oracle type inequalities, we deduce that our tests are adaptive in the minimax sense over a large variety of classes of alternatives based on classical and weak Besov bodies in the univariate case, but also Sobolev and anisotropic Nikol'skii-Besov balls in the multivariate case. A simulation study furthermore shows that they strongly perform in practice.

  11. Factorial Validity and Internal Consistency of Malaysian Adapted Depression Anxiety Stress Scale - 21 in an Adolescent Sample

    Directory of Open Access Journals (Sweden)

    Hairul Anuar Hashim

    2011-01-01

    Full Text Available Background: Psychometrically sound measurement instrument is a fundamental requirement across broad range of research areas. In negative affect research, Depression Anxiety Stress Scale (DASS has been identified as a psychometrically sound instrument to measure depression, anxiety and stress, especially the 21-item version. However, its psychometric properties in adolescents have been less consistent. Objectives: Thus, the present study sought to examine the factorial validity and internal consistency of the adapted 21-item version of DASS in Malaysian adolescents. Method: Using cross-sectional study design, DASS-21 was administered to 750 Malaysian adolescents (Mean age = 13.40 ± 0.49. The data were then analyzed using Confirmatory factor analysis (CFA, in which the original DASS-21 factor structure (depression-stress-anxiety was compared to 8 other alternative models.Results: CFA results revealed a weak support for DASS-21 as a differentiated measure of depression, anxiety and stress in Malaysian adolescents. Extremely high latent factors intercorrelations were observed in the model reflecting original DASS factor structure. On the other hand, despite the best overall fit of a 4-factor model consisting of depression, anxiety, and stress, as well as a general negative affect factor, individual factor loadings for the specific factors were uninterpretable. Although model fit of 1-factor model was inferior when compared the other competing models, this model (1-factor exhibit reasonable model fit. Conclusion: We concluded that the use of Malaysian adapted DASS-21 as a differentiated measure stress, anxiety, and depression in Malaysian adolescent should proceed with caution and further refinement of the scale is necessary before a concrete conclusion can be made.

  12. Mobile membrane introduction tandem mass spectrometry for on-the-fly measurements and adaptive sampling of VOCs around oil and gas projects in Alberta, Canada

    Science.gov (United States)

    Krogh, E.; Gill, C.; Bell, R.; Davey, N.; Martinsen, M.; Thompson, A.; Simpson, I. J.; Blake, D. R.

    2012-12-01

    The release of hydrocarbons into the environment can have significant environmental and economic consequences. The evolution of smaller, more portable mass spectrometers to the field can provide spatially and temporally resolved information for rapid detection, adaptive sampling and decision support. We have deployed a mobile platform membrane introduction mass spectrometer (MIMS) for the in-field simultaneous measurement of volatile and semi-volatile organic compounds. In this work, we report instrument and data handling advances that produce geographically referenced data in real-time and preliminary data where these improvements have been combined with high precision ultra-trace VOCs analysis to adaptively sample air plumes near oil and gas operations in Alberta, Canada. We have modified a commercially available ion-trap mass spectrometer (Griffin ICX 400) with an in-house temperature controlled capillary hollow fibre polydimethylsiloxane (PDMS) polymer membrane interface and in-line permeation tube flow cell for a continuously infused internal standard. The system is powered by 24 VDC for remote operations in a moving vehicle. Software modifications include the ability to run continuous, interlaced tandem mass spectrometry (MS/MS) experiments for multiple contaminants/internal standards. All data are time and location stamped with on-board GPS and meteorological data to facilitate spatial and temporal data mapping. Tandem MS/MS scans were employed to simultaneously monitor ten volatile and semi-volatile analytes, including benzene, toluene, ethylbenzene and xylene (BTEX), reduced sulfur compounds, halogenated organics and naphthalene. Quantification was achieved by calibrating against a continuously infused deuterated internal standard (toluene-d8). Time referenced MS/MS data were correlated with positional data and processed using Labview and Matlab to produce calibrated, geographical Google Earth data-visualizations that enable adaptive sampling protocols

  13. Ant species diversity in the 'Grands Causses' (Aveyron, France): In search of sampling methods adapted to temperate climates.

    Science.gov (United States)

    Groc, Sarah; Delabie, Jacques H C; Céréghino, Régis; Orivel, Jérôme; Jaladeau, Frédéric; Grangier, Julien; Mariano, Cléa S F; Dejean, Alain

    2007-12-01

    This study aimed at showing the applicability of using a combination of four sampling methods (i.e., Winkler extractors, pitfall traps, baiting and manual collection), something most often conducted in the tropics, to create an inventory of ant species diversity in temperate environments. We recorded a total of 33 ant species in the Grands Causses by comparing three vegetal formations: a steppic lawn ('causse' sensu stricto), which was the most species-rich (29 species), followed by an oak grove (22 species) and a pine forest (17 species). No sampling method alone is efficient enough to provide an adequate sampling, but their combination permits one to make a suitable inventory of the myrmecofauna and to obtain information on the ecology of these ant species.

  14. Field-adapted sampling of whole blood to determine the levels of amodiaquine and its metabolite in children with uncomplicated malaria treated with amodiaquine plus artesunate combination

    Directory of Open Access Journals (Sweden)

    Gustafsson Lars L

    2009-03-01

    Full Text Available Abstract Background Artemisinin combination therapy (ACT has been widely adopted as first-line treatment for uncomplicated falciparum malaria. In Uganda, amodiaquine plus artesunate (AQ+AS, is the alternative first-line regimen to Coartem® (artemether + lumefantrine for the treatment of uncomplicated falciparum malaria. Currently, there are few field-adapted analytical techniques for monitoring amodiaquine utilization in patients. This study evaluates the field applicability of a new method to determine amodiaquine and its metabolite concentrations in whole blood dried on filter paper. Methods Twelve patients aged between 1.5 to 8 years with uncomplicated malaria received three standard oral doses of AQ+AS. Filter paper blood samples were collected before drug intake and at six different time points over 28 days period. A new field-adapted sampling procedure and liquid chromatographic method was used for quantitative determination of amodiaquine and its metabolite in whole blood. Results The sampling procedure was successively applied in the field. Amodiaquine could be quantified for at least three days and the metabolite up to 28 days. All parasites in all the 12 patients cleared within the first three days of treatment and no adverse drug effects were observed. Conclusion The methodology is suitable for field studies. The possibility to determine the concentration of the active metabolite of amodiaquine up to 28 days suggested that the method is sensitive enough to monitor amodiaquine utilization in patients. Amodiaquine plus artesunate seems effective for treatment of falciparum malaria.

  15. Adaptation and Validation of the Brief Sexual Opinion Survey (SOS) in a Colombian Sample and Factorial Equivalence with the Spanish Version

    Science.gov (United States)

    Sierra, Juan Carlos; Soler, Franklin

    2016-01-01

    Attitudes toward sexuality are a key variable for sexual health. It is really important for psychology and education to have adapted and validated questionnaires to evaluate these attitudes. Therefore, the objective of this research was to adapt, validate and calculate the equivalence of the Colombia Sexual Opinion Survey as compared to the same survey from Spain. To this end, a total of eight experts were consulted and 1,167 subjects from Colombia and Spain answered the Sexual Opinion Survey, the Sexual Assertiveness Scale, the Massachusetts General Hospital-Sexual Functioning Questionnaire, and the Sexuality Scale. The evaluation was conducted by online and the results show adequate qualitative and quantitative properties of the items, with adequate reliability and external validity and compliance with the strong invariance between the two countries. Consequently, the Colombia Sexual Opinion Survey is a valid and reliable scale and its scores can be compared with the ones from the Spain survey, with minimum bias. PMID:27627114

  16. A novel β hyper-plane based importance sampling method%基于β面的截断重要抽样法

    Institute of Scientific and Technical Information of China (English)

    张峰; 吕震宙; 赵新攀

    2011-01-01

    A novel β hyper-plane based importance sampling method is presented to estimate failure probability of structure. Based on β section, a virtual hyper-plane tangent to the failure surface, the variable space is separated into the importance region R and the unimportance region S, on which the truncated importance sampling functions hR(x) and hS(x) sue established respectively. The sampling numbers generated from hR(x) and hS(x) are dependent on the contributions of R and S to the failure probability, and they are determined by the iterative simulations. The formulae of the failure probability estimation, the variance and the coefficient of variation are derived for the presented β hyper-plane importance sampling method. The presented method is suitable for the failure probability estimation of both a single failure mode and multiple failure modes in parallel. The examples show that the presented method is more efficient than the traditional importance sampling method.%提出一种基于β面的截断重要抽样法求解结构单失效模式的失效概率。该方法通过在设计点处作失效面的虚拟切面-β面,将变量空间分割成重要抽样区域R和非重要抽样区域S。在R和S区域分别建立相应的截断重要抽样密度函数hR(x)和hs(x),从hR(x)和hs(x)中抽取的样本量按照R和S区域对失效概率的贡献来分配,由迭代模拟计算得到。推导了基于β面截断重要抽样法的失效概率估计值的方差和变异系数计算公式,该方法还可推广应用到并联系统中。算例结果表明,在相同的计算精度下,基于β面的截断重要抽样法所需的样本数比传统重要抽样法计算量少。

  17. Adaptation of the Participant Role Scale (PRS) in a Spanish youth sample: measurement invariance across gender and relationship with sociometric status.

    Science.gov (United States)

    Lucas-Molina, Beatriz; Williamson, Ariel A; Pulido, Rosa; Calderón, Sonsoles

    2014-11-01

    In recent years, bullying research has transitioned from investigating the characteristics of the bully-victim dyad to examining bullying as a group-level process, in which the majority of children play some kind of role. This study used a shortened adaptation of the Participant Role Scale (PRS) to identify these roles in a representative sample of 2,050 Spanish children aged 8 to 13 years. Confirmatory factor analysis revealed three different roles, indicating that the adapted scale remains a reliable way to distinguish the Bully, Defender, and Outsider roles. In addition, measurement invariance of the adapted scale was examined to analyze possible gender differences among the roles. Peer status was assessed separately by gender through two sociometric procedures: the nominations-based method and the ratings-based method. Across genders, children in the Bully role were more often rated as rejected, whereas Defenders were more popular. Results suggest that although the PRS can reveal several different peer roles in the bullying process, a more clear distinction between bullying roles (i.e., Bully, Assistant, and Reinforcer) could better inform strategies for bullying interventions.

  18. Whole genome resequencing of a laboratory-adapted Drosophila melanogaster population sample [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    William P. Gilks

    2016-11-01

    Full Text Available As part of a study into the molecular genetics of sexually dimorphic complex traits, we used next-generation sequencing to obtain data on genomic variation in an outbred laboratory-adapted fruit fly (Drosophila melanogaster population. We successfully resequenced the whole genome of 220 hemiclonal females that were heterozygous for the same Berkeley reference line genome (BDGP6/dm6, and a unique haplotype from the outbred base population (LHM. The use of a static and known genetic background enabled us to obtain sequences from whole genome phased haplotypes. We used a BWA-Picard-GATK pipeline for mapping sequence reads to the dm6 reference genome assembly, at a median depth of coverage of 31X, and have made the resulting data publicly-available in the NCBI Short Read Archive (Accession number SRP058502. We used Haplotype Caller to discover and genotype 1,726,931 small genomic variants (SNPs and indels, <200bp. Additionally we detected and genotyped 167 large structural variants (1-100Kb in size using GenomeStrip/2.0. Sequence and genotype data are publicly-available at the corresponding NCBI databases: Short Read Archive, dbSNP and dbVar (BioProject PRJNA282591. We have also released the unfiltered genotype data, and the code and logs for data processing and summary statistics (https://zenodo.org/communities/sussex_drosophila_sequencing/.

  19. The relative importance of Staphylococcus saprophyticus as a urinary tract pathogen: distribution of bacteria among urinary samples analysed during 1 year at a major Swedish laboratory.

    Science.gov (United States)

    Eriksson, Andreas; Giske, Christian G; Ternhag, Anders

    2013-01-01

    To determine the distribution of urinary tract pathogens with focus on Staphylococcus saprophyticus and analyse the seasonality, antibiotic susceptibility, and gender and age distributions in a large Swedish cohort. S. saprophyticus is considered an important causative agent of urinary tract infection (UTI) in young women, and some earlier studies have reported up to approximately 40% of UTIs in this patient group being caused by S. saprophyticus. We hypothesized that this may be true only in very specific outpatient settings. During the year 2010, 113,720 urine samples were sent for culture to the Karolinska University Hospital, from both clinics in the hospital and from primary care units. Patient age, gender and month of sampling were analysed for S. saprophyticus, Escherichia coli, Klebsiella pneumoniae and Proteus mirabilis. Species data were obtained for 42,633 (37%) of the urine samples. The most common pathogens were E. coli (57.0%), Enterococcus faecalis (6.5%), K. pneumoniae (5.9%), group B streptococci (5.7%), P. mirabilis (3.0%) and S. saprophyticus (1.8%). The majority of subjects with S. saprophyticus were women 15-29 years of age (63.8%). In this age group, S. saprophyticus constituted 12.5% of all urinary tract pathogens. S. saprophyticus is a common urinary tract pathogen in young women, but its relative importance is low compared with E. coli even in this patient group. For women in other ages and for men, growth of S. saprophyticus is a quite uncommon finding.

  20. Assessing an Adaptive Cycle in a Social System under External Pressure to Change: the Importance of Intergroup Relations in Recreational Fisheries Governance

    Directory of Open Access Journals (Sweden)

    Robert Arlinghaus

    2011-06-01

    Full Text Available The adaptive cycle constitutes a heuristic originally used to interpret the dynamics of complex ecosystems in response to disturbance and change. It is assumed that socially constructed governance systems go through similar phases (K, Ω [omega], α [alpha], r as evident in ecological adaptive cycles. Two key dimensions of change shaping the four phases of an adaptive cycle are the degree of connectedness and the range of potential in the system. Our purpose was to quantitatively assess the four phases of the adaptive cycle in a social system by measuring the potential and connectedness dimensions and their different levels in each of the four phases. We assessed these dimensions using quantitative data from content analysis of magazine articles describing the transition process of East German recreational fisheries governance after the fall of the Berlin Wall in 1989. This process was characterized by the discussion of two governance alternatives amendable for implementation: a central East German and a decentralized West German approach. Contrary to assumptions in the adaptive cycle heuristic, we were unable to identify the four phases of the adaptive cycle in our governance system based on quantitatively assessed levels of connectedness and potential alone. However, the insertion of in-group (East Germans and out-group (West Germans dimensions representing the two governance alternatives in our analysis enabled us to identify the specific time frames for all four phases of the adaptive cycle on a monthly basis. These findings suggest that an unmodified "figure-eight model" of the adaptive cycle may not necessarily hold in social systems. Inclusion of disciplinary theories such as intergroup relation theory will help in understanding adaptation processes in social systems.

  1. Implications of the minimal clinically important difference for health-related quality-of-life outcomes: a comparison of sample size requirements for an incontinence treatment trial.

    Science.gov (United States)

    Halme, Alex S; Fritel, Xavier; Benedetti, Andrea; Eng, Ken; Tannenbaum, Cara

    2015-03-01

    Sample size calculations for treatment trials that aim to assess health-related quality-of-life (HRQOL) outcomes are often difficult to perform. Researchers must select a target minimal clinically important difference (MCID) in HRQOL for the trial, estimate the effect size of the intervention, and then consider the responsiveness of different HRQOL measures for detecting improvements. Generic preference-based HRQOL measures are usually less sensitive to gains in HRQOL than are disease-specific measures, but are nonetheless recommended to quantify an impact on HRQOL that can be translated into quality-adjusted life-years during cost-effectiveness analyses. Mapping disease-specific measures onto generic measures is a proposed method for yielding more efficient sample size requirements while retaining the ability to generate utility weights for cost-effectiveness analyses. This study sought to test this mapping strategy to calculate and compare the effect on sample size of three different methods. Three different methods were used for determining an MCID in HRQOL in patients with incontinence: 1) a global rating of improvement, 2) an incontinence-specific HRQOL instrument, and 3) a generic preference-based HRQOL instrument using mapping coefficients. The sample size required to detect a 20% difference in the MCID for the global rating of improvement was 52 per trial arm, 172 per arm for the incontinence-specific HRQOL outcome, and 500 per arm for the generic preference-based HRQOL outcome. We caution that treatment trials of conditions for which improvements are not easy to measure on generic HRQOL instruments will still require significantly greater sample size even when mapping functions are used to try to gain efficiency. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. Evaluation of the orthodontic treatment need in a paediatric sample from Southern Italy and its importance among paediatricians for improving oral health in pediatric dentistry

    Science.gov (United States)

    Ierardo, Gaetano; Corridore, Denise; Di Carlo, Gabriele; Di Giorgio, Gianni; Leonardi, Emanuele; Campus, Guglielmo-Giuseppe; Vozza, Iole; Polimeni, Antonella; Bossù, Maurizio

    2017-01-01

    Background Data from epidemiological studies investigating the prevalence and severity of malocclusions in children are of great relevance to public health programs aimed at orthodontic prevention. Previous epidemiological studies focused mainly on the adolescence age group and reported a prevalence of malocclusion with a high variability, going from 32% to 93%. Aim of our study was to assess the need for orthodontic treatment in a paediatric sample from Southern Italy in order to improve awareness among paediatricians about oral health preventive strategies in pediatric dentistry. Material and Methods The study used the IOTN-DHC index to evaluate the need for orthodontic treatment for several malocclusions (overjet, reverse overjet, overbite, openbite, crossbite) in a sample of 579 children in the 2-9 years age range. Results The most frequently altered occlusal parameter was the overbite (prevalence: 24.5%), while the occlusal anomaly that most frequently presented a need for orthodontic treatment was the crossbite (8.8%). The overall prevalence of need for orthodontic treatment was of 19.3%, while 49% of the sample showed one or more altered occlusal parameters. No statistically significant difference was found between males and females. Conclusions Results from this study support the idea that the establishment of a malocclusion is a gradual process starting at an early age. Effective orthodontic prevention programs should therefore include preschool children being aware paediatricians of the importance of early first dental visit. Key words:Orthodontic treatment, malocclusion, oral health, pediatric dentistry. PMID:28936290

  3. Slice Sampling

    CERN Document Server

    Neal, R M

    2000-01-01

    Markov chain sampling methods that automatically adapt to characteristics of the distribution being sampled can be constructed by exploiting the principle that one can sample from a distribution by sampling uniformly from the region under the plot of its density function. A Markov chain that converges to this uniform distribution can be constructed by alternating uniform sampling in the vertical direction with uniform sampling from the horizontal `slice' defined by the current vertical position, or more generally, with some update that leaves the uniform distribution over this slice invariant. Variations on such `slice sampling' methods are easily implemented for univariate distributions, and can be used to sample from a multivariate distribution by updating each variable in turn. This approach is often easier to implement than Gibbs sampling, and more efficient than simple Metropolis updates, due to the ability of slice sampling to adaptively choose the magnitude of changes made. It is therefore attractive f...

  4. Could the Food Neophobia Scale be adapted to pregnant women? A confirmatory factor analysis in a Portuguese sample.

    Science.gov (United States)

    Paupério, Ana; Severo, Milton; Lopes, Carla; Moreira, Pedro; Cooke, Lucy; Oliveira, Andreia

    2014-04-01

    The Food Neophobia Scale (FNS) is widely used in different countries, however appropriate psychometric analyses are required to allow cross-cultural comparisons. To our knowledge, most studies have been conducted among children and adult populations, with no reference to pregnant women. The objective of this study was to translate and test the psychometric properties of a Portuguese version of the FNS, and to identify clusters of food neophobia during pregnancy. The FNS was translated into Portuguese by three health researchers, and back-translated into English by an independent native English speaker and professional translator. The scale was self-administered in a sample of 219 women from the baseline evaluation of the Taste intervention study (HabEat project: http://www.habeat.eu/), who attended medical visits in two hospitals from Porto, Portugal, reporting food neophobia during the last trimester of pregnancy. The FNS consists of 10 items with a 7-point rating scale. An exploratory analysis was performed to evaluate the scale's dimensionality, followed by a confirmatory factor analysis to test the fit of the previous model by using different indexes. Cronbach's alpha coefficient was calculated to evaluate the internal reliability of the scale. The construct validity was assessed by comparing the FNS scores by categories of education, age and fruit and vegetables intake by ANOVA. A Model-based clustering was used to identify patterns of food neophobia; the number of latent classes was defined according to the Bayesian information criterion. A two-factor model solution was obtained (after excluding item 8 with a factor loading foods; less neophobic traits) and items 2, 3, 5 and 7 loaded into a second factor (i.e. more neophobic traits). A good global of fitness of the model was confirmed by fit indexes: TLI=0.876, CFI=0.911, RMSEA=0.088 and SRMR=0.051. The higher the education, age, and fruit and vegetables intake the lower the neophobic tendency, measured by

  5. AdBagging:自适应抽样参数在线装袋算法%AdBagging: adaptive sampling parameters online bagging algorithm

    Institute of Scientific and Technical Information of China (English)

    李小斌; 李世银

    2011-01-01

    By analyzing the concept drift problem in data stream classification, a new algorithm based on online bagging algorithm named adaptive lambda bagging algorithm (AdBagging) is introduced. The new algorithm dynamically adjusts the sampling parameters of Poisson distribution in online bagging algorithm based on the number of misclassified samples in data stream classification. Through this procedure the new algorithm could give more attention to the misclassified samples and give little attention to the right classified samples. At the same time the learning weight of samples for algorithm can be adjusted according to the temporal order. So the algorithm could solve the concept drift problem in data stream classification. Experiments on synthesize and real data sets prove that the algorithm is effective.%通过对数据流分类中的概念漂移问题的研究,提出了一种在线装袋(Online Bagging)算法的改进算法——自适应抽样参数的在线装袋算法AdBagging(adaptive lambda bagging).利用在分类过程中出现的误分样本数量来调整Online Bagging算法中的泊松(Poisson)分布的抽样参数,从而可以动态调整新样本在学习器中的权重,即对于数据流中的误分类样本给予较高的学习权重因子,而对于正确分类的样本给予较低的学习权重因子,同时结合样本出现的时间顺序调整权重因子,使得集成分类器可以动态调整其多样性(adversity).该算法具有Online Bagging算法的高效简洁优点,并能解决数据流中具有概念漂移的问题,人工数据集和实际数据集上的实验结果表明了该算法的有效性.

  6. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  7. Estimation of number and density, and random distribution testing of important plant species in Ban Pong Forest, Sansai District, Chiang Mai Province, Thailand using T-Square sampling

    Directory of Open Access Journals (Sweden)

    Phahol Sakkatat

    Full Text Available A study by T-square sampling method was conducted to investigate importantplant species in Ban Pong Forest, Sansai district, Chiang Mai province by estimation of theirnumber and density, and testing of their random distribution. The result showed that, therewere 14 kinds of important plant species, viz. Dipterocarpus tuberculatus Roxb., Shoreaobtuse Wall. exBlume, Bridelia retusa (L. A. Juss, Derris scandens Benth., Thysostachyssiamensis, Parinari anamense Hance, Vitex pinnata L.f., Canarium subulatum Guill.,Litsea glutinosa C.B.Roxb., Alphonsea glabrifolia Craib., Pueraria mirifica, Vaticastapfiana van Slooten, Walsura robusta Rox. and Dipterocarpus alatus Roxb. By far,Dipterocarpus tuberculatus Roxb was greatest in number and density, and all of the specieshad random distribution, except Walsura robusta Roxb and Dipterocarpus alatus Roxb

  8. Improvement of near infrared spectroscopic (NIRS) analysis of caffeine in roasted Arabica coffee by variable selection method of stability competitive adaptive reweighted sampling (SCARS)

    Science.gov (United States)

    Zhang, Xuan; Li, Wei; Yin, Bin; Chen, Weizhong; Kelly, Declan P.; Wang, Xiaoxin; Zheng, Kaiyi; Du, Yiping

    2013-10-01

    Coffee is the most heavily consumed beverage in the world after water, for which quality is a key consideration in commercial trade. Therefore, caffeine content which has a significant effect on the final quality of the coffee products requires to be determined fast and reliably by new analytical techniques. The main purpose of this work was to establish a powerful and practical analytical method based on near infrared spectroscopy (NIRS) and chemometrics for quantitative determination of caffeine content in roasted Arabica coffees. Ground coffee samples within a wide range of roasted levels were analyzed by NIR, meanwhile, in which the caffeine contents were quantitative determined by the most commonly used HPLC-UV method as the reference values. Then calibration models based on chemometric analyses of the NIR spectral data and reference concentrations of coffee samples were developed. Partial least squares (PLS) regression was used to construct the models. Furthermore, diverse spectra pretreatment and variable selection techniques were applied in order to obtain robust and reliable reduced-spectrum regression models. Comparing the respective quality of the different models constructed, the application of second derivative pretreatment and stability competitive adaptive reweighted sampling (SCARS) variable selection provided a notably improved regression model, with root mean square error of cross validation (RMSECV) of 0.375 mg/g and correlation coefficient (R) of 0.918 at PLS factor of 7. An independent test set was used to assess the model, with the root mean square error of prediction (RMSEP) of 0.378 mg/g, mean relative error of 1.976% and mean relative standard deviation (RSD) of 1.707%. Thus, the results provided by the high-quality calibration model revealed the feasibility of NIR spectroscopy for at-line application to predict the caffeine content of unknown roasted coffee samples, thanks to the short analysis time of a few seconds and non

  9. The Importance of In Situ Measurements and Sample Return in the Search for Chemical Biosignatures on Mars or other Solar System Bodies (Invited)

    Science.gov (United States)

    Glavin, D. P.; Brinckerhoff, W. B.; Conrad, P. G.; Dworkin, J. P.; Eigenbrode, J. L.; Getty, S.; Mahaffy, P. R.

    2013-12-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA's robotic exploration program for decades to come. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including organic compounds important in life on Earth and their geological forms. These compounds include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right samples to Earth (i.e. samples containing chemical biosignatures or having a high probability of biosignature preservation) would enable more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct life on Mars or elsewhere. In this presentation we will review the current in situ analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) rover using the Sample Analysis at Mars (SAM) instrument suite [3] and discuss how both future advanced in situ instrumentation [4] and laboratory measurements of samples returned from Mars and other targets of astrobiological interest including the icy moons of Jupiter and Saturn will help advance our understanding of chemical biosignatures in the Solar System. References: [1] Cronin, J. R and Chang S. (1993

  10. Face recognition with single training sample per person based on adaptive weighted LBP%自适应加权LBP的单样本人脸识别方法

    Institute of Scientific and Technical Information of China (English)

    赵汝哲; 房斌; 文静

    2012-01-01

    在面对单训练样本的人脸识别问题时,传统人脸识别方法识别率会下降很多,有的方法甚至不能使用.针对单样本人脸识别问题,提出了一种自适应加权LBP方法.方法既提取了纹理信息又包含了分块拓扑信息,更重要的是可以把这些特征用合适的权重融合起来.划分图像并用LBP提取纹理信息;利用方差来完成对特征的自适应加权融合;用最近邻分类器识别结果.在ORL人脸数据库上的实验结果表明,该方法可以有效地提高识别率.%Facing with the problem of face recognition with single training sample per person, the conventional methods will suffer serious performance drop or even fail to work. To solve this problem, this paper proposes an adaptive weighted Local Binary Pattern (LBP) method. It combines both texture feature and topological information with a novel weighted way involving the variance of sub-images. The paper partitions facial images and uses LBP to extract texture feature. It makes use of variance to implement the adaptive weighted fusion for features. The nearest neighbor classifier is adopted for further recognition. Experimental results show a better performance on ORL facial database.

  11. QR code sampling method based on adaptive match%基于自适应匹配的QR码取样方法

    Institute of Scientific and Technical Information of China (English)

    宋贤媛; 张多英

    2015-01-01

    The QR code acquired by camera always comes with some distortion, so it needs to be recognised to the standard QR code before decode. Aimming at the QR coderecognition, distortion and correction is analyzed and studied in this paper. Some inevitable distortion still existed based on the tilt correction and geometric correction;the traditional method can’t sample the QR code accurately. According to the problem, this paper proposes the adaptive match method, acquire the effective sampling region of QRcode by the matching rate of two adjacent pixel row(column). Experiment shows that the method is real-time with good stability, it can sampling the QR code fast and accurately.%通常由相机获取的QR码图像都带有一些失真,所以在译码前需要对获取的QR码图像进行识别以得到标准规格的QR码。针对QR码识别中的失真和校正进行了分析研究,解决了某些QR码经过倾斜校正和几何校正后仍存在一些无法避免的失真而无法被传统方法准确取样的问题,提出了一种自适应匹配取样法,根据相邻行(列)像素的匹配度准确获取QR码的模块有效取样区域。实验证明该方法稳定性好,能够快速准确地对QR码进行取样。

  12. Pulse transient hot strip technique adapted for slab sample geometry to study anisotropic thermal transport properties of μm-thin crystalline films.

    Science.gov (United States)

    Ma, Y; Gustavsson, J S; Haglund, A; Gustavsson, M; Gustafsson, S E

    2014-04-01

    A new method based on the adaptation of the Pulse Transient Hot Strip technique to slab sample geometry has been developed for studying thermal conductivity and thermal diffusivity of anisotropic thin film materials (conductivity in the 0.01-100 W/mK range, deposited on thin substrates (i.e., wafers). Strength of this technique is that it provides a well-controlled thermal probing depth, making it possible to probe a predetermined depth of the sample layer and thereby avoiding the influence from material(s) deeper down in the sample. To verify the technique a series of measurements were conducted on a y-cut single crystal quartz wafer. A Hot Strip sensor (32-μm wide, 3.2-mm long) was deposited along two orthogonal crystallographic (x- and z-) directions and two independent pulse transients were recorded. Thereafter, the data was fitted to our theoretical model, and the anisotropic thermal transport properties were determined. Using a thermal probing depth of only 30 μm, we obtained a thermal conductivity along the perpendicular (parallel) direction to the z-, i.e., optic axis of 6.48 (11.4) W/mK, and a thermal diffusivity of 3.62 (6.52) mm(2)/s. This yields a volumetric specific heat of 1.79 MJ/mK. These values agree well with tabulated data on bulk crystalline quartz supporting the accuracy of the technique, and the obtained standard deviation of less than 2.7% demonstrates the precision of this new measurement technique.

  13. Climate adaptation

    Science.gov (United States)

    Kinzig, Ann P.

    2015-03-01

    This paper is intended as a brief introduction to climate adaptation in a conference devoted otherwise to the physics of sustainable energy. Whereas mitigation involves measures to reduce the probability of a potential event, such as climate change, adaptation refers to actions that lessen the impact of climate change. Mitigation and adaptation differ in other ways as well. Adaptation does not necessarily have to be implemented immediately to be effective; it only needs to be in place before the threat arrives. Also, adaptation does not necessarily require global, coordinated action; many effective adaptation actions can be local. Some urban communities, because of land-use change and the urban heat-island effect, currently face changes similar to some expected under climate change, such as changes in water availability, heat-related morbidity, or changes in disease patterns. Concern over those impacts might motivate the implementation of measures that would also help in climate adaptation, despite skepticism among some policy makers about anthropogenic global warming. Studies of ancient civilizations in the southwestern US lends some insight into factors that may or may not be important to successful adaptation.

  14. Career adaptability profiles and their relationship to adaptivity and adapting

    OpenAIRE

    Hirschi, Andreas; Valero, Domingo

    2015-01-01

    Research on career adaptability predominantly uses variable-centered approaches that focus on the average effects in terms of the predictors and outcomes within a given sample. Extending this research, the present paper used a person-centered approach to determine whether subgroups with distinct adaptability profiles in terms of concern, control, curiosity and confidence can be identified. We also explored the relationship between the various adaptability profiles and adapting (career plannin...

  15. The R Package MitISEM: Mixture of Student-t Distributions using Importance Sampling Weighted Expectation Maximization for Efficient and Robust Simulation

    NARCIS (Netherlands)

    N. Basturk (Nalan); L.F. Hoogerheide (Lennart); A. Opschoor (Anne); H.K. van Dijk (Herman)

    2012-01-01

    textabstractThis paper presents the R package MitISEM, which provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate de

  16. Long-term outcomes of primary systemic light chain (AL) amyloidosis in patients treated upfront with bortezomib or lenalidomide and the importance of risk adapted strategies.

    Science.gov (United States)

    Kastritis, Efstathios; Roussou, Maria; Gavriatopoulou, Maria; Migkou, Magdalini; Kalapanida, Despina; Pamboucas, Constantinos; Kaldara, Elisavet; Ntalianis, Argyrios; Psimenou, Erasmia; Toumanidis, Savvas T; Tasidou, Anna; Terpos, Evangelos; Dimopoulos, Meletios A

    2015-04-01

    Bortezomib and lenalidomide are increasingly used in patients with AL amyloidosis, but long term data on their use as primary therapy in AL amyloidosis are lacking while early mortality remains significant. Thus, we analyzed the long term outcomes of 85 consecutive unselected patients, which received primary therapy with bortezomib or lenalidomide and we prospectively evaluated a risk adapted strategy based on bortezomib/dexamethasone to reduce early mortality. Twenty-six patients received full-dose bortezomib/dexamethasone, 36 patients lenalidomide with oral cyclophosphamide and low-dose dexamethasone and 23 patients received bortezomib/dexamethasone at a dose and schedule adjusted to the risk of early death. On intent to treat, 67% of patients achieved a hematologic response (24% hemCRs) and 34% an organ response; both were more frequent with bortezomib. An early death occurred in 20%: in 36% of those treated with full-dose bortezomib/dexamethasone, in 22% of lenalidomide-treated patients but only in 4.5% of patients treated with risk-adapted bortezomib/dexamethasone. Activity of full vs. adjusted dose bortezomib/dexamethasone was similar; twice weekly vs. weekly administration of bortezomib also had similar activity. After a median follow up of 57 months, median survival is 47 months and is similar for patients treated with bortezomib vs. lenalidomide-based regimens. However, risk adjusted-bortezomib/dexamethasone was associated with improved 1-year survival vs. full-dose bortezomib/dexamethasone or lenalidomide-based therapy (81% vs. 56% vs. 53%, respectively). In conclusion, risk-adapted bortezomib/dexamethasone may reduce early mortality and preserve activity while long term follow up indicates that remissions obtained with lenalidomide or bortezomib may be durable, even without consolidation with alkylators. © 2015 Wiley Periodicals, Inc.

  17. Defining “Normophilic” and “Paraphilic” Sexual Fantasies in a Population‐Based Sample: On the Importance of Considering Subgroups

    Science.gov (United States)

    2015-01-01

    criteria for paraphilia are too inclusive. Suggestions are given to improve the definition of pathological sexual interests, and the crucial difference between SF and sexual interest is underlined. Joyal CC. Defining “normophilic” and “paraphilic” sexual fantasies in a population‐based sample: On the importance of considering subgroups. Sex Med 2015;3:321–330. PMID:26797067

  18. Stochastic Engine Final Report: Applying Markov Chain Monte Carlo Methods with Importance Sampling to Large-Scale Data-Driven Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Glaser, R E; Johannesson, G; Sengupta, S; Kosovic, B; Carle, S; Franz, G A; Aines, R D; Nitao, J J; Hanley, W G; Ramirez, A L; Newmark, R L; Johnson, V M; Dyer, K M; Henderson, K A; Sugiyama, G A; Hickling, T L; Pasyanos, M E; Jones, D A; Grimm, R J; Levine, R A

    2004-03-11

    Accurate prediction of complex phenomena can be greatly enhanced through the use of data and observations to update simulations. The ability to create these data-driven simulations is limited by error and uncertainty in both the data and the simulation. The stochastic engine project addressed this problem through the development and application of a family of Markov Chain Monte Carlo methods utilizing importance sampling driven by forward simulators to minimize time spent search very large state spaces. The stochastic engine rapidly chooses among a very large number of hypothesized states and selects those that are consistent (within error) with all the information at hand. Predicted measurements from the simulator are used to estimate the likelihood of actual measurements, which in turn reduces the uncertainty in the original sample space via a conditional probability method called Bayesian inferencing. This highly efficient, staged Metropolis-type search algorithm allows us to address extremely complex problems and opens the door to solving many data-driven, nonlinear, multidimensional problems. A key challenge has been developing representation methods that integrate the local details of real data with the global physics of the simulations, enabling supercomputers to efficiently solve the problem. Development focused on large-scale problems, and on examining the mathematical robustness of the approach in diverse applications. Multiple data types were combined with large-scale simulations to evaluate systems with {approx}{sup 10}20,000 possible states (detecting underground leaks at the Hanford waste tanks). The probable uses of chemical process facilities were assessed using an evidence-tree representation and in-process updating. Other applications included contaminant flow paths at the Savannah River Site, locating structural flaws in buildings, improving models for seismic travel times systems used to monitor nuclear proliferation, characterizing the source

  19. Hemoglobina y testosterona: importancia en la aclimatación y adaptación a la altura Hemoglobin and testosterone: importance on high altitude acclimatization and adaptation

    Directory of Open Access Journals (Sweden)

    Gustavo F. Gonzales

    2011-03-01

    Full Text Available Los diferentes tipos de mecanismos que emplea el organismo cuando se enfrenta a una situación de hipoxia incluyen la acomodación, la aclimatación y la adaptación. La acomodación es la respuesta inicial a la exposición aguda a la hipoxia de altura y se caracteriza por aumento de la ventilación y de la frecuencia cardiaca. La aclimatación se presenta en los individuos que están temporalmente expuestos a la altura y, que en cierto grado, les permite tolerar la altura. En esta fase hay un incremento en la eritropoyesis, se incrementa la concentración de hemoglobina y mejora la capacidad de transporte de oxígeno. La adaptación es el proceso de aclimatación natural donde entra en juego las variaciones genéticas y la aclimatación que les permiten a los individuos vivir sin dificultad en la altura. La testosterona es una hormona que regula la eritropoyesis y la ventilación, podría estar asociada con los procesos de aclimatación y adaptación a la altura. La eritrocitosis excesiva que conduce al mal de montaña crónico es causada por una baja saturación arterial de oxígeno, una ineficiencia ventilatoria y reducida respuesta ventilatoria a la hipoxia. La testosterona se incrementa en la exposición aguda en la altura y en los nativos de altura con eritrocitosis excesiva. Los resultados de las investigaciones actuales permitirían concluir que el incremento de la testosterona y de la hemoglobina son buenas para la aclimatación adquirida pues mejoran el transporte de oxígeno pero no para la adaptación a la altura, dado que valores altos de testosterona en suero se asocian con eritrocitosis excesiva.The different types of response mechanisms that the organism uses when exposed to hypoxia include accommodation, acclimatization and adaptation. Accommodation is the initial response to acute exposure to high altitude hypoxia and is characterized by an increase in ventilation and heart rate. Acclimatization is observed in individuals

  20. Thriving While Engaging in Risk? Examining Trajectories of Adaptive Functioning, Delinquency, and Substance Use in a Nationally Representative Sample of U.S. Adolescents

    Science.gov (United States)

    Warren, Michael T.; Wray-Lake, Laura; Rote, Wendy M.; Shubert, Jennifer

    2016-01-01

    Recent advances in positive youth development theory and research explicate complex associations between adaptive functioning and risk behavior, acknowledging that high levels of both co-occur in the lives of some adolescents. However, evidence on nuanced overlapping developmental trajectories of adaptive functioning and risk has been limited to 1…

  1. Evolution of the MIDTAL microarray: the adaption and testing of oligonucleotide 18S and 28S rDNA probes and evaluation of subsequent microarray generations with Prymnesium spp. cultures and field samples.

    Science.gov (United States)

    McCoy, Gary R; Touzet, Nicolas; Fleming, Gerard T A; Raine, Robin

    2015-07-01

    The toxic microalgal species Prymnesium parvum and Prymnesium polylepis are responsible for numerous fish kills causing economic stress on the aquaculture industry and, through the consumption of contaminated shellfish, can potentially impact on human health. Monitoring of toxic phytoplankton is traditionally carried out by light microscopy. However, molecular methods of identification and quantification are becoming more common place. This study documents the optimisation of the novel Microarrays for the Detection of Toxic Algae (MIDTAL) microarray from its initial stages to the final commercial version now available from Microbia Environnement (France). Existing oligonucleotide probes used in whole-cell fluorescent in situ hybridisation (FISH) for Prymnesium species from higher group probes to species-level probes were adapted and tested on the first-generation microarray. The combination and interaction of numerous other probes specific for a whole range of phytoplankton taxa also spotted on the chip surface caused high cross reactivity, resulting in false-positive results on the microarray. The probe sequences were extended for the subsequent second-generation microarray, and further adaptations of the hybridisation protocol and incubation temperatures significantly reduced false-positive readings from the first to the second-generation chip, thereby increasing the specificity of the MIDTAL microarray. Additional refinement of the subsequent third-generation microarray protocols with the addition of a poly-T amino linker to the 5' end of each probe further enhanced the microarray performance but also highlighted the importance of optimising RNA labelling efficiency when testing with natural seawater samples from Killary Harbour, Ireland.

  2. Supporting Adaptive and Adaptable Hypermedia Presentation Semantics

    NARCIS (Netherlands)

    Bulterman, D.C.A.; Rutledge, L.; Hardman, L.; Ossenbruggen, J.R. van

    1999-01-01

    Having the content of a presentation adapt to the needs, resources and prior activities of a user can be an important benefit of electronic documents. While part of this adaptation is related to the encodings of individual data streams, much of the adaptation can/should be guided by the semantics in

  3. Gamma-hydroxybutyric acid endogenous production and post-mortem behaviour - the importance of different biological matrices, cut-off reference values, sample collection and storage conditions.

    Science.gov (United States)

    Castro, André L; Dias, Mário; Reis, Flávio; Teixeira, Helena M

    2014-10-01

    Gamma-Hydroxybutyric Acid (GHB) is an endogenous compound with a story of clinical use, since the 1960's. However, due to its secondary effects, it has become a controlled substance, entering the illicit market for recreational and "dance club scene" use, muscle enhancement purposes and drug-facilitated sexual assaults. Its endogenous context can bring some difficulties when interpreting, in a forensic context, the analytical values achieved in biological samples. This manuscript reviewed several crucial aspects related to GHB forensic toxicology evaluation, such as its post-mortem behaviour in biological samples; endogenous production values, whether in in vivo and in post-mortem samples; sampling and storage conditions (including stability tests); and cut-off reference values evaluation for different biological samples, such as whole blood, plasma, serum, urine, saliva, bile, vitreous humour and hair. This revision highlights the need of specific sampling care, storage conditions, and cut-off reference values interpretation in different biological samples, essential for proper practical application in forensic toxicology. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  4. Low incidence of clonality in cold water corals revealed through the novel use of standardized protocol adapted to deep sea sampling

    Science.gov (United States)

    Becheler, Ronan; Cassone, Anne-Laure; Noel, Philippe; Mouchel, Olivier; Morrison, Cheryl; Arnaud-Haond, Sophie

    2016-01-01

    Sampling in the deep sea is a technical challenge, which has hindered the acquisition of robust datasets that are necessary to determine the fine-grained biological patterns and processes that may shape genetic diversity. Estimates of the extent of clonality in deep-sea species, despite the importance of clonality in shaping the local dynamics and evolutionary trajectories, have been largely obscured by such limitations. Cold-water coral reefs along European margins are formed mainly by two reef-building species, Lophelia pertusa and Madrepora oculata. Here we present a fine-grained analysis of the genotypic and genetic composition of reefs occurring in the Bay of Biscay, based on an innovative deep-sea sampling protocol. This strategy was designed to be standardized, random, and allowed the georeferencing of all sampled colonies. Clonal lineages discriminated through their Multi-Locus Genotypes (MLG) at 6–7 microsatellite markers could thus be mapped to assess the level of clonality and the spatial spread of clonal lineages. High values of clonal richness were observed for both species across all sites suggesting a limited occurrence of clonality, which likely originated through fragmentation. Additionally, spatial autocorrelation analysis underlined the possible occurrence of fine-grained genetic structure in several populations of both L. pertusa and M. oculata. The two cold-water coral species examined had contrasting patterns of connectivity among canyons, with among-canyon genetic structuring detected in M. oculata, whereas L. pertusa was panmictic at the canyon scale. This study exemplifies that a standardized, random and georeferenced sampling strategy, while challenging, can be applied in the deep sea, and associated benefits outlined here include improved estimates of fine grained patterns of clonality and dispersal that are comparable across sites and among species.

  5. The Importance of Sampling Strategies on AMS Determination of Dykes II. Further Examples from the Kapaa Quarry, Koolau Volcano, Oahu, Hawaii

    Science.gov (United States)

    Mendoza-Borunda, R.; Herrero-Bervera, E.; Canon-Tapia, E.

    2012-12-01

    Recent work has suggested the convenience of dyke sampling along several profiles parallel and perpendicular to its walls to increase the probability of determining a geologically significant magma flow direction using anisotropy of magnetic susceptibility (AMS) measurements. For this work, we have resampled in great detail some dykes from the Kapaa Quarry, Koolau Volcano in Oahu Hawaii, comparing the results of a more detailed sampling scheme with those obtained previously with a traditional sampling scheme. In addition to the AMS results we will show magnetic properties, including magnetic grain sizes, Curie points and AMS measured at two different frequencies on a new MFK1-FA Spinner Kappabridge. Our results thus far provide further empirical evidence supporting the occurrence of a definite cyclic fabric acquisition during the emplacement of at least some of the dykes. This cyclic behavior can be captured using the new sampling scheme, but might be easily overlooked if the simple, more traditional sampling scheme is used. Consequently, previous claims concerning the advantages of adopting a more complex sampling scheme are justified since this approach can serve to reduce the uncertainty in the interpretation of AMS results.

  6. "Como Se Dice HIV?" Adapting Human Immunodeficiency Virus Prevention Messages to Reach Homosexual and Bisexual Hispanic Men: The Importance of Hispanic Cultural and Health Beliefs.

    Science.gov (United States)

    Bowdy, Matthew A.

    HIV/AIDS prevention messages catered to Anglo homosexual/bisexual men are not effective in teaching preventative behaviors to Hispanic homosexual/bisexual men. Hispanic sociocultural traits associated with homosexuality and bisexuality prevent the effectiveness of these messages. The Hispanic family is also extremely important in influencing…

  7. Adjusting for outcome misclassification: the importance of accounting for case-control sampling and other forms of outcome-related selection.

    Science.gov (United States)

    Jurek, Anne M; Maldonado, George; Greenland, Sander

    2013-03-01

    Special care must be taken when adjusting for outcome misclassification in case-control data. Basic adjustment formulas using either sensitivity and specificity or predictive values (as with external validation data) do not account for the fact that controls are sampled from a much larger pool of potential controls. A parallel problem arises in surveys and cohort studies in which participation or loss is outcome related. We review this problem and provide simple methods to adjust for outcome misclassification in case-control studies, and illustrate the methods in a case-control birth certificate study of cleft lip/palate and maternal cigarette smoking during pregnancy. Adjustment formulas for outcome misclassification that ignore case-control sampling can yield severely biased results. In the data we examined, the magnitude of error caused by not accounting for sampling is small when population sensitivity and specificity are high, but increases as (1) population sensitivity decreases, (2) population specificity decreases, and (3) the magnitude of the differentiality increases. Failing to account for case-control sampling can result in an odds ratio adjusted for outcome misclassification that is either too high or too low. One needs to account for outcome-related selection (such as case-control sampling) when adjusting for outcome misclassification using external information. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Sampling algorithms

    CERN Document Server

    Tillé, Yves

    2006-01-01

    Important progresses in the methods of sampling have been achieved. This book draws up an inventory of methods that can be useful for selecting samples. Forty-six sampling methods are described in the framework of general theory. This book is suitable for experienced statisticians who are familiar with the theory of survey sampling.

  9. Untargeted metabolomic analysis of human serum samples associated with exposure levels of Persistent organic pollutants indicate important perturbations in Sphingolipids and Glycerophospholipids levels.

    Science.gov (United States)

    Carrizo, Daniel; Chevallier, Olivier P; Woodside, Jayne V; Brennan, Sarah F; Cantwell, Marie M; Cuskelly, Geraldine; Elliott, Christopher T

    2017-02-01

    Persistent organic pollutants (POPs) are distributed globally and are associated with adverse health effects in humans. A study combining gas chromatography-mass spectrometry (GC-MS), high resolution mass spectrometry (UPLC-QTof-MS) and chemometrics for the analysis of adult human serum samples was undertaken. Levels of serum POPs found were in the low range of what has been reported in similar populations across Europe (median 33.84 p, p'-DDE, 3.02 HCB, 83.55 β-HCH, 246.62 PCBs ng/g lipids). Results indicated that compounds concentrations were significantly different between the two groups of POPs exposure (high vs low) and classes (DDE, β-HCH, HCB, PCBs). Using orthogonal partial last-squares discriminant analysis (OPLS-DA), multivariate models were created for both modes of acquisition and POPs classes, explaining the maximum amount of variation between sample groups (positive mode R2 = 98-90%; Q2 = 94-75%; root mean squared error of validation (RMSEV) = 12-20%: negative mode R2 = 98-91%; Q2 = 94-81%; root mean squared error of validation (RMSEV) = 10-19%. In the serum samples analyzed, a total 3076 and 3121 ions of interest were detected in positive and negative mode respectively. Of these, 40 were found to be significantly different (p < 0.05) between exposure levels. Sphingolipids and Glycerophospholipids lipids families were identified and found significantly (p < 0.05) different between high and low POPs exposure levels. This study has shown that the elucidation of metabolomic fingerprints may have the potential to be classified as biomarkers of POPs exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Modeling the uptake of semivolatile organic compounds by passive air samplers: importance of mass transfer processes within the porous sampling media.

    Science.gov (United States)

    Zhang, Xianming; Wania, Frank

    2012-09-04

    Air sampling based on diffusion of target molecules from the atmospheric gas phase to passive sampling media (PSMs) is currently modeled using the two-film approach. Originally developed to describe chemical exchange between air and water, it assumes a uniform chemical distribution in the bulk phases on either side of the interfacial films. Although such an assumption may be satisfied when modeling uptake in PSMs in which chemicals have high mobility, its validity is questionable for PSMs such as polyurethane foam disks and XAD-resin packed mesh cylinders. Mass transfer of chemicals through the PSMs may be subject to a large resistance because of the low mass fraction of gas-phase chemicals in the pores, where diffusion occurs. Here we present a model that does not assume that chemicals distribute uniformly in the PSMs. It describes the sequential diffusion of vapors through a stagnant air-side boundary layer and the PSM pores, and the reversible sorption onto the PSM. Sensitivity analyses reveal the potential influence of the latter two processes on passive sampling rates (PSRs) unless the air-side boundary layer is assumed to be extremely thick (i.e., representative of negligible wind speeds). The model also reveals that the temperature dependence of PSRs, differences in PSRs between different compounds, and a two-stage uptake, all observed in field calibrations, can be attributed to those mass transfer processes within the PSM. The kinetics of chemical sorption to the PSM from the gas phase in the macro-pores is a knowledge gap that needs to be addressed before the model can be applied to specific compounds.

  11. The importance of correcting for variable probe-sample interactions in AFM-IR spectroscopy: AFM-IR of dried bacteria on a polyurethane film.

    Science.gov (United States)

    Barlow, Daniel E; Biffinger, Justin C; Cockrell-Zugell, Allison L; Lo, Michael; Kjoller, Kevin; Cook, Debra; Lee, Woo Kyung; Pehrsson, Pehr E; Crookes-Goodson, Wendy J; Hung, Chia-Suei; Nadeau, Lloyd J; Russell, John N

    2016-08-02

    AFM-IR is a combined atomic force microscopy-infrared spectroscopy method that shows promise for nanoscale chemical characterization of biological-materials interactions. In an effort to apply this method to quantitatively probe mechanisms of microbiologically induced polyurethane degradation, we have investigated monolayer clusters of ∼200 nm thick Pseudomonas protegens Pf-5 bacteria (Pf) on a 300 nm thick polyether-polyurethane (PU) film. Here, the impact of the different biological and polymer mechanical properties on the thermomechanical AFM-IR detection mechanism was first assessed without the additional complication of polymer degradation. AFM-IR spectra of Pf and PU were compared with FTIR and showed good agreement. Local AFM-IR spectra of Pf on PU (Pf-PU) exhibited bands from both constituents, showing that AFM-IR is sensitive to chemical composition both at and below the surface. One distinct difference in local AFM-IR spectra on Pf-PU was an anomalous ∼4× increase in IR peak intensities for the probe in contact with Pf versus PU. This was attributed to differences in probe-sample interactions. In particular, significantly higher cantilever damping was observed for probe contact with PU, with a ∼10× smaller Q factor. AFM-IR chemical mapping at single wavelengths was also affected. We demonstrate ratioing of mapping data for chemical analysis as a simple method to cancel the extreme effects of the variable probe-sample interactions.

  12. How Stable Is the Personal Past? Stability of Most Important Autobiographical Memories and Life Narratives Across Eight Years in a Life Span Sample.

    Science.gov (United States)

    Köber, Christin; Habermas, Tilmann

    2017-03-23

    Considering life stories as the most individual layer of personality (McAdams, 2013) implies that life stories, similar to personality traits, exhibit some stability throughout life. Although stability of personality traits has been extensively investigated, only little is known about the stability of life stories. We therefore tested the influence of age, of the proportion of normative age-graded life events, and of global text coherence on the stability of the most important memories and of brief entire life narratives as 2 representations of the life story. We also explored whether normative age-graded life events form more stable parts of life narratives. In a longitudinal life span study covering up to 3 measurements across 8 years and 6 age groups (N = 164) the stability of important memories and of entire life narratives was measured as the percentage of events and narrative segments which were repeated in later tellings. Stability increased between ages 8 and 24, leveling off in middle adulthood. Beyond age, stability of life narratives was also predicted by proportion of normative age-graded life events and by causal-motivational text coherence in younger participants. Memories of normative developmental and social transitional life events were more stable than other memories. Stability of segments of life narratives exceeded the stability of single most important memories. Findings are discussed in terms of cognitive, personality, and narrative psychology and point to research questions in each of these fields. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  14. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  15. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, Simon Minze; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  16. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, Simon Minze; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital converte

  17. The relative importance of body change strategies, weight perception, perceived social support, and self-esteem on adolescent depressive symptoms: longitudinal findings from a national sample.

    Science.gov (United States)

    Rawana, Jennine S

    2013-07-01

    This study aimed to evaluate the relative importance of body change strategies and weight perception in adolescent depression after accounting for established risk factors for depression, namely low social support across key adolescent contexts. The moderating effect of self-esteem was also examined. Participants (N=4587, 49% female) were selected from the National Longitudinal Study of Adolescent Health. Regression analyses were conducted on the association between well-known depression risk factors (lack of perceived support from parents, peers, and schools), body change strategies, weight perception, and adolescent depressive symptoms one year later. Each well-known risk factor significantly predicted depressive symptoms. Body change strategies related to losing weight and overweight perceptions predicted depressive symptoms above and beyond established risk factors. Self-esteem moderated the relationship between trying to lose weight and depressive symptoms. Maladaptive weight loss strategies and overweight perceptions should be addressed in early identification depression programs. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Employee and job attributes as predictors of absenteeism in a national sample of workers: the importance of health and dangerous working conditions.

    Science.gov (United States)

    Leigh, J P

    1991-01-01

    This study reports on research which looks for employee and job characteristics which correlate with absenteeism. A large cross-sectional national probability sample of workers employed for at least 20 hr per week is analyzed (n = 1308). The dependent variable is the number of self-reported absences during the past 14 days. Thirty-seven independent variables are considered. Ordinary Least Squares (multiple regressions), two-limit Tobits, and two-part models are used to assess the statistical and practical significance of possible covariates. Statistically significant predictors included health variables such as being overweight, complaining of insomnia, and hazardous working conditions; job characteristics such as inflexible house; and personal variables such as being a mother with small children. Variables reflecting dangerous working conditions appear to be the strongest correlates of absenteeism. Notable variables which do not predict absenteeism include age, race, wages, and job satisfaction. Future research should direct attention toward workers' health and working conditions as covariates of absenteeism, since they are strongly significant in this study and have been neglected by most absenteeism investigators.

  19. Global dust attenuation in disc galaxies: strong variation with specific star formation and stellar mass, and the importance of sample selection

    Science.gov (United States)

    Devour, Brian M.; Bell, Eric F.

    2016-06-01

    We study the relative dust attenuation-inclination relation in 78 721 nearby galaxies using the axis ratio dependence of optical-near-IR colour, as measured by the Sloan Digital Sky Survey, the Two Micron All Sky Survey, and the Wide-field Infrared Survey Explorer. In order to avoid to the greatest extent possible attenuation-driven biases, we carefully select galaxies using dust attenuation-independent near- and mid-IR luminosities and colours. Relative u-band attenuation between face-on and edge-on disc galaxies along the star-forming main sequence varies from ˜0.55 mag up to ˜1.55 mag. The strength of the relative attenuation varies strongly with both specific star formation rate and galaxy luminosity (or stellar mass). The dependence of relative attenuation on luminosity is not monotonic, but rather peaks at M3.4 μm ≈ -21.5, corresponding to M* ≈ 3 × 1010 M⊙. This behaviour stands seemingly in contrast to some older studies; we show that older works failed to reliably probe to higher luminosities, and were insensitive to the decrease in attenuation with increasing luminosity for the brightest star-forming discs. Back-of-the-envelope scaling relations predict the strong variation of dust optical depth with specific star formation rate and stellar mass. More in-depth comparisons using the scaling relations to model the relative attenuation require the inclusion of star-dust geometry to reproduce the details of these variations (especially at high luminosities), highlighting the importance of these geometrical effects.

  20. Human Cytomegalovirus and Human Umbilical Vein Endothelial Cells: Restriction of Primary Isolation to Blood Samples and Susceptibilities of Clinical Isolates from Other Sources to Adaptation

    OpenAIRE

    2002-01-01

    In immunocompromised patients with disseminated infection, human cytomegalovirus (HCMV) is widespread in the microvascular endothelium of multiple organs. Human umbilical vein endothelial cells (HUVEC) were used in parallel to human embryonic lung fibroblasts (HELF) to recover HCMV from blood samples of immunocompromised patients. Using the shell vial technique, comparable median numbers of p72-positive HUVEC and HELF cells were found with the 26 HCMV-positive buffy coat samples out of 150 ex...

  1. Importance Sampling for Stochastic Timed Automata

    DEFF Research Database (Denmark)

    Jegourel, Cyrille; Larsen, Kim Guldstrand; Legay, Axel

    2016-01-01

    -wise change of measure is then applied on-the-fly during simulations, ensuring that dead ends are never reached. The change of measure is guaranteed by construction to reduce the variance of the estimator with respect to crude Monte Carlo, while experimental results demonstrate that we can achieve substantial...

  2. Semantic Importance Sampling for Statistical Model Checking

    Science.gov (United States)

    2015-01-16

    approach called Statistical Model Checking (SMC) [16], which relies on Monte - Carlo -based simulations to solve this verification task more scalably...Conclusion Statistical model checking (SMC) is a prominent approach for rigorous analysis of stochastic systems using Monte - Carlo simulations. In this... Monte - Carlo simulations, for computing the bounded probability that a specific event occurs during a stochastic system’s execution. Estimating the

  3. Robust Adaptive Photon Tracing using Photon Path Visibility

    DEFF Research Database (Denmark)

    Hachisuka, Toshiya; Jensen, Henrik Wann

    2011-01-01

    We present a new adaptive photon tracing algorithm which can handle illumination settings that are considered difficult for photon tracing approaches such as outdoor scenes, close-ups of a small part of an illuminated region, and illumination coming through a small gap. The key contribution in our...... algorithm is the use of visibility of photon path as the importance function which ensures that our sampling algorithm focuses on paths that are visible from the given viewpoint. Our sampling algorithm builds on two recent developments in Markov chain Monte Carlo methods: adaptive Markov chain sampling...... and replica exchange. Using these techniques, each photon path is adaptively mutated and it explores the sampling space efficiently without being stuck at a local peak of the importance function. We have implemented this sampling approach in the progressive photon mapping algorithm which provides visibility...

  4. Adaptation and Psychometric Properties of the Self-Efficacy/Social Support for Activity for Persons with Intellectual Disability Scale (SE/SS-AID) in a Spanish Sample

    Science.gov (United States)

    Cuesta-Vargas, Antonio Ignacio; Paz-Lourido, Berta; Lee, Miyoung; Peterson-Besse, Jana J.

    2013-01-01

    Background: In this study we aimed to develop a Spanish version of the Self-Efficacy/Social Support Scales for Activity for persons with Intellectual Disability (SE/SS-AID). Method: A cross-sectional study was carried out in a sample of 117 individuals with intellectual disability (ID). The SE/SS-AID scales were translated into Spanish and their…

  5. 基于柯西不等式的结构可靠度取尾重要抽样法%INTERCEPTING TAIL IMPORTANCE SAMPLING METHOD OF STRUCTURAL RELIABILITY BASED ON CAUCHY-INEQUALITY

    Institute of Scientific and Technical Information of China (English)

    房艳峰; 高华喜

    2013-01-01

    Cauchy-inequality is utilized to analyze and evaluate the variance of the simulation result of the Monte Carlo method with importance sampling technique involved. The concept of medium probability is introduced and the relationship between medium probability and simulation result variance is deduced. And based on this conclusion, the sampling function which will minimum the simulation result variance is figured, and cirque and semi-cirque intercepting tail importance sampling methods, as importance sampling techniques, are built. Especially, the ratio between the sampling function to the original distribution function is constant in the methods and the sampling points distribute in a spherical space, which take the origin as a center and the distance from the origin to the limit state surface as a radius. Furthermore, the quantitative description for the reduction of variance of simulation results can be obtained. The results of example and analysis indicate that the variance of simulation results by cirque and semi-cirque intercepting tail importance sampling methods can be reduced to one tenth and one twentieth of that by a direct method.%该文采用柯西不等式对结构可靠度计算中Monte Carlo重要抽样法的模拟方差进行了分析与评价,提出了中间概率的概念并推导出其与模拟方差的函数关系.在此基础上得出使方差达到极小值的抽样函数具体形式,并建立了属于重要抽样技术的圆环和半圆环取尾抽样方法.与其他方法不同的是其抽样函数与原变量分布函数的比值是一常数,其分布区域是以原点为中心以原点到极限状态面距离为半径的圆形外部空间,模拟结果的方差可以定量描述.理论分析和例题模拟结果都表明:圆环取尾抽样方法可使模拟结果的方差缩减到直接模拟的1/10,而半圆环取尾抽样方法能缩减到1/20.

  6. Separation and enrichment of palladium and gold in biological and environmental samples, adapted to the determination by total reflection X-ray fluorescence.

    Science.gov (United States)

    Messerschmidt, J; von Bohlen, A; Alt, F; Klockenkämper, R

    2000-03-01

    The reductive co-precipitation of trace and ultra-trace elements together with mercury followed by complete evaporation of the mercury makes it possible to determine palladium and gold by total reflection X-ray fluorescence. Both elements can be detected without interferences at optimal sensitivity in the pg range. Thus, detection limits of, e.g., 2.5 ng L-1 for palladium and 2.0 ng L-1 for gold, in urine, were obtained. The precision was determined to 0.04 at a palladium concentration of about 200 ng L-1 urine and to 0.19 at a gold concentration of only 18 ng L-1. The recovery for a urine sample spiked with known amounts of palladium and gold amounted to > 95%. Results of the combined procedure are given for the determination of palladium and gold in the urine of non-exposed and occupationally exposed persons and in some other environmentally relevant samples.

  7. Technology transfer for adaptation

    Science.gov (United States)

    Biagini, Bonizella; Kuhl, Laura; Gallagher, Kelly Sims; Ortiz, Claudia

    2014-09-01

    Technology alone will not be able to solve adaptation challenges, but it is likely to play an important role. As a result of the role of technology in adaptation and the importance of international collaboration for climate change, technology transfer for adaptation is a critical but understudied issue. Through an analysis of Global Environment Facility-managed adaptation projects, we find there is significantly more technology transfer occurring in adaptation projects than might be expected given the pessimistic rhetoric surrounding technology transfer for adaptation. Most projects focused on demonstration and early deployment/niche formation for existing technologies rather than earlier stages of innovation, which is understandable considering the pilot nature of the projects. Key challenges for the transfer process, including technology selection and appropriateness under climate change, markets and access to technology, and diffusion strategies are discussed in more detail.

  8. Personality and adaptive performance at work: a meta-analytic investigation.

    Science.gov (United States)

    Huang, Jason L; Ryan, Ann Marie; Zabel, Keith L; Palmer, Ashley

    2014-01-01

    We examined emotional stability, ambition (an aspect of extraversion), and openness as predictors of adaptive performance at work, based on the evolutionary relevance of these traits to human adaptation to novel environments. A meta-analysis on 71 independent samples (N = 7,535) demonstrated that emotional stability and ambition are both related to overall adaptive performance. Openness, however, does not contribute to the prediction of adaptive performance. Analysis of predictor importance suggests that ambition is the most important predictor for proactive forms of adaptive performance, whereas emotional stability is the most important predictor for reactive forms of adaptive performance. Job level (managers vs. employees) moderates the effects of personality traits: Ambition and emotional stability exert stronger effects on adaptive performance for managers as compared to employees. PsycINFO Database Record (c) 2014 APA, all rights reserved

  9. Cross-cultural adaptation of the short-form condom attitude scale: validity assessment in a sub-sample of rural-to-urban migrant workers in Bangladesh.

    Science.gov (United States)

    Roy, Tapash; Anderson, Claire; Evans, Catrin; Rahman, Mohammad Shafiqur; Rahman, Mosiur

    2013-03-19

    The reliable and valid measurement of attitudes towards condom use are essential to assist efforts to design population specific interventions aimed at promoting positive attitude towards, and increased use of condoms. Although several studies, mostly in English speaking western world, have demonstrated the utility of condom attitude scales, very limited culturally relevant condom attitude measures have been developed till to date. We have developed a scale and evaluated its psychometric properties in a sub-sample of rural-to-urban migrant workers in Bangladesh. This paper reports mostly on cross-sectional survey components of a mixed methods sexual health research in Bangladesh. The survey sample (n = 878) comprised rural-to-urban migrant taxi drivers (n = 437) and restaurant workers (n = 441) in Dhaka (aged 18-35 years). The study also involved focus group sessions with same populations to establish the content validity and cultural equivalency of the scale. The current scale was administered with a large sexual health survey questionnaire and consisted of 10 items. Quantitative and qualitative data were assessed with statistical and thematic analysis, respectively, and then presented. The participants found the scale simple and easy to understand and use. The internal consistency (α) of the scale was 0.89 with high construct validity (the first component accounted for about 52% of variance and second component about 20% of the total variance with an Eigen-value for both factors greater than one). The test-retest reliability (repeatability) was also found satisfactory with high inter-item correlations (the majority of the intra-class correlation coefficient values was above 2 and was significant for all items on the scale, p scale have good metric properties for assessing attitudes toward condom use. Validated scale is a short, simple and reliable instrument for measuring attitudes towards condom use in vulnerable populations like current study

  10. Resilience through adaptation

    Science.gov (United States)

    van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372

  11. Incorporating genetic sampling in long-term monitoring and adaptive management in the San Diego County Management Strategic Plan Area, Southern California

    Science.gov (United States)

    Vandergast, Amy G.

    2017-06-02

    Habitat and species conservation plans usually rely on monitoring to assess progress towards conservation goals. Southern California, USA, is a hotspot of biodiversity and home to many federally endangered and threatened species. Here, several regional multi-species conservation plans have been implemented to balance development and conservation goals, including in San Diego County. In the San Diego County Management Strategic Plan Area (MSPA), a monitoring framework for the preserve system has been developed with a focus on species monitoring, vegetation monitoring, threats monitoring and abiotic monitoring. Genetic sampling over time (genetic monitoring) has proven useful in gathering species presence and abundance data and detecting population trends, particularly related to species and threats monitoring objectives. This report reviews genetic concepts and techniques of genetics that relate to monitoring goals and outlines components of a genetic monitoring scheme that could be applied in San Diego or in other monitoring frameworks throughout the Nation.

  12. Some challenges with statistical inference in adaptive designs.

    Science.gov (United States)

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  13. MOTIVATION IN ADAPTED SPORT

    Directory of Open Access Journals (Sweden)

    Miguel Ángel Torralba

    2014-05-01

    Full Text Available This study examines the motivation for practice of sport of people with disabilities that form part to a federated sport.The sample was composed of 134 athletes of both genders and different disabilities.The “Participation Motivation Inventory Questionnaire” by Gill, Gross and Huddleston was used. The instrument was adapted to Paralympic sport and describes the main reasons that encourage the sports activity practice. The results haven´t found significant difference between men´s and women´s or between blind - visually impaired physical and motor disabilities. About the motivation of the practice of sport, worth highlighting the importance given to factors of fitness and health, like sport practice, improve the level, to compete, feel good and have fun, well above being popular, influenced by coaches or satisfy to parents.

  14. 自适应采样间隔的无线传感器网络多目标跟踪算法%Multi-target tracking algorithm based on adaptive sampling interval in wireless sensor networks

    Institute of Scientific and Technical Information of China (English)

    王建平; 赵高丽; 胡孟杰; 陈伟

    2014-01-01

    Multi-target tracking is a hot topic of current research on wireless sensor networks (WSN ). Based on adaptive sampling interval,we propose a multi-target tracking algorithm in order to save energy consumption and prevent tracking lost for WSN.We contrast the targets moving model by using the position metadata,and predicte the targets moving status based on extended Kalman filter (EKF).we adopt the probability density function (PDF )of the estimated targets to establish the tracking cluster.By defining the tracking center,we use Markov distance to quantify the election process of the main node (MN).We comput targets impact strength through the targets importance and the distance to MN node, and then use it to build tracking algorithm.We do the simulation experiment based on MATLAB,and the experiment results show that the proposed algorithm can accurate predict the trajectory of the targets,and adjust the sampling interval while the targets were moving.By analyzing the experiments data,we know that the proposed algorithm can improve the tracking precision and save the energy consumption of WSN obviously.%多目标跟踪是无线传感器网络当前研究的热点问题。针对多目标跟踪存在耗能较大,跟踪丢失等问题,提出了一种自适应采样间隔的多目标跟踪算法。采用跟踪目标的定位元数据来对目标的运动模式进行建模。基于扩展的卡尔曼滤波器来预测跟踪目标状态,采用预测目标定位的概率密度函数构建跟踪簇。通过定义跟踪目标中心,基于马氏距离来量化主节点 MN 的选举过程。通过跟踪目标重要性和其与MN之间的距离来量化目标的影响强度,并以此构建自适应采样间隔的多目标跟踪算法。基于MATLAB进行了仿真实验,实验结果显示,本文设计的跟踪算法能准确预测目标的运动轨迹,能随着运动目标的状态实时采用自适应的采样间隔。通过数据分析得知,本

  15. F-VIPGI: a new adapted version of VIPGI for FORS2 spectroscopy. Application to a sample of 16 X-ray selected galaxy clusters at 0.6 ≤ z ≤ 1.2

    Science.gov (United States)

    Nastasi, A.; Scodeggio, M.; Fassbender, R.; Böhringer, H.; Pierini, D.; Verdugo, M.; Garilli, B. M.; Franzetti, P.

    2013-02-01

    Aims: The goal of this paper is twofold. Firstly, we present F-VIPGI, a new version of the VIMOS Interactive Pipeline and Graphical Interface (VIPGI) adapted to handle FORS2 spectroscopic data taken with the standard instrument configuration. Secondly, we investigate the spectro-photometric properties of a sample of galaxies residing in distant X-ray selected galaxy clusters, the optical spectra of which were reduced with this new pipeline. Methods: We provide basic technical information about the innovations of the new software and refer the reader to the original VIPGI paper for a detailed description of the core functions and performances. As a demonstration of the capabilities of the new pipeline, we then show results obtained for 16 distant (0.65 ≤ z ≤ 1.25) X-ray luminous galaxy clusters selected within the XMM-Newton Distant Cluster Project. We performed a spectral indices analysis of the extracted optical spectra of their members, based on which we created a library of composite high signal-to-noise ratio spectra. We then compared the average spectra of the passive galaxies of our sample with those computed for the same class of objects that reside in the field at similar high redshift and in groups in the local Universe. Finally, We computed the "photometric" properties of our templates and compared them with those of the Coma Cluster galaxies, which we took as representative of the local cluster population. Results: We demonstrate the capabilities of F-VIPGI, whose strength is an increased efficiency and a simultaneous shortening of FORS2 spectroscopic data reduction time by a factor of ~10 w.r.t. the standard IRAF procedures. We then discuss the quality of the final stacked optical spectra and provide them in electronic form as high-quality spectral templates, representative of passive and star-forming galaxies residing in distant galaxy clusters. By comparing the spectro-photometric properties of our templates with the local and distant galaxy

  16. Validation of a simplified field-adapted procedure for routine determinations of methyl mercury at trace levels in natural water samples using species-specific isotope dilution mass spectrometry.

    Science.gov (United States)

    Lambertsson, Lars; Björn, Erik

    2004-12-01

    A field-adapted procedure based on species-specific isotope dilution (SSID) methodology for trace-level determinations of methyl mercury (CH(3)Hg(+)) in mire, fresh and sea water samples was developed, validated and applied in a field study. In the field study, mire water samples were filtered, standardised volumetrically with isotopically enriched CH(3) (200)Hg(+), and frozen on dry ice. The samples were derivatised in the laboratory without further pre-treatment using sodium tetraethyl borate (NaB(C(2)H(5))(4)) and the ethylated methyl mercury was purge-trapped on Tenax columns. The analyte was thermo-desorbed onto a GC-ICP-MS system for analysis. Investigations preceding field application of the method showed that when using SSID, for all tested matrices, identical results were obtained between samples that were freeze-preserved or analysed unpreserved. For DOC-rich samples (mire water) additional experiments showed no difference in CH(3)Hg(+) concentration between samples that were derivatised without pre-treatment or after liquid extraction. Extractions of samples for matrix-analyte separation prior to derivatisation are therefore not necessary. No formation of CH(3)Hg(+) was observed during sample storage and treatment when spiking samples with (198)Hg(2+). Total uncertainty budgets for the field application of the method showed that for analyte concentrations higher than 1.5 pg g(-1) (as Hg) the relative expanded uncertainty (REU) was approximately 5% and dominated by the uncertainty in the isotope standard concentration. Below 0.5 pg g(-1) (as Hg), the REU was >10% and dominated by variations in the field blank. The uncertainty of the method is sufficiently low to accurately determine CH(3)Hg(+) concentrations at trace levels. The detection limit was determined to be 4 fg g(-1) (as Hg) based on replicate analyses of laboratory blanks. The described procedure is reliable, considerably faster and simplified compared to non-SSID methods and thereby very

  17. The purpose of adaptation.

    Science.gov (United States)

    Gardner, Andy

    2017-10-06

    A central feature of Darwin's theory of natural selection is that it explains the purpose of biological adaptation. Here, I: emphasize the scientific importance of understanding what adaptations are for, in terms of facilitating the derivation of empirically testable predictions; discuss the population genetical basis for Darwin's theory of the purpose of adaptation, with reference to Fisher's 'fundamental theorem of natural selection'; and show that a deeper understanding of the purpose of adaptation is achieved in the context of social evolution, with reference to inclusive fitness and superorganisms.

  18. Origins of adaptive immunity.

    Science.gov (United States)

    Liongue, Clifford; John, Liza B; Ward, Alister

    2011-01-01

    Adaptive immunity, involving distinctive antibody- and cell-mediated responses to specific antigens based on "memory" of previous exposure, is a hallmark of higher vertebrates. It has been argued that adaptive immunity arose rapidly, as articulated in the "big bang theory" surrounding its origins, which stresses the importance of coincident whole-genome duplications. Through a close examination of the key molecules and molecular processes underpinning adaptive immunity, this review suggests a less-extreme model, in which adaptive immunity emerged as part of longer evolutionary journey. Clearly, whole-genome duplications provided additional raw genetic materials that were vital to the emergence of adaptive immunity, but a variety of other genetic events were also required to generate some of the key molecules, whereas others were preexisting and simply co-opted into adaptive immunity.

  19. Adaptive Rationality, Adaptive Behavior and Institutions

    Directory of Open Access Journals (Sweden)

    Volchik Vyacheslav, V.

    2015-12-01

    Full Text Available The economic literature focused on understanding decision-making and choice processes reveals a vast collection of approaches to human rationality. Theorists’ attention has moved from absolutely rational, utility-maximizing individuals to boundedly rational and adaptive ones. A number of economists have criticized the concepts of adaptive rationality and adaptive behavior. One of the recent trends in the economic literature is to consider humans irrational. This paper offers an approach which examines adaptive behavior in the context of existing institutions and constantly changing institutional environment. It is assumed that adaptive behavior is a process of evolutionary adjustment to fundamental uncertainty. We emphasize the importance of actors’ engagement in trial and error learning, since if they are involved in this process, they obtain experience and are able to adapt to existing and new institutions. The paper aims at identifying relevant institutions, adaptive mechanisms, informal working rules and practices that influence actors’ behavior in the field of Higher Education in Russia (Rostov Region education services market has been taken as an example. The paper emphasizes the application of qualitative interpretative methods (interviews and discourse analysis in examining actors’ behavior.

  20. Relations between psychological separation and adaptation of adolescents

    Directory of Open Access Journals (Sweden)

    Vukelić Marija

    2006-01-01

    Full Text Available The object of this research is a problem of relations between psychological separation-individuation as well as adaptation to secondary and boarding school and differences in separation and adaptation. Explorative research was performed on the sample of 586 adolescents aged 14-16. The instruments used were: The Psychological Separation Inventory (PSI, Hoffman, 1984, and The Student Adaptation to College Questionnaire (SACQ, Baker & Siryk, 1984. The results showed that adolescents from boarding schools, comparing to those who are not separated from parents during secondary school, have significant higher level of separation of both parents, but discriminate analysis showed that adolescents from boarding schools express nostalgia for their parents and wants more contacts and support from them. Adolescent from boarding school showed general better adaptation, but lower emotional adaptation comparing to not separate adolescents. Discriminate analysis showed that adolescents from boarding schools express low satisfaction with life in boarding school. The results confirm hypothesis of connection between psychological separation from parents and adaptation in adolescence. Canonical correlation analysis showed two statistically significant canonical factors. First factor shows significant connection of lower independence and better adaptation, with 23% explained variance. Second factor indicates connection of lower functional, emotional and attitude independence and better adaptation, with 12% of explained variance. Results are argued in light of theory separation-individuation and importance of meaning of separation from their parents for adolescents for adaptation on request for adaptation on secondary school and boarding school.

  1. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  2. 非线性结构动力学系统的首次穿越%An important sampling procedure for estimating failure probabilities of non-linear dynamic systems

    Institute of Scientific and Technical Information of China (English)

    任丽梅; 徐伟; 李战国

    2013-01-01

    The failure probability is one of the most important reliability measures in structural reliability assessment of dynamical systems. Here, a procedure for estimating failure probabilities of non-linear systems based on the important sampling technique was presented. Firstly ,by using Rice formula,the equivalent linear version of the non-linear systems was derived. Using the equivalent linear equation, the design point of the equivalent linear systems was used to construct control function. Secondly, an important sampling technique was used to estimate the first excursion probabilities for the non-linear system. Finally, a Duffing oscillator was taken for example. The simulation results showed that the proposed method is correct and effective; the number of samples and the computational time are reduced significantly compared with those of direct Monte Carlo simulations.%在结构动力学系统的可靠性分析中,动力学系统的首次穿越失效一直是研究重点问题之一.基于重要抽样法基础上研究了非线性结构动力学系统的首次穿越.首先根据Rice公式,得到与非线性系统方程具有相同平均上穿率的等效线性化系统方程,利用此等效方程得到设计点的解析表达式,并用此解析式来构造控制函数,最后将此控制函数运用到非线性系统中,利用重要抽样法估计非线性系统的首穿失效概率.以Duffing振子为例,模拟结果显示了方法的正确性与有效性,与原始蒙特卡罗模拟方法相比较,样本数量、计算所需时间都有明显减小.

  3. IMPORTANT NOTIFICATION

    CERN Multimedia

    HR Department

    2009-01-01

    Green plates, removals and importation of personal effects Please note that, as from 1 April 2009, formalities relating to K and CD special series French vehicle plates (green plates), removals and importation of personal effects into France and Switzerland will be dealt with by GS Department (Building 73/3-014, tel. 73683/74407). Importation and purchase of tax-free vehicles in Switzerland, as well as diplomatic privileges, will continue to be dealt with by the Installation Service of HR Department (Building 33/1-011, tel. 73962). HR and GS Departments

  4. Adaptation and visual salience

    Science.gov (United States)

    McDermott, Kyle C.; Malkoc, Gokhan; Mulligan, Jeffrey B.; Webster, Michael A.

    2011-01-01

    We examined how the salience of color is affected by adaptation to different color distributions. Observers searched for a color target on a dense background of distractors varying along different directions in color space. Prior adaptation to the backgrounds enhanced search on the same background while adaptation to orthogonal background directions slowed detection. Advantages of adaptation were seen for both contrast adaptation (to different color axes) and chromatic adaptation (to different mean chromaticities). Control experiments, including analyses of eye movements during the search, suggest that these aftereffects are unlikely to reflect simple learning or changes in search strategies on familiar backgrounds, and instead result from how adaptation alters the relative salience of the target and background colors. Comparable effects were observed along different axes in the chromatic plane or for axes defined by different combinations of luminance and chromatic contrast, consistent with visual search and adaptation mediated by multiple color mechanisms. Similar effects also occurred for color distributions characteristic of natural environments with strongly selective color gamuts. Our results are consistent with the hypothesis that adaptation may play an important functional role in highlighting the salience of novel stimuli by discounting ambient properties of the visual environment. PMID:21106682

  5. From equivalence to adaptation

    Directory of Open Access Journals (Sweden)

    Paulina Borowczyk

    2009-01-01

    Full Text Available The aim of this paper is to illustrate in which cases the translators use the adaptation when they are confronted with a term related to sociocultural aspects. We will discuss the notions of equivalence and adaptation and their limits in the translation. Some samples from Arte TV news and from the American film Shrek translated into Polish, German and French will be provided as a support for this article.

  6. Adaptive Sampling in Autonomous Marine Sensor Networks

    Science.gov (United States)

    2006-06-01

    left.perr > 180.0) left- perr - 360.0 - left.perr; double right.perr - fabs(evalCRS - rightabs); if (right.perr > 180.0) right.perr = 360.0 - right- perr ...if Cright..cerr > 180.0) right-.cerr - 360.0 - right-.cerr; if (left..cerr <- right..cerr) if (left.. perr < right.. perr ) mval - (CC200-left-. perr )/2.0...90.0-left...cerr)/2.0); else mval - (200-right.. perr )/2.0; I else if (right.. perr <- left.. perr ) mval - (CC200-right-. perr )/2.0) + (90.0 -right

  7. Measuring the dimensions of adaptive capacity: a psychometric approach

    Directory of Open Access Journals (Sweden)

    Michael Lockwood

    2015-03-01

    Full Text Available Although previous studies have examined adaptive capacity using a range of self-assessment procedures, no objective self-report approaches have been used to identify the dimensions of adaptive capacity and their relative importance. We examine the content, structure, and relative importance of dimensions of adaptive capacity as perceived by rural landholders in an agricultural landscape in South-Eastern Australia. Our findings indicate that the most important dimensions influencing perceived landholder adaptive capacity are related to their management style, particularly their change orientation. Other important dimensions are individual financial capacity, labor availability, and the capacity of communities and local networks to support landholders' management practices. Trust and confidence in government with respect to native vegetation management was not found to be a significant dimension of perceived adaptive capacity. The scale items presented, particularly those with high factor loadings, provide a solid foundation for assessment of adaptive capacity in other study areas, as well as exploration of relationships between the individual dimensions of adaptive capacity and dependent variables such as perceived resilience. Further work is needed to refine the scale items and compare the findings from this case study with those from other contexts and population samples.

  8. Appraising Adaptive Management

    Directory of Open Access Journals (Sweden)

    Kai N. Lee

    1999-12-01

    Full Text Available Adaptive management is appraised as a policy implementation approach by examining its conceptual, technical, equity, and practical strengths and limitations. Three conclusions are drawn: (1 Adaptive management has been more influential, so far, as an idea than as a practical means of gaining insight into the behavior of ecosystems utilized and inhabited by humans. (2 Adaptive management should be used only after disputing parties have agreed to an agenda of questions to be answered using the adaptive approach; this is not how the approach has been used. (3 Efficient, effective social learning, of the kind facilitated by adaptive management, is likely to be of strategic importance in governing ecosystems as humanity searches for a sustainable economy.

  9. ADAPTATION DEBATE, AHMET VEFİK PAŞA AND ZORAKİ TABİB SAMPLE / ADAPTASYON MESELESİ, AHMET VEFİK PAŞA VE ZORAKİ TABİB ÖRNEĞİ

    Directory of Open Access Journals (Sweden)

    Dr. Bayram YILDIZ

    2007-08-01

    Full Text Available Adaptation, an item of theatre in Turkish literature beginning with Tanzimat, has been opposed since it would have negative effects on development of national theatre and on Turkish society structure. On the other hand adaptation was supported as it would benefit development of modern theatre. Though the common belief was against adaptation, adaptations of Ahmet Vefik Pasa from Moliere have found acceptance. This acceptance might be due to his preference of comedy and of its best performer in world literature, Moliere, and adaptation of plays consistent with Turkish society structure and moral virtues as in the case of Zoraki Tabip.

  10. ADAPT Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Diagnostics and Prognostics Testbed (ADAPT) Project Lead: Scott Poll Subject Fault diagnosis in electrical power systems Description The Advanced...

  11. Adaptively Sharing Time-Series with Differential Privacy

    CERN Document Server

    Fan, Liyue

    2012-01-01

    Sharing real-time aggregate statistics of private data has given much benefit to the public to perform data mining for understanding important phenomena, such as Influenza outbreaks and traffic congestions. We propose an adaptive approach with sampling and estimation to release aggregated time series under differential privacy, the key innovation of which is that we utilize feedback loops based on observed (perturbed) values to dynamically adjust the estimation model as well as the sampling rate. To minimize the overall privacy cost, our solution uses the PID controller to adaptively sample long time-series according to detected data dynamics. To improve the accuracy of data release per timestamp, the Kalman filter is used to predict data values at non-sampling points and to estimate true values from perturbed query answers at sampling points. Our experiments with three real data sets show that it is beneficial to incorporate feedback into both the estimation model and the sampling process. The results confir...

  12. Introduction to adaptive arrays

    CERN Document Server

    Monzingo, Bob; Haupt, Randy

    2011-01-01

    This second edition is an extensive modernization of the bestselling introduction to the subject of adaptive array sensor systems. With the number of applications of adaptive array sensor systems growing each year, this look at the principles and fundamental techniques that are critical to these systems is more important than ever before. Introduction to Adaptive Arrays, 2nd Edition is organized as a tutorial, taking the reader by the hand and leading them through the maze of jargon that often surrounds this highly technical subject. It is easy to read and easy to follow as fundamental concept

  13. Principal component importance sampling for bank credit portfolio risk management%银行信用组合风险多成分重要性抽样算法研究

    Institute of Scientific and Technical Information of China (English)

    龚朴; 邓洋; 胡祖辉

    2012-01-01

    银行信用组合违约风险的度量和计算对银行监管有着重要的意义.使蒙特卡洛研究信用组合违约概率时,为提高模拟效率,越来越多的学者采用了重要性抽样技术来实现.它主要通过条件独立性和“均值移动”两个步骤实现.本文基于前人研究结果的基础之上,提出了一种基于违约相关性矩阵的多因子变方差的重要性抽样算法.该算法通过主成分分析选择违约结构中的占优成分并扩大其方差来实现.数值算例证明了该方法在信用组合遭遇极值事件时,能够提高模拟效率及计算精度,具有一定的计算优势.%The bank credit portfolio risk measurement has great significance to bank supervision. One of the most popular methods to estimate the default probability of credit asset is Monte Carlo simulation. In order to improve the simulation efficiency, more and more studies have adopted the important sampling technique to deal with it. In this paper, we propose an importance sampling procedure which does not need the conditional independence which previous studies had to base on. The procedure we provide uses principal component a-nalysis to choose dominant factors. Numerical experiments are provided and the results show that our approach when a credit portfolio confronts extreme events, offers substantial variance reduction and outperforms plain Monte Carlo algorithm and Morokoff IS algorithm.

  14. 直接压印快速采样法在院内感染环境监测中的重要性分析%Direct Imprint Rapid Sampling Analysis of Importance of Environmental Monitoring of Infection in Hospital

    Institute of Scientific and Technical Information of China (English)

    刘素珍; 张甜; 欧阳琳

    2013-01-01

    目的:对直接压印快速采样法在院内感染环境监测中的重要性进行分析。方法:对2011年6月-2012年5月本院空气、医护人员手部、物体表面、器械、经高压消毒的物品等240份表面细菌数量进行检测,观察组采用直接压印快速采样法,对照组采用传统生理盐水棉拭子涂擦法,对两组的检测效果进行对比分析。结果:观察组细菌数量检出率明显高于对照组,比较差异具有统计学意义(P<0.05)。结论:和传统生理盐水棉拭子涂擦法相比,直接压印快速采样法在院内感染环境的检测采样中有着明显的优势,能够减少众多中间环节所导致的检测结果误差缩短采样到达培养箱中的时间,方便快捷、易操作、经济节约,值得在同级医疗机构中推广应用。%Objective:To analyze direct imprint rapid sampling method in the environmental monitoring of nosocomial infections importance. Method:June 2011 to May 2012 in our hospital air medical hands,surfaces,equipment,items such as autoclaved 240 parts surface bacteria were detected by direct observation group imprint fast sampling method,saline control group using a cotton swab rubbed traditional method,detection results of the two groups were compared. Result:The number of bacteria detection rate of the observation group was significantly higher than that of the control group,the difference was statistically significant(P<0.05). Conclusion:The traditional method compared to saline cotton swab rubbed directly imprint rapid sampling method in the detection of nosocomial infection environmental sampling has obvious advantages,can reduce the number of intermediate links caused by shortening the sampling error of test results arrive incubator time,convenient,easy to operate,economical,worthy of medical institutions at the same level application.

  15. Transformational adaptation when incremental adaptations to climate change are insufficient.

    Science.gov (United States)

    Kates, Robert W; Travis, William R; Wilbanks, Thomas J

    2012-05-08

    All human-environment systems adapt to climate and its natural variation. Adaptation to human-induced change in climate has largely been envisioned as increments of these adaptations intended to avoid disruptions of systems at their current locations. In some places, for some systems, however, vulnerabilities and risks may be so sizeable that they require transformational rather than incremental adaptations. Three classes of transformational adaptations are those that are adopted at a much larger scale, that are truly new to a particular region or resource system, and that transform places and shift locations. We illustrate these with examples drawn from Africa, Europe, and North America. Two conditions set the stage for transformational adaptation to climate change: large vulnerability in certain regions, populations, or resource systems; and severe climate change that overwhelms even robust human use systems. However, anticipatory transformational adaptation may be difficult to implement because of uncertainties about climate change risks and adaptation benefits, the high costs of transformational actions, and institutional and behavioral actions that tend to maintain existing resource systems and policies. Implementing transformational adaptation requires effort to initiate it and then to sustain the effort over time. In initiating transformational adaptation focusing events and multiple stresses are important, combined with local leadership. In sustaining transformational adaptation, it seems likely that supportive social contexts and the availability of acceptable options and resources for actions are key enabling factors. Early steps would include incorporating transformation adaptation into risk management and initiating research to expand the menu of innovative transformational adaptations.

  16. [Imported histoplasmosis].

    Science.gov (United States)

    Stete, Katarina; Kern, Winfried V; Rieg, Siegbert; Serr, Annerose; Maurer, Christian; Tintelnot, Kathrin; Wagner, Dirk

    2015-06-01

    Infections with Histoplasma capsulatum are rare in Germany, and mostly imported from endemic areas. Infections can present as localized or disseminated diseases in immunocompromised as well as immunocompetent hosts. A travel history may be a major clue for diagnosing histoplasmosis. Diagnostic tools include histology, cultural and molecular detection as well as serology. Here we present four cases of patients diagnosed and treated in Freiburg between 2004 and 2013 that demonstrate the broad range of clinical manifestations of histoplasmosis: an immunocompetent patient with chronic basal meningitis; a patient with HIV infection and fatal disseminated disease; a patient with pulmonary and cutaneous disease and mediastinal and cervical lymphadenopathy; and an immunosuppressed patient with disseminated involvement of lung, bone marrow and adrenal glands.

  17. The relevance of cross-cultural adaptation and clinimetrics for physical therapy instruments A importância da adaptação transcultural e clinimétrica para instrumentos de fisioterapia

    Directory of Open Access Journals (Sweden)

    CG Maher

    2007-08-01

    Full Text Available BACKGROUND: Self-report outcome measures (questionnaires are widely used by physiotherapists for measuring patient's health status or treatment outcomes. Most of these measurement tools were developed in English and their usefulness is very limited in non-English speaking countries such as Brazil. The only way to solve this problem is to properly adapt the relevant questionnaires into a target language and culture (e.g. Brazilian-Portuguese and then test the instrument by checking its psychometric (clinimetric characteristics. OBJECTIVES: The purpose of this paper was to present relevant issues in the process of cross-cultural adaptations and clinimetric testing for self-report outcome measurements. Advice on how to perform a cross-cultural adaptation, how to properly check the clinimetric properties, how to select a relevant questionnaire and how to evaluate the quality of an adapted questionnaire are provided. Additionally we present all Brazilian-Portuguese cross-cultural adaptations of low back pain measurements that we know of. CONCLUSIONS: There is a clear need for more effort in the field of cross-cultural adaptation and clinimetrics, without proper instruments, the management of patients from non-English speaking countries is compromised.INTRODUÇÃO: Questionários vem sendo amplamente utilizados por fisioterapeutas para medir a condição de saúde do paciente ou dos resultados de tratamento. A maioria desses instrumentos para avaliação foi desenvolvida em inglês, sendo seu uso bastante limitado em países que não usam o inglês como língua nativa, a exemplo do Brasil. A única forma de resolver esse problema é através de uma adaptação apropriada dos questionários relevantes para um alvo lingüístico e cultural (por exemplo, português do Brasil e então testar suas características psicométricas (clinimétricas. OBJETIVO: A finalidade deste artigo foi a apresentar os tópicos relevantes no processo das adapta

  18. Adaptive measurements of urban runoff quality

    Science.gov (United States)

    Wong, Brandon P.; Kerkez, Branko

    2016-11-01

    An approach to adaptively measure runoff water quality dynamics is introduced, focusing specifically on characterizing the timing and magnitude of urban pollutographs. Rather than relying on a static schedule or flow-weighted sampling, which can miss important water quality dynamics if parameterized inadequately, novel Internet-enabled sensor nodes are used to autonomously adapt their measurement frequency to real-time weather forecasts and hydrologic conditions. This dynamic approach has the potential to significantly improve the use of constrained experimental resources, such as automated grab samplers, which continue to provide a strong alternative to sampling water quality dynamics when in situ sensors are not available. Compared to conventional flow-weighted or time-weighted sampling schemes, which rely on preset thresholds, a major benefit of the approach is the ability to dynamically adapt to features of an underlying hydrologic signal. A 28 km2 urban watershed was studied to characterize concentrations of total suspended solids (TSS) and total phosphorus. Water quality samples were autonomously triggered in response to features in the underlying hydrograph and real-time weather forecasts. The study watershed did not exhibit a strong first flush and intraevent concentration variability was driven by flow acceleration, wherein the largest loadings of TSS and total phosphorus corresponded with the steepest rising limbs of the storm hydrograph. The scalability of the proposed method is discussed in the context of larger sensor network deployments, as well the potential to improving control of urban water quality.

  19. Markov chain Monte Carlo and importance sampling for multiple targets tracking%马尔可夫链蒙特卡洛重要度采样与多目标跟踪

    Institute of Scientific and Technical Information of China (English)

    龙云利; 徐晖; 安玮

    2011-01-01

    针对强杂波环境下的多目标跟踪问题,提出一种基于马尔可夫链蒙特卡洛重要度采样的跟踪方法.通过马尔可夫链蒙特卡洛实现对联合关联事件的采样,据此计算目标可关联量测数据的边缘关联概率.在联合关联事件求解中利用单目标量测的概率密度进行重要度采样,提高采样效率.马尔可夫链蒙特卡洛重要度采样方法克服了联合概率数据关联中的“组合爆炸”问题,能够在强杂波干扰下较好地实现多目标实时跟踪.通过仿真实验对比分析了算法的跟踪精度和处理的时效性,验证了方法的有效性.%This paper presents an algorithm based on Markov chain Monte Carlo and importance sampling(MCMCIS) for tracking multi-target in a dense environment. The joint associated events are sampled by the Markov chain Monte Carlo and the marginal association probability of the measurement to the target is calculated. The probabilistic density is utilized when sampling the associated events as to improve the efficiency. Although the joint probabilistic data association(JPDA) is NP-hard, the MCMCIS provides the ability to track multi-target timely in a dense environment. The simulation experiments are implemented to analyze the tracking precision and processing time, which shows the effectiveness of the algorithm.

  20. 基于重要样本法的结构动力学系统的首次穿越%FIRST EXCURSION PROBABILITIES OF DYNAMICAL SYSTEMS BY IMPORTANCE SAMPLING

    Institute of Scientific and Technical Information of China (English)

    任丽梅; 徐伟; 肖玉柱; 王文杰

    2012-01-01

    基于Gisranov定理,提出一种估计稳态高斯白噪声激励的结构动力学系统首穿失效概率的重要样本法.文章重点是构造控制函数,控制函数促使随机响应尽量集中在样本空间中最易导致首次穿越发生的部分.利用设计点构造控制函数,在线性系统场合,结合时不变系统的结构可靠性理论,通过解有约束的优化问题得到设计点;在非线性系统场合,利用Heonsang Koo提出的设计点激励,通过镜像法得到设计点.最后给出例子,将所提方法与原始蒙特卡罗法相比较,模拟结果显示方法的正确性与有效性.%Based on the Girsanov transformation, this paper develops a method for estimating the first excursion probability of dynamical systems with stationary gauss white noise. The focus is to construct control function that concentrates on the samples paths in the "most important part" of the sample space, to achieve the purpose of variance reduction. The paper uses design point to construct control function. For linear systems, the present approach combines with the time-invariant structure reliability theory to get design points by solving the problem of the optimization. For non-linear systems, the paper uses mirror-images method to get design points. Finally the paper gives two examples. The results show the method of this paper to be correct and effective by comparing with the primitive Monte Carlo method.

  1. Postnatal Cardiovascular Adaptation

    Directory of Open Access Journals (Sweden)

    Ferda Ozlu

    2016-06-01

    Full Text Available Fetus depends on placental circulation in utero. A successful transition from intrauterin to extrauterine life depends on succesful physiological changes during labor. During delivery, fetus transfers from a liquid environment where oxygen comes via umbilical vein to air environement where oxygenation is supported via air breathing. Endocrinological changes are important for fetus to adapt to extrauterine life. In addition to these, cord clemping plays a crucial role in postnatal adaptation. Establishment of neonatal postnatal life and succesful overcome, the fetal cardiovascular transition period are important to stay on. [Archives Medical Review Journal 2016; 25(2.000: 181-190

  2. Ambiguous Adaptation

    DEFF Research Database (Denmark)

    Møller Larsen, Marcus; Lyngsie, Jacob

    We investigate why some exchange relationships terminate prematurely. We argue that investments in informal governance structures induce premature termination in relationships already governed by formal contracts. The formalized adaptive behavior of formal governance structures and the flexible a...

  3. Toothbrush Adaptations.

    Science.gov (United States)

    Exceptional Parent, 1987

    1987-01-01

    Suggestions are presented for helping disabled individuals learn to use or adapt toothbrushes for proper dental care. A directory lists dental health instructional materials available from various organizations. (CB)

  4. Adaptive digital filters

    CERN Document Server

    Kovačević, Branko; Milosavljević, Milan

    2013-01-01

    Adaptive Digital Filters” presents an important discipline applied to the domain of speech processing. The book first makes the reader acquainted with the basic terms of filtering and adaptive filtering, before introducing the field of advanced modern algorithms, some of which are contributed by the authors themselves. Working in the field of adaptive signal processing requires the use of complex mathematical tools. The book offers a detailed presentation of the mathematical models that is clear and consistent, an approach that allows everyone with a college level of mathematics knowledge to successfully follow the mathematical derivations and descriptions of algorithms.   The algorithms are presented in flow charts, which facilitates their practical implementation. The book presents many experimental results and treats the aspects of practical application of adaptive filtering in real systems, making it a valuable resource for both undergraduate and graduate students, and for all others interested in m...

  5. Adaptive regularization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient descent...... in the estimated generalization error with respect to the regularization parameters. The scheme is implemented in the authors' Designer Net framework for network training and pruning, i.e., is based on the diagonal Hessian approximation. The scheme does not require essential computational overhead in addition...... to what is needed for training and pruning. The viability of the approach is demonstrated in an experiment concerning prediction of the chaotic Mackey-Glass series. The authors find that the optimized weight decays are relatively large for densely connected networks in the initial pruning phase, while...

  6. Hedonic "adaptation"

    OpenAIRE

    2008-01-01

    People live in a world in which they are surrounded by potential disgust elicitors such as ``used'' chairs, air, silverware, and money as well as excretory activities. People function in this world by ignoring most of these, by active avoidance, reframing, or adaptation. The issue is particularly striking for professions, such as morticians, surgeons, or sanitation workers, in which there is frequent contact with major disgust elicitors. In this study, we study the ``adaptation'' process to d...

  7. Strategic Adaptation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul

    2015-01-01

    This article provides an overview of theoretical contributions that have influenced the discourse around strategic adaptation including contingency perspectives, strategic fit reasoning, decision structure, information processing, corporate entrepreneurship, and strategy process. The related...... concepts of strategic renewal, dynamic managerial capabilities, dynamic capabilities, and strategic response capabilities are discussed and contextualized against strategic responsiveness. The insights derived from this article are used to outline the contours of a dynamic process of strategic adaptation...

  8. Adaptive Behavior Functioning in Children with Autism.

    Science.gov (United States)

    Malhi, Prahbhjot; Singhi, Pratibha

    2015-08-01

    To investigate the relationship between intellectual functioning, symptom severity, and adaptive behavior functioning of children with autism spectrum disorders (ASD). Retrospective case records (1999 to 2013) of 523 children [Mean age 4.79y (SD 2.37)] maintained by the Pediatric Psychology Unit at the Department of Pediatrics of a tertiary care teaching hospital in North India were examined. The adaptive behavior functioning was measured by the Indian adaptation of the Vineland Social Maturity Scale. Symptom severity was assessed using the Childhood Autism Rating Scale (CARS). The mean Social Quotient (SQ) of the sample was 62.40 (SD = 20.41). Nearly two-third (63.3%) of the ASD had SQs less than 70 and only 15% of the ASD children had SQs above 85. Adaptive behavior scores in the lower functioning ASD children were significantly higher than their Intelligence Quotient (IQ) scores while for the high functioning ASD group, the SQs were significantly lower than their IQs. Multiple regression analysis revealed that IQ, age of the child, CARS score, and education of the mother accounted for 62.5% of the variance in the SQ of children with ASD (F = 198.01, P 0.000). Adaptive behavior measures must constitute a crucial component of not only diagnostic assessment of ASD children but also as an important goal of treatment.

  9. Context-aware adaptive spelling in motor imagery BCI

    Science.gov (United States)

    Perdikis, S.; Leeb, R.; Millán, J. d. R.

    2016-06-01

    Objective. This work presents a first motor imagery-based, adaptive brain-computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject’s performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Approach. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree’s language model to improve online expectation-maximization maximum-likelihood estimation. Main results. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. Significance. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  10. Is adaptation. Truly an adaptation? Is adaptation. Truly an adaptation?

    Directory of Open Access Journals (Sweden)

    Thais Flores Nogueira Diniz

    2008-04-01

    Full Text Available The article begins by historicizing film adaptation from the arrival of cinema, pointing out the many theoretical approaches under which the process has been seen: from the concept of “the same story told in a different medium” to a comprehensible definition such as “the process through which works can be transformed, forming an intersection of textual surfaces, quotations, conflations and inversions of other texts”. To illustrate this new concept, the article discusses Spike Jonze’s film Adaptation. according to James Naremore’s proposal which considers the study of adaptation as part of a general theory of repetition, joined with the study of recycling, remaking, and every form of retelling. The film deals with the attempt by the scriptwriter Charles Kaufman, cast by Nicholas Cage, to adapt/translate a non-fictional book to the cinema, but ends up with a kind of film which is by no means what it intended to be: a film of action in the model of Hollywood productions. During the process of creation, Charles and his twin brother, Donald, undergo a series of adventures involving some real persons from the world of film, the author and the protagonist of the book, all of them turning into fictional characters in the film. In the film, adaptation then signifies something different from itstraditional meaning. The article begins by historicizing film adaptation from the arrival of cinema, pointing out the many theoretical approaches under which the process has been seen: from the concept of “the same story told in a different medium” to a comprehensible definition such as “the process through which works can be transformed, forming an intersection of textual surfaces, quotations, conflations and inversions of other texts”. To illustrate this new concept, the article discusses Spike Jonze’s film Adaptation. according to James Naremore’s proposal which considers the study of adaptation as part of a general theory of repetition

  11. Feature-level domain adaptation

    DEFF Research Database (Denmark)

    Kouw, Wouter M.; Van Der Maaten, Laurens J P; Krijthe, Jesse H.

    2016-01-01

    Domain adaptation is the supervised learning setting in which the training and test data are sampled from different distributions: training data is sampled from a source domain, whilst test data is sampled from a target domain. This paper proposes and studies an approach, called feature...

  12. Genomics of local adaptation with gene flow.

    Science.gov (United States)

    Tigano, Anna; Friesen, Vicki L

    2016-05-01

    Gene flow is a fundamental evolutionary force in adaptation that is especially important to understand as humans are rapidly changing both the natural environment and natural levels of gene flow. Theory proposes a multifaceted role for gene flow in adaptation, but it focuses mainly on the disruptive effect that gene flow has on adaptation when selection is not strong enough to prevent the loss of locally adapted alleles. The role of gene flow in adaptation is now better understood due to the recent development of both genomic models of adaptive evolution and genomic techniques, which both point to the importance of genetic architecture in the origin and maintenance of adaptation with gene flow. In this review, we discuss three main topics on the genomics of adaptation with gene flow. First, we investigate selection on migration and gene flow. Second, we discuss the three potential sources of adaptive variation in relation to the role of gene flow in the origin of adaptation. Third, we explain how local adaptation is maintained despite gene flow: we provide a synthesis of recent genomic models of adaptation, discuss the genomic mechanisms and review empirical studies on the genomics of adaptation with gene flow. Despite predictions on the disruptive effect of gene flow in adaptation, an increasing number of studies show that gene flow can promote adaptation, that local adaptations can be maintained despite high gene flow, and that genetic architecture plays a fundamental role in the origin and maintenance of local adaptation with gene flow.

  13. Adaptive context exploitation

    Science.gov (United States)

    Steinberg, Alan N.; Bowman, Christopher L.

    2013-05-01

    This paper presents concepts and an implementation scheme to improve information exploitation processes and products by adaptive discovery and processing of contextual information. Context is used in data fusion - and in inferencing in general - to provide expectations and to constrain processing. It also is used to infer or refine desired information ("problem variables") on the basis of other available information ("context variables"). Contextual exploitation becomes critical in several classes of inferencing problems in which traditional information sources do not provide sufficient resolution between entity states or when such states are poorly or incompletely modeled. An adaptive evidence-accrual inference method - adapted from developments in target recognition and scene understanding - is presented; whereby context variables are selected on the basis of (a) their utility in refining explicit problem variables, (b) the probability of evaluating these variables to within a given accuracy, given candidate system actions (data collection, mining or processing), and (c) the cost of such actions. The Joint Directors of Laboratories (JDL) Data Fusion Model, with its extension to dual Resource Management functions, has been adapted to accommodate adaptive information exploitation, to include adaptive context exploitation. The interplay of Data Fusion and Resource Management (DF&RM) functionality in exploiting contextual information is illustrated in terms of the dual-node DF&RM architecture. An important advance is in the integration of data mining methods for data search/discovery and for abductive model refinement.

  14. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  15. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  16. Adapting Bulls to Florida

    Science.gov (United States)

    The adaptation of bulls used for natural breeding purposes to the Gulf Coast region of the United States including all of Florida is an important topic. Nearly 40% of the U.S. cow/calf population resides in the Gulf Coast and Southeast. Thus, as A.I. is relatively rare, the number of bulls used for ...

  17. Strategic adaptation to climate change in Europe

    OpenAIRE

    Fankhauser, Sam; Soare, Raluca

    2012-01-01

    This paper analyses the priorities and challenges for Europe as it adapts to the impacts of climate change. Whatever the ultimate level of warming we will experience, adaptation will be a permanent feature of decision making from now on. As such it is important to go about it in a strategic, rational way. A strategic approach to adaptation involves setting priorities, both spatially (where to adapt) and inter-temporally (when to adapt). The paper reviews the available evidence on Europe's exp...

  18. Adaptive test

    DEFF Research Database (Denmark)

    Kjeldsen, Lars Peter; Rose, Mette

    2010-01-01

    Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale.......Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale....

  19. Applying IRT_ΔB Procedure and Adapted LR Procedure to Detect DIF in Tests with Matrix Sampling%IRT_Δb法和修正LR法对矩阵取样DIF检验的有效性

    Institute of Scientific and Technical Information of China (English)

    张勋; 李凌艳; 刘红云; 孙研

    2013-01-01

    Matrix sampling is a useful technique widely used in large-scale educational assessments. In an assessment with matrix sampling design, each examinee takes one of the multiple booklets with partial items. A critical problem of detecting differential item functioning (DIF) in such scenario has gained a lot of attention in recent years, which is, it is not appropriate to take the observed total score obtained from individual booklet as the matching variable in detecting the DIF. Therefore, the traditional detecting methods, such as Mantel-Haenszel (MH), SIBTEST, as well as Logistic Regression (LR) are not suitable. IRT_Δb might be an alternative due to its abilities to provide valid matching variable. However, the DIF classification criterion of IRT_Δb was not well established yet. Thus, the purpose of this study were: 1) to investigate the efficiency and robustness of using ability parameters obtained from Item Response Theory (IRT) model as the matching variable, comparing with the way using traditional observed raw total scores;2) to further identify what factors will influence the abilities in detecting DIF of two methods;3) to propose a DIF classification criteria for IRT_Δb. Simulated and empirical data were both employed in this study to explore the robustness and the efficiency of the two prevailing DIF detecting methods, which were the IRT_Δb method and the adapted LR method with the estimation of group-level ability based on IRT model as the matching variable. In the Monte Carlo study, a matrix sampling test was generated, and various experimental conditions were simulated as follows:1) different proportions of DIF items;2) different actual examinee ability distributions;3) different sample sizes;4) different size of DIF. Two DIF detection methods were then applied and results were compared. In addition, power functions were established in order to derive DIF classification rule for IRT_Δb based on current rules for LR. In the empirical study, through

  20. Adaptive Face Recognition via Structed Representation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yu-hua; ZENG Xiao-ming

    2014-01-01

    In this paper, we propose a face recognition approach-Structed Sparse Representation-based classification when the measurement of the test sample is less than the number training samples of each subject. When this condition is not satisfied, we exploit Nearest Subspace approach to classify the test sample. In order to adapt all the cases, we combine the two approaches to an adaptive classification method-Adaptive approach. The adaptive approach yields greater recognition accuracy than the SRC approach and CRC_RLS approach with low sample rate on the Extend Yale B dataset. And it is more efficient than other two approaches.

  1. Intestinal mucosal adaptation

    Institute of Scientific and Technical Information of China (English)

    Laurie Drozdowski; Alan BR Thomson

    2006-01-01

    Intestinal failure is a condition characterized by malnutrition and/or dehydration as a result of the inadequate digestion and absorption of nutrients. The most common cause of intestinal failure is short bowel syndrome, which occurs when the functional gut mass is reduced below the level necessary for adequate nutrient and water absorption. This condition may be congenital, or may be acquired as a result of a massive resection of the small bowel. Following resection, the intestine is capable of adaptation in response to enteral nutrients as well as other trophic stimuli. Identifying factors that may enhance the process of intestinal adaptation is an exciting area of research with important potential clinical applications.

  2. STUDYING COMPLEX ADAPTIVE SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    John H. Holland

    2006-01-01

    Complex adaptive systems (cas) - systems that involve many components that adapt or learn as they interact - are at the heart of important contemporary problems. The study of cas poses unique challenges: Some of our most powerful mathematical tools, particularly methods involving fixed points, attractors, and the like, are of limited help in understanding the development of cas. This paper suggests ways to modify research methods and tools, with an emphasis on the role of computer-based models, to increase our understanding of cas.

  3. Image Sampling with Quasicrystals

    CERN Document Server

    Grundland, Mark; Masakova, Zuzana; Dodgson, Neil A; 10.3842/SIGMA.2009.075

    2009-01-01

    We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  4. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  5. Procedures for Sampling Vegetation

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report outlines vegetation sampling procedures used on various refuges in Region 3. The importance of sampling the response of marsh vegetation to management...

  6. Adaptive Playware in Physical Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Thorsteinsson, Arnar Tumi

    2011-01-01

    that the activity automatically will match the capability of the individual user. With small test groups, we investigate how different age groups and gender groups physically interact with some playware games, and find indications of differences between the groups. Despite the small test set, the results......We describe how playware and games may adapt to the interaction of the individual user. We hypothesize that in physical games there are individual differences in user interaction capabilities and styles, and that adaptive playware may adapt to the individual user’s capabilities, so...... are a proof of existence of differences and of the need for adaptation, and therefore we investigate adaptation as an important issue for playware. With simple playware games, we show that the adaptation will speed the physical game up and down to find the appropriate level that matches the reaction speed...

  7. Sequencing of 50 human exomes reveals adaptation to high altitude

    DEFF Research Database (Denmark)

    Yi, Xin; Liang, Yu; Huerta-Sanchez, Emilia

    2010-01-01

    represent strong candidates for altitude adaptation, were identified. The strongest signal of natural selection came from endothelial Per-Arnt-Sim (PAS) domain protein 1 (EPAS1), a transcription factor involved in response to hypoxia. One single-nucleotide polymorphism (SNP) at EPAS1 shows a 78% frequency...... difference between Tibetan and Han samples, representing the fastest allele frequency change observed at any human gene to date. This SNP's association with erythrocyte abundance supports the role of EPAS1 in adaptation to hypoxia. Thus, a population genomic survey has revealed a functionally important locus...

  8. Adaptation Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Huq, Saleemul

    2011-11-15

    Efforts to help the world's poor will face crises in coming decades as climate change radically alters conditions. Action Research for Community Adapation in Bangladesh (ARCAB) is an action-research programme on responding to climate change impacts through community-based adaptation. Set in Bangladesh at 20 sites that are vulnerable to floods, droughts, cyclones and sea level rise, ARCAB will follow impacts and adaptation as they evolve over half a century or more. National and international 'research partners', collaborating with ten NGO 'action partners' with global reach, seek knowledge and solutions applicable worldwide. After a year setting up ARCAB, we share lessons on the programme's design and move into our first research cycle.

  9. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    to be static, and no longer acts as a kind of spatial constancy maintaining stability and order? Moreover, what new potentials open in lighting design? This book is one of four books that is published in connection with the research project entitled LED Lighting; Interdisciplinary LED Lighting Research...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and distributed...

  10. Hedonic "adaptation"

    Directory of Open Access Journals (Sweden)

    Paul Rozin

    2008-02-01

    Full Text Available People live in a world in which they are surrounded by potential disgust elicitors such as ``used'' chairs, air, silverware, and money as well as excretory activities. People function in this world by ignoring most of these, by active avoidance, reframing, or adaptation. The issue is particularly striking for professions, such as morticians, surgeons, or sanitation workers, in which there is frequent contact with major disgust elicitors. In this study, we study the ``adaptation'' process to dead bodies as disgust elicitors, by measuring specific types of disgust sensitivity in medical students before and after they have spent a few months dissecting a cadaver. Using the Disgust Scale, we find a significant reduction in disgust responses to death and body envelope violation elicitors, but no significant change in any other specific type of disgust. There is a clear reduction in discomfort at touching a cold dead body, but not in touching a human body which is still warm after death.

  11. ADAPTATION EVALUATION

    Directory of Open Access Journals (Sweden)

    Björn PETERS, M.Sc.

    2001-01-01

    Full Text Available Twenty subjects with lower limb disabilities participated in a simulator study. The purpose of the study was to investigate how an Adaptive Cruise Control (ACC system together with two different hand controls for accelerator and brake influenced workload, comfort and driving behaviour and to further develop a method to evaluate vehicle adaptations for drivers with disabilities. The installed ACC system could maintain a constant speed selected and set by the driver and it also adapted speed in order to keep a safe distance to a leading vehicle. Furthermore, it included a stop-and-go function. Two common types of hand controls for accelerator and brake were used. The hand controls were different both with respect to function, single or dual levers, and position, on the steering column or between the front seats. The subjects were all experienced drivers of adapted cars equipped with hand controls. All subjects drove 100km at two occasions, with and without the ACC system available but with the same hand control. Subjective workload was found to be significantly lower and performance better for the ACC condition. The difference in speed variation between manual and ACC supported driving increased with the distance driven which seems to support the previous finding. The subjects thought they could control both speed and distance to leading vehicles better while the ACC was available. ACC driving did not influence reaction time, speed level, lateral position or variation in lateral position. Headway during car following situations was shorter for the ACC condition compared to manual driving. The ACC was well received, trusted and wanted. It was concluded that the ACC system substantially decreased workload, increased comfort and did not influence safety negatively. The only difference found between the two types of hand controls was that drivers using the dual lever system had less variation in lateral position. The applied evaluation method proved

  12. What Drives Business Model Adaptation?

    DEFF Research Database (Denmark)

    Saebi, Tina; Lien, Lasse B.; Foss, Nicolai Juul

    2016-01-01

    -rigidity as well as prospect theory to examine business model adaptation in response to external threats and opportunities. Additionally, drawing on the behavioural theory of the firm, we argue that the past strategic orientation of a firm creates path dependencies that influence the propensity of the firm...... to adapt its business model. We test our hypotheses on a sample of 1196 Norwegian companies, and find that firms are more likely to adapt their business model under conditions of perceived threats than opportunities, and that strategic orientation geared towards market development is more conducive......Business models change as managers not only innovate business models, but also engage in more mundane adaptation in response to external changes, such as changes in the level or composition of demand. However, little is known about what causes such business model adaptation. We employ threat...

  13. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  14. Sample Design.

    Science.gov (United States)

    Ross, Kenneth N.

    1987-01-01

    This article considers various kinds of probability and non-probability samples in both experimental and survey studies. Throughout, how a sample is chosen is stressed. Size alone is not the determining consideration in sample selection. Good samples do not occur by accident; they are the result of a careful design. (Author/JAZ)

  15. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  16. The Adaptation Gap Report - a Preliminary Assessment

    DEFF Research Database (Denmark)

    Alverson, Keith; Olhoff, Anne; Noble, Ian;

    This first Adaptation Gap report provides an equally sobering assessment of the gap between adaptation needs and reality, based on preliminary thinking on how baselines, future goals or targets, and gaps between them might be defined for climate change adaptation. The report focuses on gaps in de...... in developing countries in three important areas: finance, technology and knowledge....

  17. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A.

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  18. Multi-Directional Motion Adaptation

    Directory of Open Access Journals (Sweden)

    David Patrick McGovern

    2012-05-01

    Full Text Available The direction aftereffect (DAE is a phenomenon whereby prolonged exposure to a moving stimulus biases the perceived direction of subsequent stimuli. It is believed to arise through a selective suppression of directionally tuned neurons in the visual cortex, causing shifts in the population response away from the adapted direction. Whereas most studies consider only unidirectional adaptation, here we examine how concurrent adaptation to multiple directions affects the DAE. Observers were required to judge whether a random dot kinematogram (RDK moved clockwise or counter-clockwise relative to upwards. In different conditions, observers adapted to a stimulus comprised of directions drawn from a distribution or to bidirectional motion. Increasing the variance of normally distributed directions reduced the magnitude of the peak DAE and broadened its tuning profile. Asymmetric sampling of Gaussian and uniform distributions resulted in shifts of DAE tuning profiles consistent with changes in the perceived global direction of the adapting stimulus. Discrimination thresholds were elevated by an amount that related to the magnitude of the bias. For bidirectional adaptors, adding dots in directions away from the adapting motion led to a pronounced reduction in the DAE. This reduction was observed when dots were added in opposite or orthogonal directions to the adaptor suggesting that it may arise via inhibition from a broadly tuned normalisation pool. Preliminary simulations with a population coding model, where the gain of a direction-selective neuron is inversely proportional to its response to the adapting stimulus, suggest that it provides a parsimonious account of these adaptation effects.

  19. Pengaruh Self-Acceptance Importance, Affiliation Importance, dan Community Feeling Importance terhadap Compulsive Buying

    Directory of Open Access Journals (Sweden)

    Euis Soliha

    2011-03-01

    Full Text Available This study focused on phenomenon behavior of compulsive buying. The study examined how Self-Acceptance Importance, Affiliation Importance, and Community Feeling Importance influenced on Compulsive Buying. Population in this research was students in Kota Semarang, and 104 students become samples. To answer problem that is accurate, researcher applies econometrics Logit model. Result of research indicates that there were negativity influence Self-Acceptance Importance, Affiliation Importance and Community Feeling to Compulsive Buying. Result of this supports all hypothesis and consistent with theory.Keywords:    compulsive buying, self-acceptance Importance, affiliation Importance, community feeling Importance, Logit Model

  20. Adaptive manifold learning.

    Science.gov (United States)

    Zhang, Zhenyue; Wang, Jing; Zha, Hongyuan

    2012-02-01

    Manifold learning algorithms seek to find a low-dimensional parameterization of high-dimensional data. They heavily rely on the notion of what can be considered as local, how accurately the manifold can be approximated locally, and, last but not least, how the local structures can be patched together to produce the global parameterization. In this paper, we develop algorithms that address two key issues in manifold learning: 1) the adaptive selection of the local neighborhood sizes when imposing a connectivity structure on the given set of high-dimensional data points and 2) the adaptive bias reduction in the local low-dimensional embedding by accounting for the variations in the curvature of the manifold as well as its interplay with the sampling density of the data set. We demonstrate the effectiveness of our methods for improving the performance of manifold learning algorithms using both synthetic and real-world data sets.

  1. New competitive dendrimer-based and highly selective immunosensor for determination of atrazine in environmental, feed and food samples: the importance of antibody selectivity for discrimination among related triazinic metabolites.

    Science.gov (United States)

    Giannetto, Marco; Umiltà, Eleonora; Careri, Maria

    2014-01-01

    A new voltammetric competitive immunosensor selective for atrazine, based on the immobilization of a conjugate atrazine-bovine serum albumine on a nanostructured gold substrate previously functionalized with poliamidoaminic dendrimers, was realized, characterized, and validated in different real samples of environmental and food concern. Response of the sensor was reliable, highly selective and suitable for the detection and quantification of atrazine at trace levels in complex matrices such as territorial waters, corn-cultivated soils, corn-containing poultry and bovine feeds and corn flakes for human use. Selectivity studies were focused on desethylatrazine, the principal metabolite generated by long-term microbiological degradation of atrazine, terbutylazine-2-hydroxy and simazine as potential interferents. The response of the developed immunosensor for atrazine was explored over the 10(-2)-10(3) ng mL(-1) range. Good sensitivity was proved, as limit of detection and limit of quantitation of 1.2 and 5 ng mL(-1), respectively, were estimated for atrazine. RSD values <5% over the entire explored range attested a good precision of the device.

  2. Genetic structure of different cat populations in Europe and South America at a microgeographic level: importance of the choice of an adequate sampling level in the accuracy of population genetics interpretations

    Directory of Open Access Journals (Sweden)

    Manuel Ruiz-Garcia

    1999-12-01

    Full Text Available The phenotypic markers, coat color, pattern and hair length, of natural domestic cat populations observed in four cities (Barcelona, Catalonia; Palma Majorca, Balearic Islands; Rimini, Italy and Buenos Aires, Argentina were studied at a microgeographical level. Various population genetics techniques revealed that the degree of genetic differentiation between populations of Felis catus within these cities is relatively low, when compared with that found between populations of other mammals. Two different levels of sampling were used. One was that of "natural" colonies of cat families living together in specific points within the cities, and the other referred to "artificial" subpopulations, or groups of colonies, inhabiting the same district within a city. For the two sampling levels, some of the results were identical: 1 little genic heterogeneity, 2 existence of panmixia, 3 similar levels of expected heterozygosity in all populations analyzed, 4 no spatial autocorrelation, with certain differentiation in the Buenos Aires population compared to the others, and 5 very high correlations between colonies and subpopulations with the first factors from a Q factor analysis. Nevertheless, other population genetic statistics were greatly affected by the differential choice of sampling level. This was the case for: 1 the amount of heterogeneity of the FST and GST statistics between the cities, which was greater at the subpopulation level than at colony level, 2 the existence of correlations between genic differentiation statistics and size variables at subpopulation level, but not at the colony level, and 3 the relationships between the genetic variables and the principal factors of the R factorial analysis. This suggests that care should be taken in the choice of the sampling unit, for inferences on population genetics to be valid at the microgeographical level.Os marcadores fenotípicos cor da pelagem, padrão e comprimento dos pelos de popula

  3. Adaptive management

    DEFF Research Database (Denmark)

    Rist, Lucy; Campbell, Bruce Morgan; Frost, Peter

    2013-01-01

    in scientific articles, policy documents and management plans, but both understanding and application of the concept is mixed. This paper reviews recent literature from conservation and natural resource management journals to assess diversity in how the term is used, highlight ambiguities and consider how...... the concept might be further assessed. AM is currently being used to describe many different management contexts, scales and locations. Few authors define the term explicitly or describe how it offers a means to improve management outcomes in their specific management context. Many do not adhere to the idea......Adaptive management (AM) emerged in the literature in the mid-1970s in response both to a realization of the extent of uncertainty involved in management, and a frustration with attempts to use modelling to integrate knowledge and make predictions. The term has since become increasingly widely used...

  4. Capillary sample

    Science.gov (United States)

    ... several times a day using capillary blood sampling. Disadvantages to capillary blood sampling include: Only a limited ... do not constitute endorsements of those other sites. Copyright 1997-2017, A.D.A.M., Inc. Duplication ...

  5. GLOBALIZATION AND IMPORT RISKS

    Directory of Open Access Journals (Sweden)

    Popa Ioan

    2014-07-01

    Full Text Available Delocalization of production and diversification of the sources of offer in the global market place the issue of protection of consumer rights in major consumption centres, namely the European Union in a new light. A review of policies for the protection of consumer rights in the EU, USA and China, reveals major differences regarding the protection of consumer rights and the existence of gaps, and in particular the implementation of effective legislation in this regard. As such, the risks associated with imports have become a major concern in the European Union. The consumer has – one can say – a central role in the globalization process, which justifies the measures aimed at its protection. Although worldwide there are major differences in the degree of market regulation in matters of protection of consumer rights, the trend is the continuous adaptation of the offer to the requirements of global demand. However, one can still find significant gaps which translate into risks specific to the consumers in developed countries, namely in the EU. An important issue arises from this radical change of the localization of production centres in relation to the main consumption centres. While in the developed world, consumer rights protection has reached high levels both by creating an appropriate legislative framework and through consumer awareness and activism regarding their rights, in areas where much of the offer comes from the Western market (China, India, etc. modern mentality on the protection of consumer rights is just emerging. A major requirement is therefore the provision of a status of the consumer compatible with the benefits and risks of globalization, a status defined by safety and protection of imports. This paper confirms the thesis that, ultimately, the main factor counteracting the risks in matters of protection of consumer rights is the consumer, its awareness of its rights.

  6. Kinetic Solvers with Adaptive Mesh in Phase Space

    CERN Document Server

    Arslanbekov, Robert R; Frolova, Anna A

    2013-01-01

    An Adaptive Mesh in Phase Space (AMPS) methodology has been developed for solving multi-dimensional kinetic equations by the discrete velocity method. A Cartesian mesh for both configuration (r) and velocity (v) spaces is produced using a tree of trees data structure. The mesh in r-space is automatically generated around embedded boundaries and dynamically adapted to local solution properties. The mesh in v-space is created on-the-fly for each cell in r-space. Mappings between neighboring v-space trees implemented for the advection operator in configuration space. We have developed new algorithms for solving the full Boltzmann and linear Boltzmann equations with AMPS. Several recent innovations were used to calculate the full Boltzmann collision integral with dynamically adaptive mesh in velocity space: importance sampling, multi-point projection method, and the variance reduction method. We have developed an efficient algorithm for calculating the linear Boltzmann collision integral for elastic and inelastic...

  7. Adaptive vehicle motion estimation and prediction

    Science.gov (United States)

    Zhao, Liang; Thorpe, Chuck E.

    1999-01-01

    Accurate motion estimation and reliable maneuver prediction enable an automated car to react quickly and correctly to the rapid maneuvers of the other vehicles, and so allow safe and efficient navigation. In this paper, we present a car tracking system which provides motion estimation, maneuver prediction and detection of the tracked car. The three strategies employed - adaptive motion modeling, adaptive data sampling, and adaptive model switching probabilities - result in an adaptive interacting multiple model algorithm (AIMM). The experimental results on simulated and real data demonstrate that our tracking system is reliable, flexible, and robust. The adaptive tracking makes the system intelligent and useful in various autonomous driving tasks.

  8. Social Importance Dynamics: A Model for Culturally-Adaptive Agents

    NARCIS (Netherlands)

    Mascarenhas, S.; Prada, R.; Paiva, A.; Hofstede, G.J.

    2013-01-01

    The unwritten rules of human cultures greatly affect social behaviour and as such should be considered in the development of socially intelligent agents. So far, there has been a large focus on modeling cultural aspects related to non-verbal behaviour such as gaze or body posture. However, culture a

  9. Taking Root in Foreign Soil: Adaptation Processes of Imported Universities

    Science.gov (United States)

    Graham, Terrece F.

    2016-01-01

    The fall of the Berlin Wall in 1989 ushered in a period of change in higher-education systems across the former Eastern bloc. Reform-minded leaders in the region sought to introduce western models and policies promoted by foreign development aid agendas. Private higher-education institutions emerged. This qualitative multiple case study examines…

  10. Adaptive method of lines

    CERN Document Server

    Saucez, Ph

    2001-01-01

    The general Method of Lines (MOL) procedure provides a flexible format for the solution of all the major classes of partial differential equations (PDEs) and is particularly well suited to evolutionary, nonlinear wave PDEs. Despite its utility, however, there are relatively few texts that explore it at a more advanced level and reflect the method''s current state of development.Written by distinguished researchers in the field, Adaptive Method of Lines reflects the diversity of techniques and applications related to the MOL. Most of its chapters focus on a particular application but also provide a discussion of underlying philosophy and technique. Particular attention is paid to the concept of both temporal and spatial adaptivity in solving time-dependent PDEs. Many important ideas and methods are introduced, including moving grids and grid refinement, static and dynamic gridding, the equidistribution principle and the concept of a monitor function, the minimization of a functional, and the moving finite elem...

  11. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our ...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research.......This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...

  12. El Cuestionario de Necesidades de los Familiares de Pacientes de Cuidados Intensivos (CCFNI versión breve: adaptación y validación en población española The short version of Critical Care Family Needs Inventory (CCFNI: adaptation and validation for a Spanish sample

    Directory of Open Access Journals (Sweden)

    S. Gómez Martínez

    2011-12-01

    Full Text Available Los familiares son una parte muy importante en el proceso de la enfermedad y el cuidado de los pacientes ingresados en Unidades de Cuidados Intensivos (UCI. Por ello es fundamental conocer sus necesidades, para tratar de mejorar la adaptación a una situación tan difícil como es el ingreso en UCI. El objetivo del presente estudio es adaptar y validar la versión breve del Cuestionario de Necesidades de los Familiares de Pacientes de Cuidados Intensivos (CCFNI en una muestra española. Para ello se aplicó la adaptación del cuestionario, realizada conforme a las directrices internacionales, a 55 familiares de pacientes ingresados en la UCI del Hospital General Universitario de Castellón. Tras la eliminación de tres ítems por diversos motivos, se realizó un análisis factorial exploratorio con los 11 ítems restantes para obtener la estructura factorial del mismo. Se realizó un análisis descriptivo de los ítems y se calcularon la consistencia interna mediante α de Cronbach y la validez de constructo mediante el coeficiente de correlación de Pearson. El CCFNI obtuvo una estructura de cuatro factores que corresponden a: atención médica al paciente, atención personal al familiar, información y comunicación médico-paciente y posibles mejoras percibidas. Esta versión del CCFNI mostró una buena consistencia interna tanto para la escala total como para los factores. La versión del CCFNI validada en el presente estudio constituye una medida adecuada para la evaluación de las distintas necesidades que presentan los familiares de los pacientes ingresados en una UCI, mostrando una adecuada bondad psicométrica.Relatives play an important role in the disease process of patients admitted to Intensive Care Units (ICU. It is therefore important to know the needs of people close to the patient in order to try to improve their adaption to a situation as difficult as an ICU admission. The aim of this study was the adaptation and validation of

  13. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  14. Adaptation-Based Programming in Haskell

    CERN Document Server

    Bauer, Tim; Fern, Alan; Pinto, Jervis; 10.4204/EPTCS.66.1

    2011-01-01

    We present an embedded DSL to support adaptation-based programming (ABP) in Haskell. ABP is an abstract model for defining adaptive values, called adaptives, which adapt in response to some associated feedback. We show how our design choices in Haskell motivate higher-level combinators and constructs and help us derive more complicated compositional adaptives. We also show an important specialization of ABP is in support of reinforcement learning constructs, which optimize adaptive values based on a programmer-specified objective function. This permits ABP users to easily define adaptive values that express uncertainty anywhere in their programs. Over repeated executions, these adaptive values adjust to more efficient ones and enable the user's programs to self optimize. The design of our DSL depends significantly on the use of type classes. We will illustrate, along with presenting our DSL, how the use of type classes can support the gradual evolution of DSLs.

  15. Adaptation-Based Programming in Haskell

    Directory of Open Access Journals (Sweden)

    Tim Bauer

    2011-09-01

    Full Text Available We present an embedded DSL to support adaptation-based programming (ABP in Haskell. ABP is an abstract model for defining adaptive values, called adaptives, which adapt in response to some associated feedback. We show how our design choices in Haskell motivate higher-level combinators and constructs and help us derive more complicated compositional adaptives. We also show an important specialization of ABP is in support of reinforcement learning constructs, which optimize adaptive values based on a programmer-specified objective function. This permits ABP users to easily define adaptive values that express uncertainty anywhere in their programs. Over repeated executions, these adaptive values adjust to more efficient ones and enable the user's programs to self optimize. The design of our DSL depends significantly on the use of type classes. We will illustrate, along with presenting our DSL, how the use of type classes can support the gradual evolution of DSLs.

  16. Exploring the path through which career adaptability increases job satisfaction and lowers work stress : the role of affect

    OpenAIRE

    Fiori, M.; Bollmann, G.; Rossier, J.

    2015-01-01

    The construct of career adaptability, or the ability to successfully manage one's career development and challenges, predicts several important outcomes; however, little is known about the mechanisms contributing to its positive effects. The present study investigated the impact of career adaptability on job satisfaction and work stress, as mediated by individuals' affective states. Using a representative sample of 1671 individuals employed in Switzerland we hypothesized that, over time, c...

  17. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  18. On Adaptive Optimal Input Design

    NARCIS (Netherlands)

    Stigter, J.D.; Vries, D.; Keesman, K.J.

    2003-01-01

    The problem of optimal input design (OID) for a fed-batch bioreactor case study is solved recursively. Here an adaptive receding horizon optimal control problem, involving the so-called E-criterion, is solved on-line, using the current estimate of the parameter vector at each sample instant {tk, k =

  19. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... a sample liquid comprising the sample and the first preparation system is adapted to receive a receiving liquid. In a particular embodiment, a magnetic sample transport component, such as a permanent magnet or an electromagnet, is arranged to move magnetic beads in between the first and second substrates....

  20. Binocular measurements of chromatic adaptation.

    Science.gov (United States)

    Troost, J M; Wei, L; de Weert, C M

    1992-10-01

    In this paper we present asymmetric matching data that were obtained with a binocular presentation method. Our main motivation was the question whether chromatic adaptation, one of the important mechanisms that contribute to colour constancy, has evolved towards a better performance in the range of colours that are present in the natural image. For the eye adapted to a bluish illuminant for example the presence of an object with a deep yellow colour is very unlikely. So, it was expected that the colour difference between adapting light and target has an influence on the extent of chromatic adaptation. It was found that the colour shift in the observers' matches that can be attributed to chromatic adaptation indeed has a maximum. The location of the maximum, however, was unexpected, i.e. colour differences between target and adapting light that lie around 0.05 u'v'-chromaticity units. Additionally, several models for chromatic adaptation were fitted to our data. It was found that, except for the simple von Kries model, Retinex Theory and difference contrast, a number of models gave good predictions for the L-wave and M-wave fundamental systems, but that predictions for the S-wave system were less accurate.