WorldWideScience

Sample records for computer adaptive short

  1. Towards psychologically adaptive brain-computer interfaces

    Science.gov (United States)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  2. Very-long-term and short-term chromatic adaptation: are their influences cumulative?

    Science.gov (United States)

    Belmore, Suzanne C; Shevell, Steven K

    2011-02-09

    Very-long-term (VLT) chromatic adaptation results from exposure to an altered chromatic environment for days or weeks. Color shifts from VLT adaptation are observed hours or days after leaving the altered environment. Short-term chromatic adaptation, on the other hand, results from exposure for a few minutes or less, with color shifts measured within seconds or a few minutes after the adapting light is extinguished; recovery to the pre-adapted state is complete in less than an hour. Here, both types of adaptation were combined. All adaptation was to reddish-appearing long-wavelength light. Shifts in unique yellow were measured following adaptation. Previous studies demonstrate shifts in unique yellow due to VLT chromatic adaptation, but shifts from short-term chromatic adaptation to comparable adapting light can be far greater than from VLT adaptation. The question considered here is whether the color shifts from VLT adaptation are cumulative with large shifts from short-term adaptation or, alternatively, does simultaneous short-term adaptation eliminate color shifts caused by VLT adaptation. The results show the color shifts from VLT and short-term adaptation together are cumulative, which indicates that both short-term and very-long-term chromatic adaptation affect color perception during natural viewing. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. 21 CFR 874.1070 - Short increment sensitivity index (SISI) adapter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Short increment sensitivity index (SISI) adapter. 874.1070 Section 874.1070 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... increment sensitivity index (SISI) adapter. (a) Identification. A short increment sensitivity index (SISI...

  4. Adaptation of HAMMER computer code to CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Pinheiro, A.M.B.S.; Nair, R.P.K.

    1982-01-01

    The adaptation of HAMMER computer code to CYBER 170/750 computer is presented. The HAMMER code calculates cell parameters by multigroup transport theory and reactor parameters by few group diffusion theory. The auxiliary programs, the carried out modifications and the use of HAMMER system adapted to CYBER 170/750 computer are described. (M.C.K.) [pt

  5. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    NARCIS (Netherlands)

    de Beurs, D.P.; de Vries, A.L.M.; de Groot, M.H.; de Keijser, J.; Kerkhof, A.J.F.M.

    2014-01-01

    Background: The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce

  6. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  7. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  8. Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.

    Science.gov (United States)

    Saller, Maximilian A C; Habershon, Scott

    2017-07-11

    Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.

  9. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  10. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  11. Discrete linear canonical transform computation by adaptive method.

    Science.gov (United States)

    Zhang, Feng; Tao, Ran; Wang, Yue

    2013-07-29

    The linear canonical transform (LCT) describes the effect of quadratic phase systems on a wavefield and generalizes many optical transforms. In this paper, the computation method for the discrete LCT using the adaptive least-mean-square (LMS) algorithm is presented. The computation approaches of the block-based discrete LCT and the stream-based discrete LCT using the LMS algorithm are derived, and the implementation structures of these approaches by the adaptive filter system are considered. The proposed computation approaches have the inherent parallel structures which make them suitable for efficient VLSI implementations, and are robust to the propagation of possible errors in the computation process.

  12. Wavefront measurement using computational adaptive optics.

    Science.gov (United States)

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  13. Unauthorised adaptation of computer programmes - is criminalisation a solution?

    Directory of Open Access Journals (Sweden)

    L Muswaka

    2011-12-01

    Full Text Available In Haupt t/a Softcopy v Brewers Marketing Intelligence (Pty Ltd 2006 4 SA 458 (SCA Haupt sought to enforce a copyright claim in the Data Explorer computer programme against Brewers Marketing Intelligence (Pty Ltd. His claim was dismissed in the High Court and he appealed to the Supreme Court of Appeal. The Court held that copyright in the Data Explorer programme vested in Haupt. Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty Ltd.This case note inter alia analyses the possibility of an author being sued for infringement even though he has acquired copyright in a work that he created by making unauthorised adaptations to another's copyright material. Furthermore, it examines whether or not the law adequately protects copyright owners in situations where infringement takes the form of unauthorised adaptations of computer programmes. It is argued that the protection afforded by the Copyright Act 98 of 1978 (Copyright Act in terms of section 27(1 to copyright owners of computer programmes is narrowly defined. It excludes from its ambit of criminal liability the act of making unauthorised adaptation of computer programmes. The issue that is considered is therefore whether or not the unauthorised adaptation of computer programmes should attract a criminal sanction. In addressing this issue and with the aim of making recommendations, the legal position in the United Kingdom (UK is analysed. From the analysis it is recommended that the Copyright Act be amended by the insertion of a new section, section 27(1(A, which will make the act of making an unauthorised adaptation of a computer programme an offence. This recommended section will close the gap that currently exists in our law with regard to unauthorised adaptations of computer programmes.

  14. Polynomial Phase Estimation Based on Adaptive Short-Time Fourier Transform.

    Science.gov (United States)

    Jing, Fulong; Zhang, Chunjie; Si, Weijian; Wang, Yu; Jiao, Shuhong

    2018-02-13

    Polynomial phase signals (PPSs) have numerous applications in many fields including radar, sonar, geophysics, and radio communication systems. Therefore, estimation of PPS coefficients is very important. In this paper, a novel approach for PPS parameters estimation based on adaptive short-time Fourier transform (ASTFT), called the PPS-ASTFT estimator, is proposed. Using the PPS-ASTFT estimator, both one-dimensional and multi-dimensional searches and error propagation problems, which widely exist in PPSs field, are avoided. In the proposed algorithm, the instantaneous frequency (IF) is estimated by S-transform (ST), which can preserve information on signal phase and provide a variable resolution similar to the wavelet transform (WT). The width of the ASTFT analysis window is equal to the local stationary length, which is measured by the instantaneous frequency gradient (IFG). The IFG is calculated by the principal component analysis (PCA), which is robust to the noise. Moreover, to improve estimation accuracy, a refinement strategy is presented to estimate signal parameters. Since the PPS-ASTFT avoids parameter search, the proposed algorithm can be computed in a reasonable amount of time. The estimation performance, computational cost, and implementation of the PPS-ASTFT are also analyzed. The conducted numerical simulations support our theoretical results and demonstrate an excellent statistical performance of the proposed algorithm.

  15. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  16. Effect of Short-Term Study Abroad Programs on Students' Cultural Adaptability

    Science.gov (United States)

    Mapp, Susan C.

    2012-01-01

    The number of U.S. students studying abroad has been growing, particularly those participating in short-term trips. However, literature on the effect of these short-term trips is lacking. The purpose of this study was to assess quantitatively the effect on bachelor students' cross-cultural adaptability using a pre-post design. Significant changes…

  17. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  18. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  19. Simple and Effective Algorithms: Computer-Adaptive Testing.

    Science.gov (United States)

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  20. Adaptive synchrosqueezing based on a quilted short-time Fourier transform

    Science.gov (United States)

    Berrian, Alexander; Saito, Naoki

    2017-08-01

    In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.

  1. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  2. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  3. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  4. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  5. An adaptive random search for short term generation scheduling with network constraints.

    Directory of Open Access Journals (Sweden)

    J A Marmolejo

    Full Text Available This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  6. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  7. Short-term effects of playing computer games on attention.

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  8. A New Adaptive Checkpointing Strategy for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    MENChaoguang; ZUODecheng; YANGXiaozong

    2005-01-01

    Adaptive checkpointing strategy is an efficient recovery scheme, which is suitable for mobile computing system. However, all existing adaptive checkpointing schemes are not correct to recover system when failure occurs in some special period. In this paper, the issues that will lead to system inconsistency are first discussed and then a new adaptive strategy that can recover system to correct consistent state is proposed. Our algorithm improves system recovery performance because only failure process needs rollback through logging.

  9. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  10. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    Science.gov (United States)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  11. A comparison of computerized adaptive testing and fixed-length short forms for the Prosthetic Limb Users Survey of Mobility (PLUS-MTM).

    Science.gov (United States)

    Amtmann, Dagmar; Bamer, Alyssa M; Kim, Jiseon; Bocell, Fraser; Chung, Hyewon; Park, Ryoungsun; Salem, Rana; Hafner, Brian J

    2017-09-01

    New health status instruments can be administered by computerized adaptive test or short forms. The Prosthetic Limb Users Survey of Mobility (PLUS-M TM ) is a self-report measure of mobility for prosthesis users with lower limb loss. This study used the PLUS-M to examine advantages and disadvantages of computerized adaptive test and short forms. To compare scores obtained from computerized adaptive test to scores obtained from fixed-length short forms (7-item and 12-item) in order to provide guidance to researchers and clinicians on how to select the best form of administration for different uses. Cross-sectional, observational study. Individuals with lower limb loss completed the PLUS-M by computerized adaptive test and short forms. Administration time, correlations between the scores, and standard errors were compared. Scores and standard errors from the computerized adaptive test, 7-item short form, and 12-item short form were highly correlated and all forms of administration were efficient. Computerized adaptive test required less time to administer than either paper or electronic short forms; however, time savings were minimal compared to the 7-item short form. Results indicate that the PLUS-M computerized adaptive test is most efficient, and differences in scores between administration methods are minimal. The main advantage of the computerized adaptive test was more reliable scores at higher levels of mobility compared to short forms. Clinical relevance Health-related item banks, like the Prosthetic Limb Users Survey of Mobility (PLUS-M TM ), can be administered by computerized adaptive testing (CAT) or as fixed-length short forms (SFs). Results of this study will help clinicians and researchers decide whether they should invest in a CAT administration system or whether SFs are more appropriate.

  12. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  13. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  14. Significant decimal digits for energy representation on short-word computers

    International Nuclear Information System (INIS)

    Sartori, E.

    1989-01-01

    The general belief that single precision floating point numbers have always at least seven significant decimal digits on short word computers such as IBM is erroneous. Seven significant digits are required however for representing the energy variable in nuclear cross-section data sets containing sharp p-wave resonances at 0 Kelvin. It is suggested that either the energy variable is stored in double precision or that cross-section resonances are reconstructed to room temperature or higher on short word computers

  15. Sequential decision making in computational sustainability via adaptive submodularity

    Science.gov (United States)

    Krause, Andreas; Golovin, Daniel; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  16. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    , we are improving computational performance by exploiting modern hardware features, such as dynamic voltage-frequency scaling and transactional memory. Adapting software is an iterative process, requiring that we continually revisit it to meet new requirements or realities; a time consuming process......The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen...

  17. Short-Term Effects of Playing Computer Games on Attention

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-01-01

    Objective: The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. Method: One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour.…

  18. Applications of decision theory to computer-based adaptive instructional systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate

  19. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  20. Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

    DEFF Research Database (Denmark)

    Saari, Pasi; Fazekas, György; Eerola, Tuomas

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are prop......This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling...... related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outperforms a semantic computing technique that does not exploit genre information, and ACTwg-SLPwg outperforms conventional techniques and other genre-adaptive alternatives. In particular, improvements......-based genre representation for genre-adaptive music mood analysis....

  1. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  2. Computer-Adaptive Testing in Second Language Contexts.

    Science.gov (United States)

    Chalhoub-Deville, Micheline; Deville, Craig

    1999-01-01

    Provides a broad overview of computerized testing issues with an emphasis on computer-adaptive testing (CAT). A survey of the potential benefits and drawbacks of CAT are given, the process of CAT development is described; and some L2 instruments developed to assess various language skills are summarized. (Author/VWL)

  3. Computing three-point functions for short operators

    International Nuclear Information System (INIS)

    Bargheer, Till; Institute for Advanced Study, Princeton, NJ; Minahan, Joseph A.; Pereira, Raul

    2013-11-01

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  4. Computing three-point functions for short operators

    Energy Technology Data Exchange (ETDEWEB)

    Bargheer, Till [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Minahan, Joseph A.; Pereira, Raul [Uppsala Univ. (Sweden). Dept. of Physics and Astronomy

    2013-11-15

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  5. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  6. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish

    OpenAIRE

    Rutkowski, Radosław; Gałczyńska-Rusin, Małgorzata; Gizińska, Małgorzata; Straburzyński-Lupa, Marcin; Zdanowska, Agata; Romanowski, Mateusz Wojciech; Romanowski, Wojciech; Budiman-Mak, Elly; Straburzyńska-Lupa, Anna

    2017-01-01

    Purpose The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS) questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA). Methods The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheu...

  7. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    International Nuclear Information System (INIS)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E; Loukopoulos, Klearchos

    2011-01-01

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  8. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Loukopoulos, Klearchos, E-mail: m.hoban@ucl.ac.uk [Department of Materials, Oxford University, Parks Road, Oxford OX1 4PH (United Kingdom)

    2011-02-15

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  9. Adapt or Perish: A Review of Planning Approaches for Adaptation under Deep Uncertainty

    Directory of Open Access Journals (Sweden)

    Jan H. Kwakkel

    2013-03-01

    Full Text Available There is increasing interest in long-term plans that can adapt to changing situations under conditions of deep uncertainty. We argue that a sustainable plan should not only achieve economic, environmental, and social objectives, but should be robust and able to be adapted over time to (unforeseen future conditions. Large numbers of papers dealing with robustness and adaptive plans have begun to appear, but the literature is fragmented. The papers appear in disparate journals, and deal with a wide variety of policy domains. This paper (1 describes and compares a family of related conceptual approaches to designing a sustainable plan, and (2 describes several computational tools supporting these approaches. The conceptual approaches all have their roots in an approach to long-term planning called Assumption-Based Planning. Guiding principles for the design of a sustainable adaptive plan are: explore a wide variety of relevant uncertainties, connect short-term targets to long-term goals over time, commit to short-term actions while keeping options open, and continuously monitor the world and take actions if necessary. A key computational tool across the conceptual approaches is a fast, simple (policy analysis model that is used to make large numbers of runs, in order to explore the full range of uncertainties and to identify situations in which the plan would fail.

  10. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  11. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  12. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  13. Passive adaptation to stress in adulthood after short-term social instability stress during adolescence in mice.

    Science.gov (United States)

    de Lima, A P N; Massoco, C O

    2017-05-01

    This study reports that short-term social instability stress (SIS) in adolescence increases passive-coping in adulthood in male mice. Short-term SIS decreased the latency of immobility and increased the frequency and time of immobility in tail suspension test. These findings support the hypothesis that adolescent stress can induce a passive adaptation to stress in adulthood, even if it is a short period of stress.

  14. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  15. Cross-cultural adaptation and validation of the Danish version of the Short Musculoskeletal Function Assessment Questionnaire (SMFA)

    DEFF Research Database (Denmark)

    Lindahl, Marianne Pia; Andersen, Signe; Jørgensen, Annette

    2017-01-01

    PURPOSE: The aim of this study was to translate and culturally adapt the Short Musculoskeletal Function Assessment (SMFA) into Danish (SMFA-DK) and assess the psychometric properties. METHODS: SMFA was translated and cross-culturally adapted according to a standardized procedure. Minor changes......, content validity as coding according to the International Classification of Functioning, Disability and Health (ICF), floor/ceiling effects, construct validity as factor analysis, correlations between SMFA-DK and Short Form 36 and also known group method. Responsiveness and effect size were calculated...

  16. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  17. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  18. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator

    Science.gov (United States)

    Ma, J.; Liu, Q.

    2018-02-01

    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  19. A fiber orientation-adapted integration scheme for computing the hyperelastic Tucker average for short fiber reinforced composites

    Science.gov (United States)

    Goldberg, Niels; Ospald, Felix; Schneider, Matti

    2017-10-01

    In this article we introduce a fiber orientation-adapted integration scheme for Tucker's orientation averaging procedure applied to non-linear material laws, based on angular central Gaussian fiber orientation distributions. This method is stable w.r.t. fiber orientations degenerating into planar states and enables the construction of orthotropic hyperelastic energies for truly orthotropic fiber orientation states. We establish a reference scenario for fitting the Tucker average of a transversely isotropic hyperelastic energy, corresponding to a uni-directional fiber orientation, to microstructural simulations, obtained by FFT-based computational homogenization of neo-Hookean constituents. We carefully discuss ideas for accelerating the identification process, leading to a tremendous speed-up compared to a naive approach. The resulting hyperelastic material map turns out to be surprisingly accurate, simple to integrate in commercial finite element codes and fast in its execution. We demonstrate the capabilities of the extracted model by a finite element analysis of a fiber reinforced chain link.

  20. Short generators without quantum computers : the case of multiquadratics

    NARCIS (Netherlands)

    Bauch, J.; Bernstein, D.J.; de Valence, H.; Lange, T.; van Vredendaal, C.; Coron, J.-S.; Nielsen, J.B.

    2017-01-01

    Finding a short element g of a number field, given the ideal generated by g, is a classic problem in computational algebraic number theory. Solving this problem recovers the private key in cryptosystems introduced by Gentry, Smart–Vercauteren, Gentry–Halevi, Garg– Gentry–Halevi, et al. Work over the

  1. Computer intervention impact on psychosocial adaptation of rural women with chronic conditions.

    Science.gov (United States)

    Weinert, Clarann; Cudney, Shirley; Comstock, Bryan; Bansal, Aasthaa

    2011-01-01

    Adapting to living with chronic conditions is a life-long psychosocial challenge. The purpose of this study was to report the effect of a computer intervention on the psychosocial adaptation of rural women with chronic conditions. A two-group study design was used with 309 middle-aged, rural women who had chronic conditions, randomized into either a computer-based intervention or a control group. Data were collected at baseline, at the end of the intervention, and 6 months later on the psychosocial indicators of social support, self-esteem, acceptance of illness, stress, depression, and loneliness. The impact of the computer-based intervention was statistically significant for five of six of the psychosocial outcomes measured, with a modest impact on social support. The largest benefits were seen in depression, stress, and acceptance. The women-to-women intervention resulted in positive psychosocial responses that have the potential to contribute to successful management of illness and adaptation. Other components of adaptation to be examined are the impact of the intervention on illness management and quality of life and the interrelationships among environmental stimuli, psychosocial response, and illness management.

  2. Short-term saccadic adaptation in the macaque monkey: a binocular mechanism

    Science.gov (United States)

    Schultz, K. P.

    2013-01-01

    Saccadic eye movements are rapid transfers of gaze between objects of interest. Their duration is too short for the visual system to be able to follow their progress in time. Adaptive mechanisms constantly recalibrate the saccadic responses by detecting how close the landings are to the selected targets. The double-step saccadic paradigm is a common method to simulate alterations in saccadic gain. While the subject is responding to a first target shift, a second shift is introduced in the middle of this movement, which masks it from visual detection. The error in landing introduced by the second shift is interpreted by the brain as an error in the programming of the initial response, with gradual gain changes aimed at compensating the apparent sensorimotor mismatch. A second shift applied dichoptically to only one eye introduces disconjugate landing errors between the two eyes. A monocular adaptive system would independently modify only the gain of the eye exposed to the second shift in order to reestablish binocular alignment. Our results support a binocular mechanism. A version-based saccadic adaptive process detects postsaccadic version errors and generates compensatory conjugate gain alterations. A vergence-based saccadic adaptive process detects postsaccadic disparity errors and generates corrective nonvisual disparity signals that are sent to the vergence system to regain binocularity. This results in striking dynamical similarities between visually driven combined saccade-vergence gaze transfers, where the disparity is given by the visual targets, and the double-step adaptive disconjugate responses, where an adaptive disparity signal is generated internally by the saccadic system. PMID:23076111

  3. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...

  4.   Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

      Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  5. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    Science.gov (United States)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  6. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang

    2010-10-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting the changes in a grid system can help to alarm the anomalies, clean the noises, and report the new patterns. In this paper, we proposed an approach of self-adaptive change detection based on the Page-Hinkley statistic test. It handles the non-stationary distribution without the assumption of data distribution and the empirical setting of parameters. We validate the approach on the EGEE streaming jobs, and report its better performance on achieving higher accuracy comparing to the other change detection methods. Meanwhile this change detection process could help to discover the device fault which was not claimed in the system logs. © 2010 IEEE.

  8. The effects of short-lasting anti-saccade training in homonymous hemianopia with and without saccadic adaptation

    Directory of Open Access Journals (Sweden)

    Delphine eLévy-Bencheton

    2016-01-01

    Full Text Available Homonymous Visual Field Defects (HVFD are common following stroke and can be highly debilitating for visual perception and higher level cognitive functions such as exploring visual scene or reading a text. Rehabilitation using oculomotor compensatory methods with automatic training over a short duration (~15 days have been shown as efficient as longer voluntary training methods (>1 month. Here, we propose to evaluate and compare the effect of an original HVFD rehabilitation method based on a single 15 min voluntary anti-saccades task (AS toward the blind hemifield, with automatic sensorimotor adaptation to increase AS amplitude. In order to distinguish between adaptation and training effect, fourteen left- or right-HVFD patients were exposed, one month apart, to three training, two isolated AS task (Delayed-shift & No-shift paradigm and one combined with AS adaptation (Adaptation paradigm. A quality of life questionnaire (NEI-VFQ 25 and functional measurements (reading speed, visual exploration time in pop-out and serial tasks as well as oculomotor measurements were assessed before and after each training. We could not demonstrate significant adaptation at the group level, but we identified a group of 9 adapted patients. While AS training itself proved to demonstrate significant functional improvements in the overall patient group , we could also demonstrate in the sub-group of adapted patients and specifically following the adaptation training, an increase of saccade amplitude during the reading task (left-HVFD patients and the Serial exploration task, and improvement of the visual quality of life. We conclude that short-lasting AS training combined with adaptation could be implemented in rehabilitation methods of cognitive dysfunctions following HVFD. Indeed, both voluntary and automatic processes have shown interesting effects on the control of visually guided saccades in different cognitive tasks.

  9. A stereotactic adapter compatible with computed tomography

    International Nuclear Information System (INIS)

    Cacak, R.K.; Law, J.D.

    1982-01-01

    One application of computed-tomographic (CT) scanners is the localization of intracranial targets for stereotactic surgery. Unfortunately, conventional stereotactic devices affixed to the patient cause artifacts which obscure anatomic features in CT images. The authors describe the initial phase of a project to eliminate this problem by using an adapter that is free of metallic objects. Localization of the target point relative to the coordinate system of a Leksell stereotactic frame is achieved from CT image measurements

  10. Smart swarms of bacteria-inspired agents with performance adaptable interactions.

    Directory of Open Access Journals (Sweden)

    Adi Shklarsh

    2011-09-01

    Full Text Available Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment--by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots.

  11. Smart swarms of bacteria-inspired agents with performance adaptable interactions.

    Science.gov (United States)

    Shklarsh, Adi; Ariel, Gil; Schneidman, Elad; Ben-Jacob, Eshel

    2011-09-01

    Collective navigation and swarming have been studied in animal groups, such as fish schools, bird flocks, bacteria, and slime molds. Computer modeling has shown that collective behavior of simple agents can result from simple interactions between the agents, which include short range repulsion, intermediate range alignment, and long range attraction. Here we study collective navigation of bacteria-inspired smart agents in complex terrains, with adaptive interactions that depend on performance. More specifically, each agent adjusts its interactions with the other agents according to its local environment--by decreasing the peers' influence while navigating in a beneficial direction, and increasing it otherwise. We show that inclusion of such performance dependent adaptable interactions significantly improves the collective swarming performance, leading to highly efficient navigation, especially in complex terrains. Notably, to afford such adaptable interactions, each modeled agent requires only simple computational capabilities with short-term memory, which can easily be implemented in simple swarming robots.

  12. Water System Adaptation To Hydrological Changes: Module 11, Methods and Tools: Computational Models

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  13. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    Science.gov (United States)

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  14. Moving finite elements: A continuously adaptive method for computational fluid dynamics

    International Nuclear Information System (INIS)

    Glasser, A.H.; Miller, K.; Carlson, N.

    1991-01-01

    Moving Finite Elements (MFE), a recently developed method for computational fluid dynamics, promises major advances in the ability of computers to model the complex behavior of liquids, gases, and plasmas. Applications of computational fluid dynamics occur in a wide range of scientifically and technologically important fields. Examples include meteorology, oceanography, global climate modeling, magnetic and inertial fusion energy research, semiconductor fabrication, biophysics, automobile and aircraft design, industrial fluid processing, chemical engineering, and combustion research. The improvements made possible by the new method could thus have substantial economic impact. Moving Finite Elements is a moving node adaptive grid method which has a tendency to pack the grid finely in regions where it is most needed at each time and to leave it coarse elsewhere. It does so in a manner which is simple and automatic, and does not require a large amount of human ingenuity to apply it to each particular problem. At the same time, it often allows the time step to be large enough to advance a moving shock by many shock thicknesses in a single time step, moving the grid smoothly with the solution and minimizing the number of time steps required for the whole problem. For 2D problems (two spatial variables) the grid is composed of irregularly shaped and irregularly connected triangles which are very flexible in their ability to adapt to the evolving solution. While other adaptive grid methods have been developed which share some of these desirable properties, this is the only method which combines them all. In many cases, the method can save orders of magnitude of computing time, equivalent to several generations of advancing computer hardware

  15. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  16. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  17. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  18. Computational adaptive optics for broadband interferometric tomography of tissues and cells

    Science.gov (United States)

    Adie, Steven G.; Mulligan, Jeffrey A.

    2016-03-01

    Adaptive optics (AO) can shape aberrated optical wavefronts to physically restore the constructive interference needed for high-resolution imaging. With access to the complex optical field, however, many functions of optical hardware can be achieved computationally, including focusing and the compensation of optical aberrations to restore the constructive interference required for diffraction-limited imaging performance. Holography, which employs interferometric detection of the complex optical field, was developed based on this connection between hardware and computational image formation, although this link has only recently been exploited for 3D tomographic imaging in scattering biological tissues. This talk will present the underlying imaging science behind computational image formation with optical coherence tomography (OCT) -- a beam-scanned version of broadband digital holography. Analogous to hardware AO (HAO), we demonstrate computational adaptive optics (CAO) and optimization of the computed pupil correction in 'sensorless mode' (Zernike polynomial corrections with feedback from image metrics) or with the use of 'guide-stars' in the sample. We discuss the concept of an 'isotomic volume' as the volumetric extension of the 'isoplanatic patch' introduced in astronomical AO. Recent CAO results and ongoing work is highlighted to point to the potential biomedical impact of computed broadband interferometric tomography. We also discuss the advantages and disadvantages of HAO vs. CAO for the effective shaping of optical wavefronts, and highlight opportunities for hybrid approaches that synergistically combine the unique advantages of hardware and computational methods for rapid volumetric tomography with cellular resolution.

  19. Adaptive Remodeling of Achilles Tendon: A Multi-scale Computational Model.

    Directory of Open Access Journals (Sweden)

    Stuart R Young

    2016-09-01

    Full Text Available While it is known that musculotendon units adapt to their load environments, there is only a limited understanding of tendon adaptation in vivo. Here we develop a computational model of tendon remodeling based on the premise that mechanical damage and tenocyte-mediated tendon damage and repair processes modify the distribution of its collagen fiber lengths. We explain how these processes enable the tendon to geometrically adapt to its load conditions. Based on known biological processes, mechanical and strain-dependent proteolytic fiber damage are incorporated into our tendon model. Using a stochastic model of fiber repair, it is assumed that mechanically damaged fibers are repaired longer, whereas proteolytically damaged fibers are repaired shorter, relative to their pre-damage length. To study adaptation of tendon properties to applied load, our model musculotendon unit is a simplified three-component Hill-type model of the human Achilles-soleus unit. Our model results demonstrate that the geometric equilibrium state of the Achilles tendon can coincide with minimization of the total metabolic cost of muscle activation. The proposed tendon model independently predicts rates of collagen fiber turnover that are in general agreement with in vivo experimental measurements. While the computational model here only represents a first step in a new approach to understanding the complex process of tendon remodeling in vivo, given these findings, it appears likely that the proposed framework may itself provide a useful theoretical foundation for developing valuable qualitative and quantitative insights into tendon physiology and pathology.

  20. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  1. Processing-Efficient Distributed Adaptive RLS Filtering for Computationally Constrained Platforms

    Directory of Open Access Journals (Sweden)

    Noor M. Khan

    2017-01-01

    Full Text Available In this paper, a novel processing-efficient architecture of a group of inexpensive and computationally incapable small platforms is proposed for a parallely distributed adaptive signal processing (PDASP operation. The proposed architecture runs computationally expensive procedures like complex adaptive recursive least square (RLS algorithm cooperatively. The proposed PDASP architecture operates properly even if perfect time alignment among the participating platforms is not available. An RLS algorithm with the application of MIMO channel estimation is deployed on the proposed architecture. Complexity and processing time of the PDASP scheme with MIMO RLS algorithm are compared with sequentially operated MIMO RLS algorithm and liner Kalman filter. It is observed that PDASP scheme exhibits much lesser computational complexity parallely than the sequential MIMO RLS algorithm as well as Kalman filter. Moreover, the proposed architecture provides an improvement of 95.83% and 82.29% decreased processing time parallely compared to the sequentially operated Kalman filter and MIMO RLS algorithm for low doppler rate, respectively. Likewise, for high doppler rate, the proposed architecture entails an improvement of 94.12% and 77.28% decreased processing time compared to the Kalman and RLS algorithms, respectively.

  2. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Science.gov (United States)

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  3. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  4. Short-term adaptation and chronic cardiac remodelling to high altitude in lowlander natives and Himalayan Sherpa.

    Science.gov (United States)

    Stembridge, Mike; Ainslie, Philip N; Shave, Rob

    2015-11-01

    What is the topic of this review? At high altitude, the cardiovascular system must adapt in order to meet the metabolic demand for oxygen. This review summarizes recent findings relating to short-term and life-long cardiac adaptation to high altitude in the context of exercise capacity. What advances does it highlight? Both Sherpa and lowlanders exhibit smaller left ventricular volumes at high altitude; however, myocardial relaxation, as evidenced by diastolic untwist, is reduced only in Sherpa, indicating that short-term hypoxia does not impair diastolic relaxation. Potential remodelling of systolic function, as evidenced by lower left ventricular systolic twist in Sherpa, may facilitate the requisite sea-level mechanical reserve required during exercise, although this remains to be confirmed. Both short-term and life-long high-altitude exposure challenge the cardiovascular system to meet the metabolic demand for O2 in a hypoxic environment. As the demand for O2 delivery increases during exercise, the circulatory component of oxygen transport is placed under additional stress. Acute adaptation and chronic remodelling of cardiac structure and function may occur to facilitate O2 delivery in lowlanders during sojourn to high altitude and in permanent highland residents. However, our understanding of cardiac structural and functional adaption in Sherpa remains confined to a higher maximal heart rate, lower pulmonary vascular resistance and no differences in resting cardiac output. Ventricular form and function are intrinsically linked through the left ventricular (LV) mechanics that facilitate efficient ejection, minimize myofibre stress during contraction and aid diastolic recoil. Recent examination of LV mechanics has allowed detailed insight into fundamental cardiac adaptation in high-altitude Sherpa. In this symposium report, we review recent advances in our understanding of LV function in both lowlanders and Sherpa at rest and discuss the potential consequences

  5. A self-adaptive chaotic particle swarm algorithm for short term hydroelectric system scheduling in deregulated environment

    International Nuclear Information System (INIS)

    Jiang Chuanwen; Bompard, Etorre

    2005-01-01

    This paper proposes a short term hydroelectric plant dispatch model based on the rule of maximizing the benefit. For the optimal dispatch model, which is a large scale nonlinear planning problem with multi-constraints and multi-variables, this paper proposes a novel self-adaptive chaotic particle swarm optimization algorithm to solve the short term generation scheduling of a hydro-system better in a deregulated environment. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed approach introduces chaos mapping and an adaptive scaling term into the particle swarm optimization algorithm, which increases its convergence rate and resulting precision. The new method has been examined and tested on a practical hydro-system. The results are promising and show the effectiveness and robustness of the proposed approach in comparison with the traditional particle swarm optimization algorithm

  6. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  7. Odor-context effects in free recall after a short retention interval: a new methodology for controlling adaptation.

    Science.gov (United States)

    Isarida, Takeo; Sakai, Tetsuya; Kubota, Takayuki; Koga, Miho; Katayama, Yu; Isarida, Toshiko K

    2014-04-01

    The present study investigated context effects of incidental odors in free recall after a short retention interval (5 min). With a short retention interval, the results are not confounded by extraneous odors or encounters with the experimental odor and possible rehearsal during a long retention interval. A short study time condition (4 s per item), predicted not to be affected by adaptation to the odor, and a long study time condition (8 s per item) were used. Additionally, we introduced a new method for recovery from adaptation, where a dissimilar odor was briefly presented at the beginning of the retention interval, and we demonstrated the effectiveness of this technique. An incidental learning paradigm was used to prevent overshadowing from confounding the results. In three experiments, undergraduates (N = 200) incidentally studied words presented one-by-one and received a free recall test. Two pairs of odors and a third odor having different semantic-differential characteristics were selected from 14 familiar odors. One of the odors was presented during encoding, and during the test, the same odor (same-context condition) or the other odor within the pair (different-context condition) was presented. Without using a recovery-from-adaptation method, a significant odor-context effect appeared in the 4-s/item condition, but not in the 8-s/item condition. Using the recovery-from-adaptation method, context effects were found for both the 8- and the 4-s/item conditions. The size of the recovered odor-context effect did not change with study time. There were no serial position effects. Implications of the present findings are discussed.

  8. Passenger thermal perceptions, thermal comfort requirements, and adaptations in short- and long-haul vehicles.

    Science.gov (United States)

    Lin, Tzu-Ping; Hwang, Ruey-Lung; Huang, Kuo-Tsang; Sun, Chen-Yi; Huang, Ying-Che

    2010-05-01

    While thermal comfort in mass transportation vehicles is relevant to service quality and energy consumption, benchmarks for such comfort that reflect the thermal adaptations of passengers are currently lacking. This study reports a field experiment involving simultaneous physical measurements and a questionnaire survey, collecting data from 2,129 respondents, that evaluated thermal comfort in short- and long-haul buses and trains. Experimental results indicate that high air temperature, strong solar radiation, and low air movement explain why passengers feel thermally uncomfortable. The overall insulation of clothing worn by passengers and thermal adaptive behaviour in vehicles differ from those in their living and working spaces. Passengers in short-haul vehicles habitually adjust the air outlets to increase thermal comfort, while passengers in long-haul vehicles prefer to draw the drapes to reduce discomfort from extended exposure to solar radiation. The neutral temperatures for short- and long-haul vehicles are 26.2 degrees C and 27.4 degrees C, while the comfort zones are 22.4-28.9 degrees C and 22.4-30.1 degrees C, respectively. The results of this study provide a valuable reference for practitioners involved in determining the adequate control and management of in-vehicle thermal environments, as well as facilitating design of buses and trains, ultimately contributing to efforts to achieve a balance between the thermal comfort satisfaction of passengers and energy conserving measures for air-conditioning in mass transportation vehicles.

  9. Short- and long-term adaptation to ethanol stress and its cross-protective consequences in Lactobacillus plantarum

    NARCIS (Netherlands)

    Bokhorst-van de Veen, van H.; Abee, T.; Tempelaars, M.H.; Bron, P.A.; Kleerebezem, M.; Marco, M.L.

    2011-01-01

    This paper describes the molecular responses of Lactobacillus plantarum WCFS1 toward ethanol exposure. Global transcriptome profiling using DNA microarrays demonstrated adaptation of the microorganism to the presence of 8% ethanol over short (10-min and 30-min) and long (24-h) time intervals. A

  10. A computer simulation of an adaptive noise canceler with a single input

    Science.gov (United States)

    Albert, Stuart D.

    1991-06-01

    A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.

  11. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  12. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  13. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    Science.gov (United States)

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  14. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  15. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  16. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  17. Short-term effects of implemented high intensity shoulder elevation during computer work

    DEFF Research Database (Denmark)

    Larsen, Mette K.; Samani, Afshin; Madeleine, Pascal

    2009-01-01

    computer work to prevent neck-shoulder pain may be possible without affecting the working routines. However, the unexpected reduction in clavicular trapezius rest during a pause with preceding high intensity contraction requires further investigation before high intensity shoulder elevations can......BACKGROUND: Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary...... contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction...

  18. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    Science.gov (United States)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable

  19. Neural Computations in a Dynamical System with Multiple Time Scales.

    Science.gov (United States)

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  20. Method and system for environmentally adaptive fault tolerant computing

    Science.gov (United States)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  1. Cross-cultural adaptation and validation of the Danish version of the Short Musculoskeletal Function Assessment questionnaire (SMFA).

    Science.gov (United States)

    Lindahl, Marianne; Andersen, Signe; Joergensen, Annette; Frandsen, Christian; Jensen, Liselotte; Benedikz, Eirikur

    2018-01-01

    The aim of this study was to translate and culturally adapt the Short Musculoskeletal Function Assessment (SMFA) into Danish (SMFA-DK) and assess the psychometric properties. SMFA was translated and cross-culturally adapted according to a standardized procedure. Minor changes in the wording in three items were made to adapt to Danish conditions. Acute patients (n = 201) and rehabilitation patients (n = 231) with musculoskeletal problems aged 18-87 years were included. The following analysis were made to evaluate psychometric quality of SMFA-DK: Reliability with Chronbach's alpha, content validity as coding according to the International Classification of Functioning, Disability and Health (ICF), floor/ceiling effects, construct validity as factor analysis, correlations between SMFA-DK and Short Form 36 and also known group method. Responsiveness and effect size were calculated. Cronbach's alpha values were between 0.79 and 0.94. SMFA-DK captured all components of the ICF, and there were no floor/ceiling effects. Factor analysis demonstrated four subscales. SMFA-DK correlated good with the SF-36 subscales for the rehabilitation patients and lower for the newly injured patients. Effect sizes were excellent and better for SMFA-DK than for SF-36. The study indicates that SMFA-DK can be a valid and responsive measure of outcome in rehabilitation settings.

  2. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  3. Administration of a dipeptidyl peptidase IV inhibitor enhances the intestinal adaptation in a mouse model of short bowel syndrome

    DEFF Research Database (Denmark)

    Okawada, Manabu; Holst, Jens Juul; Teitelbaum, Daniel H

    2011-01-01

    Glucagon-like peptide-2 induces small intestine mucosal epithelial cell proliferation and may have benefit for patients who suffer from short bowel syndrome. However, glucagon-like peptide-2 is inactivated rapidly in vivo by dipeptidyl peptidase IV. Therefore, we hypothesized that selectively inh...... inhibiting dipeptidyl peptidase IV would prolong the circulating life of glucagon-like peptide-2 and lead to increased intestinal adaptation after development of short bowel syndrome....

  4. PEAC: A Power-Efficient Adaptive Computing Technology for Enabling Swarm of Small Spacecraft and Deployable Mini-Payloads

    Data.gov (United States)

    National Aeronautics and Space Administration — This task is to develop and demonstrate a path-to-flight and power-adaptive avionics technology PEAC (Power Efficient Adaptive Computing). PEAC will enable emerging...

  5. Computational adaptive optics for broadband optical interferometric tomography of biological tissue.

    Science.gov (United States)

    Adie, Steven G; Graf, Benedikt W; Ahmad, Adeel; Carney, P Scott; Boppart, Stephen A

    2012-05-08

    Aberrations in optical microscopy reduce image resolution and contrast, and can limit imaging depth when focusing into biological samples. Static correction of aberrations may be achieved through appropriate lens design, but this approach does not offer the flexibility of simultaneously correcting aberrations for all imaging depths, nor the adaptability to correct for sample-specific aberrations for high-quality tomographic optical imaging. Incorporation of adaptive optics (AO) methods have demonstrated considerable improvement in optical image contrast and resolution in noninterferometric microscopy techniques, as well as in optical coherence tomography. Here we present a method to correct aberrations in a tomogram rather than the beam of a broadband optical interferometry system. Based on Fourier optics principles, we correct aberrations of a virtual pupil using Zernike polynomials. When used in conjunction with the computed imaging method interferometric synthetic aperture microscopy, this computational AO enables object reconstruction (within the single scattering limit) with ideal focal-plane resolution at all depths. Tomographic reconstructions of tissue phantoms containing subresolution titanium-dioxide particles and of ex vivo rat lung tissue demonstrate aberration correction in datasets acquired with a highly astigmatic illumination beam. These results also demonstrate that imaging with an aberrated astigmatic beam provides the advantage of a more uniform depth-dependent signal compared to imaging with a standard gaussian beam. With further work, computational AO could enable the replacement of complicated and expensive optical hardware components with algorithms implemented on a standard desktop computer, making high-resolution 3D interferometric tomography accessible to a wider group of users and nonspecialists.

  6. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  7. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  8. A systems biology analysis of long and short-term memories of osmotic stress adaptation in fungi

    Directory of Open Access Journals (Sweden)

    You Tao

    2012-05-01

    Full Text Available Abstract Background Saccharomyces cerevisiae senses hyperosmotic conditions via the HOG signaling network that activates the stress-activated protein kinase, Hog1, and modulates metabolic fluxes and gene expression to generate appropriate adaptive responses. The integral control mechanism by which Hog1 modulates glycerol production remains uncharacterized. An additional Hog1-independent mechanism retains intracellular glycerol for adaptation. Candida albicans also adapts to hyperosmolarity via a HOG signaling network. However, it remains unknown whether Hog1 exerts integral or proportional control over glycerol production in C. albicans. Results We combined modeling and experimental approaches to study osmotic stress responses in S. cerevisiae and C. albicans. We propose a simple ordinary differential equation (ODE model that highlights the integral control that Hog1 exerts over glycerol biosynthesis in these species. If integral control arises from a separation of time scales (i.e. rapid HOG activation of glycerol production capacity which decays slowly under hyperosmotic conditions, then the model predicts that glycerol production rates elevate upon adaptation to a first stress and this makes the cell adapts faster to a second hyperosmotic stress. It appears as if the cell is able to remember the stress history that is longer than the timescale of signal transduction. This is termed the long-term stress memory. Our experimental data verify this. Like S. cerevisiae, C. albicans mimimizes glycerol efflux during adaptation to hyperosmolarity. Also, transient activation of intermediate kinases in the HOG pathway results in a short-term memory in the signaling pathway. This determines the amplitude of Hog1 phosphorylation under a periodic sequence of stress and non-stressed intervals. Our model suggests that the long-term memory also affects the way a cell responds to periodic stress conditions. Hence, during osmohomeostasis, short-term memory is

  9. Configurable multiplier modules for an adaptive computing system

    Directory of Open Access Journals (Sweden)

    O. A. Pfänder

    2006-01-01

    Full Text Available The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  10. Discriminating Children with Autism from Children with Learning Difficulties with an Adaptation of the Short Sensory Profile

    Science.gov (United States)

    O'Brien, Justin; Tsermentseli, Stella; Cummins, Omar; Happe, Francesca; Heaton, Pamela; Spencer, Janine

    2009-01-01

    In this article, we examine the extent to which children with autism and children with learning difficulties can be discriminated from their responses to different patterns of sensory stimuli. Using an adapted version of the Short Sensory Profile (SSP), sensory processing was compared in 34 children with autism to 33 children with typical…

  11. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  12. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees.

    Science.gov (United States)

    Chien, Tsair-Wei; Lai, Wen-Pin; Lu, Chih-Wei; Wang, Weng-Chung; Chen, Shih-Chung; Wang, Hsien-Yi; Su, Shih-Bin

    2011-04-17

    To develop a web-based computer adaptive testing (CAT) application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37) could evaluate the job satisfaction of individual employees as a single construct. The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  13. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees

    Directory of Open Access Journals (Sweden)

    Chen Shih-Chung

    2011-04-01

    Full Text Available Abstract Background To develop a web-based computer adaptive testing (CAT application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37 could evaluate the job satisfaction of individual employees as a single construct. Methods The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Results Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. Conclusions A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  14. Integrable discretizations of the short pulse equation

    International Nuclear Information System (INIS)

    Feng Baofeng; Maruno, Ken-ichi; Ohta, Yasuhiro

    2010-01-01

    In this paper, we propose integrable semi-discrete and full-discrete analogues of the short pulse (SP) equation. The key construction is the bilinear form and determinant structure of solutions of the SP equation. We also give the determinant formulas of N-soliton solutions of the semi-discrete and full-discrete analogues of the SP equations, from which the multi-loop and multi-breather solutions can be generated. In the continuous limit, the full-discrete SP equation converges to the semi-discrete SP equation, and then to the continuous SP equation. Based on the semi-discrete SP equation, an integrable numerical scheme, i.e. a self-adaptive moving mesh scheme, is proposed and used for the numerical computation of the short pulse equation.

  15. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  16. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    Science.gov (United States)

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  17. Statistical Multiplexing of Computations in C-RAN with Tradeoffs in Latency and Energy

    DEFF Research Database (Denmark)

    Kalør, Anders Ellersgaard; Agurto Agurto, Mauricio Ignacio; Pratas, Nuno

    2017-01-01

    frame duration, then this may result in additional access latency and limit the energy savings. In this paper we investigate the tradeoff by considering two extreme time-scales for the resource multiplexing: (i) long-term, where the computational resources are adapted over periods much larger than...... the access frame durations; (ii) short-term, where the adaption is below the access frame duration.We develop a general C-RAN queuing model that models the access latency and show, for Poisson arrivals, that long-term multiplexing achieves savings comparable to short-term multiplexing, while offering low...

  18. From “Crash!” to Crash: Adapting the Adaptation

    Directory of Open Access Journals (Sweden)

    Ljubica Matek

    2017-12-01

    Full Text Available The paper focuses on J.G. Ballard’s various adaptations of his own material related to the issue of the sexual and sensual nature of an automobile crash, and suggests that adaptation is one of the key methods in art and literature which can be used as a means of contemplating and developing various aesthetic and political ideas. Ballard’s short story “Crash!” was first published in the ICA’s (Institute of Contemporary Arts Eventsheet in February 1969, and later became a chapter of his experimental novel The Atrocity Exhibition (1970. At the same time, Ballard adapts the idea into the “Crashed Cars” exhibition (1970 in London. The short story was then adapted into a short film, Crash!, directed by Harley Cokeliss (1971 and starring Ballard himself, to be finally adapted into the novel Crash (1973. Ballard’s adaptation of his initial ideas across literary forms and media testifies to the importance of adaptation as a process and method of creating art. Thus, rather than suggesting that adaptations merely “breathe life” into the written word, the paper points to the conclusion that the form and content are mutually influential and that, in this case, the novel itself is an adaptation, rather than a hypotext (which it becomes in 1996 to David Cronenberg as he adapts it to film. The complexity of the relationship between the source text and its many adaptations has already contributed to the deconstruction, in Derrida’s terms, of the hierarchy (opposition between the original and the copy. Rather, Ballard’s crossmedial and transmedial adaptations of his own ideas show how, as Ray would suggest, an adaptation cites the source and grafts it into a new context, giving it a new function, both aesthetic and political.

  19. Intricacies of Feedback in Computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    Prism Adaptation Therapy (PAT) is an intervention method for treatment of attentional disorders, such as neglect e.g. 1,2. The method involves repeated pointing at specified targets with or without prism glasses using a specifically designed wooden box. The aim of this study was to ascertain...... whether the PAT method can be executed with similar effect using a computer with a touch screen.   62 healthy subjects were subjected to two experimental conditions: 1) pointing out at targets using the original box, 2) pointing out at targets on a computer attached touch screen. In both conditions......, the subjects performed a pre-test consisting of 30 targets without feedback, then an exposure-test of 90 targets with prism glasses and feedback, and finally a post-test of 60 targets, with no glasses and no feedback. Two experiments were carried out, 1) the feedback was provided by showing a cross...

  20. Are We Measuring Teachers’ Attitudes towards Computers in Detail?: Adaptation of a Questionnaire into Turkish Culture

    Directory of Open Access Journals (Sweden)

    Nilgün Günbaş

    2017-04-01

    Full Text Available Teachers’ perceptions of computers play an important role in integrating computers into education. The related literature includes studies developing or adapting a survey instrument in Turkish culture measuring teachers’ attitudes toward computers. These instruments have three to four factors (e.g., computer importance, computer enjoyment, computer confidence and 18 to 26 items under these factors. The purpose of the present study is to adapt a more detailed and stronger survey questionnaire measuring more dimensions related to teachers’ attitudes. The source instrument was developed by Christensen and Kenzek (2009 and called Teachers’ Attitudes toward Computers (TAC. It has nine factors with 51 items. Before testing the instrument, the interaction (e-mail factor was taken out because of the cultural differences. The reliability and validity testing of the translated instrument was completed with 273 teachers’ candidates in a Faculty of Education in Turkey. The results showed that the translated instrument (Cronbach’s Alpha: .94 included eight factors and consisted of 42 items under these factors, which were consistent with the original instrument. These factors were: Interest (α: .83, Comfort (α: .90, Accommodation (α: .87, Concern (α: .79, Utility (α: .90, Perception (α: .89, Absorption (α: .84, and Significance (α: .83. Additionally, the confirmatory factor analysis result for the model with eight factors was: RMSEA=0.050, χ2/df=1.69, RMR=0.075, SRMR=0.057, GFI= 0.81, AGFI= 0.78, NFI= 0.94, NNFI=0.97, CFI=0.97, IFI= 0.97. Accordingly, as a reliable, valid and stronger instrument, the adapted survey instrument can be suggested for the use in Turkish academic studies.

  1. Understanding Coral's Short-term Adaptive Ability to Changing Environment

    Science.gov (United States)

    Tisthammer, K.; Richmond, R. H.

    2016-02-01

    Corals in Maunalua Bay, Hawaii are under chronic pressures from sedimentation and terrestrial runoffs containing multiple pollutants as a result of large scale urbanization that has taken place in the last 100 years. However, some individual corals thrive despite the prolonged exposure to these environmental stressors, which suggests that these individuals may have adapted to withstand such stressors. A recent survey showed that the lobe coral Porites lobata from the `high-stress' nearshore site had an elevated level of stress ixnduced proteins, compared to those from the `low-stress,' less polluted offshore site. To understand the genetic basis for the observed differential stress responses between the nearshore and offshore P. lobata populations, an analysis of the lineage-scale population genetic structure, as well as a reciprocal transplant experiment were conducted. The result of the genetic analysis revealed a clear genetic differentiation between P. lobata from the nearshore site and the offshore site. Following the 30- day reciprocal transplant experiment, protein expression profiles and other stress-related physiological characteristics were compared between the two populations. The experimental results suggest that the nearshore genotype can cope better with sedimentation/pollutants than the offshore genotype. This indicates that the observed genetic differentiation is due to selection for tolerance to these environmental stressors. Understanding the little-known, linage-scale genetic variation in corals offers a critical insight into their short-term adaptive ability, which is indispensable for protecting corals from impending environmental and climate change. The results of this study also offer a valuable tool for resource managers to make effective decisions on coral reef conservation, such as designing marine protected areas that incorporate and maintain such genetic diversity, and establishing acceptable pollution run-off levels.

  2. Adaptive versus Non-Adaptive Security of Multi-Party Protocols

    DEFF Research Database (Denmark)

    Canetti, Ran; Damgård, Ivan Bjerre; Dziembowski, Stefan

    2004-01-01

    Security analysis of multi-party cryptographic protocols distinguishes between two types of adversarial settings: In the non-adaptive setting the set of corrupted parties is chosen in advance, before the interaction begins. In the adaptive setting the adversary chooses who to corrupt during...... the course of the computation. We study the relations between adaptive security (i.e., security in the adaptive setting) and nonadaptive security, according to two definitions and in several models of computation....

  3. A short note on the use of the red-black tree in Cartesian adaptive mesh refinement algorithms

    Science.gov (United States)

    Hasbestan, Jaber J.; Senocak, Inanc

    2017-12-01

    Mesh adaptivity is an indispensable capability to tackle multiphysics problems with large disparity in time and length scales. With the availability of powerful supercomputers, there is a pressing need to extend time-proven computational techniques to extreme-scale problems. Cartesian adaptive mesh refinement (AMR) is one such method that enables simulation of multiscale, multiphysics problems. AMR is based on construction of octrees. Originally, an explicit tree data structure was used to generate and manipulate an adaptive Cartesian mesh. At least eight pointers are required in an explicit approach to construct an octree. Parent-child relationships are then used to traverse the tree. An explicit octree, however, is expensive in terms of memory usage and the time it takes to traverse the tree to access a specific node. For these reasons, implicit pointerless methods have been pioneered within the computer graphics community, motivated by applications requiring interactivity and realistic three dimensional visualization. Lewiner et al. [1] provides a concise review of pointerless approaches to generate an octree. Use of a hash table and Z-order curve are two key concepts in pointerless methods that we briefly discuss next.

  4. Quinoa - Adaptive Computational Fluid Dynamics, 0.2

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter, an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh

  5. Computing Adaptive Feature Weights with PSO to Improve Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2017-01-01

    Full Text Available Android malware detection is a complex and crucial issue. In this paper, we propose a malware detection model using a support vector machine (SVM method based on feature weights that are computed by information gain (IG and particle swarm optimization (PSO algorithms. The IG weights are evaluated based on the relevance between features and class labels, and the PSO weights are adaptively calculated to result in the best fitness (the performance of the SVM classification model. Moreover, to overcome the defects of basic PSO, we propose a new adaptive inertia weight method called fitness-based and chaotic adaptive inertia weight-PSO (FCAIW-PSO that improves on basic PSO and is based on the fitness and a chaotic term. The goal is to assign suitable weights to the features to ensure the best Android malware detection performance. The results of experiments indicate that the IG weights and PSO weights both improve the performance of SVM and that the performance of the PSO weights is better than that of the IG weights.

  6. Study maps as a tool for the adaptive tests construction

    Directory of Open Access Journals (Sweden)

    Dita Dlabolová

    2013-01-01

    Full Text Available Measurement of students’ knowledge is an essential part of the educational process. Teachers on universities often use computer-based tests to testing a large number of students in a short time. The question is, what kind of information these tests provide, and if it is possible to classify students on this basis. Praxis shows that the scalar test results in the form of simple numbers cannot be plainly interpreted as the level of knowledge; moreover it is not easy to build such tests, which detect the necessary information. In the first part of the article we present the results of pedagogical experiment focused on the difference between information obtained through the computer-based test and a teacher’s interview with the same students. Possible starting point to improve information from computer-based tests in non-scalar form is a construction of an adaptive test, adapting test items to identify knowledge similar to a conversation with a teacher. As a tool for the design of the adaptive tests we use so called study maps, which are described in the second part of the article.

  7. Quantum computer based on activated dielectric nanoparticles selectively interacting with short optical pulses

    International Nuclear Information System (INIS)

    Gadomskii, Oleg N; Kharitonov, Yu Ya

    2004-01-01

    The operation principle of a quantum computer is proposed based on a system of dielectric nanoparticles activated with two-level atoms - cubits, in which electric dipole transitions are excited by short intense optical pulses. It is proved that the logical operation (logical operator) CNOT (controlled NOT) is performed by means of time-dependent transfer of quantum information over 'long' (of the order of 10 4 nm) distances between spherical nanoparticles owing to the delayed interaction between them in the optical radiation field. It is shown that one-cubit and two-cubit logical operators required for quantum calculations can be realised by selectively exciting dielectric particles with short optical pulses. (quantum calculations)

  8. The self-adaptation to dynamic failures for efficient virtual organization formations in grid computing context

    International Nuclear Information System (INIS)

    Han Liangxiu

    2009-01-01

    Grid computing aims to enable 'resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations (VOs)'. However, due to the nature of heterogeneous and dynamic resources, dynamic failures in the distributed grid environment usually occur more than in traditional computation platforms, which cause failed VO formations. In this paper, we develop a novel self-adaptive mechanism to dynamic failures during VO formations. Such a self-adaptive scheme allows an individual and member of VOs to automatically find other available or replaceable one once a failure happens and therefore makes systems automatically recover from dynamic failures. We define dynamic failure situations of a system by using two standard indicators: mean time between failures (MTBF) and mean time to recover (MTTR). We model both MTBF and MTTR as Poisson distributions. We investigate and analyze the efficiency of the proposed self-adaptation mechanism to dynamic failures by comparing the success probability of VO formations before and after adopting it in three different cases: (1) different failure situations; (2) different organizational structures and scales; (3) different task complexities. The experimental results show that the proposed scheme can automatically adapt to dynamic failures and effectively improve the dynamic VO formation performance in the event of node failures, which provide a valuable addition to the field.

  9. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  10. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  11. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    Science.gov (United States)

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. ICAN Computer Code Adapted for Building Materials

    Science.gov (United States)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  13. Adaptive control of Parkinson's state based on a nonlinear computational model with unknown parameters.

    Science.gov (United States)

    Su, Fei; Wang, Jiang; Deng, Bin; Wei, Xi-Le; Chen, Ying-Yuan; Liu, Chen; Li, Hui-Yan

    2015-02-01

    The objective here is to explore the use of adaptive input-output feedback linearization method to achieve an improved deep brain stimulation (DBS) algorithm for closed-loop control of Parkinson's state. The control law is based on a highly nonlinear computational model of Parkinson's disease (PD) with unknown parameters. The restoration of thalamic relay reliability is formulated as the desired outcome of the adaptive control methodology, and the DBS waveform is the control input. The control input is adjusted in real time according to estimates of unknown parameters as well as the feedback signal. Simulation results show that the proposed adaptive control algorithm succeeds in restoring the relay reliability of the thalamus, and at the same time achieves accurate estimation of unknown parameters. Our findings point to the potential value of adaptive control approach that could be used to regulate DBS waveform in more effective treatment of PD.

  14. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei; Guyet, Thomas; Quiniou, René ; Cordier, Marie-Odile; Masseglia, Florent; Zhang, Xiangliang

    2014-01-01

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  15. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  16. On Adaptive vs. Non-adaptive Security of Multiparty Protocols

    DEFF Research Database (Denmark)

    Canetti, Ran; Damgård, Ivan Bjerre; Dziembowski, Stefan

    2001-01-01

    highlights of our results are: – - According to the definition of Dodis-Micali-Rogaway (which is set in the information-theoretic model), adaptive and non-adaptive security are equivalent. This holds for both honest-but-curious and Byzantine adversaries, and for any number of parties. – - According......Security analysis of multiparty cryptographic protocols distinguishes between two types of adversarialsettings: In the non-adaptive setting, the set of corrupted parties is chosen in advance, before the interaction begins. In the adaptive setting, the adversary chooses who to corrupt during...... the course of the computation. We study the relations between adaptive security (i.e., security in the adaptive setting) and non-adaptive security, according to two definitions and in several models of computation. While affirming some prevailing beliefs, we also obtain some unexpected results. Some...

  17. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  18. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  19. Sub-module Short Circuit Fault Diagnosis in Modular Multilevel Converter Based on Wavelet Transform and Adaptive Neuro Fuzzy Inference System

    DEFF Research Database (Denmark)

    Liu, Hui; Loh, Poh Chiang; Blaabjerg, Frede

    2015-01-01

    for continuous operation and post-fault maintenance. In this article, a fault diagnosis technique is proposed for the short circuit fault in a modular multi-level converter sub-module using the wavelet transform and adaptive neuro fuzzy inference system. The fault features are extracted from output phase voltage...

  20. Adaptation of MPDATA Heterogeneous Stencil Computation to Intel Xeon Phi Coprocessor

    Directory of Open Access Journals (Sweden)

    Lukasz Szustak

    2015-01-01

    Full Text Available The multidimensional positive definite advection transport algorithm (MPDATA belongs to the group of nonoscillatory forward-in-time algorithms and performs a sequence of stencil computations. MPDATA is one of the major parts of the dynamic core of the EULAG geophysical model. In this work, we outline an approach to adaptation of the 3D MPDATA algorithm to the Intel MIC architecture. In order to utilize available computing resources, we propose the (3 + 1D decomposition of MPDATA heterogeneous stencil computations. This approach is based on combination of the loop tiling and fusion techniques. It allows us to ease memory/communication bounds and better exploit the theoretical floating point efficiency of target computing platforms. An important method of improving the efficiency of the (3 + 1D decomposition is partitioning of available cores/threads into work teams. It permits for reducing inter-cache communication overheads. This method also increases opportunities for the efficient distribution of MPDATA computation onto available resources of the Intel MIC architecture, as well as Intel CPUs. We discuss preliminary performance results obtained on two hybrid platforms, containing two CPUs and Intel Xeon Phi. The top-of-the-line Intel Xeon Phi 7120P gives the best performance results, and executes MPDATA almost 2 times faster than two Intel Xeon E5-2697v2 CPUs.

  1. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  2. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish

    Directory of Open Access Journals (Sweden)

    Radosław Rutkowski

    2017-01-01

    Full Text Available Purpose. The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA. Methods. The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheumatological hospital in Śrem (Poland. They represented different sociodemographic characteristics and were characterized as rural and city environments residents. Results. The mean age of the patients was 58.9±10.2 years. The majority of patients (85% were female. The average final FFI-RS score was 62.9±15.3. The internal consistency was achieved at a high level of 0.95 in Cronbach’s alpha test, with an interclass correlation coefficient ranging between 0.78 and 0.84. A strong correlation was observed between the FFI-RS and Health Assessment Questionnaire-Disability Index (HAQ-DI questionnaires. Conclusion. The Polish version of FFI-RS-PL indicator is an important tool for evaluating the functional condition of patients’ feet and can be applied in the diagnosis and treatment of Polish-speaking patients suffering from RA.

  3. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish.

    Science.gov (United States)

    Rutkowski, Radosław; Gałczyńska-Rusin, Małgorzata; Gizińska, Małgorzata; Straburzyński-Lupa, Marcin; Zdanowska, Agata; Romanowski, Mateusz Wojciech; Romanowski, Wojciech; Budiman-Mak, Elly; Straburzyńska-Lupa, Anna

    2017-01-01

    The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS) questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA). The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheumatological hospital in Śrem (Poland). They represented different sociodemographic characteristics and were characterized as rural and city environments residents. The mean age of the patients was 58.9 ± 10.2 years. The majority of patients (85%) were female. The average final FFI-RS score was 62.9 ± 15.3. The internal consistency was achieved at a high level of 0.95 in Cronbach's alpha test, with an interclass correlation coefficient ranging between 0.78 and 0.84. A strong correlation was observed between the FFI-RS and Health Assessment Questionnaire-Disability Index (HAQ-DI) questionnaires. The Polish version of FFI-RS-PL indicator is an important tool for evaluating the functional condition of patients' feet and can be applied in the diagnosis and treatment of Polish-speaking patients suffering from RA.

  4. Implementing Molecular Dynamics for Hybrid High Performance Computers - 1. Short Range Forces

    International Nuclear Information System (INIS)

    Brown, W. Michael; Wang, Peng; Plimpton, Steven J.; Tharrington, Arnold N.

    2011-01-01

    The use of accelerators such as general-purpose graphics processing units (GPGPUs) have become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high performance computers, machines with more than one type of floating-point processor, are now becoming more prevalent due to these advantages. In this work, we discuss several important issues in porting a large molecular dynamics code for use on parallel hybrid machines - (1) choosing a hybrid parallel decomposition that works on central processing units (CPUs) with distributed memory and accelerator cores with shared memory, (2) minimizing the amount of code that must be ported for efficient acceleration, (3) utilizing the available processing power from both many-core CPUs and accelerators, and (4) choosing a programming model for acceleration. We present our solution to each of these issues for short-range force calculation in the molecular dynamics package LAMMPS. We describe algorithms for efficient short range force calculation on hybrid high performance machines. We describe a new approach for dynamic load balancing of work between CPU and accelerator cores. We describe the Geryon library that allows a single code to compile with both CUDA and OpenCL for use on a variety of accelerators. Finally, we present results on a parallel test cluster containing 32 Fermi GPGPUs and 180 CPU cores.

  5. Adult zebrafish intestine resection: a novel model of short bowel syndrome, adaptation, and intestinal stem cell regeneration.

    Science.gov (United States)

    Schall, K A; Holoyda, K A; Grant, C N; Levin, D E; Torres, E R; Maxwell, A; Pollack, H A; Moats, R A; Frey, M R; Darehzereshki, A; Al Alam, D; Lien, C; Grikscheit, T C

    2015-08-01

    Loss of significant intestinal length from congenital anomaly or disease may lead to short bowel syndrome (SBS); intestinal failure may be partially offset by a gain in epithelial surface area, termed adaptation. Current in vivo models of SBS are costly and technically challenging. Operative times and survival rates have slowed extension to transgenic models. We created a new reproducible in vivo model of SBS in zebrafish, a tractable vertebrate model, to facilitate investigation of the mechanisms of intestinal adaptation. Proximal intestinal diversion at segment 1 (S1, equivalent to jejunum) was performed in adult male zebrafish. SBS fish emptied distal intestinal contents via stoma as in the human disease. After 2 wk, S1 was dilated compared with controls and villus ridges had increased complexity, contributing to greater villus epithelial perimeter. The number of intervillus pockets, the intestinal stem cell zone of the zebrafish increased and contained a higher number of bromodeoxyuridine (BrdU)-labeled cells after 2 wk of SBS. Egf receptor and a subset of its ligands, also drivers of adaptation, were upregulated in SBS fish. Igf has been reported as a driver of intestinal adaptation in other animal models, and SBS fish exposed to a pharmacological inhibitor of the Igf receptor failed to demonstrate signs of intestinal adaptation, such as increased inner epithelial perimeter and BrdU incorporation. We describe a technically feasible model of human SBS in the zebrafish, a faster and less expensive tool to investigate intestinal stem cell plasticity as well as the mechanisms that drive intestinal adaptation. Copyright © 2015 the American Physiological Society.

  6. Colonic GLP-2 is not sufficient to promote jejunal adaptation in a PN-dependent rat model of human short bowel syndrome

    DEFF Research Database (Denmark)

    Koopmann, Matthew C; Liu, Xiaowen; Boehler, Christopher J

    2009-01-01

    BACKGROUND: Bowel resection may lead to short bowel syndrome (SBS), which often requires parenteral nutrition (PN) due to inadequate intestinal adaptation. The objective of this study was to determine the time course of adaptation and proglucagon system responses after bowel resection in a PN...... and digestive capacity were assessed by mucosal mass, protein, DNA, histology, and sucrase activity. Plasma insulin-like growth factor I (IGF-I) and bioactive glucagon-like peptide 2 (GLP-2) were measured by radioimmunoassay. RESULTS: Jejunum cellularity changed significantly over time with resection...

  7. Short-term cardiorespiratory adaptation to high altitude in children compared with adults.

    Science.gov (United States)

    Kriemler, S; Radtke, T; Bürgi, F; Lambrecht, J; Zehnder, M; Brunner-La Rocca, H P

    2016-02-01

    As short-term cardiorespiratory adaptation to high altitude (HA) exposure has not yet been studied in children, we assessed acute mountain sickness (AMS), hypoxic ventilatory response (HVR) at rest and maximal exercise capacity (CPET) at low altitude (LA) and HA in pre-pubertal children and their fathers. Twenty father-child pairs (11 ± 1 years and 44 ± 4 years) were tested at LA (450 m) and HA (3450 m) at days 1, 2, and 3 after fast ascent (HA1/2/3). HVR was measured at rest and CPET was performed on a cycle ergometer. AMS severity was mild to moderate with no differences between generations. HVR was higher in children than adults at LA and increased at HA similarly in both groups. Peak oxygen uptake (VO2 peak) relative to body weight was similar in children and adults at LA and decreased significantly by 20% in both groups at HA; maximal heart rate did not change at HA in children while it decreased by 16% in adults (P < 0.001). Changes in HVR and VO2 peak from LA to HA were correlated among the biological child-father pairs. In conclusion, cardiorespiratory adaptation to altitude seems to be at least partly hereditary. Even though children and their fathers lose similar fractions of aerobic capacity going to high altitude, the mechanisms might be different. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  9. Translation and cross-cultural adaptation of the Brazilian Portuguese version of the Driving Anger Scale (DAS: long form and short form

    Directory of Open Access Journals (Sweden)

    Jessye Almeida Cantini

    2015-03-01

    Full Text Available Introduction: Driving anger has attracted the attention of researchers in recent years because it may induce individuals to drive aggressively or adopt risk behaviors. The Driving Anger Scale (DAS was designed to evaluate the propensity of drivers to become angry or aggressive while driving. This study describes the cross-cultural adaptation of a Brazilian version of the short form and the long form of the DAS.Methods: Translation and adaptation were made in four steps: two translations and two back-translations carried out by independent evaluators; the development of a brief version by four bilingual experts in mental health and driving behaviors; a subsequent experimental application; and, finally, an investigation of operational equivalence.Results: Final Brazilian versions of the short form and of the long form of the DAS were made and are presented. Conclusions: This important instrument, which assesses driving anger and aggressive behaviors, is now available to evaluate the driving behaviors of the Brazilian population, which facilitates research in this field.

  10. Multi-objective differential evolution with adaptive Cauchy mutation for short-term multi-objective optimal hydro-thermal scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Qin Hui [College of Hydropower and Information Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zhou Jianzhong, E-mail: jz.zhou@hust.edu.c [College of Hydropower and Information Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Lu Youlin; Wang Ying; Zhang Yongchuan [College of Hydropower and Information Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2010-04-15

    A new multi-objective optimization method based on differential evolution with adaptive Cauchy mutation (MODE-ACM) is presented to solve short-term multi-objective optimal hydro-thermal scheduling (MOOHS) problem. Besides fuel cost, the pollutant gas emission is also optimized as an objective. The water transport delay between connected reservoirs and the effect of valve-point loading of thermal units are also taken into account in the presented problem formulation. The proposed algorithm adopts an elitist archive to retain non-dominated solutions obtained during the evolutionary process. It modifies the DE's operators to make it suit for multi-objective optimization (MOO) problems and improve its performance. Furthermore, to avoid premature convergence, an adaptive Cauchy mutation is proposed to preserve the diversity of population. An effective constraints handling method is utilized to handle the complex equality and inequality constraints. The effectiveness of the proposed algorithm is tested on a hydro-thermal system consisting of four cascaded hydro plants and three thermal units. The results obtained by MODE-ACM are compared with several previous studies. It is found that the results obtained by MODE-ACM are superior in terms of fuel cost as well as emission output, consuming a shorter time. Thus it can be a viable alternative to generate optimal trade-offs for short-term MOOHS problem.

  11. Computational modeling of ultra-short-pulse ablation of enamel

    Energy Technology Data Exchange (ETDEWEB)

    London, R.A.; Bailey, D.S.; Young, D.A. [and others

    1996-02-29

    A computational model for the ablation of tooth enamel by ultra-short laser pulses is presented. The role of simulations using this model in designing and understanding laser drilling systems is discussed. Pulses of duration 300 sec and intensity greater than 10{sup 12} W/cm{sup 2} are considered. Laser absorption proceeds via multi-photon initiated plasma mechanism. The hydrodynamic response is calculated with a finite difference method, using an equation of state constructed from thermodynamic functions including electronic, ion motion, and chemical binding terms. Results for the ablation efficiency are presented. An analytic model describing the ablation threshold and ablation depth is presented. Thermal coupling to the remaining tissue and long-time thermal conduction are calculated. Simulation results are compared to experimental measurements of the ablation efficiency. Desired improvements in the model are presented.

  12. Increased performance in the short-term water demand forecasting through the use of a parallel adaptive weighting strategy

    Science.gov (United States)

    Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.

    2018-03-01

    Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.

  13. On-Line Testing and Reconfiguration of Field Programmable Gate Arrays (FPGAs) for Fault-Tolerant (FT) Applications in Adaptive Computing Systems (ACS)

    National Research Council Canada - National Science Library

    Abramovici, Miron

    2002-01-01

    Adaptive computing systems (ACS) rely on reconfigurable hardware to adapt the system operation to changes in the external environment, and to extend mission capability by implementing new functions on the same hardware platform...

  14. Adaptive tight frame based medical image reconstruction: a proof-of-concept study for computed tomography

    International Nuclear Information System (INIS)

    Zhou, Weifeng; Cai, Jian-Feng; Gao, Hao

    2013-01-01

    A popular approach for medical image reconstruction has been through the sparsity regularization, assuming the targeted image can be well approximated by sparse coefficients under some properly designed system. The wavelet tight frame is such a widely used system due to its capability for sparsely approximating piecewise-smooth functions, such as medical images. However, using a fixed system may not always be optimal for reconstructing a variety of diversified images. Recently, the method based on the adaptive over-complete dictionary that is specific to structures of the targeted images has demonstrated its superiority for image processing. This work is to develop the adaptive wavelet tight frame method image reconstruction. The proposed scheme first constructs the adaptive wavelet tight frame that is task specific, and then reconstructs the image of interest by solving an l 1 -regularized minimization problem using the constructed adaptive tight frame system. The proof-of-concept study is performed for computed tomography (CT), and the simulation results suggest that the adaptive tight frame method improves the reconstructed CT image quality from the traditional tight frame method. (paper)

  15. Automatic Delineation of On-Line Head-And-Neck Computed Tomography Images: Toward On-Line Adaptive Radiotherapy

    International Nuclear Information System (INIS)

    Zhang Tiezhi; Chi Yuwei; Meldolesi, Elisa; Yan Di

    2007-01-01

    Purpose: To develop and validate a fully automatic region-of-interest (ROI) delineation method for on-line adaptive radiotherapy. Methods and Materials: On-line adaptive radiotherapy requires a robust and automatic image segmentation method to delineate ROIs in on-line volumetric images. We have implemented an atlas-based image segmentation method to automatically delineate ROIs of head-and-neck helical computed tomography images. A total of 32 daily computed tomography images from 7 head-and-neck patients were delineated using this automatic image segmentation method. Manually drawn contours on the daily images were used as references in the evaluation of automatically delineated ROIs. Two methods were used in quantitative validation: (1) the dice similarity coefficient index, which indicates the overlapping ratio between the manually and automatically delineated ROIs; and (2) the distance transformation, which yields the distances between the manually and automatically delineated ROI surfaces. Results: Automatic segmentation showed agreement with manual contouring. For most ROIs, the dice similarity coefficient indexes were approximately 0.8. Similarly, the distance transformation evaluation results showed that the distances between the manually and automatically delineated ROI surfaces were mostly within 3 mm. The distances between two surfaces had a mean of 1 mm and standard deviation of <2 mm in most ROIs. Conclusion: With atlas-based image segmentation, it is feasible to automatically delineate ROIs on the head-and-neck helical computed tomography images in on-line adaptive treatments

  16. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    Science.gov (United States)

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  17. Integrable discretizations and self-adaptive moving mesh method for a coupled short pulse equation

    International Nuclear Information System (INIS)

    Feng, Bao-Feng; Chen, Junchao; Chen, Yong; Maruno, Ken-ichi; Ohta, Yasuhiro

    2015-01-01

    In the present paper, integrable semi-discrete and fully discrete analogues of a coupled short pulse (CSP) equation are constructed. The key to the construction are the bilinear forms and determinant structure of the solutions of the CSP equation. We also construct N-soliton solutions for the semi-discrete and fully discrete analogues of the CSP equations in the form of Casorati determinants. In the continuous limit, we show that the fully discrete CSP equation converges to the semi-discrete CSP equation, then further to the continuous CSP equation. Moreover, the integrable semi-discretization of the CSP equation is used as a self-adaptive moving mesh method for numerical simulations. The numerical results agree with the analytical results very well. (paper)

  18. Features of the adaptive control and measuring the effectiveness of distant teaching to computer science

    Directory of Open Access Journals (Sweden)

    Евгений Игоревич Горюшкин

    2009-06-01

    Full Text Available In title approaches to construction of effective monitoring systems of productivity of training to computer science in high schools are described. It is offered to put adaptive testing at which in development of tests artificial neural networks are applied in a basis of such systems.

  19. The impacts of computer adaptive testing from a variety of perspectives

    Directory of Open Access Journals (Sweden)

    Tetsuo Kimura

    2017-05-01

    Full Text Available Computer adaptive testing (CAT is a kind of tailored testing, in that it is a form of computer-based testing that is adaptive to each test-taker’s ability level. In this review, the impacts of CAT are discussed from different perspectives in order to illustrate crucial points to keep in mind during the development and implementation of CAT. Test developers and psychometricians often emphasize the efficiency and accuracy of CAT in comparison to traditional linear tests. However, many test-takers report feeling discouraged after taking CATs, and this feeling can reduce learning self-efficacy and motivation. A trade-off must be made between the psychological experiences of test-takers and measurement efficiency. From the perspective of educators and subject matter experts, nonstatistical specifications, such as content coverage, content balance, and form length are major concerns. Thus, accreditation bodies may be faced with a discrepancy between the perspectives of psychometricians and those of subject matter experts. In order to improve test-takers’ impressions of CAT, the author proposes increasing the target probability of answering correctly in the item selection algorithm even if doing so consequently decreases measurement efficiency. Two different methods, CAT with a shadow test approach and computerized multistage testing, have been developed in order to ensure the satisfaction of subject matter experts. In the shadow test approach, a full-length test is assembled that meets the constraints and provides maximum information at the current ability estimate, while computerized multistage testing gives subject matter experts an opportunity to review all test forms prior to administration.

  20. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  1. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  2. The Simulation Computer Based Learning (SCBL) for Short Circuit Multi Machine Power System Analysis

    Science.gov (United States)

    Rahmaniar; Putri, Maharani

    2018-03-01

    Strengthening Competitiveness of human resources become the reply of college as a conductor of high fomal education. Electrical Engineering Program UNPAB (Prodi TE UNPAB) as one of the department of electrical engineering that manages the field of electrical engineering expertise has a very important part in preparing human resources (HR), Which is required by where graduates are produced by DE UNPAB, Is expected to be able to compete globally, especially related to the implementation of Asean Economic Community (AEC) which requires the active participation of graduates with competence and quality of human resource competitiveness. Preparation of HR formation Competitive is done with the various strategies contained in the Seven (7) Higher Education Standard, one part of which is the implementation of teaching and learning process in Electrical system analysis with short circuit analysis (SCA) This course is a course The core of which is the basis for the competencies of other subjects in the advanced semester at Development of Computer Based Learning model (CBL) is done in the learning of interference analysis of multi-machine short circuit which includes: (a) Short-circuit One phase, (B) Two-phase Short Circuit Disruption, (c) Ground Short Circuit Disruption, (d) Short Circuit Disruption One Ground Floor Development of CBL learning model for Electrical System Analysis course provides space for students to be more active In learning in solving complex (complicated) problems, so it is thrilling Ilkan flexibility of student learning how to actively solve the problem of short-circuit analysis and to form the active participation of students in learning (Student Center Learning, in the course of electrical power system analysis.

  3. Usability of an adaptive computer assistant that improves self-care and health literacy of older adults

    NARCIS (Netherlands)

    Blanson Henkemans, O.A.; Rogers, W.A.; Fisk, A.D.; Neerincx, M.A.; Lindenberg, J.; Mast, C.A.P.G. van der

    2008-01-01

    Objectives: We developed an adaptive computer assistant for the supervision of diabetics' self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient's electronic diary. Accordingly, it provides

  4. An adaptive short-term prediction scheme for wind energy storage management

    International Nuclear Information System (INIS)

    Blonbou, Ruddy; Monjoly, Stephanie; Dorville, Jean-Francois

    2011-01-01

    Research highlights: → We develop a real time algorithm for grid-connected wind energy storage management. → The method aims to guarantee, with ±5% error margin, the power sent to the grid. → Dynamic scheduling of energy storage is based on short-term energy prediction. → Accurate predictions reduce the need in storage capacity. -- Abstract: Efficient forecasting scheme that includes some information on the likelihood of the forecast and based on a better knowledge of the wind variations characteristics along with their influence on power output variation is of key importance for the optimal integration of wind energy in island's power system. In the Guadeloupean archipelago (French West-Indies), with a total wind power capacity of 25 MW; wind energy can represent up to 5% of the instantaneous electricity production. At this level, wind energy contribution can be equivalent to the current network primary control reserve, which causes balancing difficult. The share of wind energy is due to grow even further since the objective is set to reach 118 MW by 2020. It is an absolute evidence for the network operator that due to security concerns of the electrical grid, the share of wind generation should not increase unless solutions are found to solve the prediction problem. The University of French West-Indies and Guyana has developed a short-term wind energy prediction scheme that uses artificial neural networks and adaptive learning procedures based on Bayesian approach and Gaussian approximation. This paper reports the results of the evaluation of the proposed approach; the improvement with respect to the simple persistent prediction model was globally good. A discussion on how such a tool combined with energy storage capacity could help to smooth the wind power variation and improve the wind energy penetration rate into island utility network is also proposed.

  5. Adaptation in CRISPR-Cas Systems.

    Science.gov (United States)

    Sternberg, Samuel H; Richter, Hagen; Charpentier, Emmanuelle; Qimron, Udi

    2016-03-17

    Clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated (Cas) proteins constitute an adaptive immune system in prokaryotes. The system preserves memories of prior infections by integrating short segments of foreign DNA, termed spacers, into the CRISPR array in a process termed adaptation. During the past 3 years, significant progress has been made on the genetic requirements and molecular mechanisms of adaptation. Here we review these recent advances, with a focus on the experimental approaches that have been developed, the insights they generated, and a proposed mechanism for self- versus non-self-discrimination during the process of spacer selection. We further describe the regulation of adaptation and the protein players involved in this fascinating process that allows bacteria and archaea to harbor adaptive immunity. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale – Short Form

    Directory of Open Access Journals (Sweden)

    Fernanda Figueredo Chaves

    Full Text Available ABSTRACT OBJECTIVE To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale – Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. METHODS Assessment of the instrument’s conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee’s assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. RESULTS Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes – Versão Curta, was established. The scale had acceptable internal consistency with Cronbach’s alpha of 0.634 (95%CI 0.494– 0.737, while the correlation of the total score in the two periods was considered moderate (0.47. The intraclass correlation coefficient was 0.50. CONCLUSIONS The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv for recording the Expert Committee responses as well as the responses in the

  7. Building adaptive capacity in Assam

    Directory of Open Access Journals (Sweden)

    Soumyadeep Banerjee

    2015-05-01

    Full Text Available A starting point for adapting to longer-term climate change could be adaptation to short-term climate variability and extreme events. Making more informed choices about the use of remittances can enhance the adaptive capacity of remittance-receiving households.

  8. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  9. Developing brief fatigue short forms calibrated to a common mathematical metric: is content-balancing important?

    Directory of Open Access Journals (Sweden)

    Karon F Cook

    2010-08-01

    Full Text Available Karon F Cook1, Seung W Choi2, Kurt L Johnson1, Dagmar Amtmann11Department of Rehabilitation Medicine, University of Washington, Seattle, WA; 2Northwestern University Feinberg School of Medicine, Chicago, IL, USAAbstract: There are clinical and research settings in which concerns about respondent burden make the use of longer self-report measures impractical. Though computer adaptive testing provides an efficient strategy for measuring patient reported outcomes, the requirement of a computer interface makes it impractical for some settings. This study evaluated how well brief short forms, constructed from a longer measure of patient reported fatigue, reproduced scores on the full measure. When the items of an item bank are calibrated using an item response theory model, it is assumed that the items are fungible units. Theoretically, there should be no advantage to balancing the content coverage of the items. We compared short forms developed using a random item selection process to short forms developed with consideration of the items relation to subdomains of fatigue (ie, physical and cognitive fatigue. Scores on short forms developed using content balancing more successfully predicted full item bank scores than did scores on short forms developed by random selection of items.Keywords: psychometrics, outcomes, quality of life, measurement, fatigue

  10. Adaptive-Predictive Organ Localization Using Cone-Beam Computed Tomography for Improved Accuracy in External Beam Radiotherapy for Bladder Cancer

    International Nuclear Information System (INIS)

    Lalondrelle, Susan; Huddart, Robert; Warren-Oseni, Karole; Hansen, Vibeke Nordmark; McNair, Helen; Thomas, Karen; Dearnaley, David; Horwich, Alan; Khoo, Vincent

    2011-01-01

    Purpose: To examine patterns of bladder wall motion during high-dose hypofractionated bladder radiotherapy and to validate a novel adaptive planning method, A-POLO, to prevent subsequent geographic miss. Methods and Materials: Patterns of individual bladder filling were obtained with repeat computed tomography planning scans at 0, 15, and 30 minutes after voiding. A series of patient-specific plans corresponding to these time-displacement points was created. Pretreatment cone-beam computed tomography was performed before each fraction and assessed retrospectively for adaptive intervention. In fractions that would have required intervention, the most appropriate plan was chosen from the patient's 'library,' and the resulting target coverage was reassessed with repeat cone-beam computed tomography. Results: A large variation in patterns of bladder filling and interfraction displacement was seen. During radiotherapy, predominant translations occurred cranially (maximum 2.5 cm) and anteriorly (maximum 1.75 cm). No apparent explanation was found for this variation using pretreatment patient factors. A need for adaptive planning was demonstrated by 51% of fractions, and 73% of fractions would have been delivered correctly using A-POLO. The adaptive strategy improved target coverage and was able to account for intrafraction motion also. Conclusions: Bladder volume variation will result in geographic miss in a high proportion of delivered bladder radiotherapy treatments. The A-POLO strategy can be used to correct for this and can be implemented from the first fraction of radiotherapy; thus, it is particularly suited to hypofractionated bladder radiotherapy regimens.

  11. Improving Short-Range Ensemble Kalman Storm Surge Forecasting Using Robust Adaptive Inflation

    KAUST Repository

    Altaf, Muhammad

    2013-08-01

    This paper presents a robust ensemble filtering methodology for storm surge forecasting based on the singular evolutive interpolated Kalman (SEIK) filter, which has been implemented in the framework of the H∞ filter. By design, an H∞ filter is more robust than the common Kalman filter in the sense that the estimation error in the H∞ filter has, in general, a finite growth rate with respect to the uncertainties in assimilation. The computational hydrodynamical model used in this study is the Advanced Circulation (ADCIRC) model. The authors assimilate data obtained from Hurricanes Katrina and Ike as test cases. The results clearly show that the H∞-based SEIK filter provides more accurate short-range forecasts of storm surge compared to recently reported data assimilation results resulting from the standard SEIK filter.

  12. Improving Short-Range Ensemble Kalman Storm Surge Forecasting Using Robust Adaptive Inflation

    KAUST Repository

    Altaf, Muhammad; Butler, T.; Luo, X.; Dawson, C.; Mayo, T.; Hoteit, Ibrahim

    2013-01-01

    This paper presents a robust ensemble filtering methodology for storm surge forecasting based on the singular evolutive interpolated Kalman (SEIK) filter, which has been implemented in the framework of the H∞ filter. By design, an H∞ filter is more robust than the common Kalman filter in the sense that the estimation error in the H∞ filter has, in general, a finite growth rate with respect to the uncertainties in assimilation. The computational hydrodynamical model used in this study is the Advanced Circulation (ADCIRC) model. The authors assimilate data obtained from Hurricanes Katrina and Ike as test cases. The results clearly show that the H∞-based SEIK filter provides more accurate short-range forecasts of storm surge compared to recently reported data assimilation results resulting from the standard SEIK filter.

  13. Adaptive hybrid brain-computer interaction: ask a trainer for assistance!

    Science.gov (United States)

    Müller-Putz, Gernot R; Steyrl, David; Faller, Josef

    2014-01-01

    In applying mental imagery brain-computer interfaces (BCIs) to end users, training is a key part for novice users to get control. In general learning situations, it is an established concept that a trainer assists a trainee to improve his/her aptitude in certain skills. In this work, we want to evaluate whether we can apply this concept in the context of event-related desynchronization (ERD) based, adaptive, hybrid BCIs. Hence, in a first session we merged the features of a high aptitude BCI user, a trainer, and a novice user, the trainee, in a closed-loop BCI feedback task and automatically adapted the classifier over time. In a second session the trainees operated the system unassisted. Twelve healthy participants ran through this protocol. Along with the trainer, the trainees achieved a very high overall peak accuracy of 95.3 %. In the second session, where users operated the BCI unassisted, they still achieved a high overall peak accuracy of 83.6%. Ten of twelve first time BCI users successfully achieved significantly better than chance accuracy. Concluding, we can say that this trainer-trainee approach is very promising. Future research should investigate, whether this approach is superior to conventional training approaches. This trainer-trainee concept could have potential for future application of BCIs to end users.

  14. Computer prediction of subsurface radionuclide transport: an adaptive numerical method

    International Nuclear Information System (INIS)

    Neuman, S.P.

    1983-01-01

    Radionuclide transport in the subsurface is often modeled with the aid of the advection-dispersion equation. A review of existing computer methods for the solution of this equation shows that there is need for improvement. To answer this need, a new adaptive numerical method is proposed based on an Eulerian-Lagrangian formulation. The method is based on a decomposition of the concentration field into two parts, one advective and one dispersive, in a rigorous manner that does not leave room for ambiguity. The advective component of steep concentration fronts is tracked forward with the aid of moving particles clustered around each front. Away from such fronts the advection problem is handled by an efficient modified method of characteristics called single-step reverse particle tracking. When a front dissipates with time, its forward tracking stops automatically and the corresponding cloud of particles is eliminated. The dispersion problem is solved by an unconventional Lagrangian finite element formulation on a fixed grid which involves only symmetric and diagonal matrices. Preliminary tests against analytical solutions of ne- and two-dimensional dispersion in a uniform steady state velocity field suggest that the proposed adaptive method can handle the entire range of Peclet numbers from 0 to infinity, with Courant numbers well in excess of 1

  15. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali; Ltaief, Hatem; Gratadour, Damien; Keyes, David E.; Sevin, Arnaud; Abdelfattah, Ahmad; Gendron, Eric; Morel, Carine; Vidal, Fabrice

    2014-01-01

    called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used

  16. THE PLUTO CODE FOR ADAPTIVE MESH COMPUTATIONS IN ASTROPHYSICAL FLUID DYNAMICS

    International Nuclear Information System (INIS)

    Mignone, A.; Tzeferacos, P.; Zanni, C.; Bodo, G.; Van Straalen, B.; Colella, P.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  17. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  18. AdapterRemoval v2

    DEFF Research Database (Denmark)

    Schubert, Mikkel; Lindgreen, Stinus; Orlando, Ludovic Antoine Alexandre

    2016-01-01

    BACKGROUND: As high-throughput sequencing platforms produce longer and longer reads, sequences generated from short inserts, such as those obtained from fossil and degraded material, are increasingly expected to contain adapter sequences. Efficient adapter trimming algorithms are also needed to p...

  19. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  20. COMPUTER VISION SYNDROME: A SHORT REVIEW.

    OpenAIRE

    Sameena; Mohd Inayatullah

    2012-01-01

    Computers are probably one of the biggest scientific inventions of the modern era, and since then they have become an integral part of our life. The increased usage of computers have lead to variety of ocular symptoms which includ es eye strain, tired eyes, irritation, redness, blurred vision, and diplopia, collectively referred to as Computer Vision Syndrome (CVS). CVS may have a significant impact not only on visual com fort but also occupational productivit...

  1. Adaptive user interfaces

    CERN Document Server

    1990-01-01

    This book describes techniques for designing and building adaptive user interfaces developed in the large AID project undertaken by the contributors.Key Features* Describes one of the few large-scale adaptive interface projects in the world* Outlines the principles of adaptivity in human-computer interaction

  2. Modeling adaptive and non-adaptive responses to environmental change

    DEFF Research Database (Denmark)

    Coulson, Tim; Kendall, Bruce E; Barthold, Julia A.

    2017-01-01

    , with plastic responses being either adaptive or non-adaptive. We develop an approach that links quantitative genetic theory with data-driven structured models to allow prediction of population responses to environmental change via plasticity and adaptive evolution. After introducing general new theory, we...... construct a number of example models to demonstrate that evolutionary responses to environmental change over the short-term will be considerably slower than plastic responses, and that the rate of adaptive evolution to a new environment depends upon whether plastic responses are adaptive or non-adaptive....... Parameterization of the models we develop requires information on genetic and phenotypic variation and demography that will not always be available, meaning that simpler models will often be required to predict responses to environmental change. We consequently develop a method to examine whether the full...

  3. Adaptive phase measurements in linear optical quantum computation

    International Nuclear Information System (INIS)

    Ralph, T C; Lund, A P; Wiseman, H M

    2005-01-01

    Photon counting induces an effective non-linear optical phase shift in certain states derived by linear optics from single photons. Although this non-linearity is non-deterministic, it is sufficient in principle to allow scalable linear optics quantum computation (LOQC). The most obvious way to encode a qubit optically is as a superposition of the vacuum and a single photon in one mode-so-called 'single-rail' logic. Until now this approach was thought to be prohibitively expensive (in resources) compared to 'dual-rail' logic where a qubit is stored by a photon across two modes. Here we attack this problem with real-time feedback control, which can realize a quantum-limited phase measurement on a single mode, as has been recently demonstrated experimentally. We show that with this added measurement resource, the resource requirements for single-rail LOQC are not substantially different from those of dual-rail LOQC. In particular, with adaptive phase measurements an arbitrary qubit state α vertical bar 0>+β vertical bar 1> can be prepared deterministically

  4. Computerized adaptive testing in computer assisted learning?

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick

    2011-01-01

    A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and

  5. An Adaptive and Integrated Low-Power Framework for Multicore Mobile Computing

    Directory of Open Access Journals (Sweden)

    Jongmoo Choi

    2017-01-01

    Full Text Available Employing multicore in mobile computing such as smartphone and IoT (Internet of Things device is a double-edged sword. It provides ample computing capabilities required in recent intelligent mobile services including voice recognition, image processing, big data analysis, and deep learning. However, it requires a great deal of power consumption, which causes creating a thermal hot spot and putting pressure on the energy resource in a mobile device. In this paper, we propose a novel framework that integrates two well-known low-power techniques, DPM (Dynamic Power Management and DVFS (Dynamic Voltage and Frequency Scaling for energy efficiency in multicore mobile systems. The key feature of the proposed framework is adaptability. By monitoring the online resource usage such as CPU utilization and power consumption, the framework can orchestrate diverse DPM and DVFS policies according to workload characteristics. Real implementation based experiments using three mobile devices have shown that it can reduce the power consumption ranging from 22% to 79%, while affecting negligibly the performance of workloads.

  6. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  7. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    International Nuclear Information System (INIS)

    Ball, D.G.; Cheverton, R.D.

    1985-01-01

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  8. Power Spectral Analysis of Short-Term Heart Rate Variability in Healthy and Arrhythmia Subjects by the Adaptive Continuous Morlet Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ram Sewak SINGH

    2017-12-01

    Full Text Available Power spectral analysis of short-term heart rate variability (HRV can provide instant valuable information to understand the functioning of autonomic control over the cardiovascular system. In this study, an adaptive continuous Morlet wavelet transform (ACMWT method has been used to describe the time-frequency characteristics of the HRV using band power spectra and the median value of interquartile range. Adaptation of the method was based on the measurement of maximum energy concentration. The ACMWT has been validated on synthetic signals (i.e. stationary, non-stationary as slow varying and fast changing frequency with time modeled as closest to dynamic changes in HRV signals. This method has been also tested in the presence of additive white Gaussian noise (AWGN to show its robustness towards the noise. From the results of testing on synthetic signals, the ACMWT was found to be an enhanced energy concentration estimator for assessment of power spectral of short-term HRV time series compared to adaptive Stockwell transform (AST, adaptive modified Stockwell transform (AMST, standard continuous Morlet wavelet transform (CMWT and Stockwell transform (ST estimators at statistical significance level of 5%. Further, the ACMWT was applied to real HRV data from Fantasia and MIT-BIH databases, grouped as healthy young group (HYG, healthy elderly group (HEG, arrhythmia controlled medication group (ARCMG, and supraventricular tachycardia group (SVTG subjects. The global results demonstrate that spectral indices of low frequency power (LFp and high frequency power (HFp of HRV were decreased in HEG compared to HYG subjects (p<0.0001. While LFp and HFp indices were increased in ARCMG compared to HEG (p<0.00001. The LFp and HFp components of HRV obtained from SVTG were reduced compared to other group subjects (p<0.00001.

  9. Robust Adaptive LCMV Beamformer Based On An Iterative Suboptimal Solution

    Directory of Open Access Journals (Sweden)

    Xiansheng Guo

    2015-06-01

    Full Text Available The main drawback of closed-form solution of linearly constrained minimum variance (CF-LCMV beamformer is the dilemma of acquiring long observation time for stable covariance matrix estimates and short observation time to track dynamic behavior of targets, leading to poor performance including low signal-noise-ratio (SNR, low jammer-to-noise ratios (JNRs and small number of snapshots. Additionally, CF-LCMV suffers from heavy computational burden which mainly comes from two matrix inverse operations for computing the optimal weight vector. In this paper, we derive a low-complexity Robust Adaptive LCMV beamformer based on an Iterative Suboptimal solution (RAIS-LCMV using conjugate gradient (CG optimization method. The merit of our proposed method is threefold. Firstly, RAIS-LCMV beamformer can reduce the complexity of CF-LCMV remarkably. Secondly, RAIS-LCMV beamformer can adjust output adaptively based on measurement and its convergence speed is comparable. Finally, RAIS-LCMV algorithm has robust performance against low SNR, JNRs, and small number of snapshots. Simulation results demonstrate the superiority of our proposed algorithms.

  10. Cross-Cultural Adaptation and Psychometric Properties of the Malay Version of the Short Sensory Profile.

    Science.gov (United States)

    Ee, Su Im; Loh, Siew Yim; Chinna, Karuthan; Marret, Mary J

    2016-01-01

    To translate, culturally adapt, and examine psychometric properties of the Malay version Short Sensory Profile (SSP-M). Pretesting (n = 30) of the original English SSP established its applicability for use with Malaysian children aged 3-10 years. This was followed by the translation and cross-cultural adaptation of the SSP-M. Two forward and two back translations were compared and reviewed by a committee of 10 experts who validated the content of the SSP-M, before pilot testing (n = 30). The final SSP-M questionnaire was completed by 419 parents of typically developing children aged 3-10 years. Cronbach's alpha of each section of the SSP-M ranged from 0.73 to 0.93 and the intraclass correlation coefficient (ICC) indicated good reliability (0.62-0.93). The seven factor model of the SSP-M had an adequate fit with evidence of convergent and discriminant validity. We conclude that the SSP-M is a valid and reliable screening tool for use in Malaysia with Malay-speaking parents of children aged 3-10 years. The SSP-M enables Malay-speaking parents to answer the questionnaire with better reliability, and provides occupational therapists with a valid tool to screen for sensory processing difficulties.

  11. Adaptive Time Stepping for Transient Network Flow Simulation in Rocket Propulsion Systems

    Science.gov (United States)

    Majumdar, Alok K.; Ravindran, S. S.

    2017-01-01

    Fluid and thermal transients found in rocket propulsion systems such as propellant feedline system is a complex process involving fast phases followed by slow phases. Therefore their time accurate computation requires use of short time step initially followed by the use of much larger time step. Yet there are instances that involve fast-slow-fast phases. In this paper, we present a feedback control based adaptive time stepping algorithm, and discuss its use in network flow simulation of fluid and thermal transients. The time step is automatically controlled during the simulation by monitoring changes in certain key variables and by feedback. In order to demonstrate the viability of time adaptivity for engineering problems, we applied it to simulate water hammer and cryogenic chill down in pipelines. Our comparison and validation demonstrate the accuracy and efficiency of this adaptive strategy.

  12. Effects of Short-Interval and Long-Interval Swimming Protocols on Performance, Aerobic Adaptations, and Technical Parameters: A Training Study.

    Science.gov (United States)

    Dalamitros, Athanasios A; Zafeiridis, Andreas S; Toubekis, Argyris G; Tsalis, George A; Pelarigo, Jailton G; Manou, Vasiliki; Kellis, Spiridon

    2016-10-01

    Dalamitros, AA, Zafeiridis, AS, Toubekis, AG, Tsalis, GA, Pelarigo, JG, Manou, V, and Kellis, S. Effects of short-interval and long-interval swimming protocols on performance, aerobic adaptations, and technical parameters: A training study. J Strength Cond Res 30(10): 2871-2879, 2016-This study compared 2-interval swimming training programs of different work interval durations, matched for total distance and exercise intensity, on swimming performance, aerobic adaptations, and technical parameters. Twenty-four former swimmers were equally divided to short-interval training group (INT50, 12-16 × 50 m with 15 seconds rest), long-interval training group (INT100, 6-8 × 100 m with 30 seconds rest), and a control group (CON). The 2 experimental groups followed the specified swimming training program for 8 weeks. Before and after training, swimming performance, technical parameters, and indices of aerobic adaptations were assessed. ΙΝΤ50 and ΙΝΤ100 improved swimming performance in 100 and 400-m tests and the maximal aerobic speed (p ≤ 0.05); the performance in the 50-m swim did not change. Posttraining V[Combining Dot Above]O2max values were higher compared with pretraining values in both training groups (p ≤ 0.05), whereas peak aerobic power output increased only in INT100 (p ≤ 0.05). The 1-minute heart rate and blood lactate recovery values decreased after training in both groups (p training in both groups (p ≤ 0.05); no changes were observed in stroke rate after training. Comparisons between groups on posttraining mean values, after adjusting for pretraining values, revealed no significant differences between ΙΝΤ50 and ΙΝΤ100 for all variables; however, all measures were improved vs. the respective values in the CON (p training.

  13. Computer adaptive test performance in children with and without disabilities: prospective field study of the PEDI-CAT

    NARCIS (Netherlands)

    Dumas, H.M.; Fragala-Pinkham, M.A.; Haley, S.M.; Ni, P.; Coster, W.; Kramer, J.M.; Kao, Y.C.; Moed, R.; Ludlow, L.H.

    2012-01-01

    PURPOSE: To examine the discriminant validity, test-retest reliability, administration time and acceptability of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT). METHODS: A sample of 102 parents of children 3 through 20 years of age with (n = 50) and without (n =

  14. Turning the Page on Pen-and-Paper Questionnaires: Combining Ecological Momentary Assessment and Computer Adaptive Testing to Transform Psychological Assessment in the 21st Century.

    Science.gov (United States)

    Gibbons, Chris J

    2016-01-01

    The current paper describes new opportunities for patient-centred assessment methods which have come about by the increased adoption of affordable smart technologies in biopsychosocial research and medical care. In this commentary, we review modern assessment methods including item response theory (IRT), computer adaptive testing (CAT), and ecological momentary assessment (EMA) and explain how these methods may be combined to improve psychological assessment. We demonstrate both how a 'naïve' selection of a small group of items in an EMA can lead to unacceptably unreliable assessments and how IRT can provide detailed information on the individual information that each item gives thus allowing short form assessments to be selected with acceptable reliability. The combination of CAT and IRT can ensure assessments are precise, efficient, and well targeted to the individual; allowing EMAs to be both brief and accurate.

  15. Short bowel syndrome

    International Nuclear Information System (INIS)

    Engels, L.G.J.B.

    1983-01-01

    This thesis describes some aspects of short bowel syndrome. When approximately 1 m or less small bowel is retained after extensive resection, a condition called short bowel syndrome is present. Since the advent of parenteral nutrition, the prognosis of patients with a very short bowel has dramatically improved. Patients with 40 to 100 cm remaining jejunum and/or ileum can generally be maintained with oral nutrition due to increased absorption of the small bowel remnant as result of intestinal adaptation. This study reports clinical, biochemical and nutritional aspects of short bowel patients on oral or parenteral nutrition, emphasizing data on absorption of various nutrients and on bone metabolism. Furthermore, some technical apsects concerning long-term parenteral nutrition are discussed. (Auth.)

  16. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    Science.gov (United States)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  17. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    Science.gov (United States)

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  18. Short Paper and Poster Proceedings of the 22nd Annual Conference on Computer Animation and Social Agents

    NARCIS (Netherlands)

    Nijholt, Antinus; Egges, A.; van Welbergen, H.; Hondorp, G.H.W.

    2009-01-01

    These are the proceedings containing the short and poster papers of CASA 2009, the twenty second international conference on Computer Animation and Social Agents. CASA 2009 was organized in Amsterdam, the Netherlands from the 17th to the 19th of June 2009. CASA is organized under the auspices of the

  19. An effective algorithm for approximating adaptive behavior in seasonal environments

    DEFF Research Database (Denmark)

    Sainmont, Julie; Andersen, Ken Haste; Thygesen, Uffe Høgsbro

    2015-01-01

    Behavior affects most aspects of ecological processes and rates, and yet modeling frameworks which efficiently predict and incorporate behavioral responses into ecosystem models remain elusive. Behavioral algorithms based on life-time optimization, adaptive dynamics or game theory are unsuited...... for large global models because of their high computational demand. We compare an easily integrated, computationally efficient behavioral algorithm known as Gilliam's rule against the solution from a life-history optimization. The approximation takes into account only the current conditions to optimize...... behavior; the so-called "myopic approximation", "short sighted", or "static optimization". We explore the performance of the myopic approximation with diel vertical migration (DVM) as an example of a daily routine, a behavior with seasonal dependence that trades off predation risk with foraging...

  20. Short-Term Intercultural Psychotherapy: Ethnographic Inquiry

    Science.gov (United States)

    Seeley, Karen M.

    2004-01-01

    This article examines the challenges specific to short-term intercultural treatments and recently developed approaches to intercultural treatments based on notions of cultural knowledge and cultural competence. The article introduces alternative approaches to short-term intercultural treatments based on ethnographic inquiry adapted for clinical…

  1. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  2. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  3. Online adaptation of a c-VEP Brain-computer Interface(BCI) based on error-related potentials and unsupervised learning.

    Science.gov (United States)

    Spüler, Martin; Rosenstiel, Wolfgang; Bogdan, Martin

    2012-01-01

    The goal of a Brain-Computer Interface (BCI) is to control a computer by pure brain activity. Recently, BCIs based on code-modulated visual evoked potentials (c-VEPs) have shown great potential to establish high-performance communication. In this paper we present a c-VEP BCI that uses online adaptation of the classifier to reduce calibration time and increase performance. We compare two different approaches for online adaptation of the system: an unsupervised method and a method that uses the detection of error-related potentials. Both approaches were tested in an online study, in which an average accuracy of 96% was achieved with adaptation based on error-related potentials. This accuracy corresponds to an average information transfer rate of 144 bit/min, which is the highest bitrate reported so far for a non-invasive BCI. In a free-spelling mode, the subjects were able to write with an average of 21.3 error-free letters per minute, which shows the feasibility of the BCI system in a normal-use scenario. In addition we show that a calibration of the BCI system solely based on the detection of error-related potentials is possible, without knowing the true class labels.

  4. A Combined Methodology of Adaptive Neuro-Fuzzy Inference System and Genetic Algorithm for Short-term Energy Forecasting

    Directory of Open Access Journals (Sweden)

    KAMPOUROPOULOS, K.

    2014-02-01

    Full Text Available This document presents an energy forecast methodology using Adaptive Neuro-Fuzzy Inference System (ANFIS and Genetic Algorithms (GA. The GA has been used for the selection of the training inputs of the ANFIS in order to minimize the training result error. The presented algorithm has been installed and it is being operating in an automotive manufacturing plant. It periodically communicates with the plant to obtain new information and update the database in order to improve its training results. Finally the obtained results of the algorithm are used in order to provide a short-term load forecasting for the different modeled consumption processes.

  5. Short-term adaptations as a response to travel time: results of a stated adaptation experimentincreases

    NARCIS (Netherlands)

    Psarra, I.; Arentze, T.A.; Timmermans, H.J.P.

    2016-01-01

    This study focused on short-term dynamics of activity-travel behavior as a response to travel time increases. It is assumed that short-term changes are triggered by stress, which is defined as the deviation between an individual’s aspirations and his or her daily experiences. When stress exceeds a

  6. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs

    International Nuclear Information System (INIS)

    Nyman, Ulf; Kristiansson, Mattias; Leitz, Wolfram; Paahlstorp, Per-Aake

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations

  7. Resource-adaptive cognitive processes

    CERN Document Server

    Crocker, Matthew W

    2010-01-01

    This book investigates the adaptation of cognitive processes to limited resources. The central topics of this book are heuristics considered as results of the adaptation to resource limitations, through natural evolution in the case of humans, or through artificial construction in the case of computational systems; the construction and analysis of resource control in cognitive processes; and an analysis of resource-adaptivity within the paradigm of concurrent computation. The editors integrated the results of a collaborative 5-year research project that involved over 50 scientists. After a mot

  8. Self-adaptive method to distinguish inner and outer contours of industrial computed tomography image for rapid prototype

    International Nuclear Information System (INIS)

    Duan Liming; Ye Yong; Zhang Xia; Zuo Jian

    2013-01-01

    A self-adaptive identification method is proposed for realizing more accurate and efficient judgment about the inner and outer contours of industrial computed tomography (CT) slice images. The convexity-concavity of the single-pixel-wide closed contour is identified with angle method at first. Then, contours with concave vertices are distinguished to be inner or outer contours with ray method, and contours without concave vertices are distinguished with extreme coordinate value method. The method was chosen to automatically distinguish contours by means of identifying the convexity and concavity of the contours. Thus, the disadvantages of single distinguishing methods, such as ray method's time-consuming and extreme coordinate method's fallibility, can be avoided. The experiments prove the adaptability, efficiency, and accuracy of the self-adaptive method. (authors)

  9. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and distributed......Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  10. Computational Bench Testing to Evaluate the Short-Term Mechanical Performance of a Polymeric Stent.

    Science.gov (United States)

    Bobel, A C; Petisco, S; Sarasua, J R; Wang, W; McHugh, P E

    2015-12-01

    Over the last decade, there has been a significant volume of research focussed on the utilization of biodegradable polymers such as poly-L-lactide-acid (PLLA) for applications associated with cardiovascular disease. More specifically, there has been an emphasis on upgrading current clinical shortfalls experienced with conventional bare metal stents and drug eluting stents. One such approach, the adaption of fully formed polymeric stents has led to a small number of products being commercialized. Unfortunately, these products are still in their market infancy, meaning there is a clear non-occurrence of long term data which can support their mechanical performance in vivo. Moreover, the load carry capacity and other mechanical properties essential to a fully optimized polymeric stent are difficult, timely and costly to establish. With the aim of compiling rapid and representative performance data for specific stent geometries, materials and designs, in addition to reducing experimental timeframes, Computational bench testing via finite element analysis (FEA) offers itself as a very powerful tool. On this basis, the research presented in this paper is concentrated on the finite element simulation of the mechanical performance of PLLA, which is a fully biodegradable polymer, in the stent application, using a non-linear viscous material model. Three physical stent geometries, typically used for fully polymeric stents, are selected, and a comparative study is performed in relation to their short-term mechanical performance, with the aid of experimental data. From the simulated output results, an informed understanding can be established in relation to radial strength, flexibility and longitudinal resistance, that can be compared with conventional permanent metal stent functionality, and the results show that it is indeed possible to generate a PLLA stent with comparable and sufficient mechanical performance. The paper also demonstrates the attractiveness of FEA as a tool

  11. Computational identification of adaptive mutants using the VERT system

    Directory of Open Access Journals (Sweden)

    Winkler James

    2012-04-01

    Full Text Available Background Evolutionary dynamics of microbial organisms can now be visualized using the Visualizing Evolution in Real Time (VERT system, in which several isogenic strains expressing different fluorescent proteins compete during adaptive evolution and are tracked using fluorescent cell sorting to construct a population history over time. Mutations conferring enhanced growth rates can be detected by observing changes in the fluorescent population proportions. Results Using data obtained from several VERT experiments, we construct a hidden Markov-derived model to detect these adaptive events in VERT experiments without external intervention beyond initial training. Analysis of annotated data revealed that the model achieves consensus with human annotation for 85-93% of the data points when detecting adaptive events. A method to determine the optimal time point to isolate adaptive mutants is also introduced. Conclusions The developed model offers a new way to monitor adaptive evolution experiments without the need for external intervention, thereby simplifying adaptive evolution efforts relying on population tracking. Future efforts to construct a fully automated system to isolate adaptive mutants may find the algorithm a useful tool.

  12. Adaptive Radiotherapy Planning on Decreasing Gross Tumor Volumes as Seen on Megavoltage Computed Tomography Images

    International Nuclear Information System (INIS)

    Woodford, Curtis; Yartsev, Slav; Dar, A. Rashid; Bauman, Glenn; Van Dyk, Jake

    2007-01-01

    Purpose: To evaluate gross tumor volume (GTV) changes for patients with non-small-cell lung cancer by using daily megavoltage (MV) computed tomography (CT) studies acquired before each treatment fraction on helical tomotherapy and to relate the potential benefit of adaptive image-guided radiotherapy to changes in GTV. Methods and Materials: Seventeen patients were prescribed 30 fractions of radiotherapy on helical tomotherapy for non-small-cell lung cancer at London Regional Cancer Program from Dec 2005 to March 2007. The GTV was contoured on the daily MVCT studies of each patient. Adapted plans were created using merged MVCT-kilovoltage CT image sets to investigate the advantages of replanning for patients with differing GTV regression characteristics. Results: Average GTV change observed over 30 fractions was -38%, ranging from -12 to -87%. No significant correlation was observed between GTV change and patient's physical or tumor features. Patterns of GTV changes in the 17 patients could be divided broadly into three groups with distinctive potential for benefit from adaptive planning. Conclusions: Changes in GTV are difficult to predict quantitatively based on patient or tumor characteristics. If changes occur, there are points in time during the treatment course when it may be appropriate to adapt the plan to improve sparing of normal tissues. If GTV decreases by greater than 30% at any point in the first 20 fractions of treatment, adaptive planning is appropriate to further improve the therapeutic ratio

  13. Sleep facilitates long-term face adaptation

    OpenAIRE

    Ditye, Thomas; Javadi, Amir Homayoun; Carbon, Claus-Christian; Walsh, Vincent

    2013-01-01

    Adaptation is an automatic neural mechanism supporting the optimization of visual processing on the basis of previous experiences. While the short-term effects of adaptation on behaviour and physiology have been studied extensively, perceptual long-term changes associated with adaptation are still poorly understood. Here, we show that the integration of adaptation-dependent long-term shifts in neural function is facilitated by sleep. Perceptual shifts induced by adaptation to a distorted imag...

  14. Adaptive short-term electricity price forecasting using artificial neural networks in the restructured power markets

    International Nuclear Information System (INIS)

    Yamin, H.Y.; Shahidehpour, S.M.; Li, Z.

    2004-01-01

    This paper proposes a comprehensive model for the adaptive short-term electricity price forecasting using Artificial Neural Networks (ANN) in the restructured power markets. The model consists: price simulation, price forecasting, and performance analysis. The factors impacting the electricity price forecasting, including time factors, load factors, reserve factors, and historical price factor are discussed. We adopted ANN and proposed a new definition for the MAPE using the median to study the relationship between these factors and market price as well as the performance of the electricity price forecasting. The reserve factors are included to enhance the performance of the forecasting process. The proposed model handles the price spikes more efficiently due to considering the median instead of the average. The IEEE 118-bus system and California practical system are used to demonstrate the superiority of the proposed model. (author)

  15. Speed regulating Effects of Incentive-based Intelligent Speed Adaptation in the short and medium term

    DEFF Research Database (Denmark)

    Agerholm, Niels

    Speed regulating Effects of Incentive-based Intelligent Speed Adaptation in the short and medium term Despite massive improvements in vehicles’ safety equipment, more information and safer road network, inappropriate road safety is still causing that more than 250 people are killed and several...... thousands injured each year in Denmark. Until a few years ago the number of fatalities in most countries had decreased while the amount of traffic increased. However, this trend has been replaced by a more uncertain development towards a constant or even somewhat increasing risk. Inappropriate speeding...... is a central cause for the high number of fatalities on the roads. Despite speed limits, speed limit violating driving behaviour is still widespread in Denmark. Traditional solutions to prevent speed violation have been enforcement, information, and enhanced road design. It seems, however, hard to achieve...

  16. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF...

  17. Integrated Adaptive Scenarios for Ariculture: Synergies and Tradeoffs.

    Science.gov (United States)

    Malek, K.; Rajagopalan, K.; Adam, J. C.; Brady, M.; Stockle, C.; Liu, M.; Kruger, C. E.

    2017-12-01

    A wide variety of factors can drive adaptation of the agricultural production sector in response to climate change. Warming and increased growing season length can lead to adoption of newer plant varieties as well as increases in double cropping systems. Changes in expectations of drought frequency or economic factors could lead to adoption of new technology (such as irrigation technology or water trading systems) or crop choices with a view of reducing farm-level risk, and these choices can result in unintended system wide effects. These are all examples of producer adaptation decisions made with a long-term (multiple decades) view. In addition, producers respond to short-term (current year) shocks - such as drought events - through management strategies that include deficit irrigation, fallowing, nutrient management, and engaging in water trading. The effects of these short- and long-term decisions are not independent, and can drive or be driven by the other. For example, investment in new irrigation systems (long-term) can be driven by expectations of short-term crop productivity losses in drought years. Similarly, the capacity to manage for short-term shocks will depend on crop type and variety as well as adopted irrigation technologies. Our overarching objective is to understand the synergies and tradeoffs that exist when combining three potential long-term adaptation strategies and two short-term adaptation strategies, with a view of understanding the synergies and tradeoffs. We apply the integrated crop-hydrology modeling framework VIC-CropSyst, along with the water management module Yakima RiverWare to address these questions over our test area, the Yakima River basin. We consider adoption of a) more efficient irrigation technologies, slower growing crop varieties, and increased prevalence of double cropping systems as long-term adaptation strategies; and b) fallowing and deficit irrigation as short-term responses to droughts. We evaluate the individual and

  18. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  19. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  20. Adjusting for cross-cultural differences in computer-adaptive tests of quality of life.

    Science.gov (United States)

    Gibbons, C J; Skevington, S M

    2018-04-01

    Previous studies using the WHOQOL measures have demonstrated that the relationship between individual items and the underlying quality of life (QoL) construct may differ between cultures. If unaccounted for, these differing relationships can lead to measurement bias which, in turn, can undermine the reliability of results. We used item response theory (IRT) to assess differential item functioning (DIF) in WHOQOL data from diverse language versions collected in UK, Zimbabwe, Russia, and India (total N = 1332). Data were fitted to the partial credit 'Rasch' model. We used four item banks previously derived from the WHOQOL-100 measure, which provided excellent measurement for physical, psychological, social, and environmental quality of life domains (40 items overall). Cross-cultural differential item functioning was assessed using analysis of variance for item residuals and post hoc Tukey tests. Simulated computer-adaptive tests (CATs) were conducted to assess the efficiency and precision of the four items banks. Splitting item parameters by DIF results in four linked item banks without DIF or other breaches of IRT model assumptions. Simulated CATs were more precise and efficient than longer paper-based alternatives. Assessing differential item functioning using item response theory can identify measurement invariance between cultures which, if uncontrolled, may undermine accurate comparisons in computer-adaptive testing assessments of QoL. We demonstrate how compensating for DIF using item anchoring allowed data from all four countries to be compared on a common metric, thus facilitating assessments which were both sensitive to cultural nuance and comparable between countries.

  1. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    Science.gov (United States)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  2. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L. [Oak Ridge National Lab., TN (United States); Sartori, E. [OCDE/OECD NEA Data Bank, Issy-les-Moulineaux (France); Viedma, L.G. de [Consejo de Seguridad Nuclear, Madrid (Spain)

    1997-06-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee`s Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community`s computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management.

  3. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    International Nuclear Information System (INIS)

    Kirk, B.L.; Sartori, E.; Viedma, L.G. de

    1997-01-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee's Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community's computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management

  4. When rapid adaptation paradigm is not too rapid: Evidence of face-sensitive N170 adaptation effects.

    Science.gov (United States)

    Tian, Tengxiang; Feng, Xue; Feng, Chunliang; Gu, Ruolei; Luo, Yue-Jia

    2015-07-01

    Recent findings have demonstrated that N170 adaptation effects evoked by face adaptors are general to face and non-face tests, implicating adaptor-locked interferences in the rapid adaptation paradigm. Here we examined the extent to which adaptor-locked interferences confound N170 adaptation effects in different experimental parameters by manipulating the stimulus onset asynchrony (SOA) duration and jitter between adaptors and tests. In the short SOA, those interferences were well visible for the grand-average ERP waveforms evoked by tests, and they are likely to render rapid adaptation paradigm with short SOA unreliable. The adaptor-locked interferences were attenuated by appropriately increasing SOA duration, such that face-sensitive adaptation effects were evident in the long SOA for both baseline-to-peak and peak-to-peak N170 measurements. These findings suggest that the rapid adaptation paradigm may work with a relative long SOA. Our findings provide useful information for future studies regarding the choosing of appropriate experimental parameters and measurements for the rapid adaptation paradigm. In addition, future studies are needed to investigate how to objectively subtract the overlaps of adaptors from tests and to validate the N170 adaptation effect with appropriate behavioral performance. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  6. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  7. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  8. Computational hydrodynamics and optical performance of inductively-coupled plasma adaptive lenses

    Energy Technology Data Exchange (ETDEWEB)

    Mortazavi, M.; Urzay, J., E-mail: jurzay@stanford.edu; Mani, A. [Center for Turbulence Research, Stanford University, Stanford, California 94305-3024 (United States)

    2015-06-15

    This study addresses the optical performance of a plasma adaptive lens for aero-optical applications by using both axisymmetric and three-dimensional numerical simulations. Plasma adaptive lenses are based on the effects of free electrons on the phase velocity of incident light, which, in theory, can be used as a phase-conjugation mechanism. A closed cylindrical chamber filled with Argon plasma is used as a model lens into which a beam of light is launched. The plasma is sustained by applying a radio-frequency electric current through a coil that envelops the chamber. Four different operating conditions, ranging from low to high powers and induction frequencies, are employed in the simulations. The numerical simulations reveal complex hydrodynamic phenomena related to buoyant and electromagnetic laminar transport, which generate, respectively, large recirculating cells and wall-normal compression stresses in the form of local stagnation-point flows. In the axisymmetric simulations, the plasma motion is coupled with near-wall axial striations in the electron-density field, some of which propagate in the form of low-frequency traveling disturbances adjacent to vortical quadrupoles that are reminiscent of Taylor-Görtler flow structures in centrifugally unstable flows. Although the refractive-index fields obtained from axisymmetric simulations lead to smooth beam wavefronts, they are found to be unstable to azimuthal disturbances in three of the four three-dimensional cases considered. The azimuthal striations are optically detrimental, since they produce high-order angular aberrations that account for most of the beam wavefront error. A fourth case is computed at high input power and high induction frequency, which displays the best optical properties among all the three-dimensional simulations considered. In particular, the increase in induction frequency prevents local thermalization and leads to an axisymmetric distribution of electrons even after introduction of

  9. The comparison of the intestinal adaptation effects of subcutaneous ...

    African Journals Online (AJOL)

    Aim: Insulin has been reported to have positive effects on intestinal adaptation after short bowel syndrome when applicated oral or subcutaneously. The purpose of this study is to compare the intestinal adaptation effects of subcutaneous and oral routes of insulin in rats with short bowel syndrome. Materials and Methods: ...

  10. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H; von Schwerin, E; Szepessy, A; Tempone, Raul

    2011-01-01

    . An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates

  11. Drought analysis and short-term forecast in the Aison River Basin (Greece

    Directory of Open Access Journals (Sweden)

    S. Kavalieratou

    2012-05-01

    Full Text Available A combined regional drought analysis and forecast is elaborated and applied to the Aison River Basin (Greece. The historical frequency, duration and severity were estimated using the standardized precipitation index (SPI computed on variable time scales, while short-term drought forecast was investigated by means of 3-D loglinear models. A quasi-association model with homogenous diagonal effect was proposed to fit the observed frequencies of class transitions of the SPI values computed on the 12-month time scale. Then, an adapted submodel was selected for each data set through the backward elimination method. The analysis and forecast of the drought class transition probabilities were based on the odds of the expected frequencies, estimated by these submodels, and the respective confidence intervals of these odds. The parsimonious forecast models fitted adequately the observed data. Results gave a comprehensive insight on drought behavior, highlighting a dominant drought period (1988–1991 with extreme drought events and revealing, in most cases, smooth drought class transitions. The proposed approach can be an efficient tool in regional water resources management and short-term drought warning, especially in irrigated districts.

  12. Neural Computations in a Dynamical System with Multiple Time Scales

    Directory of Open Access Journals (Sweden)

    Yuanyuan Mi

    2016-09-01

    Full Text Available Neural systems display rich short-term dynamics at various levels, e.g., spike-frequencyadaptation (SFA at single neurons, and short-term facilitation (STF and depression (STDat neuronal synapses. These dynamical features typically covers a broad range of time scalesand exhibit large diversity in different brain regions. It remains unclear what the computationalbenefit for the brain to have such variability in short-term dynamics is. In this study, we proposethat the brain can exploit such dynamical features to implement multiple seemingly contradictorycomputations in a single neural circuit. To demonstrate this idea, we use continuous attractorneural network (CANN as a working model and include STF, SFA and STD with increasing timeconstants in their dynamics. Three computational tasks are considered, which are persistent activity,adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, andhence cannot be implemented by a single dynamical feature or any combination with similar timeconstants. However, with properly coordinated STF, SFA and STD, we show that the network isable to implement the three computational tasks concurrently. We hope this study will shed lighton the understanding of how the brain orchestrates its rich dynamics at various levels to realizediverse cognitive functions.

  13. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    Science.gov (United States)

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  14. Water System Adaptation To Hydrological Changes: Module 7, Adaptation Principles and Considerations

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  15. Adaptive algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Lu Wenkai; Yin Fangfang

    2004-01-01

    Algebraic reconstruction techniques (ART) are iterative procedures for reconstructing objects from their projections. It is proven that ART can be computationally efficient by carefully arranging the order in which the collected data are accessed during the reconstruction procedure and adaptively adjusting the relaxation parameters. In this paper, an adaptive algebraic reconstruction technique (AART), which adopts the same projection access scheme in multilevel scheme algebraic reconstruction technique (MLS-ART), is proposed. By introducing adaptive adjustment of the relaxation parameters during the reconstruction procedure, one-iteration AART can produce reconstructions with better quality, in comparison with one-iteration MLS-ART. Furthermore, AART outperforms MLS-ART with improved computational efficiency

  16. A Secure, Scalable and Elastic Autonomic Computing Systems Paradigm: Supporting Dynamic Adaptation of Self-* Services from an Autonomic Cloud

    Directory of Open Access Journals (Sweden)

    Abdul Jaleel

    2018-05-01

    Full Text Available Autonomic computing embeds self-management features in software systems using external feedback control loops, i.e., autonomic managers. In existing models of autonomic computing, adaptive behaviors are defined at the design time, autonomic managers are statically configured, and the running system has a fixed set of self-* capabilities. An autonomic computing design should accommodate autonomic capability growth by allowing the dynamic configuration of self-* services, but this causes security and integrity issues. A secure, scalable and elastic autonomic computing system (SSE-ACS paradigm is proposed to address the runtime inclusion of autonomic managers, ensuring secure communication between autonomic managers and managed resources. Applying the SSE-ACS concept, a layered approach for the dynamic adaptation of self-* services is presented with an online ‘Autonomic_Cloud’ working as the middleware between Autonomic Managers (offering the self-* services and Autonomic Computing System (requiring the self-* services. A stock trading and forecasting system is used for simulation purposes. The security impact of the SSE-ACS paradigm is verified by testing possible attack cases over the autonomic computing system with single and multiple autonomic managers running on the same and different machines. The common vulnerability scoring system (CVSS metric shows a decrease in the vulnerability severity score from high (8.8 for existing ACS to low (3.9 for SSE-ACS. Autonomic managers are introduced into the system at runtime from the Autonomic_Cloud to test the scalability and elasticity. With elastic AMs, the system optimizes the Central Processing Unit (CPU share resulting in an improved execution time for business logic. For computing systems requiring the continuous support of self-management services, the proposed system achieves a significant improvement in security, scalability, elasticity, autonomic efficiency, and issue resolving time

  17. How visual short-term memory maintenance modulates subsequent visual aftereffects.

    Science.gov (United States)

    Saad, Elyana; Silvanto, Juha

    2013-05-01

    Prolonged viewing of a visual stimulus can result in sensory adaptation, giving rise to perceptual phenomena such as the tilt aftereffect (TAE). However, it is not known if short-term memory maintenance induces such effects. We examined how visual short-term memory (VSTM) maintenance modulates the strength of the TAE induced by subsequent visual adaptation. We reasoned that if VSTM maintenance induces aftereffects on subsequent encoding of visual information, then it should either enhance or reduce the TAE induced by a subsequent visual adapter, depending on the congruency of the memory cue and the adapter. Our results were consistent with this hypothesis and thus indicate that the effects of VSTM maintenance can outlast the maintenance period.

  18. Genetic Algorithms for Case Adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Salem, A M [Computer Science Dept, Faculty of Computer and Information Sciences, Ain Shams University, Cairo (Egypt); Mohamed, A H [Solid State Dept., (NCRRT), Cairo (Egypt)

    2008-07-01

    Case based reasoning (CBR) paradigm has been widely used to provide computer support for recalling and adapting known cases to novel situations. Case adaptation algorithms generally rely on knowledge based and heuristics in order to change the past solutions to solve new problems. However, case adaptation has always been a difficult process to engineers within (CBR) cycle. Its difficulties can be referred to its domain dependency; and computational cost. In an effort to solve this problem, this research explores a general-purpose method that applying a genetic algorithm (GA) to CBR adaptation. Therefore, it can decrease the computational complexity of the search space in the problems having a great dependency on their domain knowledge. The proposed model can be used to perform a variety of design tasks on a broad set of application domains. However, it has been implemented for the tablet formulation as a domain of application. The proposed system has improved the performance of the CBR design systems.

  19. Genetic Algorithms for Case Adaptation

    International Nuclear Information System (INIS)

    Salem, A.M.; Mohamed, A.H.

    2008-01-01

    Case based reasoning (CBR) paradigm has been widely used to provide computer support for recalling and adapting known cases to novel situations. Case adaptation algorithms generally rely on knowledge based and heuristics in order to change the past solutions to solve new problems. However, case adaptation has always been a difficult process to engineers within (CBR) cycle. Its difficulties can be referred to its domain dependency; and computational cost. In an effort to solve this problem, this research explores a general-purpose method that applying a genetic algorithm (GA) to CBR adaptation. Therefore, it can decrease the computational complexity of the search space in the problems having a great dependency on their domain knowledge. The proposed model can be used to perform a variety of design tasks on a broad set of application domains. However, it has been implemented for the tablet formulation as a domain of application. The proposed system has improved the performance of the CBR design systems

  20. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  1. Using Artificial Intelligence to Control and Adapt Level of Difficulty in Computer Based, Cognitive Therapy – an Explorative Study

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2011-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...

  2. Spatial co-adaptation of cortical control columns in a micro-ECoG brain-computer interface

    Science.gov (United States)

    Rouse, A. G.; Williams, J. J.; Wheeler, J. J.; Moran, D. W.

    2016-10-01

    Objective. Electrocorticography (ECoG) has been used for a range of applications including electrophysiological mapping, epilepsy monitoring, and more recently as a recording modality for brain-computer interfaces (BCIs). Studies that examine ECoG electrodes designed and implanted chronically solely for BCI applications remain limited. The present study explored how two key factors influence chronic, closed-loop ECoG BCI: (i) the effect of inter-electrode distance on BCI performance and (ii) the differences in neural adaptation and performance when fixed versus adaptive BCI decoding weights are used. Approach. The amplitudes of epidural micro-ECoG signals between 75 and 105 Hz with 300 μm diameter electrodes were used for one-dimensional and two-dimensional BCI tasks. The effect of inter-electrode distance on BCI control was tested between 3 and 15 mm. Additionally, the performance and cortical modulation differences between constant, fixed decoding using a small subset of channels versus adaptive decoding weights using the entire array were explored. Main results. Successful BCI control was possible with two electrodes separated by 9 and 15 mm. Performance decreased and the signals became more correlated when the electrodes were only 3 mm apart. BCI performance in a 2D BCI task improved significantly when using adaptive decoding weights (80%-90%) compared to using constant, fixed weights (50%-60%). Additionally, modulation increased for channels previously unavailable for BCI control under the fixed decoding scheme upon switching to the adaptive, all-channel scheme. Significance. Our results clearly show that neural activity under a BCI recording electrode (which we define as a ‘cortical control column’) readily adapts to generate an appropriate control signal. These results show that the practical minimal spatial resolution of these control columns with micro-ECoG BCI is likely on the order of 3 mm. Additionally, they show that the combination and

  3. Cross-cultural adaptation of the US consumer form of the short Primary Care Assessment Tool (PCAT): the Korean consumer form of the short PCAT (KC PCAT) and the Korean standard form of the short PCAT (KS PCAT).

    Science.gov (United States)

    Jeon, Ki-Yeob

    2011-01-01

    It is well known that countries with well-structured primary care have better health outcomes, better health equity and reduced healthcare costs. This study aimed to culturally modify and validate the US consumer form of the short Primary Care Assessment Tool (PCAT) in primary care in the Republic of Korea (hereafter referred to as Korea). The Korean consumer form of the short PCAT (KC PCAT) was cross-culturally modified from the original version using a standardised transcultural adaptation method. A pre-test version of the KC PCAT was formulated by replacement of four items and modification of a further four items from the 37 items of the original consumer form of the short PCAT at face value evaluation meetings. Pilot testing was done with a convenience sample of 15 responders at two different sites. Test-retest showed high reliability. To validate the KC PCAT, 606 clients participated in a survey carried out in Korea between February and May 2006. Internal consistency reliability, test-retest reliability and factor analysis were conducted in order to test validity. Psychometric testing was carried out on 37 items of the KC PCAT to make the KS PCAT which has 30 items and has seven principal domains: first contact utilisation, first contact accessibility, ongoing accountable care (ongoing care and coordinated rapport care), integrated care (patient-centred care with integration between primary and specialty care or between different specialties), comprehensive care, community-oriented care and culturally-oriented care. Component factors of the verified KS PCAT explained 58.28% of the total variance in the total item scores of primary care. The verified KS PCAT has been characterised by the seven classic domains of primary care with minor modifications. This may provide clues concerning differences in expectations for primary care in the Korean population as compared with that of the US. The KS PCAT is a reliable and valid tool for the evaluation of the quality of

  4. Procedures for Computing Transonic Flows for Control of Adaptive Wind Tunnels. Ph.D. Thesis - Technische Univ., Berlin, Mar. 1986

    Science.gov (United States)

    Rebstock, Rainer

    1987-01-01

    Numerical methods are developed for control of three dimensional adaptive test sections. The physical properties of the design problem occurring in the external field computation are analyzed, and a design procedure suited for solution of the problem is worked out. To do this, the desired wall shape is determined by stepwise modification of an initial contour. The necessary changes in geometry are determined with the aid of a panel procedure, or, with incident flow near the sonic range, with a transonic small perturbation (TSP) procedure. The designed wall shape, together with the wall deflections set during the tunnel run, are the input to a newly derived one-step formula which immediately yields the adapted wall contour. This is particularly important since the classical iterative adaptation scheme is shown to converge poorly for 3D flows. Experimental results obtained in the adaptive test section with eight flexible walls are presented to demonstrate the potential of the procedure. Finally, a method is described to minimize wall interference in 3D flows by adapting only the top and bottom wind tunnel walls.

  5. Adaptive Distributed Data Structure Management for Parallel CFD Applications

    KAUST Repository

    Frisch, Jerome

    2013-09-01

    Computational fluid dynamics (CFD) simulations require a lot of computing resources in terms of CPU time and memory in order to compute with a reasonable physical accuracy. If only uniformly refined domains are applied, the amount of computing cells is growing rather fast if a certain small resolution is physically required. This can be remedied by applying adaptively refined grids. Unfortunately, due to the adaptive refinement procedures, errors are introduced which have to be taken into account. This paper is focussing on implementation details of the applied adaptive data structure management and a qualitative analysis of the introduced errors by analysing a Poisson problem on the given data structure, which has to be solved in every time step of a CFD analysis. Furthermore an adaptive CFD benchmark example is computed, showing the benefits of an adaptive refinement as well as measurements of parallel data distribution and performance. © 2013 IEEE.

  6. Adapting a computer-delivered brief alcohol intervention for veterans with Hepatitis C.

    Science.gov (United States)

    Cucciare, Michael A; Jamison, Andrea L; Combs, Ann S; Joshi, Gauri; Cheung, Ramsey C; Rongey, Catherine; Huggins, Joe; Humphreys, Keith

    2017-12-01

    This study adapted an existing computer-delivered brief alcohol intervention (cBAI) for use in Veterans with the hepatitis C virus (HCV) and examined its acceptability and feasibility in this patient population. A four-stage model consisting of initial pilot testing, qualitative interviews with key stakeholders, development of a beta version of the cBAI, and usability testing was used to achieve the study objectives. In-depth interviews gathered feedback for modifying the cBAI, including adding HCV-related content such as the health effects of alcohol on liver functioning, immune system functioning, and management of HCV, a preference for concepts to be displayed through "newer looking" graphics, and limiting the use of text to convey key concepts. Results from usability testing indicated that the modified cBAI was acceptable and feasible for use in this patient population. The development model used in this study is effective for gathering actionable feedback that can inform the development of a cBAI and can result in the development of an acceptable and feasible intervention for use in this population. Findings also have implications for developing computer-delivered interventions targeting behavior change more broadly.

  7. A structure-based approach to evaluation product adaptability in adaptable design

    International Nuclear Information System (INIS)

    Cheng, Qiang; Liu, Zhifeng; Cai, Ligang; Zhang, Guojun; Gu, Peihua

    2011-01-01

    Adaptable design, as a new design paradigm, involves creating designs and products that can be easily changed to satisfy different requirements. In this paper, two types of product adaptability are proposed as essential adaptability and behavioral adaptability, and through measuring which respectively a model for product adaptability evaluation is developed. The essential adaptability evaluation proceeds with analyzing the independencies of function requirements and function modules firstly based on axiomatic design, and measuring the adaptability of interfaces secondly with three indices. The behavioral adaptability reflected by the performance of adaptable requirements after adaptation is measured based on Kano model. At last, the effectiveness of the proposed method is demonstrated by an illustrative example of the motherboard of a personal computer. The results show that the method can evaluate and reveal the adaptability of a product in essence, and is of directive significance to improving design and innovative design

  8. Short-term locomotor adaptation to a robotic ankle exoskeleton does not alter soleus Hoffmann reflex amplitude.

    Science.gov (United States)

    Kao, Pei-Chun; Lewis, Cara L; Ferris, Daniel P

    2010-07-26

    To improve design of robotic lower limb exoskeletons for gait rehabilitation, it is critical to identify neural mechanisms that govern locomotor adaptation to robotic assistance. Previously, we demonstrated soleus muscle recruitment decreased by approximately 35% when walking with a pneumatically-powered ankle exoskeleton providing plantar flexor torque under soleus proportional myoelectric control. Since a substantial portion of soleus activation during walking results from the stretch reflex, increased reflex inhibition is one potential mechanism for reducing soleus recruitment when walking with exoskeleton assistance. This is clinically relevant because many neurologically impaired populations have hyperactive stretch reflexes and training to reduce the reflexes could lead to substantial improvements in their motor ability. The purpose of this study was to quantify soleus Hoffmann (H-) reflex responses during powered versus unpowered walking. We tested soleus H-reflex responses in neurologically intact subjects (n=8) that had trained walking with the soleus controlled robotic ankle exoskeleton. Soleus H-reflex was tested at the mid and late stance while subjects walked with the exoskeleton on the treadmill at 1.25 m/s, first without power (first unpowered), then with power (powered), and finally without power again (second unpowered). We also collected joint kinematics and electromyography. When the robotic plantar flexor torque was provided, subjects walked with lower soleus electromyographic (EMG) activation (27-48%) and had concomitant reductions in H-reflex amplitude (12-24%) compared to the first unpowered condition. The H-reflex amplitude in proportion to the background soleus EMG during powered walking was not significantly different from the two unpowered conditions. These findings suggest that the nervous system does not inhibit the soleus H-reflex in response to short-term adaption to exoskeleton assistance. Future studies should determine if the

  9. Refficientlib: an efficient load-rebalanced adaptive mesh refinement algorithm for high-performance computational physics meshes

    OpenAIRE

    Baiges Aznar, Joan; Bayona Roa, Camilo Andrés

    2017-01-01

    No separate or additional fees are collected for access to or distribution of the work. In this paper we present a novel algorithm for adaptive mesh refinement in computational physics meshes in a distributed memory parallel setting. The proposed method is developed for nodally based parallel domain partitions where the nodes of the mesh belong to a single processor, whereas the elements can belong to multiple processors. Some of the main features of the algorithm presented in this paper a...

  10. Short bowel syndrome.

    LENUS (Irish Health Repository)

    Donohoe, Claire L

    2012-02-01

    The short bowel syndrome (SBS) is a state of malabsorption following intestinal resection where there is less than 200 cm of intestinal length. The management of short bowel syndrome can be challenging and is best managed by a specialised multidisciplinary team. A good understanding of the pathophysiological consequences of resection of different portions of the small intestine is necessary to anticipate and prevent, where possible, consequences of SBS. Nutrient absorption and fluid and electrolyte management in the initial stages are critical to stabilisation of the patient and to facilitate the process of adaptation. Pharmacological adjuncts to promote adaptation are in the early stages of development. Primary restoration of bowel continuity, if possible, is the principle mode of surgical treatment. Surgical procedures to increase the surface area of the small intestine or improve its function may be of benefit in experienced hands, particularly in the paediatric population. Intestinal transplant is indicated at present for patients who have failed to tolerate long-term parenteral nutrition but with increasing experience, there may be a potentially expanded role for its use in the future.

  11. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  12. An efficient Adaptive Mesh Refinement (AMR) algorithm for the Discontinuous Galerkin method: Applications for the computation of compressible two-phase flows

    Science.gov (United States)

    Papoutsakis, Andreas; Sazhin, Sergei S.; Begg, Steven; Danaila, Ionut; Luddens, Francky

    2018-06-01

    We present an Adaptive Mesh Refinement (AMR) method suitable for hybrid unstructured meshes that allows for local refinement and de-refinement of the computational grid during the evolution of the flow. The adaptive implementation of the Discontinuous Galerkin (DG) method introduced in this work (ForestDG) is based on a topological representation of the computational mesh by a hierarchical structure consisting of oct- quad- and binary trees. Adaptive mesh refinement (h-refinement) enables us to increase the spatial resolution of the computational mesh in the vicinity of the points of interest such as interfaces, geometrical features, or flow discontinuities. The local increase in the expansion order (p-refinement) at areas of high strain rates or vorticity magnitude results in an increase of the order of accuracy in the region of shear layers and vortices. A graph of unitarian-trees, representing hexahedral, prismatic and tetrahedral elements is used for the representation of the initial domain. The ancestral elements of the mesh can be split into self-similar elements allowing each tree to grow branches to an arbitrary level of refinement. The connectivity of the elements, their genealogy and their partitioning are described by linked lists of pointers. An explicit calculation of these relations, presented in this paper, facilitates the on-the-fly splitting, merging and repartitioning of the computational mesh by rearranging the links of each node of the tree with a minimal computational overhead. The modal basis used in the DG implementation facilitates the mapping of the fluxes across the non conformal faces. The AMR methodology is presented and assessed using a series of inviscid and viscous test cases. Also, the AMR methodology is used for the modelling of the interaction between droplets and the carrier phase in a two-phase flow. This approach is applied to the analysis of a spray injected into a chamber of quiescent air, using the Eulerian

  13. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  14. Adaptive weighted anisotropic diffusion for computed tomography denoising

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhi; Silver, Michael D. [Toshiba Medical Research Institute USA, Inc., Vernon Hills, IL (United States); Noshi, Yasuhiro [Toshiba Medical System Corporation, Tokyo (Japan)

    2011-07-01

    With increasing awareness of radiation safety, dose reduction has become an important task of modern CT system development. This paper proposes an adaptive weighted anisotropic diffusion method and an adaptive weighted sharp source anisotropic diffusion method as image domain filters to potentially help dose reduction. Different from existing anisotropic diffusion methods, the proposed methods incorporate an edge-sensitive adaptive source term as part of the diffusion iteration. It provides better edge and detail preservation. Visual evaluation showed that the new methods can reduce noise substantially without apparent edge and detail loss. The quantitative evaluations also showed over 50% of noise reduction in terms of noise standard deviations, which is equivalent to over 75% of dose reduction for a normal dose image quality. (orig.)

  15. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  16. Development of Shunt-Type Three-Phase Active Power Filter with Novel Adaptive Control for Wind Generators

    Directory of Open Access Journals (Sweden)

    Ming-Hung Chen

    2015-01-01

    Full Text Available This paper proposes a new adaptive filter for wind generators that combines instantaneous reactive power compensation technology and current prediction controller, and therefore this system is characterized by low harmonic distortion, high power factor, and small DC-link voltage variations during load disturbances. The performance of the system was first simulated using MATLAB/Simulink, and the possibility of an adaptive digital low-pass filter eliminating current harmonics was confirmed in steady and transient states. Subsequently, a digital signal processor was used to implement an active power filter. The experimental results indicate, that for the rated operation of 2 kVA, the system has a total harmonic distortion of current less than 5.0% and a power factor of 1.0 on the utility side. Thus, the transient performance of the adaptive filter is superior to the traditional digital low-pass filter and is more economical because of its short computation time compared with other types of adaptive filters.

  17. Development of Shunt-Type Three-Phase Active Power Filter with Novel Adaptive Control for Wind Generators.

    Science.gov (United States)

    Chen, Ming-Hung

    2015-01-01

    This paper proposes a new adaptive filter for wind generators that combines instantaneous reactive power compensation technology and current prediction controller, and therefore this system is characterized by low harmonic distortion, high power factor, and small DC-link voltage variations during load disturbances. The performance of the system was first simulated using MATLAB/Simulink, and the possibility of an adaptive digital low-pass filter eliminating current harmonics was confirmed in steady and transient states. Subsequently, a digital signal processor was used to implement an active power filter. The experimental results indicate, that for the rated operation of 2 kVA, the system has a total harmonic distortion of current less than 5.0% and a power factor of 1.0 on the utility side. Thus, the transient performance of the adaptive filter is superior to the traditional digital low-pass filter and is more economical because of its short computation time compared with other types of adaptive filters.

  18. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale - Short Form.

    Science.gov (United States)

    Chaves, Fernanda Figueredo; Reis, Ilka Afonso; Pagano, Adriana Silvina; Torres, Heloísa de Carvalho

    2017-03-23

    To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale - Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. Assessment of the instrument's conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee's assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes - Versão Curta, was established. The scale had acceptable internal consistency with Cronbach's alpha of 0.634 (95%CI 0.494- 0.737), while the correlation of the total score in the two periods was considered moderate (0.47). The intraclass correlation coefficient was 0.50. The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv) for recording the Expert Committee responses as well as the responses in the validation tests proved to be a reliable, safe and innovative method. Traduzir

  19. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  20. Intelligent Adaptation and Personalization Techniques in Computer-Supported Collaborative Learning

    CERN Document Server

    Demetriadis, Stavros; Xhafa, Fatos

    2012-01-01

    Adaptation and personalization have been extensively studied in CSCL research community aiming to design intelligent systems that adaptively support eLearning processes and collaboration. Yet, with the fast development in Internet technologies, especially with the emergence of new data technologies and the mobile technologies, new opportunities and perspectives are opened for advanced adaptive and personalized systems. Adaptation and personalization are posing new research and development challenges to nowadays CSCL systems. In particular, adaptation should be focused in a multi-dimensional way (cognitive, technological, context-aware and personal). Moreover, it should address the particularities of both individual learners and group collaboration. As a consequence, the aim of this book is twofold. On the one hand, it discusses the latest advances and findings in the area of intelligent adaptive and personalized learning systems. On the other hand it analyzes the new implementation perspectives for intelligen...

  1. Unauthorised adaptation of computer programmes - is ...

    African Journals Online (AJOL)

    Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty) Ltd. This case note inter alia analyses the possibility of an author being sued for ...

  2. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  3. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  4. Cross-Cultural adaptation of an instrument to computer accessibility evaluation for students with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Gerusa Ferreira Lourenço

    2015-03-01

    Full Text Available The specific literature indicates that the successful education of children with cerebral palsy may require the implementation of appropriate assistive technology resources, allowing students to improve their performance and complete everyday tasks more efficiently and independently. To this end, these resources must be selected properly, emphasizing the importance of an appropriate initial assessment of the child and the possibilities of the resources available. The present study aimed to translate and adapt theoretically an American instrument that evaluates computer accessibility for people with cerebral palsy, in order to contextualize it for applicability to Brazilian students with cerebral palsy. The methodology involved the steps of translation and cross-cultural adaptation of this instrument, as well as the construction of a supplementary script for additional use of that instrument in the educational context. Translation procedures, theoretical and technical adaptation of the American instrument and theoretical analysis (content and semantics were carried out with the participation of professional experts of the special education area as adjudicators. The results pointed to the relevance of the proposal of the translated instrument in conjunction with the script built to the reality of professionals involved with the education of children with cerebral palsy, such as occupational therapists and special educators.

  5. Success and adaptation

    CERN Multimedia

    2013-01-01

    Yesterday morning, the last colliding proton beams of 2013 were extracted from the LHC, heralding the start of the machine’s first long shutdown (LS1) and crowning its first three glorious years of running. I hardly need to tell the CERN community what a fabulous performance all the people running the machine, the experiments, the computing and all supporting infrastructures put in. Those people are you, and you all know very well what a great job everyone did.   Nevertheless, I would like to express my thanks to all the people who made this first LHC run such a success. Re-measuring the whole Standard Model in such a short period, and then completing it with the discovery of what looks increasingly like the Higgs boson, is no mean feat. What I’d like to focus on today is another aspect of our field: its remarkable ability to adapt. When I started out in research, experiments involved a handful of people and lasted a few years at most. The timescale for the development of ...

  6. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    Directory of Open Access Journals (Sweden)

    Jack Burston

    2014-09-01

    Full Text Available This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2 English ability of incoming students is by means of a computer-based test since such evaluation can be administered quickly, automatically corrected, and the outcome known as soon as the test is completed. While the option of using a commercial CAT is available to institutions with the ability to pay substantial annual fees, or the means of passing these expenses on to their students, language instructors without these resources can only avail themselves of the advantages of CAT evaluation by creating their own tests.  As is demonstrated by the E-CAT project described in this paper, this is a viable alternative even for those lacking any computer programing expertise.  However, language teaching experience and testing expertise are critical to such an undertaking, which requires considerable effort and, above all, collaborative teamwork to succeed. A number of practical skills are also required. Firstly, the operation of a CAT authoring programme must be learned. Once this is done, test makers must master the art of creating a question database and assigning difficulty levels to test items. Lastly, if multimedia resources are to be exploited in a CAT, test creators need to be able to locate suitable copyright-free resources and re-edit them as needed.

  7. SAGE - MULTIDIMENSIONAL SELF-ADAPTIVE GRID CODE

    Science.gov (United States)

    Davies, C. B.

    1994-01-01

    SAGE, Self Adaptive Grid codE, is a flexible tool for adapting and restructuring both 2D and 3D grids. Solution-adaptive grid methods are useful tools for efficient and accurate flow predictions. In supersonic and hypersonic flows, strong gradient regions such as shocks, contact discontinuities, shear layers, etc., require careful distribution of grid points to minimize grid error and produce accurate flow-field predictions. SAGE helps the user obtain more accurate solutions by intelligently redistributing (i.e. adapting) the original grid points based on an initial or interim flow-field solution. The user then computes a new solution using the adapted grid as input to the flow solver. The adaptive-grid methodology poses the problem in an algebraic, unidirectional manner for multi-dimensional adaptations. The procedure is analogous to applying tension and torsion spring forces proportional to the local flow gradient at every grid point and finding the equilibrium position of the resulting system of grid points. The multi-dimensional problem of grid adaption is split into a series of one-dimensional problems along the computational coordinate lines. The reduced one dimensional problem then requires a tridiagonal solver to find the location of grid points along a coordinate line. Multi-directional adaption is achieved by the sequential application of the method in each coordinate direction. The tension forces direct the redistribution of points to the strong gradient region. To maintain smoothness and a measure of orthogonality of grid lines, torsional forces are introduced that relate information between the family of lines adjacent to one another. The smoothness and orthogonality constraints are direction-dependent, since they relate only the coordinate lines that are being adapted to the neighboring lines that have already been adapted. Therefore the solutions are non-unique and depend on the order and direction of adaption. Non-uniqueness of the adapted grid is

  8. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  9. Adaptation and validation of the short version WHOQOL-HIV in Ethiopia

    DEFF Research Database (Denmark)

    Tesfaye Woldeyohannes, Markos; Olsen, Mette Frahm; Medhin, Girmay

    2016-01-01

    BACKGROUND: Quality of life of patients is an important element in the evaluation of outcome of health care, social services and clinical trials. The WHOQOL instruments were originally developed for measurement of quality of life across cultures. However, there were concerns raised about the cross-cultural...... equivalence of the WHOQOL-HIV when used among people with HIV in Ethiopia. Therefore, this study aimed at adapting the WHOQOL-HIV bref for the Ethiopian setting. METHODS: A step-wise adaptation of the WHOQOL-HIV bref for use in Ethiopia was conducted to produce an Ethiopian version...... were recruited from HIV clinics. RESULTS: In the process of adaptation, new items of relevance to the context were added while seven items were deleted because of problems with acceptability and poor psychometric properties. The Cronbach's α for the final tool with twenty-seven items WHOQOL...

  10. A case study of evolutionary computation of biochemical adaptation

    International Nuclear Information System (INIS)

    François, Paul; Siggia, Eric D

    2008-01-01

    Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein–protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature

  11. Translation, cross-cultural adaptation and psychometric evaluation of yoruba version of the short-form 36 health survey.

    Science.gov (United States)

    Mbada, Chidozie Emmanuel; Adeogun, Gafar Atanda; Ogunlana, Michael Opeoluwa; Adedoyin, Rufus Adesoji; Akinsulore, Adesanmi; Awotidebe, Taofeek Oluwole; Idowu, Opeyemi Ayodiipo; Olaoye, Olumide Ayoola

    2015-09-14

    The Short-Form Health Survey (SF-36) is a valid quality of life tool often employed to determine the impact of medical intervention and the outcome of health care services. However, the SF-36 is culturally sensitive which necessitates its adaptation and translation into different languages. This study was conducted to cross-culturally adapt the SF-36 into Yoruba language and determine its reliability and validity. Based on the International Quality of Life Assessment project guidelines, a sequence of translation, test of item-scale correlation, and validation was implemented for the translation of the Yoruba version of the SF-36. Following pilot testing, the English and the Yoruba versions of the SF-36 were administered to a random sample of 1087 apparently healthy individuals to test validity and 249 respondents completed the Yoruba SF-36 again after two weeks to test reliability. Data was analyzed using Pearson's product moment correlation analysis, independent t-test, one-way analysis of variance, multi trait scaling analysis and Intra-Class Correlation (ICC) at p Yoruba SF-36 ranges between 0.636 and 0.843 for scales; and 0.783 and 0.851 for domains. The data quality, concurrent and discriminant validity, reliability and internal consistency of the Yoruba version of the SF-36 are adequate and it is recommended for measuring health-related quality of life among Yoruba population.

  12. Heating Augmentation for Short Hypersonic Protuberances

    Science.gov (United States)

    Mazaheri, Ali R.; Wood, William A.

    2008-01-01

    Computational aeroheating analyses of the Space Shuttle Orbiter plug repair models are validated against data collected in the Calspan University of Buffalo Research Center (CUBRC) 48 inch shock tunnel. The comparison shows that the average difference between computed heat transfer results and the data is about 9.5%. Using CFD and Wind Tunnel (WT) data, an empirical correlation for estimating heating augmentation on short hypersonic protuberances (k/delta less than 0.3) is proposed. This proposed correlation is compared with several computed flight simulation cases and good agreement is achieved. Accordingly, this correlation is proposed for further investigation on other short hypersonic protuberances for estimating heating augmentation.

  13. Extensive Intestinal Resection Triggers Behavioral Adaptation, Intestinal Remodeling and Microbiota Transition in Short Bowel Syndrome

    Directory of Open Access Journals (Sweden)

    Camille Mayeur

    2016-03-01

    Full Text Available Extensive resection of small bowel often leads to short bowel syndrome (SBS. SBS patients develop clinical mal-absorption and dehydration relative to the reduction of absorptive area, acceleration of gastrointestinal transit time and modifications of the gastrointestinal intra-luminal environment. As a consequence of severe mal-absorption, patients require parenteral nutrition (PN. In adults, the overall adaptation following intestinal resection includes spontaneous and complex compensatory processes such as hyperphagia, mucosal remodeling of the remaining part of the intestine and major modifications of the microbiota. SBS patients, with colon in continuity, harbor a specific fecal microbiota that we called “lactobiota” because it is enriched in the Lactobacillus/Leuconostoc group and depleted in anaerobic micro-organisms (especially Clostridium and Bacteroides. In some patients, the lactobiota-driven fermentative activities lead to an accumulation of fecal d/l-lactates and an increased risk of d-encephalopathy. Better knowledge of clinical parameters and lactobiota characteristics has made it possible to stratify patients and define group at risk for d-encephalopathy crises.

  14. Newnes short wave listening handbook

    CERN Document Server

    Pritchard, Joe

    2013-01-01

    Newnes Short Wave Listening Handbook is a guide for starting up in short wave listening (SWL). The book is comprised of 15 chapters that discuss the basics and fundamental concepts of short wave radio listening. The coverage of the text includes electrical principles; types of signals that can be heard in the radio spectrum; and using computers in SWL. The book also covers SWL equipment, such as receivers, converters, and circuits. The text will be of great use to individuals who want to get into short wave listening.

  15. An adaptive network-based fuzzy inference system for short-term natural gas demand estimation: Uncertain and complex environments

    International Nuclear Information System (INIS)

    Azadeh, A.; Asadzadeh, S.M.; Ghanbari, A.

    2010-01-01

    Accurate short-term natural gas (NG) demand estimation and forecasting is vital for policy and decision-making process in energy sector. Moreover, conventional methods may not provide accurate results. This paper presents an adaptive network-based fuzzy inference system (ANFIS) for estimation of NG demand. Standard input variables are used which are day of the week, demand of the same day in previous year, demand of a day before and demand of 2 days before. The proposed ANFIS approach is equipped with pre-processing and post-processing concepts. Moreover, input data are pre-processed (scaled) and finally output data are post-processed (returned to its original scale). The superiority and applicability of the ANFIS approach is shown for Iranian NG consumption from 22/12/2007 to 30/6/2008. Results show that ANFIS provides more accurate results than artificial neural network (ANN) and conventional time series approach. The results of this study provide policy makers with an appropriate tool to make more accurate predictions on future short-term NG demand. This is because the proposed approach is capable of handling non-linearity, complexity as well as uncertainty that may exist in actual data sets due to erratic responses and measurement errors.

  16. Keyboards: from Typewriters to Tablet Computers

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2014-06-01

    Full Text Available The evolution of Lithuanian keyboards is reviewed. Keyboards are divided up to three categories according to flexibility of their adaptation for typing of Lithuanian texts: 1 mechanical typewriter keyboards (heavily adaptable, 2 electromechanical desktop or laptop computer keyboards, and 3 programmable touch screen tablet computer keyboards (easily adaptable. It is discussed how they were adapted for Lithuanian language, with solutions in other languages are compared. Both successful and unsuccessful solutions are discussed. The reasons of failures as well as their negative impact on writing culture and formation of bad habits in the work with computer are analyzed. The recommendations how to improve current situation are presented.

  17. Duration Adaptation Occurs Across the Sub- and Supra-Second Systems.

    Science.gov (United States)

    Shima, Shuhei; Murai, Yuki; Hashimoto, Yuki; Yotsumoto, Yuko

    2016-01-01

    After repetitive exposure to a stimulus of relatively short duration, a subsequent stimulus of long duration is perceived as being even longer, and after repetitive exposure to a stimulus of relatively long duration, a subsequent stimulus of short duration is perceived as being even shorter. This phenomenon is called duration adaptation, and has been reported only for sub-second durations. We examined whether duration adaptation also occurs for supra-second durations (Experiment 1) and whether duration adaptation occurs across sub- and supra-second durations (Experiment 2). Duration adaptation occurred not only for sub-second durations, but also for supra-second durations and across sub- and supra-second durations. These results suggest that duration adaptation involves an interval-independent system or two functionally related systems that are associated with both the sub- and supra-second durations.

  18. Unstructured mesh adaptivity for urban flooding modelling

    Science.gov (United States)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  19. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  20. Normalised subband adaptive filtering with extended adaptiveness on degree of subband filters

    Science.gov (United States)

    Samuyelu, Bommu; Rajesh Kumar, Pullakura

    2017-12-01

    This paper proposes an adaptive normalised subband adaptive filtering (NSAF) to accomplish the betterment of NSAF performance. In the proposed NSAF, an extended adaptiveness is introduced from its variants in two ways. In the first way, the step-size is set adaptive, and in the second way, the selection of subbands is set adaptive. Hence, the proposed NSAF is termed here as variable step-size-based NSAF with selected subbands (VS-SNSAF). Experimental investigations are carried out to demonstrate the performance (in terms of convergence) of the VS-SNSAF against the conventional NSAF and its state-of-the-art adaptive variants. The results report the superior performance of VS-SNSAF over the traditional NSAF and its variants. It is also proved for its stability, robustness against noise and substantial computing complexity.

  1. Circuit motifs for contrast-adaptive differentiation in early sensory systems: the role of presynaptic inhibition and short-term plasticity.

    Science.gov (United States)

    Zhang, Danke; Wu, Si; Rasch, Malte J

    2015-01-01

    In natural signals, such as the luminance value across of a visual scene, abrupt changes in intensity value are often more relevant to an organism than intensity values at other positions and times. Thus to reduce redundancy, sensory systems are specialized to detect the times and amplitudes of informative abrupt changes in the input stream rather than coding the intensity values at all times. In theory, a system that responds transiently to fast changes is called a differentiator. In principle, several different neural circuit mechanisms exist that are capable of responding transiently to abrupt input changes. However, it is unclear which circuit would be best suited for early sensory systems, where the dynamic range of the natural input signals can be very wide. We here compare the properties of different simple neural circuit motifs for implementing signal differentiation. We found that a circuit motif based on presynaptic inhibition (PI) is unique in a sense that the vesicle resources in the presynaptic site can be stably maintained over a wide range of stimulus intensities, making PI a biophysically plausible mechanism to implement a differentiator with a very wide dynamical range. Moreover, by additionally considering short-term plasticity (STP), differentiation becomes contrast adaptive in the PI-circuit but not in other potential neural circuit motifs. Numerical simulations show that the behavior of the adaptive PI-circuit is consistent with experimental observations suggesting that adaptive presynaptic inhibition might be a good candidate neural mechanism to achieve differentiation in early sensory systems.

  2. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    Science.gov (United States)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the

  3. Cone Beam Computed Tomography-Derived Adaptive Radiotherapy for Radical Treatment of Esophageal Cancer

    International Nuclear Information System (INIS)

    Hawkins, Maria A.; Brooks, Corrinne; Hansen, Vibeke N.; Aitken, Alexandra; Tait, Diana M.

    2010-01-01

    Purpose: To investigate the potential for reduction in normal tissue irradiation by creating a patient specific planning target volume (PTV) using cone beam computed tomography (CBCT) imaging acquired in the first week of radiotherapy for patients receiving radical radiotherapy. Methods and materials: Patients receiving radical RT for carcinoma of the esophagus were investigated. The PTV is defined as CTV(tumor, nodes) plus esophagus outlined 3 to 5 cm cranio-caudally and a 1.5-cm circumferential margin is added (clinical plan). Prefraction CBCT are acquired on Days 1 to 4, then weekly. No correction for setup error made. The images are imported into the planning system. The tumor and esophagus for the length of the PTV are contoured on each CBCT and 5 mm margin is added. A composite volume (PTV1) is created using Week 1 composite CBCT volumes. The same process is repeated using CBCT Week 2 to 6 (PTV2). A new plan is created using PTV1 (adaptive plan). The coverage of the 95% isodose of PTV1 is evaluated on PTV2. Dose-volume histograms (DVH) for lungs, heart, and cord for two plans are compared. Results: A total of 139 CBCT for 14 cases were analyzed. For the adaptive plan the coverage of the 95% prescription isodose for PTV1 = 95.6% ± 4% and the PTV2 = 96.8% ± 4.1% (t test, 0.19). Lungs V20 (15.6 Gy vs. 10.2 Gy) and heart mean dose (26.9 Gy vs. 20.7 Gy) were significantly smaller for the adaptive plan. Conclusions: A reduced planning volume can be constructed within the first week of treatment using CBCT. A single plan modification can be performed within the second week of treatment with considerable reduction in organ at risk dose.

  4. Adaptive homodyne phase discrimination and qubit measurement

    International Nuclear Information System (INIS)

    Sarovar, Mohan; Whaley, K. Birgitta

    2007-01-01

    Fast and accurate measurement is a highly desirable, if not vital, feature of quantum computing architectures. In this work we investigate the usefulness of adaptive measurements in improving the speed and accuracy of qubit measurement. We examine a particular class of quantum computing architectures, ones based on qubits coupled to well-controlled harmonic oscillator modes (reminiscent of cavity QED), where adaptive schemes for measurement are particularly appropriate. In such architectures, qubit measurement is equivalent to phase discrimination for a mode of the electromagnetic field, and we examine adaptive techniques for doing this. In the final section we present a concrete example of applying adaptive measurement to the particularly well-developed circuit-QED architecture

  5. Adaptive finite element methods for differential equations

    CERN Document Server

    Bangerth, Wolfgang

    2003-01-01

    These Lecture Notes discuss concepts of `self-adaptivity' in the numerical solution of differential equations, with emphasis on Galerkin finite element methods. The key issues are a posteriori error estimation and it automatic mesh adaptation. Besides the traditional approach of energy-norm error control, a new duality-based technique, the Dual Weighted Residual method for goal-oriented error estimation, is discussed in detail. This method aims at economical computation of arbitrary quantities of physical interest by properly adapting the computational mesh. This is typically required in the design cycles of technical applications. For example, the drag coefficient of a body immersed in a viscous flow is computed, then it is minimized by varying certain control parameters, and finally the stability of the resulting flow is investigated by solving an eigenvalue problem. `Goal-oriented' adaptivity is designed to achieve these tasks with minimal cost. At the end of each chapter some exercises are posed in order ...

  6. Determinants of Short-Term Export Performance in Pakistan

    OpenAIRE

    Subhani, Muhammad Imtiaz; Osman, Ms.Amber; Habib, Sukaina

    2010-01-01

    This research investigates the interdependency between independent (Increase of pricing strategy adaptation, Increase of export intensity, Firm's commitment to exporting, Export market development, Export market competition, Past Pricing Strategy Adaptation, Past Export Performance Satisfaction, Past Export Intensity, Export market distance) and dependent variables (i.e. Expected Short-Term Export Performance improvement) of export performance. The framework is tested via a survey through que...

  7. Modeling Two Types of Adaptation to Climate Change

    Science.gov (United States)

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  8. Convergence acceleration of Navier-Stokes equation using adaptive wavelet method

    International Nuclear Information System (INIS)

    Kang, Hyung Min; Ghafoor, Imran; Lee, Do Hyung

    2010-01-01

    An efficient adaptive wavelet method is proposed for the enhancement of computational efficiency of the Navier-Stokes equations. The method is based on sparse point representation (SPR), which uses the wavelet decomposition and thresholding to obtain a sparsely distributed dataset. The threshold mechanism is modified in order to maintain the spatial accuracy of a conventional Navier-Stokes solver by adapting the threshold value to the order of spatial truncation error. The computational grid can be dynamically adapted to a transient solution to reflect local changes in the solution. The flux evaluation is then carried out only at the points of the adapted dataset, which reduces the computational effort and memory requirements. A stabilization technique is also implemented to avoid the additional numerical errors introduced by the threshold procedure. The numerical results of the adaptive wavelet method are compared with a conventional solver to validate the enhancement in computational efficiency of Navier-Stokes equations without the degeneration of the numerical accuracy of a conventional solver

  9. Relative codon adaptation: a generic codon bias index for prediction of gene expression.

    Science.gov (United States)

    Fox, Jesse M; Erill, Ivan

    2010-06-01

    The development of codon bias indices (CBIs) remains an active field of research due to their myriad applications in computational biology. Recently, the relative codon usage bias (RCBS) was introduced as a novel CBI able to estimate codon bias without using a reference set. The results of this new index when applied to Escherichia coli and Saccharomyces cerevisiae led the authors of the original publications to conclude that natural selection favours higher expression and enhanced codon usage optimization in short genes. Here, we show that this conclusion was flawed and based on the systematic oversight of an intrinsic bias for short sequences in the RCBS index and of biases in the small data sets used for validation in E. coli. Furthermore, we reveal that how the RCBS can be corrected to produce useful results and how its underlying principle, which we here term relative codon adaptation (RCA), can be made into a powerful reference-set-based index that directly takes into account the genomic base composition. Finally, we show that RCA outperforms the codon adaptation index (CAI) as a predictor of gene expression when operating on the CAI reference set and that this improvement is significantly larger when analysing genomes with high mutational bias.

  10. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    Directory of Open Access Journals (Sweden)

    Laura Acqualagna

    Full Text Available In the last years Brain Computer Interface (BCI technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods.

  11. Computationally efficient implementation of sarse-tap FIR adaptive filters with tap-position control on intel IA-32 processors

    OpenAIRE

    Hirano, Akihiro; Nakayama, Kenji

    2008-01-01

    This paper presents an computationally ef cient implementation of sparse-tap FIR adaptive lters with tapposition control on Intel IA-32 processors with single-instruction multiple-data (SIMD) capability. In order to overcome randomorder memory access which prevents a ectorization, a blockbased processing and a re-ordering buffer are introduced. A dynamic register allocation and the use of memory-to-register operations help the maximization of the loop-unrolling level. Up to 66percent speedup ...

  12. libgapmis: extending short-read alignments.

    Science.gov (United States)

    Alachiotis, Nikolaos; Berger, Simon; Flouri, Tomáš; Pissis, Solon P; Stamatakis, Alexandros

    2013-01-01

    A wide variety of short-read alignment programmes have been published recently to tackle the problem of mapping millions of short reads to a reference genome, focusing on different aspects of the procedure such as time and memory efficiency, sensitivity, and accuracy. These tools allow for a small number of mismatches in the alignment; however, their ability to allow for gaps varies greatly, with many performing poorly or not allowing them at all. The seed-and-extend strategy is applied in most short-read alignment programmes. After aligning a substring of the reference sequence against the high-quality prefix of a short read--the seed--an important problem is to find the best possible alignment between a substring of the reference sequence succeeding and the remaining suffix of low quality of the read--extend. The fact that the reads are rather short and that the gap occurrence frequency observed in various studies is rather low suggest that aligning (parts of) those reads with a single gap is in fact desirable. In this article, we present libgapmis, a library for extending pairwise short-read alignments. Apart from the standard CPU version, it includes ultrafast SSE- and GPU-based implementations. libgapmis is based on an algorithm computing a modified version of the traditional dynamic-programming matrix for sequence alignment. Extensive experimental results demonstrate that the functions of the CPU version provided in this library accelerate the computations by a factor of 20 compared to other programmes. The analogous SSE- and GPU-based implementations accelerate the computations by a factor of 6 and 11, respectively, compared to the CPU version. The library also provides the user the flexibility to split the read into fragments, based on the observed gap occurrence frequency and the length of the read, thereby allowing for a variable, but bounded, number of gaps in the alignment. We present libgapmis, a library for extending pairwise short-read alignments. We

  13. Short Communication Report

    African Journals Online (AJOL)

    Dr Ahmed

    As information requirements become more complex, users have adapted computer in almost all their daily endeavors. This has made a lot of users to be online and perform most of their businesses online. The basic data security utility applications that are provided by most operating systems and other application software ...

  14. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.; Dawson, Clint N.

    2014-01-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  15. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.

    2014-03-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  16. Adaptive Spectral Doppler Estimation

    DEFF Research Database (Denmark)

    Gran, Fredrik; Jakobsson, Andreas; Jensen, Jørgen Arendt

    2009-01-01

    . The methods can also provide better quality of the estimated power spectral density (PSD) of the blood signal. Adaptive spectral estimation techniques are known to pro- vide good spectral resolution and contrast even when the ob- servation window is very short. The 2 adaptive techniques are tested......In this paper, 2 adaptive spectral estimation techniques are analyzed for spectral Doppler ultrasound. The purpose is to minimize the observation window needed to estimate the spectrogram to provide a better temporal resolution and gain more flexibility when designing the data acquisition sequence...... and compared with the averaged periodogram (Welch’s method). The blood power spectral capon (BPC) method is based on a standard minimum variance technique adapted to account for both averaging over slow-time and depth. The blood amplitude and phase estimation technique (BAPES) is based on finding a set...

  17. Career Adapt-Abilities Scale in a French-Speaking Swiss Sample: Psychometric Properties and Relationships to Personality and Work Engagement

    Science.gov (United States)

    Rossier, Jerome; Zecca, Gregory; Stauffer, Sarah D.; Maggiori, Christian; Dauwalder, Jean-Pierre

    2012-01-01

    The aim of this study was to analyze the psychometric properties of the Career Adapt-Abilities Scale (CAAS) in a French-speaking Swiss sample and its relationship with personality dimensions and work engagement. The heterogeneous sample of 391 participants (M[subscript age] = 39.59, SD = 12.30) completed the CAAS-International and a short version…

  18. Reducing adapter synthesis to controller synthesis

    NARCIS (Netherlands)

    Gierds, C.; Mooij, A.J.; Wolf, K.

    2012-01-01

    Service-oriented computing aims to create complex systems by composing less-complex systems, called services. Since services can be developed independently, the integration of services requires an adaptation mechanism for bridging any incompatibilities. Behavioral adapters aim to adjust the

  19. Water System Adaptation To Hydrological Changes: Module 12, Models and Tools for Stormwater and Wastewater System Adaptation

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  20. Adaptive hybrid mesh refinement for multiphysics applications

    International Nuclear Information System (INIS)

    Khamayseh, Ahmed; Almeida, Valmor de

    2007-01-01

    The accuracy and convergence of computational solutions of mesh-based methods is strongly dependent on the quality of the mesh used. We have developed methods for optimizing meshes that are comprised of elements of arbitrary polygonal and polyhedral type. We present in this research the development of r-h hybrid adaptive meshing technology tailored to application areas relevant to multi-physics modeling and simulation. Solution-based adaptation methods are used to reposition mesh nodes (r-adaptation) or to refine the mesh cells (h-adaptation) to minimize solution error. The numerical methods perform either the r-adaptive mesh optimization or the h-adaptive mesh refinement method on the initial isotropic or anisotropic meshes to equidistribute weighted geometric and/or solution error function. We have successfully introduced r-h adaptivity to a least-squares method with spherical harmonics basis functions for the solution of the spherical shallow atmosphere model used in climate modeling. In addition, application of this technology also covers a wide range of disciplines in computational sciences, most notably, time-dependent multi-physics, multi-scale modeling and simulation

  1. Synergistic effect of supplemental enteral nutrients and exogenous glucagon-like peptide 2 on intestinal adaptation in a rat model of short bowel syndrome

    DEFF Research Database (Denmark)

    Liu, Xiaowen; Nelson, David W; Holst, Jens Juul

    2006-01-01

    BACKGROUND: Short bowel syndrome (SBS) can lead to intestinal failure and require total or supplemental parenteral nutrition (TPN or PN, respectively). Glucagon-like peptide 2 (GLP-2) is a nutrient-dependent, proglucagon-derived gut hormone that stimulates intestinal adaptation. OBJECTIVE: Our...... objective was to determine whether supplemental enteral nutrients (SEN) modulate the intestinotrophic response to a low dose of GLP-2 coinfused with PN in a rat model of SBS (60% jejunoileal resection plus cecectomy). DESIGN: Rats were randomly assigned to 8 treatments by using a 2 x 2 x 2 factorial design...

  2. Partial update least-square adaptive filtering

    CERN Document Server

    Xie, Bei

    2014-01-01

    Adaptive filters play an important role in the fields related to digital signal processing and communication, such as system identification, noise cancellation, channel equalization, and beamforming. In practical applications, the computational complexity of an adaptive filter is an important consideration. The Least Mean Square (LMS) algorithm is widely used because of its low computational complexity (O(N)) and simplicity in implementation. The least squares algorithms, such as Recursive Least Squares (RLS), Conjugate Gradient (CG), and Euclidean Direction Search (EDS), can converge faster a

  3. Adaptive filtering and change detection

    CERN Document Server

    Gustafsson, Fredrik

    2003-01-01

    Adaptive filtering is a classical branch of digital signal processing (DSP). Industrial interest in adaptive filtering grows continuously with the increase in computer performance that allows ever more conplex algorithms to be run in real-time. Change detection is a type of adaptive filtering for non-stationary signals and is also the basic tool in fault detection and diagnosis. Often considered as separate subjects Adaptive Filtering and Change Detection bridges a gap in the literature with a unified treatment of these areas, emphasizing that change detection is a natural extensi

  4. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Science.gov (United States)

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Integrated mechanisms of anticipation and rate-of-change computations in cortical circuits.

    Directory of Open Access Journals (Sweden)

    Gabriel D Puccini

    2007-05-01

    Full Text Available Local neocortical circuits are characterized by stereotypical physiological and structural features that subserve generic computational operations. These basic computations of the cortical microcircuit emerge through the interplay of neuronal connectivity, cellular intrinsic properties, and synaptic plasticity dynamics. How these interacting mechanisms generate specific computational operations in the cortical circuit remains largely unknown. Here, we identify the neurophysiological basis of both the rate of change and anticipation computations on synaptic inputs in a cortical circuit. Through biophysically realistic computer simulations and neuronal recordings, we show that the rate-of-change computation is operated robustly in cortical networks through the combination of two ubiquitous brain mechanisms: short-term synaptic depression and spike-frequency adaptation. We then show how this rate-of-change circuit can be embedded in a convergently connected network to anticipate temporally incoming synaptic inputs, in quantitative agreement with experimental findings on anticipatory responses to moving stimuli in the primary visual cortex. Given the robustness of the mechanism and the widespread nature of the physiological machinery involved, we suggest that rate-of-change computation and temporal anticipation are principal, hard-wired functions of neural information processing in the cortical microcircuit.

  6. Bacterial computing: a form of natural computing and its applications.

    Science.gov (United States)

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  7. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  8. Computational complexity of algorithms for sequence comparison, short-read assembly and genome alignment.

    Science.gov (United States)

    Baichoo, Shakuntala; Ouzounis, Christos A

    A multitude of algorithms for sequence comparison, short-read assembly and whole-genome alignment have been developed in the general context of molecular biology, to support technology development for high-throughput sequencing, numerous applications in genome biology and fundamental research on comparative genomics. The computational complexity of these algorithms has been previously reported in original research papers, yet this often neglected property has not been reviewed previously in a systematic manner and for a wider audience. We provide a review of space and time complexity of key sequence analysis algorithms and highlight their properties in a comprehensive manner, in order to identify potential opportunities for further research in algorithm or data structure optimization. The complexity aspect is poised to become pivotal as we will be facing challenges related to the continuous increase of genomic data on unprecedented scales and complexity in the foreseeable future, when robust biological simulation at the cell level and above becomes a reality. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Efficient computation of the elastography inverse problem by combining variational mesh adaption and a clustering technique

    International Nuclear Information System (INIS)

    Arnold, Alexander; Bruhns, Otto T; Reichling, Stefan; Mosler, Joern

    2010-01-01

    This paper is concerned with an efficient implementation suitable for the elastography inverse problem. More precisely, the novel algorithm allows us to compute the unknown stiffness distribution in soft tissue by means of the measured displacement field by considerably reducing the numerical cost compared to previous approaches. This is realized by combining and further elaborating variational mesh adaption with a clustering technique similar to those known from digital image compression. Within the variational mesh adaption, the underlying finite element discretization is only locally refined if this leads to a considerable improvement of the numerical solution. Additionally, the numerical complexity is reduced by the aforementioned clustering technique, in which the parameters describing the stiffness of the respective soft tissue are sorted according to a predefined number of intervals. By doing so, the number of unknowns associated with the elastography inverse problem can be chosen explicitly. A positive side effect of this method is the reduction of artificial noise in the data (smoothing of the solution). The performance and the rate of convergence of the resulting numerical formulation are critically analyzed by numerical examples.

  10. Validity of Cognitive ability tests – comparison of computerized adaptive testing with paper and pencil and computer-based forms of administrations

    Czech Academy of Sciences Publication Activity Database

    Žitný, P.; Halama, P.; Jelínek, Martin; Květon, Petr

    2012-01-01

    Roč. 54, č. 3 (2012), s. 181-194 ISSN 0039-3320 R&D Projects: GA ČR GP406/09/P284 Institutional support: RVO:68081740 Keywords : item response theory * computerized adaptive testing * paper and pencil * computer-based * criterion and construct validity * efficiency Subject RIV: AN - Psychology Impact factor: 0.215, year: 2012

  11. Short-Term Memory in Habituation and Dishabituation

    Science.gov (United States)

    Whitlow, Jesse William, Jr.

    1975-01-01

    The present research evaluated the refractorylike response decrement, as found in habituation of auditory evoked peripheral vasoconstriction in rabbits, to determine whether or not it represents a short-term habituation process distinct from effector fatigue or sensory adaptation. (Editor)

  12. Adaptive governance : Towards a stable, accountable and responsive government

    NARCIS (Netherlands)

    Janssen, M.F.W.H.A.; van der Voort, H.G.

    2016-01-01

    Organizations are expected to adapt within a short time to deal with changes that might become disruptive if not adequately dealt with. Yet many organizations are unable to adapt effectively or quickly due to the established institutional arrangements and patterns of decision-making and

  13. Comparison of Rigid and Adaptive Methods of Propagating Gross Tumor Volume Through Respiratory Phases of Four-Dimensional Computed Tomography Image Data Set

    International Nuclear Information System (INIS)

    Ezhil, Muthuveni; Choi, Bum; Starkschall, George; Bucci, M. Kara; Vedam, Sastry; Balter, Peter

    2008-01-01

    Purpose: To compare three different methods of propagating the gross tumor volume (GTV) through the respiratory phases that constitute a four-dimensional computed tomography image data set. Methods and Materials: Four-dimensional computed tomography data sets of 20 patients who had undergone definitive hypofractionated radiotherapy to the lung were acquired. The GTV regions of interest (ROIs) were manually delineated on each phase of the four-dimensional computed tomography data set. The ROI from the end-expiration phase was propagated to the remaining nine phases of respiration using the following three techniques: (1) rigid-image registration using in-house software, (2) rigid image registration using research software from a commercial radiotherapy planning system vendor, and (3) rigid-image registration followed by deformable adaptation originally intended for organ-at-risk delineation using the same software. The internal GTVs generated from the various propagation methods were compared with the manual internal GTV using the normalized Dice similarity coefficient (DSC) index. Results: The normalized DSC index of 1.01 ± 0.06 (SD) for rigid propagation using the in-house software program was identical to the normalized DSC index of 1.01 ± 0.06 for rigid propagation achieved with the vendor's research software. Adaptive propagation yielded poorer results, with a normalized DSC index of 0.89 ± 0.10 (paired t test, p <0.001). Conclusion: Propagation of the GTV ROIs through the respiratory phases using rigid- body registration is an acceptable method within a 1-mm margin of uncertainty. The adaptive organ-at-risk propagation method was not applicable to propagating GTV ROIs, resulting in an unacceptable reduction of the volume and distortion of the ROIs

  14. Complications with computer-aided designed/computer-assisted manufactured titanium and soldered gold bars for mandibular implant-overdentures: short-term observations.

    Science.gov (United States)

    Katsoulis, Joannis; Wälchli, Julia; Kobel, Simone; Gholami, Hadi; Mericske-Stern, Regina

    2015-01-01

    Implant-overdentures supported by rigid bars provide stability in the edentulous atrophic mandible. However, fractures of solder joints and matrices, and loosening of screws and matrices were observed with soldered gold bars (G-bars). Computer-aided designed/computer-assisted manufactured (CAD/CAM) titanium bars (Ti-bars) may reduce technical complications due to enhanced material quality. To compare prosthetic-technical maintenance service of mandibular implant-overdentures supported by CAD/CAM Ti-bar and soldered G-bar. Edentulous patients were consecutively admitted for implant-prosthodontic treatment with a maxillary complete denture and a mandibular implant-overdenture connected to a rigid G-bar or Ti-bar. Maintenance service and problems with the implant-retention device complex and the prosthesis were recorded during minimally 3-4 years. Annual peri-implant crestal bone level changes (ΔBIC) were radiographically assessed. Data of 213 edentulous patients (mean age 68 ± 10 years), who had received a total of 477 tapered implants, were available. Ti-bar and G-bar comprised 101 and 112 patients with 231 and 246 implants, respectively. Ti-bar mostly exhibited distal bar extensions (96%) compared to 34% of G-bar (p overdentures supported by soldered gold bars or milled CAD/CAM Ti-bars are a successful treatment modality but require regular maintenance service. These short-term observations support the hypothesis that CAD/CAM Ti-bars reduce technical complications. Fracture location indicated that the titanium thickness around the screw-access hole should be increased. © 2013 Wiley Periodicals, Inc.

  15. Short Bowel Syndrome, a Case of Intestinal Rehabilitation

    Directory of Open Access Journals (Sweden)

    Dianna Ramírez Prada

    2015-05-01

    Full Text Available Case: The objective is to present the successful experience of multidisciplinary management of a patient with short bowel syndrome and intestinal failure with progression to intestinal adaptation. This is a newly born premature with intestinal atresia type IV with multiple intestinal atresia who evolved to intestinal failure and required managed with prolonged parenteral nutritional support, multiple antibiotic schemes, prebiotics, multivitamins, enteral nutrition with elemental formula to achieve their adaptation intestinal until lead to a normal diet. The evolution of these patients intestinal failure is a challenge for the health team, as it not only involves the surgical management of your condition if not basic nutritional support, fluid and electrolyte balance, hepatic dysfunction cholestasis associated infections etc. Discussion: Short bowel syndrome with progression to intestinal failure in children is a condition whose prevalence is increasing worldwide, thanks to advances in neonatal intensive care, neonatal surgery, and nutritional support of patients with conditions such as gastroschisis, omphalocele and necrotizing enterocolitis. Despite the limitations of our health system, it is possible to offer a multidisciplinary and integrated to lead to intestinal adaptation treatment.

  16. An adaptable Boolean net trainable to control a computing robot

    International Nuclear Information System (INIS)

    Lauria, F. E.; Prevete, R.; Milo, M.; Visco, S.

    1999-01-01

    We discuss a method to implement in a Boolean neural network a Hebbian rule so to obtain an adaptable universal control system. We start by presenting both the Boolean neural net and the Hebbian rule we have considered. Then we discuss, first, the problems arising when the latter is naively implemented in a Boolean neural net, second, the method consenting us to overcome them and the ensuing adaptable Boolean neural net paradigm. Next, we present the adaptable Boolean neural net as an intelligent control system, actually controlling a writing robot, and discuss how to train it in the execution of the elementary arithmetic operations on operands represented by numerals with an arbitrary number of digits

  17. Hybrid GPU-CPU adaptive precision ray-triangle intersection tests for robust high-performance GPU dosimetry computations

    International Nuclear Information System (INIS)

    Perrotte, Lancelot; Bodin, Bruno; Chodorge, Laurent

    2011-01-01

    Before an intervention on a nuclear site, it is essential to study different scenarios to identify the less dangerous one for the operator. Therefore, it is mandatory to dispose of an efficient dosimetry simulation code with accurate results. One classical method in radiation protection is the straight-line attenuation method with build-up factors. In the case of 3D industrial scenes composed of meshes, the computation cost resides in the fast computation of all of the intersections between the rays and the triangles of the scene. Efficient GPU algorithms have already been proposed, that enable dosimetry calculation for a huge scene (800000 rays, 800000 triangles) in a fraction of second. But these algorithms are not robust: because of the rounding caused by floating-point arithmetic, the numerical results of the ray-triangle intersection tests can differ from the expected mathematical results. In worst case scenario, this can lead to a computed dose rate dramatically inferior to the real dose rate to which the operator is exposed. In this paper, we present a hybrid GPU-CPU algorithm to manage adaptive precision floating-point arithmetic. This algorithm allows robust ray-triangle intersection tests, with very small loss of performance (less than 5 % overhead), and without any need for scene-dependent tuning. (author)

  18. HAMSTRING ARCHITECTURAL AND FUNCTIONAL ADAPTATIONS FOLLOWING LONG VS. SHORT MUSCLE LENGTH ECCENTRIC TRAINING

    Directory of Open Access Journals (Sweden)

    Kenny Guex

    2016-08-01

    Full Text Available Most common preventive eccentric-based exercises, such as Nordic hamstring do not include any hip flexion. So, the elongation stress reached is lower than during the late swing phase of sprinting. The aim of this study was to assess the evolution of hamstring architectural (fascicle length and pennation angle and functional (concentric and eccentric optimum angles and concentric and eccentric peak torques parameters following a 3-week eccentric resistance program performed at long (LML versus short muscle length (SML. Both groups performed eight sessions of 3-5x8 slow maximal eccentric knee extensions on an isokinetic dynamometer: the SML group at 0° and the LML group at 80° of hip flexion. Architectural parameters were measured using ultrasound imaging and functional parameters using the isokinetic dynamometer. The fascicle length increased by 4.9% (p<0.01, medium effect size in the SML and by 9.3% (p<0.001, large effect size in the LML group. The pennation angle did not change (p=0.83 in the SML and tended to decrease by 0.7° (p=0.09, small effect size in the LML group. The concentric optimum angle tended to decrease by 8.8° (p=0.09, medium effect size in the SML and by 17.3° (p<0.01, large effect size in the LML group. The eccentric optimum angle did not change (p=0.19, small effect size in the SML and tended to decrease by 10.7° (p=0.06, medium effect size in the LML group. The concentric peak torque did not change in the SML (p=0.37 and the LML (p=0.23 groups, whereas eccentric peak torque increased by 12.9% (p<0.01, small effect size and 17.9% (p<0.001, small effect size in the SML and the LML group, respectively. No group-by-time interaction was found for any parameters. A correlation was found between the training-induced change in fascicle length and the change in concentric optimum angle (r=-0.57, p<0.01. These results suggest that performing eccentric exercises lead to several architectural and functional adaptations. However

  19. Exposure Control Using Adaptive Multi-Stage Item Bundles.

    Science.gov (United States)

    Luecht, Richard M.

    This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…

  20. Cas4 Facilitates PAM-Compatible Spacer Selection during CRISPR Adaptation

    OpenAIRE

    Sebastian N. Kieper; Cristóbal Almendros; Juliane Behler; Rebecca E. McKenzie; Franklin L. Nobrega; Anna C. Haagsma; Jochem N.A. Vink; Wolfgang R. Hess; Stan J.J. Brouns

    2018-01-01

    Summary: CRISPR-Cas systems adapt their immunological memory against their invaders by integrating short DNA fragments into clustered regularly interspaced short palindromic repeat (CRISPR) loci. While Cas1 and Cas2 make up the core machinery of the CRISPR integration process, various class I and II CRISPR-Cas systems encode Cas4 proteins for which the role is unknown. Here, we introduced the CRISPR adaptation genes cas1, cas2, and cas4 from the type I-D CRISPR-Cas system of Synechocystis sp....

  1. A dynamically adaptive wavelet approach to stochastic computations based on polynomial chaos - capturing all scales of random modes on independent grids

    International Nuclear Information System (INIS)

    Ren Xiaoan; Wu Wenquan; Xanthis, Leonidas S.

    2011-01-01

    Highlights: → New approach for stochastic computations based on polynomial chaos. → Development of dynamically adaptive wavelet multiscale solver using space refinement. → Accurate capture of steep gradients and multiscale features in stochastic problems. → All scales of each random mode are captured on independent grids. → Numerical examples demonstrate the need for different space resolutions per mode. - Abstract: In stochastic computations, or uncertainty quantification methods, the spectral approach based on the polynomial chaos expansion in random space leads to a coupled system of deterministic equations for the coefficients of the expansion. The size of this system increases drastically when the number of independent random variables and/or order of polynomial chaos expansions increases. This is invariably the case for large scale simulations and/or problems involving steep gradients and other multiscale features; such features are variously reflected on each solution component or random/uncertainty mode requiring the development of adaptive methods for their accurate resolution. In this paper we propose a new approach for treating such problems based on a dynamically adaptive wavelet methodology involving space-refinement on physical space that allows all scales of each solution component to be refined independently of the rest. We exemplify this using the convection-diffusion model with random input data and present three numerical examples demonstrating the salient features of the proposed method. Thus we establish a new, elegant and flexible approach for stochastic problems with steep gradients and multiscale features based on polynomial chaos expansions.

  2. Short-Term Wind Power Forecasting Based on Clustering Pre-Calculated CFD Method

    Directory of Open Access Journals (Sweden)

    Yimei Wang

    2018-04-01

    Full Text Available To meet the increasing wind power forecasting (WPF demands of newly built wind farms without historical data, physical WPF methods are widely used. The computational fluid dynamics (CFD pre-calculated flow fields (CPFF-based WPF is a promising physical approach, which can balance well the competing demands of computational efficiency and accuracy. To enhance its adaptability for wind farms in complex terrain, a WPF method combining wind turbine clustering with CPFF is first proposed where the wind turbines in the wind farm are clustered and a forecasting is undertaken for each cluster. K-means, hierarchical agglomerative and spectral analysis methods are used to establish the wind turbine clustering models. The Silhouette Coefficient, Calinski-Harabaz index and within-between index are proposed as criteria to evaluate the effectiveness of the established clustering models. Based on different clustering methods and schemes, various clustering databases are built for clustering pre-calculated CFD (CPCC-based short-term WPF. For the wind farm case studied, clustering evaluation criteria show that hierarchical agglomerative clustering has reasonable results, spectral clustering is better and K-means gives the best performance. The WPF results produced by different clustering databases also prove the effectiveness of the three evaluation criteria in turn. The newly developed CPCC model has a much higher WPF accuracy than the CPFF model without using clustering techniques, both on temporal and spatial scales. The research provides supports for both the development and improvement of short-term physical WPF systems.

  3. Short-term electricity demand and gas price forecasts using wavelet transforms and adaptive models

    International Nuclear Information System (INIS)

    Nguyen, Hang T.; Nabney, Ian T.

    2010-01-01

    This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their NMSEs are 0.02314 and 0.15384 respectively. (author)

  4. Adaptive Blending of Model and Observations for Automated Short-Range Forecasting: Examples from the Vancouver 2010 Olympic and Paralympic Winter Games

    Science.gov (United States)

    Bailey, Monika E.; Isaac, George A.; Gultepe, Ismail; Heckman, Ivan; Reid, Janti

    2014-01-01

    An automated short-range forecasting system, adaptive blending of observations and model (ABOM), was tested in real time during the 2010 Vancouver Olympic and Paralympic Winter Games in British Columbia. Data at 1-min time resolution were available from a newly established, dense network of surface observation stations. Climatological data were not available at these new stations. This, combined with output from new high-resolution numerical models, provided a unique and exciting setting to test nowcasting systems in mountainous terrain during winter weather conditions. The ABOM method blends extrapolations in time of recent local observations with numerical weather predictions (NWP) model predictions to generate short-range point forecasts of surface variables out to 6 h. The relative weights of the model forecast and the observation extrapolation are based on performance over recent history. The average performance of ABOM nowcasts during February and March 2010 was evaluated using standard scores and thresholds important for Olympic events. Significant improvements over the model forecasts alone were obtained for continuous variables such as temperature, relative humidity and wind speed. The small improvements to forecasts of variables such as visibility and ceiling, subject to discontinuous changes, are attributed to the persistence component of ABOM.

  5. Development of an item bank for the EORTC Role Functioning Computer Adaptive Test (EORTC RF-CAT)

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Petersen, Morten Aa.; Aaronson, Neil

    2016-01-01

    a computer-adaptive test (CAT) for RF. This was part of a larger project whose objective is to develop a CAT version of the EORTC QLQ-C30 which is one of the most widely used HRQOL instruments in oncology. METHODS: In accordance with EORTC guidelines, the development of the RF-CAT comprised four phases...... with good psychometric properties. The resulting item bank exhibits excellent reliability (mean reliability = 0.85, median = 0.95). Using the RF-CAT may allow sample size savings from 11 % up to 50 % compared to using the QLQ-C30 RF scale. CONCLUSIONS: The RF-CAT item bank improves the precision...

  6. Water System Adaptation to Hydrological Changes: Module 10, Basic Principles of Incorporating Adaptation Science into Hydrologic Planning and Design

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  7. Water System Adaptation To Hydrological Changes: Module 14, Life Cycle Analysis (LCA) and Prioritization Tools in Water System Adaptation

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  8. An initial investigation on developing a new method to predict short-term breast cancer risk based on deep learning technology

    Science.gov (United States)

    Qiu, Yuchen; Wang, Yunzhi; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2016-03-01

    In order to establish a new personalized breast cancer screening paradigm, it is critically important to accurately predict the short-term risk of a woman having image-detectable cancer after a negative mammographic screening. In this study, we developed and tested a novel short-term risk assessment model based on deep learning method. During the experiment, a number of 270 "prior" negative screening cases was assembled. In the next sequential ("current") screening mammography, 135 cases were positive and 135 cases remained negative. These cases were randomly divided into a training set with 200 cases and a testing set with 70 cases. A deep learning based computer-aided diagnosis (CAD) scheme was then developed for the risk assessment, which consists of two modules: adaptive feature identification module and risk prediction module. The adaptive feature identification module is composed of three pairs of convolution-max-pooling layers, which contains 20, 10, and 5 feature maps respectively. The risk prediction module is implemented by a multiple layer perception (MLP) classifier, which produces a risk score to predict the likelihood of the woman developing short-term mammography-detectable cancer. The result shows that the new CAD-based risk model yielded a positive predictive value of 69.2% and a negative predictive value of 74.2%, with a total prediction accuracy of 71.4%. This study demonstrated that applying a new deep learning technology may have significant potential to develop a new short-term risk predicting scheme with improved performance in detecting early abnormal symptom from the negative mammograms.

  9. Short-term adaptations following Complex Training in team-sports: A meta-analysis.

    Science.gov (United States)

    Freitas, Tomás T; Martinez-Rodriguez, Alejandro; Calleja-González, Julio; Alcaraz, Pedro E

    2017-01-01

    The purpose of this meta-analysis was to study the short-term adaptations on sprint and vertical jump (VJ) performance following Complex Training (CT) in team-sports. CT is a resistance training method aimed at developing both strength and power, which has a direct effect on sprint and VJ. It consists on alternating heavy resistance training exercises with plyometric/power ones, set for set, on the same workout. A search of electronic databases up to July 2016 (PubMed-MEDLINE, SPORTDiscus, Web of Knowledge) was conducted. Inclusion criteria: 1) at least one CT intervention group; 2) training protocols ≥4-wks; 3) sample of team-sport players; 4) sprint or VJ as an outcome variable. Effect sizes (ES) of each intervention were calculated and subgroup analyses were performed. A total of 9 studies (13 CT groups) met the inclusion criteria. Medium effect sizes (ES) (ES = 0.73) were obtained for pre-post improvements in sprint, and small (ES = 0.41) in VJ, following CT. Experimental-groups presented better post-intervention sprint (ES = 1.01) and VJ (ES = 0.63) performance than control-groups. large ESs were exhibited in younger athletes (training programs >12 total sessions (ES = 0.74). Large ESs in programs with >12 total sessions (ES = 0.81). Medium ESs obtained for under-Division I individuals (ES = 0.56); protocols with intracomplex rest intervals ≥2 min (ES = 0.55); conditioning activities with intensities ≤85% 1RM (ES = 0.64); basketball/volleyball players (ES = 0.55). Small ESs were found for younger athletes (ES = 0.42); interventions ≥6 weeks (ES = 0.45). CT interventions have positive medium effects on sprint performance and small effects on VJ in team-sport athletes. This training method is a suitable option to include in the season planning.

  10. Sauna exposure immediately prior to short-term heat acclimation accelerates phenotypic adaptation in females.

    Science.gov (United States)

    Mee, Jessica A; Peters, Sophie; Doust, Jonathan H; Maxwell, Neil S

    2018-02-01

    Investigate whether a sauna exposure prior to short-term heat acclimation (HA) accelerates phenotypic adaptation in females. Randomised, repeated measures, cross-over trial. Nine females performed two 5-d HA interventions (controlled hyperthermia T re ≥38.5°C), separated by 7-wk, during the follicular phase of the menstrual cycle confirmed by plasma concentrations of 17-β estradiol and progesterone. Prior to each 90-min HA session participants sat for 20-min in either a temperate environment (20°C, 40% RH; HA temp ) wearing shorts and sports bra or a hot environment (50°C, 30% RH) wearing a sauna suit to replicate sauna conditions (HA sauna ). Participants performed a running heat tolerance test (RHTT) 24-h pre and 24-h post HA. Mean heart rate (HR) (85±4 vs. 68±5 bpm, p≤0.001), sweat rate (0.4±0.2 vs. 0.0±0.0Lh -1 , p≤0.001), and thermal sensation (6±0 vs. 5±1, p=0.050) were higher during the sauna compared to temperate exposure. Resting rectal temperature (T re ) (-0.28±0.16°C), peak T re (-0.42±0.22°C), resting HR (-10±4 bpm), peak HR (-12±7 bpm), T re at sweating onset (-0.29±0.17°C) (p≤0.001), thermal sensation (-0.5±0.5; p=0.002), and perceived exertion (-3±2; p≤0.001) reduced during the RHTT, following HA sauna ; but not HA temp . Plasma volume expansion was greater following HA sauna (HA sauna , 9±7%; HA temp , 1±5%; p=0.013). Sweat rate (p≤0.001) increased and sweat NaCl (p=0.006) reduced during the RHTT following HA sauna and HA temp . This novel strategy initiated HA with an attenuation of thermoregulatory, cardiovascular, and perceptual strain in females due to a measurably greater strain in the sauna compared to temperate exposure when adopted prior to STHA. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  11. Bacterial computing: a form of natural computing and its applications

    Directory of Open Access Journals (Sweden)

    Rafael eLahoz-Beltra

    2014-03-01

    Full Text Available The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular learning along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  12. Adaptive [theta]-methods for pricing American options

    Science.gov (United States)

    Khaliq, Abdul Q. M.; Voss, David A.; Kazmi, Kamran

    2008-12-01

    We develop adaptive [theta]-methods for solving the Black-Scholes PDE for American options. By adding a small, continuous term, the Black-Scholes PDE becomes an advection-diffusion-reaction equation on a fixed spatial domain. Standard implementation of [theta]-methods would require a Newton-type iterative procedure at each time step thereby increasing the computational complexity of the methods. Our linearly implicit approach avoids such complications. We establish a general framework under which [theta]-methods satisfy a discrete version of the positivity constraint characteristic of American options, and numerically demonstrate the sensitivity of the constraint. The positivity results are established for the single-asset and independent two-asset models. In addition, we have incorporated and analyzed an adaptive time-step control strategy to increase the computational efficiency. Numerical experiments are presented for one- and two-asset American options, using adaptive exponential splitting for two-asset problems. The approach is compared with an iterative solution of the two-asset problem in terms of computational efficiency.

  13. Effect of Preparation Depth on the Marginal and Internal Adaptation of Computer-aided Design/Computer-assisted Manufacture Endocrowns.

    Science.gov (United States)

    Gaintantzopoulou, M D; El-Damanhoury, H M

    The aim of the study was to evaluate the effect of preparation depth and intraradicular extension on the marginal and internal adaptation of computer-aided design/computer-assisted manufacture (CAD/CAM) endocrown restorations. Standardized preparations were made in resin endodontic tooth models (Nissin Dental), with an intracoronal preparation depth of 2 mm (group H2), with extra 1- (group H3) or 2-mm (group H4) intraradicular extensions in the root canals (n=12). Vita Enamic polymer-infiltrated ceramic-network material endocrowns were fabricated using the CEREC AC CAD/CAM system and were seated on the prepared teeth. Specimens were evaluated by microtomography. Horizontal and vertical tomographic sections were recorded and reconstructed by using the CTSkan software (TView v1.1, Skyscan).The surface/void volume (S/V) in the region of interest was calculated. Marginal gap (MG), absolute marginal discrepancy (MD), and internal marginal gap were measured at various measuring locations and calculated in microscale (μm). Marginal and internal discrepancy data (μm) were analyzed with nonparametric Kruskal-Wallis analysis of variance by ranks with Dunn's post hoc, whereas S/V data were analyzed by one-way analysis of variance and Bonferroni multiple comparisons (α=0.05). Significant differences were found in MG, MD, and internal gap width values between the groups, with H2 showing the lowest values from all groups. S/V calculations presented significant differences between H2 and the other two groups (H3 and H4) tested, with H2 again showing the lowest values. Increasing the intraradicular extension of endocrown restorations increased the marginal and internal gap of endocrown restorations.

  14. Apo AIV and Citrulline Plasma Concentrations in Short Bowel Syndrome Patients: The Influence of Short Bowel Anatomy

    Science.gov (United States)

    Targarona, Jordi; Ruiz, Jorge; García, Natalia; Oró, Denise; García-Villoria, Judit; Creus, Gloria; Pita, Ana M.

    2016-01-01

    Introduction Parenteral nutrition (PN) dependence in short bowel syndrome (SBS) patients is linked to the functionality of the remnant small bowel (RSB). Patients may wean off PN following a period of intestinal adaptation that restores this functionality. Currently, plasma citrulline is the standard biomarker for monitoring intestinal functionality and adaptation. However, available studies reveal that the relationship the biomarker with the length and function of the RSB is arguable. Thus, having additional biomarkers would improve pointing out PN weaning. Aim By measuring concomitant changes in citrulline and the novel biomarker apolipoprotein AIV (Apo AIV), as well as taking into account the anatomy of the RSB, this exploratory study aims to a better understanding of the intestinal adaptation process and characterization of the SBS patients under PN. Methods Thirty four adult SBS patients were selected and assigned to adapted (aSBS) and non-adapted (nSBS) groups after reconstructive surgeries. Remaining jejunum and ileum lengths were recorded. The aSBS patients were either on an oral diet (ORAL group), those with intestinal insufficiency, or on oral and home parenteral nutrition (HPN group), those with chronic intestinal failure. Apo AIV and citrulline were analyzed in plasma samples after overnight fasting. An exploratory ROC analysis using citrulline as gold standard was performed. Results Biomarkers, Apo AIV and citrulline showed a significant correlation with RSBL in aSBS patients. In jejuno-ileocolic patients, only Apo AIV correlated with RSBL (rb = 0.54) and with ileum length (rb = 0.84). In patients without ileum neither biomarker showed any correlation with RSBL. ROC analysis indicated the Apo AIV cut-off value to be 4.6 mg /100 mL for differentiating between the aSBS HPN and ORAL groups. Conclusions Therefore, in addition to citrulline, Apo AIV can be set as a biomarker to monitor intestinal adaptation in SBS patients. As short bowel anatomy is shown

  15. Spontaneous recovery of effects of contrast adaptation without awareness

    Directory of Open Access Journals (Sweden)

    Gaoxing eMei

    2015-09-01

    Full Text Available Prolonged exposure to a high contrast stimulus reduces the neural sensitivity to subsequent similar patterns. Recent work has disclosed that contrast adaptation is controlled by multiple mechanisms operating over differing timescales. Adaptation to high contrast for a relatively longer period can be rapidly eliminated by adaptation to a lower contrast (or meanfield in the present study. Such rapid deadaptation presumably causes a short-term mechanism to signal for a sensitivity increase, cancelling ongoing signals from long-term mechanisms. Once deadaptation ends, the short-term mechanism rapidly returns to baseline, and the slowly decaying effects in the long-term mechanisms reemerge, allowing the perceptual aftereffects to recover during continued testing. Although this spontaneous recovery effect is considered strong evidence supporting the multiple mechanisms theory, it remains controversial whether the effect is mainly driven by visual memory established during the initial longer-term adaptation period. To resolve this debate, we used a modified Continuous Flash Suppression (CFS and visual crowding paradigms to render the adapting stimuli invisible, but still observed the spontaneous recovery phenomenon. These results exclude the possibility that spontaneous recovery found in the previous work was merely the consequence of explicit visual memory. Our findings also demonstrate that contrast adaptation, even at the unconscious processing levels, is controlled by multiple mechanisms.

  16. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  17. Representing adaptive and adaptable Units of Learning. How to model personalized eLearning in IMS Learning Design

    NARCIS (Netherlands)

    Burgos, Daniel; Tattersall, Colin; Koper, Rob

    2006-01-01

    Burgos, D., Tattersall, C., & Koper, E. J. R. (2007). Representing adaptive and adaptable Units of Learning. How to model personalized eLearning in IMS Learning Design. In B. Fernández Manjon, J. M. Sanchez Perez, J. A. Gómez Pulido, M. A. Vega Rodriguez & J. Bravo (Eds.), Computers and Education:

  18. Modeling Adaptation as a Flow and Stock Decsion with Mitigation

    Science.gov (United States)

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  19. Modeling Adaptation as a Flow and Stock Decision with Mitigation

    Science.gov (United States)

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-liv...

  20. Randomised controlled trial of colostrum to improve intestinal function in patients with short bowel syndrome

    DEFF Research Database (Denmark)

    Lund, Pernille; Sangild, Per Torp; Aunsholt, L.

    2012-01-01

    Colostrum is rich in immunoregulatory, antimicrobial and trophic components supporting intestinal development and function in newborns. We assessed whether bovine colostrum could enhance intestinal adaptation and function in adult short bowel syndrome (SBS) patients.......Colostrum is rich in immunoregulatory, antimicrobial and trophic components supporting intestinal development and function in newborns. We assessed whether bovine colostrum could enhance intestinal adaptation and function in adult short bowel syndrome (SBS) patients....

  1. Adaptive vibrational configuration interaction (A-VCI): A posteriori error estimation to efficiently compute anharmonic IR spectra

    Science.gov (United States)

    Garnier, Romain; Odunlami, Marc; Le Bris, Vincent; Bégué, Didier; Baraille, Isabelle; Coulaud, Olivier

    2016-05-01

    A new variational algorithm called adaptive vibrational configuration interaction (A-VCI) intended for the resolution of the vibrational Schrödinger equation was developed. The main advantage of this approach is to efficiently reduce the dimension of the active space generated into the configuration interaction (CI) process. Here, we assume that the Hamiltonian writes as a sum of products of operators. This adaptive algorithm was developed with the use of three correlated conditions, i.e., a suitable starting space, a criterion for convergence, and a procedure to expand the approximate space. The velocity of the algorithm was increased with the use of a posteriori error estimator (residue) to select the most relevant direction to increase the space. Two examples have been selected for benchmark. In the case of H2CO, we mainly study the performance of A-VCI algorithm: comparison with the variation-perturbation method, choice of the initial space, and residual contributions. For CH3CN, we compare the A-VCI results with a computed reference spectrum using the same potential energy surface and for an active space reduced by about 90%.

  2. Efficient computation in adaptive artificial spiking neural networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); R.B.P. Nusselder (Roeland); H.S. Scholte; S.M. Bohte (Sander)

    2017-01-01

    textabstractArtificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of

  3. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Science.gov (United States)

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  4. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Suleman Khan

    2014-01-01

    Full Text Available Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  5. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    Science.gov (United States)

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  6. Communicating climate change adaptation information using web-based platforms

    Directory of Open Access Journals (Sweden)

    E. Karali

    2017-07-01

    Full Text Available To facilitate progress in climate change adaptation policy and practice, it is important not only to ensure the production of accurate, comprehensive and relevant information, but also the easy, timely and affordable access to it. This can contribute to better-informed decisions and improve the design and implementation of adaptation policies and other relevant initiatives. Web-based platforms can play an important role in communicating and distributing data, information and knowledge that become constantly available, reaching out to a large group of potential users. Indeed in the last decade there has been an extensive increase in the number of platforms developed for this purpose in many fields including climate change adaptation. This short paper concentrates on the web-based adaptation platforms developed in Europe. It provides an overview of the recently emerged landscape, examines the basic characteristics of a set of platforms that operate at national, transnational and European level, and discusses some of the key challenges related to their development, maintenance and overall management. Findings presented in this short paper are discussed in greater detailed in the Technical Report of the European Environment Agency Overview of climate change adaptation platforms in Europe.

  7. Communicating climate change adaptation information using web-based platforms

    Science.gov (United States)

    Karali, Eleni; Mattern, Kati

    2017-07-01

    To facilitate progress in climate change adaptation policy and practice, it is important not only to ensure the production of accurate, comprehensive and relevant information, but also the easy, timely and affordable access to it. This can contribute to better-informed decisions and improve the design and implementation of adaptation policies and other relevant initiatives. Web-based platforms can play an important role in communicating and distributing data, information and knowledge that become constantly available, reaching out to a large group of potential users. Indeed in the last decade there has been an extensive increase in the number of platforms developed for this purpose in many fields including climate change adaptation. This short paper concentrates on the web-based adaptation platforms developed in Europe. It provides an overview of the recently emerged landscape, examines the basic characteristics of a set of platforms that operate at national, transnational and European level, and discusses some of the key challenges related to their development, maintenance and overall management. Findings presented in this short paper are discussed in greater detailed in the Technical Report of the European Environment Agency Overview of climate change adaptation platforms in Europe.

  8. Improvement of resolution in full-view linear-array photoacoustic computed tomography using a novel adaptive weighting method

    Science.gov (United States)

    Omidi, Parsa; Diop, Mamadou; Carson, Jeffrey; Nasiriavanaki, Mohammadreza

    2017-03-01

    Linear-array-based photoacoustic computed tomography is a popular methodology for deep and high resolution imaging. However, issues such as phase aberration, side-lobe effects, and propagation limitations deteriorate the resolution. The effect of phase aberration due to acoustic attenuation and constant assumption of the speed of sound (SoS) can be reduced by applying an adaptive weighting method such as the coherence factor (CF). Utilizing an adaptive beamforming algorithm such as the minimum variance (MV) can improve the resolution at the focal point by eliminating the side-lobes. Moreover, invisibility of directional objects emitting parallel to the detection plane, such as vessels and other absorbing structures stretched in the direction perpendicular to the detection plane can degrade resolution. In this study, we propose a full-view array level weighting algorithm in which different weighs are assigned to different positions of the linear array based on an orientation algorithm which uses the histogram of oriented gradient (HOG). Simulation results obtained from a synthetic phantom show the superior performance of the proposed method over the existing reconstruction methods.

  9. Endurance Exercise in Hypoxia, Hyperoxia and Normoxia: Mitochondrial and Global Adaptations.

    Science.gov (United States)

    Przyklenk, Axel; Gutmann, Boris; Schiffer, Thorsten; Hollmann, Wildor; Strueder, Heiko K; Bloch, Wilhelm; Mierau, Andreas; Gehlert, Sebastian

    2017-07-01

    We hypothesized short-term endurance exercise (EN) in hypoxia (HY) to exert decreased mitochondrial adaptation, peak oxygen consumption (VO 2peak ) and peak power output (PPO) compared to EN in normoxia (NOR) and hyperoxia (PER). 11 male subjects performed repeated unipedal cycling EN in HY, PER, and NOR over 4 weeks in a cross-over design. VO 2peak , PPO, rate of perceived exertion (RPE) and blood lactate (Bla) were determined pre- and post-intervention to assess physiological demands and adaptation. Skeletal muscle biopsies were collected to determine molecular mitochondrial signaling and adaptation. Despite reduced exercise intensity (P0.05). Electron transport chain complexes tended to increase in all groups with the highest increase in HY (n.s.). EN-induced mitochondrial adaptability and exercise capacity neither decreased significantly in HY nor increased in PER compared to NOR. Despite decreased exercise intensity, short term EN under HY may not necessarily impair mitochondrial adaptation and exercise capacity while PER does not augment adaptation. HY might strengthen adaptive responses under circumstances when absolute training intensity has to be reduced. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Development of an Adaptive Forecasting System: A Case Study of a PC Manufacturer in South Korea

    Directory of Open Access Journals (Sweden)

    Chihyun Jung

    2016-03-01

    Full Text Available We present a case study of the development of an adaptive forecasting system for a leading personal computer (PC manufacturer in South Korea. It is widely accepted that demand forecasting for products with short product life cycles (PLCs is difficult, and the PLC of a PC is generally very short. The firm has various types of products, and the volatile demand patterns differ by product. Moreover, we found that different departments have different requirements when it comes to the accuracy, point-of-time and range of the forecasts. We divide the demand forecasting process into three stages depending on the requirements and purposes. The systematic forecasting process is then introduced to improve the accuracy of demand forecasting and to meet the department-specific requirements. Moreover, a newly devised short-term forecasting method is presented, which utilizes the long-term forecasting results of the preceding stages. We evaluate our systematic forecasting methods based on actual sales data from the PC manufacturer, where our forecasting methods have been implemented.

  11. Design strategies for irregularly adapting parallel applications

    International Nuclear Information System (INIS)

    Oliker, Leonid; Biswas, Rupak; Shan, Hongzhang; Sing, Jaswinder Pal

    2000-01-01

    Achieving scalable performance for dynamic irregular applications is eminently challenging. Traditional message-passing approaches have been making steady progress towards this goal; however, they suffer from complex implementation requirements. The use of a global address space greatly simplifies the programming task, but can degrade the performance of dynamically adapting computations. In this work, we examine two major classes of adaptive applications, under five competing programming methodologies and four leading parallel architectures. Results indicate that it is possible to achieve message-passing performance using shared-memory programming techniques by carefully following the same high level strategies. Adaptive applications have computational work loads and communication patterns which change unpredictably at runtime, requiring dynamic load balancing to achieve scalable performance on parallel machines. Efficient parallel implementations of such adaptive applications are therefore a challenging task. This work examines the implementation of two typical adaptive applications, Dynamic Remeshing and N-Body, across various programming paradigms and architectural platforms. We compare several critical factors of the parallel code development, including performance, programmability, scalability, algorithmic development, and portability

  12. Cas4 Facilitates PAM-Compatible Spacer Selection during CRISPR Adaptation

    NARCIS (Netherlands)

    Kieper, Sebastian N.; Almendros, Cristóbal; Behler, Juliane; McKenzie, Rebecca E.; Nobrega, Franklin L.; Haagsma, Anna C.; Vink, Jochem N.A.; Hess, Wolfgang R.; Brouns, Stan J.J.

    2018-01-01

    CRISPR-Cas systems adapt their immunological memory against their invaders by integrating short DNA fragments into clustered regularly interspaced short palindromic repeat (CRISPR) loci. While Cas1 and Cas2 make up the core machinery of the CRISPR integration process, various class I and II

  13. Dynamical adaptation in photoreceptors.

    Directory of Open Access Journals (Sweden)

    Damon A Clark

    Full Text Available Adaptation is at the heart of sensation and nowhere is it more salient than in early visual processing. Light adaptation in photoreceptors is doubly dynamical: it depends upon the temporal structure of the input and it affects the temporal structure of the response. We introduce a non-linear dynamical adaptation model of photoreceptors. It is simple enough that it can be solved exactly and simulated with ease; analytical and numerical approaches combined provide both intuition on the behavior of dynamical adaptation and quantitative results to be compared with data. Yet the model is rich enough to capture intricate phenomenology. First, we show that it reproduces the known phenomenology of light response and short-term adaptation. Second, we present new recordings and demonstrate that the model reproduces cone response with great precision. Third, we derive a number of predictions on the response of photoreceptors to sophisticated stimuli such as periodic inputs, various forms of flickering inputs, and natural inputs. In particular, we demonstrate that photoreceptors undergo rapid adaptation of response gain and time scale, over ∼ 300[Formula: see text] ms-i. e., over the time scale of the response itself-and we confirm this prediction with data. For natural inputs, this fast adaptation can modulate the response gain more than tenfold and is hence physiologically relevant.

  14. Hardware Acceleration of Adaptive Neural Algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    As tradit ional numerical computing has faced challenges, researchers have turned towards alternative computing approaches to reduce power - per - computation metrics and improve algorithm performance. Here, we describe an approach towards non - conventional computing that strengthens the connection between machine learning and neuroscience concepts. The Hardware Acceleration of Adaptive Neural Algorithms (HAANA) project ha s develop ed neural machine learning algorithms and hardware for applications in image processing and cybersecurity. While machine learning methods are effective at extracting relevant features from many types of data, the effectiveness of these algorithms degrades when subjected to real - world conditions. Our team has generated novel neural - inspired approa ches to improve the resiliency and adaptability of machine learning algorithms. In addition, we have also designed and fabricated hardware architectures and microelectronic devices specifically tuned towards the training and inference operations of neural - inspired algorithms. Finally, our multi - scale simulation framework allows us to assess the impact of microelectronic device properties on algorithm performance.

  15. A New Mobile Learning Adaptation Model

    OpenAIRE

    Mohamd Hassan Hassan; Jehad Al-Sadi

    2009-01-01

    This paper introduces a new model for m- Learning context adaptation due to the need of utilizing mobile technology in education. Mobile learning; m-Learning for short; in considered to be one of the hottest topics in the educational community, many researches had been done to conceptualize this new form of learning. We are presenting a promising design for a model to adapt the learning content in mobile learning applications in order to match the learner context, preferences and the educatio...

  16. Adaptive oxide electronics: A review

    Science.gov (United States)

    Ha, Sieu D.; Ramanathan, Shriram

    2011-10-01

    Novel information processing techniques are being actively explored to overcome fundamental limitations associated with CMOS scaling. A new paradigm of adaptive electronic devices is emerging that may reshape the frontiers of electronics and enable new modalities. Creating systems that can learn and adapt to various inputs has generally been a complex algorithm problem in information science, albeit with wide-ranging and powerful applications from medical diagnosis to control systems. Recent work in oxide electronics suggests that it may be plausible to implement such systems at the device level, thereby drastically increasing computational density and power efficiency and expanding the potential for electronics beyond Boolean computation. Intriguing possibilities of adaptive electronics include fabrication of devices that mimic human brain functionality: the strengthening and weakening of synapses emulated by electrically, magnetically, thermally, or optically tunable properties of materials.In this review, we detail materials and device physics studies on functional metal oxides that may be utilized for adaptive electronics. It has been shown that properties, such as resistivity, polarization, and magnetization, of many oxides can be modified electrically in a non-volatile manner, suggesting that these materials respond to electrical stimulus similarly as a neural synapse. We discuss what device characteristics will likely be relevant for integration into adaptive platforms and then survey a variety of oxides with respect to these properties, such as, but not limited to, TaOx, SrTiO3, and Bi4-xLaxTi3O12. The physical mechanisms in each case are detailed and analyzed within the framework of adaptive electronics. We then review theoretically formulated and current experimentally realized adaptive devices with functional oxides, such as self-programmable logic and neuromorphic circuits. Finally, we speculate on what advances in materials physics and engineering may

  17. The Psychological Well-Being and Sociocultural Adaptation of Short-Term International Students in Ireland

    Science.gov (United States)

    O'Reilly, Aileen; Ryan, Dermot; Hickey, Tina

    2010-01-01

    This article reports on an empirical study of the psychosocial adaptation of international students in Ireland. Using measures of social support, loneliness, stress, psychological well-being, and sociocultural adaptation, data were obtained from international students and a comparison sample of Irish students. The study found that, although…

  18. Linear hypergeneralization of learned dynamics across movement speeds reveals anisotropic, gain-encoding primitives for motor adaptation.

    Science.gov (United States)

    Joiner, Wilsaan M; Ajayi, Obafunso; Sing, Gary C; Smith, Maurice A

    2011-01-01

    The ability to generalize learned motor actions to new contexts is a key feature of the motor system. For example, the ability to ride a bicycle or swing a racket is often first developed at lower speeds and later applied to faster velocities. A number of previous studies have examined the generalization of motor adaptation across movement directions and found that the learned adaptation decays in a pattern consistent with the existence of motor primitives that display narrow Gaussian tuning. However, few studies have examined the generalization of motor adaptation across movement speeds. Following adaptation to linear velocity-dependent dynamics during point-to-point reaching arm movements at one speed, we tested the ability of subjects to transfer this adaptation to short-duration higher-speed movements aimed at the same target. We found near-perfect linear extrapolation of the trained adaptation with respect to both the magnitude and the time course of the velocity profiles associated with the high-speed movements: a 69% increase in movement speed corresponded to a 74% extrapolation of the trained adaptation. The close match between the increase in movement speed and the corresponding increase in adaptation beyond what was trained indicates linear hypergeneralization. Computational modeling shows that this pattern of linear hypergeneralization across movement speeds is not compatible with previous models of adaptation in which motor primitives display isotropic Gaussian tuning of motor output around their preferred velocities. Instead, we show that this generalization pattern indicates that the primitives involved in the adaptation to viscous dynamics display anisotropic tuning in velocity space and encode the gain between motor output and motion state rather than motor output itself.

  19. Reconciling White-Box and Black-Box Perspectives on Behavioral Self-adaptation

    DEFF Research Database (Denmark)

    Bruni, Roberto; Corradini, Andrea; Gadducci, Fabio

    2015-01-01

    This paper proposes to reconcile two perspectives on behavioral adaptation commonly taken at different stages of the engineering of autonomic computing systems. Requirements engineering activities often take a black-box perspective: A system is considered to be adaptive with respect to an environ......This paper proposes to reconcile two perspectives on behavioral adaptation commonly taken at different stages of the engineering of autonomic computing systems. Requirements engineering activities often take a black-box perspective: A system is considered to be adaptive with respect...... to an environment whenever the system is able to satisfy its goals irrespectively of the environment perturbations. Modeling and programming engineering activities often take a white-box perspective: A system is equipped with suitable adaptation mechanisms and its behavior is classified as adaptive depending...

  20. Adaptive game AI with dynamic scripting

    OpenAIRE

    Spronck, P.; Ponsen, M.J.V.; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.

    2006-01-01

    Online learning in commercial computer games allows computer-controlled opponents to adapt to the way the game is being played. As such it provides a mechanism to deal with weaknesses in the game AI, and to respond to changes in human player tactics.We argue that online learning of game AI should meet four computational and four functional requirements. The computational requirements are speed, effectiveness, robustness and ef- ficiency. The functional requirements are clarity, variety, consi...

  1. Cas4 Facilitates PAM-Compatible Spacer Selection during CRISPR Adaptation

    NARCIS (Netherlands)

    Kieper, S.N.; Almendros, Cristóbal; Behler, Juliane; McKenzie, R.E.; Luzia De Nóbrega, F.; van Eijkeren-Haagsma, A.C.; Vink, J.N.A.; Hess, Wolfgang R.; Brouns, S.J.J.

    2018-01-01

    CRISPR-Cas systems adapt their immunological memory against their invaders by integrating short DNA fragments into clustered regularly interspaced short palindromic repeat (CRISPR) loci. While Cas1 and Cas2 make up the core machinery of the CRISPR integration process, various class I and II

  2. Computed 3D visualisation of an extinct cephalopod using computer tomographs.

    Science.gov (United States)

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites . Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  3. Computed 3D visualisation of an extinct cephalopod using computer tomographs

    Science.gov (United States)

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites. Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  4. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    Science.gov (United States)

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  5. Adaptive Finite Element Method Assisted by Stochastic Simulation of Chemical Systems

    KAUST Repository

    Cotter, Simon L.; Vejchodský , Tomá š; Erban, Radek

    2013-01-01

    Stochastic models of chemical systems are often analyzed by solving the corresponding Fokker-Planck equation, which is a drift-diffusion partial differential equation for the probability distribution function. Efficient numerical solution of the Fokker-Planck equation requires adaptive mesh refinements. In this paper, we present a mesh refinement approach which makes use of a stochastic simulation of the underlying chemical system. By observing the stochastic trajectory for a relatively short amount of time, the areas of the state space with nonnegligible probability density are identified. By refining the finite element mesh in these areas, and coarsening elsewhere, a suitable mesh is constructed and used for the computation of the stationary probability density. Numerical examples demonstrate that the presented method is competitive with existing a posteriori methods. © 2013 Society for Industrial and Applied Mathematics.

  6. Teduglutide for the treatment of short bowel syndrome

    DEFF Research Database (Denmark)

    Jeppesen, P B

    2013-01-01

    Glucagon-like peptide 2 (GLP-2) decreases gastric and intestinal motility, reduces gastric secretions, promotes intestinal growth and improves post-resection structural and functional adaptation in short bowel syndrome (SBS). Teduglutide, an analogue of GLP-2, has a prolonged half-life and provides...

  7. Partially Adaptive STAP Algorithm Approaches to functional MRI

    OpenAIRE

    Huang, Lejian; Thompson, Elizabeth A.; Schmithorst, Vincent; Holland, Scott K.; Talavage, Thomas M.

    2008-01-01

    In this work, the architectures of three partially adaptive STAP algorithms are introduced, one of which is explored in detail, that reduce dimensionality and improve tractability over fully adaptive STAP when used in construction of brain activation maps in fMRI. Computer simulations incorporating actual MRI noise and human data analysis indicate that element space partially adaptive STAP can attain close to the performance of fully adaptive STAP while significantly decreasing processing tim...

  8. The Benefits of Adaptive Partitioning for Parallel AMR Applications

    Energy Technology Data Exchange (ETDEWEB)

    Steensland, Johan [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Advanced Software Research and Development

    2008-07-01

    Parallel adaptive mesh refinement methods potentially lead to realistic modeling of complex three-dimensional physical phenomena. However, the dynamics inherent in these methods present significant challenges in data partitioning and load balancing. Significant human resources, including time, effort, experience, and knowledge, are required for determining the optimal partitioning technique for each new simulation. In reality, scientists resort to using the on-board partitioner of the computational framework, or to using the partitioning industry standard, ParMetis. Adaptive partitioning refers to repeatedly selecting, configuring and invoking the optimal partitioning technique at run-time, based on the current state of the computer and application. In theory, adaptive partitioning automatically delivers superior performance and eliminates the need for repeatedly spending valuable human resources for determining the optimal static partitioning technique. In practice, however, enabling frameworks are non-existent due to the inherent significant inter-disciplinary research challenges. This paper presents a study of a simple implementation of adaptive partitioning and discusses implied potential benefits from the perspective of common groups of users within computational science. The study is based on a large set of data derived from experiments including six real-life, multi-time-step adaptive applications from various scientific domains, five complementing and fundamentally different partitioning techniques, a large set of parameters corresponding to a wide spectrum of computing environments, and a flexible cost function that considers the relative impact of multiple partitioning metrics and diverse partitioning objectives. The results show that even a simple implementation of adaptive partitioning can automatically generate results statistically equivalent to the best static partitioning. Thus, it is possible to effectively eliminate the problem of determining the

  9. Water System Adaptation To Hydrological Changes: Module 5, Water Quality and Infrastructure Response to Rapid Urbanization: Adaptation Case Study in China

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  10. Representing adaptive and adaptable Units of Learning. How to model personalized eLearning in IMS Learning Design

    OpenAIRE

    Burgos, Daniel; Tattersall, Colin; Koper, Rob

    2006-01-01

    Burgos, D., Tattersall, C., & Koper, E. J. R. (2007). Representing adaptive and adaptable Units of Learning. How to model personalized eLearning in IMS Learning Design. In B. Fernández Manjon, J. M. Sanchez Perez, J. A. Gómez Pulido, M. A. Vega Rodriguez & J. Bravo (Eds.), Computers and Education: E-learning - from theory to practice. Germany: Kluwer.

  11. Adaptive ILC with an adaptive iterative learnign gain

    International Nuclear Information System (INIS)

    Ashraf, S.; Muhammad, E.

    2008-01-01

    This paper describes the design of an adaptive ILC (Iterative Learning Controller) with an iterative learning gain. The basic idea behind ILC is that the information obtained from one trial can be used to improve the control input for the next trial. This proposed scheme extends the idea further and suggests that the information obtained from one trial could also be used to improve control algorithm parameters (gain matrices). The scheme converges faster than the conventional ILC. This convergence and hence number of iterations has always been an issue with ILC. This scheme because of its simple mathematical structure can easily be implemented with lower memory and simpler hardware as opposed to other such adaptive schemes which are computationally expensive. (author)

  12. Implementation of an Improved Adaptive Testing Theory

    Science.gov (United States)

    Al-A'ali, Mansoor

    2007-01-01

    Computer adaptive testing is the study of scoring tests and questions based on assumptions concerning the mathematical relationship between examinees' ability and the examinees' responses. Adaptive student tests, which are based on item response theory (IRT), have many advantages over conventional tests. We use the least square method, a…

  13. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  14. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs; Straaldosreglering vid kroppsdatortomografi - bakgrund till dosregleringsprogrammet OmnimAs

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, Ulf; Kristiansson, Mattias [Trelleborg Hospital (Sweden); Leitz, Wolfram [Swedish Radiation Protection Authority, Stockholm (Sweden); Paahlstorp, Per-Aake [Siemens Medical Solutions, Solna (Sweden)

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations.

  15. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs; Straaldosreglering vid kroppsdatortomografi - bakgrund till dosregleringsprogrammet OmnimAs

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, Ulf; Kristiansson, Mattias [Trelleborg Hospital (Sweden); Leitz, Wolfram [Swedish Radiation Protection Authority, Stockholm (Sweden); Paahlstorp, Per-Aake [Siemens Medical Solutions, Solna (Sweden)

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations.

  16. Adaptive control in multi-threaded iterated integration

    International Nuclear Information System (INIS)

    Doncker, Elise de; Yuasa, Fukuko

    2013-01-01

    In recent years we have developed a technique for the direct computation of Feynman loop-integrals, which are notorious for the occurrence of integrand singularities. Especially for handling singularities in the interior of the domain, we approximate the iterated integral using an adaptive algorithm in the coordinate directions. We present a novel multi-core parallelization scheme for adaptive multivariate integration, by assigning threads to the rule evaluations in the outer dimensions of the iterated integral. The method ensures a large parallel granularity as each function evaluation by itself comprises an integral over the lower dimensions, while the application of the threads is governed by the adaptive control in the outer level. We give computational results for a test set of 3- to 6-dimensional integrals, where several problems exhibit a loop integral behavior.

  17. Adaptations to Short, Frequent Sessions of Endurance and Strength Training Are Similar to Longer, Less Frequent Exercise Sessions When the Total Volume Is the Same.

    Science.gov (United States)

    Kilen, Anders; Hjelvang, Line B; Dall, Niels; Kruse, Nanna L; Nordsborg, Nikolai B

    2015-11-01

    The hypothesis that the distribution of weekly training across several short sessions, as opposed to fewer longer sessions, enhances maximal strength gain without compromising maximal oxygen uptake was evaluated. Twenty-nine subjects completed an 8-week controlled parallel-group training intervention. One group ("micro training" [MI]: n = 21) performed nine 15-minute training sessions weekly, whereas a second group ("classical training" [CL]: n = 8) completed exactly the same training on a weekly basis but as three 45-minute sessions. For each group, each session comprised exclusively strength, high-intensity cardiovascular training or muscle endurance training. Both groups increased shuttle run performance (MI: 1,373 ± 133 m vs. 1,498 ± 126 m, p ≤ 0.05; CL: 1,074 ± 213 m vs. 1,451 ± 202 m, p training intervention. In conclusion, similar training adaptations can be obtained with short, frequent exercise sessions or longer, less frequent sessions where the total volume of weekly training performed is the same.

  18. Sleep facilitates long-term face adaptation.

    Science.gov (United States)

    Ditye, Thomas; Javadi, Amir Homayoun; Carbon, Claus-Christian; Walsh, Vincent

    2013-10-22

    Adaptation is an automatic neural mechanism supporting the optimization of visual processing on the basis of previous experiences. While the short-term effects of adaptation on behaviour and physiology have been studied extensively, perceptual long-term changes associated with adaptation are still poorly understood. Here, we show that the integration of adaptation-dependent long-term shifts in neural function is facilitated by sleep. Perceptual shifts induced by adaptation to a distorted image of a famous person were larger in a group of participants who had slept (experiment 1) or merely napped for 90 min (experiment 2) during the interval between adaptation and test compared with controls who stayed awake. Participants' individual rapid eye movement sleep duration predicted the size of post-sleep behavioural adaptation effects. Our data suggest that sleep prevented decay of adaptation in a way that is qualitatively different from the effects of reduced visual interference known as 'storage'. In the light of the well-established link between sleep and memory consolidation, our findings link the perceptual mechanisms of sensory adaptation--which are usually not considered to play a relevant role in mnemonic processes--with learning and memory, and at the same time reveal a new function of sleep in cognition.

  19. Adaptive game AI with dynamic scripting

    NARCIS (Netherlands)

    Spronck, P.; Ponsen, M.J.V.; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.

    2006-01-01

    Online learning in commercial computer games allows computer-controlled opponents to adapt to the way the game is being played. As such it provides a mechanism to deal with weaknesses in the game AI, and to respond to changes in human player tactics.We argue that online learning of game AI should

  20. Adaptive computations of flow around a delta wing with vortex breakdown

    Science.gov (United States)

    Modiano, David L.; Murman, Earll M.

    1993-01-01

    An adaptive unstructured mesh solution method for the three-dimensional Euler equations was used to simulate the flow around a sharp edged delta wing. Emphasis was on the breakdown of the leading edge vortex at high angle of attack. Large values of entropy, which indicate vortical regions of the flow, specified the region in which adaptation was performed. The aerodynamic normal force coefficients show excellent agreement with wind tunnel data measured by Jarrah, and demonstrate the importance of adaptation in obtaining an accurate solution. The pitching moment coefficient and the location of vortex breakdown are compared with experimental data measured by Hummel and Srinivasan, showing good agreement in cases in which vortex breakdown is located over the wing.

  1. Contrast adaptation in the Limulus lateral eye

    OpenAIRE

    Valtcheva, Tchoudomira M.; Passaglia, Christopher L.

    2015-01-01

    Luminance and contrast adaptation are neuronal mechanisms employed by the visual system to adjust our sensitivity to light. They are mediated by an assortment of cellular and network processes distributed across the retina and visual cortex. Both have been demonstrated in the eyes of many vertebrates, but only luminance adaptation has been shown in invertebrate eyes to date. Since the computational benefits of contrast adaptation should apply to all visual systems, we investigated whether thi...

  2. Exploring adaptive program behavior

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal; Probst, Christian W.

    Modern computer systems are increasingly complex, with ever changing bottlenecks. This makes it difficult to ensure consistent performance when porting software, or even running it. Adaptivity, ie, switching between program variations, and dynamic recompilation have been suggested as solutions....... Both solutions come at a cost; adaptivity issues a runtime overhead and requires more design effort, while dynamic recompilation takes time to perform. In this project, we plan to investigate the possibilities, limitations, and benefits of these techniques. This abstract covers our thoughts on how...

  3. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  4. Adaptive Wireless Multimedia Services

    OpenAIRE

    Yi, Xiaokun

    2006-01-01

    Context-awareness is a hot topic in mobile computing currently. A lot of importance is being attached to facilitating the user of various mobile computing devices to provide services that are more “user-centric”. One aspect of context-awareness is to perceive variations in available resources, and to make decisions based on the feedback to enable applications to automatically adapt to the current environment. For Voice over IP (VoIP) software phones (softphones), variations in network perform...

  5. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    Energy Technology Data Exchange (ETDEWEB)

    Bigu, J [Department of Energy, Mines and Resources, Elliot Lake, Ontario (Canada). Elliot Lake Lab.; Raz, R; Golden, K; Dominguez, P [Alpha-NUCLEAR, Toronto, Ontario (Canada)

    1984-08-15

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operation with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross ..cap alpha..-count methods and ..cap alpha..-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of ..cap alpha..-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software.

  6. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    International Nuclear Information System (INIS)

    Bigu, J.

    1984-01-01

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operatin with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross α-count methods and α-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of α-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software. (orig.)

  7. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  8. Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-01-01

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. A meaningful adaption will result in high-fidelity and robust adapted core simulator models. To perform adaption, we propose an inverse theory approach in which the multitudes of input data to core simulators, i.e., reactor physics and thermal-hydraulic data, are to be adjusted to improve agreement with measured observables while keeping core simulator models unadapted. At first glance, devising such adaption for typical core simulators with millions of input and observables data would spawn not only several prohibitive challenges but also numerous disparaging concerns. The challenges include the computational burdens of the sensitivity-type calculations required to construct Jacobian operators for the core simulator models. Also, the computational burdens of the uncertainty-type calculations required to estimate the uncertainty information of core simulator input data present a demanding challenge. The concerns however are mainly related to the reliability of the adjusted input data. The methodologies of adaptive simulation are well established in the literature of data adjustment. We adopt the same general framework for data adjustment; however, we refrain from solving the fundamental adjustment equations in a conventional manner. We demonstrate the use of our so-called Efficient Subspace Methods (ESMs) to overcome the computational and storage burdens associated with the core adaption problem. We illustrate the successful use of ESM-based adaptive techniques for a typical boiling water reactor core simulator adaption problem

  9. Numerical Computation of Underground Inundation in Multiple Layers Using the Adaptive Transfer Method

    Directory of Open Access Journals (Sweden)

    Hyung-Jun Kim

    2018-01-01

    Full Text Available Extreme rainfall causes surface runoff to flow towards lowlands and subterranean facilities, such as subway stations and buildings with underground spaces in densely packed urban areas. These facilities and areas are therefore vulnerable to catastrophic submergence. However, flood modeling of underground space has not yet been adequately studied because there are difficulties in reproducing the associated multiple horizontal layers connected with staircases or elevators. This study proposes a convenient approach to simulate underground inundation when two layers are connected. The main facet of this approach is to compute the flow flux passing through staircases in an upper layer and to transfer the equivalent quantity to a lower layer. This is defined as the ‘adaptive transfer method’. This method overcomes the limitations of 2D modeling by introducing layers connecting concepts to prevent large variations in mesh sizes caused by complicated underlying obstacles or local details. Consequently, this study aims to contribute to the numerical analysis of flow in inundated underground spaces with multiple floors.

  10. Adaptive Gradient Multiobjective Particle Swarm Optimization.

    Science.gov (United States)

    Han, Honggui; Lu, Wei; Zhang, Lu; Qiao, Junfei

    2017-10-09

    An adaptive gradient multiobjective particle swarm optimization (AGMOPSO) algorithm, based on a multiobjective gradient (stocktickerMOG) method and a self-adaptive flight parameters mechanism, is developed to improve the computation performance in this paper. In this AGMOPSO algorithm, the stocktickerMOG method is devised to update the archive to improve the convergence speed and the local exploitation in the evolutionary process. Meanwhile, the self-adaptive flight parameters mechanism, according to the diversity information of the particles, is then established to balance the convergence and diversity of AGMOPSO. Attributed to the stocktickerMOG method and the self-adaptive flight parameters mechanism, this AGMOPSO algorithm not only has faster convergence speed and higher accuracy, but also its solutions have better diversity. Additionally, the convergence is discussed to confirm the prerequisite of any successful application of AGMOPSO. Finally, with regard to the computation performance, the proposed AGMOPSO algorithm is compared with some other multiobjective particle swarm optimization algorithms and two state-of-the-art multiobjective algorithms. The results demonstrate that the proposed AGMOPSO algorithm can find better spread of solutions and have faster convergence to the true Pareto-optimal front.

  11. Unified Computational Intelligence for Complex Systems

    CERN Document Server

    Seiffertt, John

    2010-01-01

    Computational intelligence encompasses a wide variety of techniques that allow computation to learn, to adapt, and to seek. That is, they may be designed to learn information without explicit programming regarding the nature of the content to be retained, they may be imbued with the functionality to adapt to maintain their course within a complex and unpredictably changing environment, and they may help us seek out truths about our own dynamics and lives through their inclusion in complex system modeling. These capabilities place our ability to compute in a category apart from our ability to e

  12. Computationally Efficient Blind Code Synchronization for Asynchronous DS-CDMA Systems with Adaptive Antenna Arrays

    Directory of Open Access Journals (Sweden)

    Chia-Chang Hu

    2005-04-01

    Full Text Available A novel space-time adaptive near-far robust code-synchronization array detector for asynchronous DS-CDMA systems is developed in this paper. There are the same basic requirements that are needed by the conventional matched filter of an asynchronous DS-CDMA system. For the real-time applicability, a computationally efficient architecture of the proposed detector is developed that is based on the concept of the multistage Wiener filter (MWF of Goldstein and Reed. This multistage technique results in a self-synchronizing detection criterion that requires no inversion or eigendecomposition of a covariance matrix. As a consequence, this detector achieves a complexity that is only a linear function of the size of antenna array (J, the rank of the MWF (M, the system processing gain (N, and the number of samples in a chip interval (S, that is, 𝒪(JMNS. The complexity of the equivalent detector based on the minimum mean-squared error (MMSE or the subspace-based eigenstructure analysis is a function of 𝒪((JNS3. Moreover, this multistage scheme provides a rapid adaptive convergence under limited observation-data support. Simulations are conducted to evaluate the performance and convergence behavior of the proposed detector with the size of the J-element antenna array, the amount of the L-sample support, and the rank of the M-stage MWF. The performance advantage of the proposed detector over other DS-CDMA detectors is investigated as well.

  13. Final Report: Symposium on Adaptive Methods for Partial Differential Equations

    Energy Technology Data Exchange (ETDEWEB)

    Pernice, M.; Johnson, C.R.; Smith, P.J.; Fogelson, A.

    1998-12-10

    OAK-B135 Final Report: Symposium on Adaptive Methods for Partial Differential Equations. Complex physical phenomena often include features that span a wide range of spatial and temporal scales. Accurate simulation of such phenomena can be difficult to obtain, and computations that are under-resolved can even exhibit spurious features. While it is possible to resolve small scale features by increasing the number of grid points, global grid refinement can quickly lead to problems that are intractable, even on the largest available computing facilities. These constraints are particularly severe for three dimensional problems that involve complex physics. One way to achieve the needed resolution is to refine the computational mesh locally, in only those regions where enhanced resolution is required. Adaptive solution methods concentrate computational effort in regions where it is most needed. These methods have been successfully applied to a wide variety of problems in computational science and engineering. Adaptive methods can be difficult to implement, prompting the development of tools and environments to facilitate their use. To ensure that the results of their efforts are useful, algorithm and tool developers must maintain close communication with application specialists. Conversely it remains difficult for application specialists who are unfamiliar with the methods to evaluate the trade-offs between the benefits of enhanced local resolution and the effort needed to implement an adaptive solution method.

  14. Changes in Jump-Down Performance After Space Flight: Short- and Long-Term Adaptation

    Science.gov (United States)

    Kofman, I. S.; Reschke, M. F.; Cerisano, J. M.; Fisher, E. A.; Lawrence, E. L.; Peters, B. T.; Bloomberg, J. J.

    2010-01-01

    INTRODUCTION Successful jump performance requires functional coordination of visual, vestibular, and somatosensory systems, which are affected by prolonged exposure to microgravity. Astronauts returning from space flight exhibit impaired ability to coordinate effective landing strategies when jumping from a platform to the ground. This study compares the jump strategies used by astronauts before and after flight, the changes to those strategies within a test session, and the recoveries in jump-down performance parameters across several postflight test sessions. These data were obtained as part of an ongoing interdisciplinary study (Functional Task Test, FTT) designed to evaluate both astronaut postflight functional performance and related physiological changes. METHODS Six astronauts from short-duration (Shuttle) and three from long-duration (International Space Station) flights performed 3 two-footed jumps from a platform 30 cm high. A force plate measured the ground reaction forces and center-of-pressure displacement from the landings. Muscle activation data were collected from the medial gastrocnemius and anterior tibialis of both legs using surface electromyography electrodes. Two load cells in the platform measured the load exerted by each foot during the takeoff phase of the jump. Data were collected in 2 preflight sessions, on landing day (Shuttle only), and 1, 6, and 30 days after flight. RESULTS AND CONCLUSION Many of the astronauts tested were unable to maintain balance on their first postflight jump landing but recovered by the third jump, showing a learning progression in which the performance improvement could be attributed to adjustments of strategy on takeoff, landing, or both. Takeoff strategy changes were evident in air time (time between takeoff and landing), which was significantly reduced after flight, and also in increased asymmetry in foot latencies on takeoff. Landing modifications were seen in changes in ground reaction force curves. The

  15. Parallel adaptive simulations on unstructured meshes

    International Nuclear Information System (INIS)

    Shephard, M S; Jansen, K E; Sahni, O; Diachin, L A

    2007-01-01

    This paper discusses methods being developed by the ITAPS center to support the execution of parallel adaptive simulations on unstructured meshes. The paper first outlines the ITAPS approach to the development of interoperable mesh, geometry and field services to support the needs of SciDAC application in these areas. The paper then demonstrates the ability of unstructured adaptive meshing methods built on such interoperable services to effectively solve important physics problems. Attention is then focused on ITAPs' developing ability to solve adaptive unstructured mesh problems on massively parallel computers

  16. Near-Body Grid Adaption for Overset Grids

    Science.gov (United States)

    Buning, Pieter G.; Pulliam, Thomas H.

    2016-01-01

    A solution adaption capability for curvilinear near-body grids has been implemented in the OVERFLOW overset grid computational fluid dynamics code. The approach follows closely that used for the Cartesian off-body grids, but inserts refined grids in the computational space of original near-body grids. Refined curvilinear grids are generated using parametric cubic interpolation, with one-sided biasing based on curvature and stretching ratio of the original grid. Sensor functions, grid marking, and solution interpolation tasks are implemented in the same fashion as for off-body grids. A goal-oriented procedure, based on largest error first, is included for controlling growth rate and maximum size of the adapted grid system. The adaption process is almost entirely parallelized using MPI, resulting in a capability suitable for viscous, moving body simulations. Two- and three-dimensional examples are presented.

  17. Parallel 3D Mortar Element Method for Adaptive Nonconforming Meshes

    Science.gov (United States)

    Feng, Huiyu; Mavriplis, Catherine; VanderWijngaart, Rob; Biswas, Rupak

    2004-01-01

    High order methods are frequently used in computational simulation for their high accuracy. An efficient way to avoid unnecessary computation in smooth regions of the solution is to use adaptive meshes which employ fine grids only in areas where they are needed. Nonconforming spectral elements allow the grid to be flexibly adjusted to satisfy the computational accuracy requirements. The method is suitable for computational simulations of unsteady problems with very disparate length scales or unsteady moving features, such as heat transfer, fluid dynamics or flame combustion. In this work, we select the Mark Element Method (MEM) to handle the non-conforming interfaces between elements. A new technique is introduced to efficiently implement MEM in 3-D nonconforming meshes. By introducing an "intermediate mortar", the proposed method decomposes the projection between 3-D elements and mortars into two steps. In each step, projection matrices derived in 2-D are used. The two-step method avoids explicitly forming/deriving large projection matrices for 3-D meshes, and also helps to simplify the implementation. This new technique can be used for both h- and p-type adaptation. This method is applied to an unsteady 3-D moving heat source problem. With our new MEM implementation, mesh adaptation is able to efficiently refine the grid near the heat source and coarsen the grid once the heat source passes. The savings in computational work resulting from the dynamic mesh adaptation is demonstrated by the reduction of the the number of elements used and CPU time spent. MEM and mesh adaptation, respectively, bring irregularity and dynamics to the computer memory access pattern. Hence, they provide a good way to gauge the performance of computer systems when running scientific applications whose memory access patterns are irregular and unpredictable. We select a 3-D moving heat source problem as the Unstructured Adaptive (UA) grid benchmark, a new component of the NAS Parallel

  18. Implementation and adaption of the Computer Code ECOSYS/EXCEL for Austria as OECOSYS/EXCEL

    International Nuclear Information System (INIS)

    Hick, H.; Suda, M.; Mueck, K.

    1998-03-01

    During 1989, under contract to the Austrian Chamber of the Federal Chancellor, department VII, the radioecological forecast model OECOSYS was implemented by the Austrian Research Centre Seibersdorf on a VAX computer using VAX Fortran. OECOSYS allows the prediction of the consequences after a large scale contamination event. During 1992, under contract to the Austrian Federal Ministry of Health, Sports and Consumer Protection, department III OECOSYS - in the version of 1989 - was implemented on PC's in Seibersdorf and the Ministry using OS/2 and Microsoft -Fortran. In March 1993, the Ministry ordered an update which had become necessary and the evaluation of two exercise scenarios. Since that time the prognosis model with its auxiliary program and communication facilities is kept on stand-by and yearly exercises are performed to maintain its readiness. The current report describes the implementation and adaption to Austrian conditions of the newly available EXCEL version of the German ECOSYS prognosis model as OECOSYS. (author)

  19. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  20. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  1. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  2. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali

    2014-11-01

    The European Extremely Large Telescope project (E-ELT) is one of Europe\\'s highest priorities in ground-based astronomy. ELTs are built on top of a variety of highly sensitive and critical astronomical instruments. In particular, a new instrument called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used to drive the deformable mirror in real time from the measurements. A new numerical algorithm is proposed (1) to capture the actual experimental noise and (2) to substantially speed up previous implementations by exposing more concurrency, while reducing the number of floating-point operations. Based on the Matrices Over Runtime System at Exascale numerical library (MORSE), a dynamic scheduler drives all computational stages of the tomographic reconstruct or simulation and allows to pipeline and to run tasks out-of order across different stages on heterogeneous systems, while ensuring data coherency and dependencies. The proposed TR simulation outperforms asymptotically previous state-of-the-art implementations up to 13-fold speedup. At more than 50000 unknowns, this appears to be the largest-scale AO problem submitted to computation, to date, and opens new research directions for extreme scale AO simulations. © 2014 IEEE.

  3. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  4. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  5. Short Pulse Laser Applications Design

    International Nuclear Information System (INIS)

    Town, R.J.; Clark, D.S.; Kemp, A.J.; Lasinski, B.F.; Tabak, M.

    2008-01-01

    We are applying our recently developed, LDRD-funded computational simulation tool to optimize and develop applications of Fast Ignition (FI) for stockpile stewardship. This report summarizes the work performed during a one-year exploratory research LDRD to develop FI point designs for the National Ignition Facility (NIF). These results were sufficiently encouraging to propose successfully a strategic initiative LDRD to design and perform the definitive FI experiment on the NIF. Ignition experiments on the National Ignition Facility (NIF) will begin in 2010 using the central hot spot (CHS) approach, which relies on the simultaneous compression and ignition of a spherical fuel capsule. Unlike this approach, the fast ignition (FI) method separates fuel compression from the ignition phase. In the compression phase, a laser such as NIF is used to implode a shell either directly, or by x rays generated from the hohlraum wall, to form a compact dense (∼300 g/cm 3 ) fuel mass with an areal density of ∼3.0 g/cm 2 . To ignite such a fuel assembly requires depositing ∼20kJ into a ∼35 (micro)m spot delivered in a short time compared to the fuel disassembly time (∼20ps). This energy is delivered during the ignition phase by relativistic electrons generated by the interaction of an ultra-short high-intensity laser. The main advantages of FI over the CHS approach are higher gain, a lower ignition threshold, and a relaxation of the stringent symmetry requirements required by the CHS approach. There is worldwide interest in FI and its associated science. Major experimental facilities are being constructed which will enable 'proof of principle' tests of FI in integrated subignition experiments, most notably the OMEGA-EP facility at the University of Rochester's Laboratory of Laser Energetics and the FIREX facility at Osaka University in Japan. Also, scientists in the European Union have recently proposed the construction of a new FI facility, called HiPER, designed to

  6. Contrast adaptation in the Limulus lateral eye.

    Science.gov (United States)

    Valtcheva, Tchoudomira M; Passaglia, Christopher L

    2015-12-01

    Luminance and contrast adaptation are neuronal mechanisms employed by the visual system to adjust our sensitivity to light. They are mediated by an assortment of cellular and network processes distributed across the retina and visual cortex. Both have been demonstrated in the eyes of many vertebrates, but only luminance adaptation has been shown in invertebrate eyes to date. Since the computational benefits of contrast adaptation should apply to all visual systems, we investigated whether this mechanism operates in horseshoe crab eyes, one of the best-understood neural networks in the animal kingdom. The spike trains of optic nerve fibers were recorded in response to light stimuli modulated randomly in time and delivered to single ommatidia or the whole eye. We found that the retina adapts to both the mean luminance and contrast of a white-noise stimulus, that luminance- and contrast-adaptive processes are largely independent, and that they originate within an ommatidium. Network interactions are not involved. A published computer model that simulates existing knowledge of the horseshoe crab eye did not show contrast adaptation, suggesting that a heretofore unknown mechanism may underlie the phenomenon. This mechanism does not appear to reside in photoreceptors because white-noise analysis of electroretinogram recordings did not show contrast adaptation. The likely site of origin is therefore the spike discharge mechanism of optic nerve fibers. The finding of contrast adaption in a retinal network as simple as the horseshoe crab eye underscores the broader importance of this image processing strategy to vision. Copyright © 2015 the American Physiological Society.

  7. Encoder-decoder optimization for brain-computer interfaces.

    Science.gov (United States)

    Merel, Josh; Pianto, Donald M; Cunningham, John P; Paninski, Liam

    2015-06-01

    Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages.

  8. Encoder-decoder optimization for brain-computer interfaces.

    Directory of Open Access Journals (Sweden)

    Josh Merel

    2015-06-01

    Full Text Available Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model" and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages.

  9. Event-driven Adaptation in COP

    Directory of Open Access Journals (Sweden)

    Pierpaolo Degano

    2016-06-01

    Full Text Available Context-Oriented Programming languages provide us with primitive constructs to adapt program behaviour depending on the evolution of their operational environment, namely the context. In previous work we proposed ML_CoDa, a context-oriented language with two-components: a declarative constituent for programming the context and a functional one for computing. This paper describes an extension of ML_CoDa to deal with adaptation to unpredictable context changes notified by asynchronous events.

  10. Surface Plasmon Wave Adapter Designed with Transformation Optics

    DEFF Research Database (Denmark)

    Zhang, Jingjing; Xiao, Sanshui; Wubs, Martijn

    2011-01-01

    On the basis of transformation optics, we propose the design of a surface plasmon wave adapter which confines surface plasmon waves on non-uniform metal surfaces and enables adiabatic mode transformation of surface plasmon polaritons with very short tapers. This adapter can be simply achieved...... with homogeneous anisotropic naturally occurring materials or subwavelength grating-structured dielectric materials. Full wave simulations based on a finite-element method have been performed to validate our proposal....

  11. Effect of the Novel Polysaccharide PolyGlycopleX® on Short-Chain Fatty Acid Production in a Computer-Controlled in Vitro Model of the Human Large Intestine

    Directory of Open Access Journals (Sweden)

    Raylene A. Reimer

    2014-03-01

    Full Text Available Many of the health benefits associated with dietary fiber are attributed to their fermentation by microbiota and production of short chain fatty acids (SCFA. The aim of this study was to investigate the fermentability of the functional fiber PolyGlyopleX® (PGX® in vitro. A validated dynamic, computer-controlled in vitro system simulating the conditions in the proximal large intestine (TIM-2 was used. Sodium hydroxide (NaOH consumption in the system was used as an indicator of fermentability and SCFA and branched chain fatty acids (BCFA production was determined. NaOH consumption was significantly higher for Fructooligosaccharide (FOS than PGX, which was higher than cellulose (p = 0.002. At 32, 48 and 72 h, acetate and butyrate production were higher for FOS and PGX versus cellulose. Propionate production was higher for PGX than cellulose at 32, 48, 56 and 72 h and higher than FOS at 72 h (p = 0.014. Total BCFA production was lower for FOS compared to cellulose, whereas production with PGX was lower than for cellulose at 72 h. In conclusion, PGX is fermented by the colonic microbiota which appeared to adapt to the substrate over time. The greater propionate production for PGX may explain part of the cholesterol-lowering properties of PGX seen in rodents and humans.

  12. Partially Adaptive STAP Algorithm Approaches to functional MRI

    Science.gov (United States)

    Huang, Lejian; Thompson, Elizabeth A.; Schmithorst, Vincent; Holland, Scott K.; Talavage, Thomas M.

    2010-01-01

    In this work, the architectures of three partially adaptive STAP algorithms are introduced, one of which is explored in detail, that reduce dimensionality and improve tractability over fully adaptive STAP when used in construction of brain activation maps in fMRI. Computer simulations incorporating actual MRI noise and human data analysis indicate that element space partially adaptive STAP can attain close to the performance of fully adaptive STAP while significantly decreasing processing time and maximum memory requirements, and thus demonstrates potential in fMRI analysis. PMID:19272913

  13. Haloarcula hispanica CRISPR authenticates PAM of a target sequence to prime discriminative adaptation.

    Science.gov (United States)

    Li, Ming; Wang, Rui; Xiang, Hua

    2014-06-01

    The prokaryotic immune system CRISPR/Cas (Clustered Regularly Interspaced Short Palindromic Repeats/CRISPR-associated genes) adapts to foreign invaders by acquiring their short deoxyribonucleic acid (DNA) fragments as spacers, which guide subsequent interference to foreign nucleic acids based on sequence matching. The adaptation mechanism avoiding acquiring 'self' DNA fragments is poorly understood. In Haloarcula hispanica, we previously showed that CRISPR adaptation requires being primed by a pre-existing spacer partially matching the invader DNA. Here, we further demonstrate that flanking a fully-matched target sequence, a functional PAM (protospacer adjacent motif) is still required to prime adaptation. Interestingly, interference utilizes only four PAM sequences, whereas adaptation-priming tolerates as many as 23 PAM sequences. This relaxed PAM selectivity explains how adaptation-priming maximizes its tolerance of PAM mutations (that escape interference) while avoiding mis-targeting the spacer DNA within CRISPR locus. We propose that the primed adaptation, which hitches and cooperates with the interference pathway, distinguishes target from non-target by CRISPR ribonucleic acid guidance and PAM recognition. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Variable mutation rates as an adaptive strategy in replicator populations.

    Directory of Open Access Journals (Sweden)

    Michael Stich

    2010-06-01

    Full Text Available For evolving populations of replicators, there is much evidence that the effect of mutations on fitness depends on the degree of adaptation to the selective pressures at play. In optimized populations, most mutations have deleterious effects, such that low mutation rates are favoured. In contrast to this, in populations thriving in changing environments a larger fraction of mutations have beneficial effects, providing the diversity necessary to adapt to new conditions. What is more, non-adapted populations occasionally benefit from an increase in the mutation rate. Therefore, there is no optimal universal value of the mutation rate and species attempt to adjust it to their momentary adaptive needs. In this work we have used stationary populations of RNA molecules evolving in silico to investigate the relationship between the degree of adaptation of an optimized population and the value of the mutation rate promoting maximal adaptation in a short time to a new selective pressure. Our results show that this value can significantly differ from the optimal value at mutation-selection equilibrium, being strongly influenced by the structure of the population when the adaptive process begins. In the short-term, highly optimized populations containing little variability respond better to environmental changes upon an increase of the mutation rate, whereas populations with a lower degree of optimization but higher variability benefit from reducing the mutation rate to adapt rapidly. These findings show a good agreement with the behaviour exhibited by actual organisms that replicate their genomes under broadly different mutation rates.

  15. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  16. The Short French Internet Addiction Test Adapted to Online Sexual Activities: Validation and Links With Online Sexual Preferences and Addiction Symptoms.

    Science.gov (United States)

    Wéry, Aline; Burnay, Jonathan; Karila, Laurent; Billieux, Joël

    2016-01-01

    The goal of this study was to investigate the psychometric properties of a French version of the short Internet Addiction Test adapted to online sexual activities (s-IAT-sex). The French version of the s-IAT-sex was administered to a sample of 401 men. The participants also completed a questionnaire that screened for sexual addiction (PATHOS). The relationships of s-IAT-sex scores with time spent online for online sexual activities (OSAs) and the types of OSAs favored were also considered. Confirmatory analyses supported a two-factor model of s-IAT-sex, corresponding to the factorial structure found in earlier studies that used the short IAT. The first factor regroups loss of control and time management, whereas the second factor regroups craving and social problems. Internal consistency for each factor was evaluated with Cronbach's α coefficient, resulting in .87 for Factor 1, .76 for Factor 2, and .88 for the global scale. Concurrent validity was supported by relationships with symptoms of sexual addiction, types of OSAs practiced, and time spent online for OSAs. The prevalence of sexual addiction (measured by PATHOS) was 28.1% in the current sample of self-selected male OSA users. The French version of the s-IAT-sex presents good psychometric properties and constitutes a useful tool for researchers and practitioners.

  17. The nociceptive withdrawal reflex does not adapt to joint position change and short-term motor practice [v2; ref status: indexed, http://f1000r.es/2lr

    Directory of Open Access Journals (Sweden)

    Nathan Eckert

    2013-12-01

    Full Text Available The nociceptive withdrawal reflex is a protective mechanism to mediate interactions within a potentially dangerous environment. The reflex is formed by action-based sensory encoding during the early post-natal developmental period, and it is unknown if the protective motor function of the nociceptive withdrawal reflex in the human upper-limb is adaptable based on the configuration of the arm or if it can be modified by short-term practice of a similar or opposing motor action. In the present study, nociceptive withdrawal reflexes were evoked by a brief train of electrical stimuli applied to digit II, 1 in five different static arm positions and, 2 before and after motor practice that was opposite (EXT or similar (FLEX to the stereotyped withdrawal response, in 10 individuals. Withdrawal responses were quantified by the electromyography (EMG reflex response in several upper limb muscles, and by the forces and moments recorded at the wrist. EMG onset latencies and response amplitudes were not significantly different across the arm positions or between the EXT and FLEX practice conditions, and the general direction of the withdrawal response was similar across arm positions. In addition, the force vectors were not different after practice in either the practice condition or between EXT and FLEX conditions. We conclude the withdrawal response is insensitive to changes in elbow or shoulder joint angles as well as remaining resistant to short-term adaptations from the practice of motor actions, resulting in a generalized limb withdrawal in each case. It is further hypothesized that the multisensory feedback is weighted differently in each arm position, but integrated to achieve a similar withdrawal response to safeguard against erroneous motor responses that could cause further harm. The results remain consistent with the concept that nociceptive withdrawal reflexes are shaped through long-term and not short-term action based sensory encoding.

  18. Adaptive security protocol selection for mobile computing

    NARCIS (Netherlands)

    Pontes Soares Rocha, B.; Costa, D.N.O.; Moreira, R.A.; Rezende, C.G.; Loureiro, A.A.F.; Boukerche, A.

    2010-01-01

    The mobile computing paradigm has introduced new problems for application developers. Challenges include heterogeneity of hardware, software, and communication protocols, variability of resource limitations and varying wireless channel quality. In this scenario, security becomes a major concern for

  19. Dosimetric Advantages of Four-Dimensional Adaptive Image-Guided Radiotherapy for Lung Tumors Using Online Cone-Beam Computed Tomography

    International Nuclear Information System (INIS)

    Harsolia, Asif; Hugo, Geoffrey D.; Kestin, Larry L.; Grills, Inga S.; Yan Di

    2008-01-01

    Purpose: This study compares multiple planning techniques designed to improve accuracy while allowing reduced planning target volume (PTV) margins though image-guided radiotherapy (IGRT) with four-dimensional (4D) cone-beam computed tomography (CBCT). Methods and Materials: Free-breathing planning and 4D-CBCT scans were obtained in 8 patients with lung tumors. Four plans were generated for each patient: 3D-conformal, 4D-union, 4D-offline adaptive with a single correction (offline ART), and 4D-online adaptive with daily correction (online ART). For the 4D-union plan, the union of gross tumor volumes from all phases of the 4D-CBCT was created with a 5-mm expansion applied for setup uncertainty. For offline and online ART, the gross tumor volume was delineated at the mean position of tumor motion from the 4D-CBCT. The PTV margins were calculated from the random components of tumor motion and setup uncertainty. Results: Adaptive IGRT techniques provided better PTV coverage with less irradiated normal tissues. Compared with 3D plans, mean relative decreases in PTV volumes were 15%, 39%, and 44% using 4D-union, offline ART, and online ART planning techniques, respectively. This resulted in mean lung volume receiving ≥ 20Gy (V20) relative decreases of 21%, 23%, and 31% and mean lung dose relative decreases of 16%, 26%, and 31% for the 4D-union, 4D-offline ART, and 4D-online ART, respectively. Conclusions: Adaptive IGRT using CBCT is feasible for the treatment of patients with lung tumors and significantly decreases PTV volume and dose to normal tissues, allowing for the possibility of dose escalation. All analyzed 4D planning strategies resulted in improvements over 3D plans, with 4D-online ART appearing optimal

  20. Quantum computing for physics research

    International Nuclear Information System (INIS)

    Georgeot, B.

    2006-01-01

    Quantum computers hold great promises for the future of computation. In this paper, this new kind of computing device is presented, together with a short survey of the status of research in this field. The principal algorithms are introduced, with an emphasis on the applications of quantum computing to physics. Experimental implementations are also briefly discussed

  1. Training Adaptability in Digital Skills

    National Research Council Canada - National Science Library

    Hess-Kosa, Kathleen

    2001-01-01

    .... As outlined in this Phase I report, Aptima and the Group for Organizational Effectiveness (gOE) have lain the groundwork for an innovative, computer-based, digital-skills training package designed to increase the adaptability of digital skills...

  2. Adapting a measure of acculturation for cross-cultural research.

    Science.gov (United States)

    dela Cruz, F A; Padilla, G V; Agustin, E O

    2000-07-01

    Although Filipino Americans are projected to become the largest Asian American ethnic group in this millennium, no acculturation measure existed for this group. This article describes a systematic and replicable process used in adapting and modifying A Short Acculturation Scale for Hispanics (ASASH) for use with Filipino Americans. It depicts the multiple and iterative steps of translation and backtranslation to produce A Short Acculturation Scale for Filipino Americans (ASASFA) in English and in Tagalog--the Philippine national language. Also, it describes the methods undertaken for the measures to achieve linguistic and cross-cultural validity through content, technical, experiential, semantic, and conceptual equivalence. With the dearth of linguistically and culturally valid measures for immigrant populations, the adaptation of valid measures developed for other cultures remains a viable option.

  3. Force control tasks with pure haptic feedback promote short-term focused attention.

    Science.gov (United States)

    Wang, Dangxiao; Zhang, Yuru; Yang, Xiaoxiao; Yang, Gaofeng; Yang, Yi

    2014-01-01

    Focused attention has great impact on our quality of life. Our learning, social skills and even happiness are closely intertwined with our capacity for focused attention. Attention promotion is replete with examples of training-induced increases in attention capability, most of which rely on visual and auditory stimulation. Pure haptic stimulation to increase attention capability is rarely found. We show that accurate force control tasks with pure haptic feedback enhance short-term focused attention. Participants were trained by a force control task in which information from visual and auditory channels was blocked, and only haptic feedback was provided. The trainees were asked to exert a target force within a pre-defined force tolerance for a specific duration. The tolerance was adaptively modified to different levels of difficulty to elicit full participant engagement. Three attention tests showed significant changes in different aspects of focused attention in participants who had been trained as compared with those who had not, thereby illustrating the role of haptic-based sensory-motor tasks in the promotion of short-term focused attention. The findings highlight the potential value of haptic stimuli in brain plasticity and serve as a new tool to extend existing computer games for cognitive enhancement.

  4. Towards Measuring the Adaptability of an AO4BPEL Process

    OpenAIRE

    Botangen, Khavee Agustus; Yu, Jian; Sheng, Michael

    2017-01-01

    Adaptability is a significant property which enables software systems to continuously provide the required functionality and achieve optimal performance. The recognised importance of adaptability makes its evaluation an essential task. However, the various adaptability dimensions and implementation mechanisms make adaptive strategies difficult to evaluate. In service oriented computing, several frameworks that extend the WS-BPEL, the de facto standard in composing distributed business applica...

  5. Automated Analysis of Short Responses in an Interactive Synthetic Tutoring System for Introductory Physics

    Science.gov (United States)

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-01-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…

  6. Numerical computations with GPUs

    CERN Document Server

    Kindratenko, Volodymyr

    2014-01-01

    This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to

  7. Ultra-Wideband, Short Pulse Electromagnetics 9

    CERN Document Server

    Rachidi, Farhad; Kaelin, Armin; Sabath, Frank; UWB SP 9

    2010-01-01

    Ultra-wideband (UWB), short-pulse (SP) electromagnetics are now being used for an increasingly wide variety of applications, including collision avoidance radar, concealed object detection, and communications. Notable progress in UWB and SP technologies has been achieved by investigations of their theoretical bases and improvements in solid-state manufacturing, computers, and digitizers. UWB radar systems are also being used for mine clearing, oil pipeline inspections, archeology, geology, and electronic effects testing. Ultra-wideband Short-Pulse Electromagnetics 9 presents selected papers of deep technical content and high scientific quality from the UWB-SP9 Conference, which was held from July 21-25, 2008, in Lausanne, Switzerland. The wide-ranging coverage includes contributions on electromagnetic theory, time-domain computational techniques, modeling, antennas, pulsed-power, UWB interactions, radar systems, UWB communications, and broadband systems and components. This book serves as a state-of-the-art r...

  8. Computer Technology for Industry

    Science.gov (United States)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  9. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    Science.gov (United States)

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residualsLD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  10. Adaptive control of robotic manipulators

    Science.gov (United States)

    Seraji, H.

    1987-01-01

    The author presents a novel approach to adaptive control of manipulators to achieve trajectory tracking by the joint angles. The central concept in this approach is the utilization of the manipulator inverse as a feedforward controller. The desired trajectory is applied as an input to the feedforward controller which behaves as the inverse of the manipulator at any operating point; the controller output is used as the driving torque for the manipulator. The controller gains are then updated by an adaptation algorithm derived from MRAC (model reference adaptive control) theory to cope with variations in the manipulator inverse due to changes of the operating point. An adaptive feedback controller and an auxiliary signal are also used to enhance closed-loop stability and to achieve faster adaptation. The proposed control scheme is computationally fast and does not require a priori knowledge of the complex dynamic model or the parameter values of the manipulator or the payload.

  11. Computer controlled testing of batteries

    NARCIS (Netherlands)

    Kuiper, A.C.J.; Einerhand, R.E.F.; Visscher, W.

    1989-01-01

    A computerized testing device for batteries consists of a power supply, a multiplexer circuit connected to the batteries, a protection circuit, and an IBM Data Aquisition and Control Adapter card, connected to a personal computer. The software is written in Turbo-Pascal and can be easily adapted to

  12. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  breast cancer risk.

  13. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  14. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  15. An integrated environmental analysis of short rotation forests as a biomass resource

    International Nuclear Information System (INIS)

    Stjernquist, Ingrid

    1994-01-01

    Short-rotation plantations are an environmental sound energy resource if: (1) the biomass production systems are not pressed to maximum production, (2) cultivation measures are taken to minimize nutrient leaching, (3) the short-rotation plantations are designed for visual adaptation to the landscape, and (4) directed silvicultural measures are taken to retain and improve important habitats and protect marginal forest areas. (author)

  16. Effect of radiation dose and adaptive statistical iterative reconstruction on image quality of pulmonary computed tomography

    International Nuclear Information System (INIS)

    Sato, Jiro; Akahane, Masaaki; Inano, Sachiko; Terasaki, Mariko; Akai, Hiroyuki; Katsura, Masaki; Matsuda, Izuru; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    The purpose of this study was to assess the effects of dose and adaptive statistical iterative reconstruction (ASIR) on image quality of pulmonary computed tomography (CT). Inflated and fixed porcine lungs were scanned with a 64-slice CT system at 10, 20, 40 and 400 mAs. Using automatic exposure control, 40 mAs was chosen as standard dose. Scan data were reconstructed with filtered back projection (FBP) and ASIR. Image pairs were obtained by factorial combination of images at a selected level. Using a 21-point scale, three experienced radiologists independently rated differences in quality between adjacently displayed paired images for image noise, image sharpness and conspicuity of tiny nodules. A subjective quality score (SQS) for each image was computed based on Anderson's functional measurement theory. The standard deviation was recorded as a quantitative noise measurement. At all doses examined, SQSs improved with ASIR for all evaluation items. No significant differences were noted between the SQSs for 40%-ASIR images obtained at 20 mAs and those for FBP images at 40 mAs. Compared to the FBP algorithm, ASIR for lung CT can enable an approximately 50% dose reduction from the standard dose while preserving visualization of small structures. (author)

  17. A possible role of midbrain dopamine neurons in short- and long-term adaptation of saccades to position-reward mapping.

    Science.gov (United States)

    Takikawa, Yoriko; Kawagoe, Reiko; Hikosaka, Okihide

    2004-10-01

    Dopamine (DA) neurons respond to sensory stimuli that predict reward. To understand how DA neurons acquire such ability, we trained monkeys on a one-direction-rewarded version of memory-guided saccade task (1DR) only when we recorded from single DA neurons. In 1DR, position-reward mapping was changed across blocks of trials. In the early stage of training of 1DR, DA neurons responded to reward delivery; in the later stages, they responded predominantly to the visual cue that predicted reward or no reward (reward predictor) differentially. We found that such a shift of activity from reward to reward predictor also occurred within a block of trials after position-reward mapping was altered. A main effect of long-term training was to accelerate the within-block reward-to-predictor shift of DA neuronal responses. The within-block shift appeared first in the intermediate stage, but was slow, and DA neurons often responded to the cue that indicated reward in the preceding block. In the advanced stage, the reward-to-predictor shift occurred quickly such that the DA neurons' responses to visual cues faithfully matched the current position-reward mapping. Changes in the DA neuronal responses co-varied with the reward-predictive differentiation of saccade latency both in short-term (within-block) and long-term adaptation. DA neurons' response to the fixation point also underwent long-term changes until it occurred predominantly in the first trial within a block. This might trigger a switch between the learned sets. These results suggest that midbrain DA neurons play an essential role in adapting oculomotor behavior to frequent switches in position-reward mapping.

  18. Computational Intelligence Agent-Oriented Modelling

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2006-01-01

    Roč. 5, č. 2 (2006), s. 430-433 ISSN 1109-2777 R&D Projects: GA MŠk 1M0567 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * adaptive agents * computational intelligence Subject RIV: IN - Informatics, Computer Science

  19. Ethical Considerations in Designing Adaptive Persuasive Games

    DEFF Research Database (Denmark)

    Pedersen, Christoffer Holmgård; Khaled, Rilla; Yannakakis, Georgios N.

    In this poster, we describe an ongoing project concerning the development of an Adaptive Treatment Game (ATG) for treating Post Traumatic Stress Disorder. The ATG uses biofeedback and computer game technology to enable multiple treatment techniques and goals. We examine how a multidisciplinary ap...... approach shaped the prototype and we discuss the ethical implications of creating a self-adaptive, semiautonomous treatment game....

  20. Between-Trial Forgetting Due to Interference and Time in Motor Adaptation.

    Directory of Open Access Journals (Sweden)

    Sungshin Kim

    Full Text Available Learning a motor task with temporally spaced presentations or with other tasks intermixed between presentations reduces performance during training, but can enhance retention post training. These two effects are known as the spacing and contextual interference effect, respectively. Here, we aimed at testing a unifying hypothesis of the spacing and contextual interference effects in visuomotor adaptation, according to which forgetting between trials due to either spaced presentations or interference by another task will promote between-trial forgetting, which will depress performance during acquisition, but will promote retention. We first performed an experiment with three visuomotor adaptation conditions: a short inter-trial-interval (ITI condition (SHORT-ITI; a long ITI condition (LONG-ITI; and an alternating condition with two alternated opposite tasks (ALT, with the same single-task ITI as in LONG-ITI. In the SHORT-ITI condition, there was fastest increase in performance during training and largest immediate forgetting in the retention tests. In contrast, in the ALT condition, there was slowest increase in performance during training and little immediate forgetting in the retention tests. Compared to these two conditions, in the LONG-ITI, we found intermediate increase in performance during training and intermediate immediate forgetting. To account for these results, we fitted to the data six possible adaptation models with one or two time scales, and with interference in the fast, or in the slow, or in both time scales. Model comparison confirmed that two time scales and some degree of interferences in either time scale are needed to account for our experimental results. In summary, our results suggest that retention following adaptation is modulated by the degree of between-trial forgetting, which is due to time-based decay in single adaptation task and interferences in multiple adaptation tasks.

  1. Adaptive hybrid control of manipulators

    Science.gov (United States)

    Seraji, H.

    1987-01-01

    Simple methods for the design of adaptive force and position controllers for robot manipulators within the hybrid control architecuture is presented. The force controller is composed of an adaptive PID feedback controller, an auxiliary signal and a force feedforward term, and it achieves tracking of desired force setpoints in the constraint directions. The position controller consists of adaptive feedback and feedforward controllers and an auxiliary signal, and it accomplishes tracking of desired position trajectories in the free directions. The controllers are capable of compensating for dynamic cross-couplings that exist between the position and force control loops in the hybrid control architecture. The adaptive controllers do not require knowledge of the complex dynamic model or parameter values of the manipulator or the environment. The proposed control schemes are computationally fast and suitable for implementation in on-line control with high sampling rates.

  2. Use of dynamic grid adaption in the ASWR-method

    International Nuclear Information System (INIS)

    Graf, U.; Romstedt, P.; Werner, W.

    1985-01-01

    A dynamic grid adaption method has been developed for use with the ASWR-method. The method automatically adapts the number and position of the spatial meshpoints as the solution of hyperbolic or parabolic vector partial differential equations progresses in time. The mesh selection algorithm is based on the minimization of the L 2 -norm of the spatial discretization error. The method permits accurate calculation of the evolution of inhomogenities like wave fronts, shock layers and other sharp transitions, while generally using a coarse computational grid. The number of required mesh points is significantly reduced, relative to a fixed Eulerian grid. Since the mesh selection algorithm is computationally inexpensive, a corresponding reduction of computing time results

  3. An adaptive inverse kinematics algorithm for robot manipulators

    Science.gov (United States)

    Colbaugh, R.; Glass, K.; Seraji, H.

    1990-01-01

    An adaptive algorithm for solving the inverse kinematics problem for robot manipulators is presented. The algorithm is derived using model reference adaptive control (MRAC) theory and is computationally efficient for online applications. The scheme requires no a priori knowledge of the kinematics of the robot if Cartesian end-effector sensing is available, and it requires knowledge of only the forward kinematics if joint position sensing is used. Computer simulation results are given for the redundant seven-DOF robotics research arm, demonstrating that the proposed algorithm yields accurate joint angle trajectories for a given end-effector position/orientation trajectory.

  4. Adaptive Fuzzy-Lyapunov Controller Using Biologically Inspired Swarm Intelligence

    Directory of Open Access Journals (Sweden)

    Alejandro Carrasco Elizalde

    2008-01-01

    Full Text Available The collective behaviour of swarms produces smarter actions than those achieved by a single individual. Colonies of ants, flocks of birds and fish schools are examples of swarms interacting with their environment to achieve a common goal. This cooperative biological intelligence is the inspiration for an adaptive fuzzy controller developed in this paper. Swarm intelligence is used to adjust the parameters of the membership functions used in the adaptive fuzzy controller. The rules of the controller are designed using a computing-with-words approach called Fuzzy-Lyapunov synthesis to improve the stability and robustness of an adaptive fuzzy controller. Computing-with-words provides a powerful tool to manipulate numbers and symbols, like words in a natural language.

  5. An embedded implementation based on adaptive filter bank for brain-computer interface systems.

    Science.gov (United States)

    Belwafi, Kais; Romain, Olivier; Gannouni, Sofien; Ghaffari, Fakhreddine; Djemal, Ridha; Ouni, Bouraoui

    2018-07-15

    Brain-computer interface (BCI) is a new communication pathway for users with neurological deficiencies. The implementation of a BCI system requires complex electroencephalography (EEG) signal processing including filtering, feature extraction and classification algorithms. Most of current BCI systems are implemented on personal computers. Therefore, there is a great interest in implementing BCI on embedded platforms to meet system specifications in terms of time response, cost effectiveness, power consumption, and accuracy. This article presents an embedded-BCI (EBCI) system based on a Stratix-IV field programmable gate array. The proposed system relays on the weighted overlap-add (WOLA) algorithm to perform dynamic filtering of EEG-signals by analyzing the event-related desynchronization/synchronization (ERD/ERS). The EEG-signals are classified, using the linear discriminant analysis algorithm, based on their spatial features. The proposed system performs fast classification within a time delay of 0.430 s/trial, achieving an average accuracy of 76.80% according to an offline approach and 80.25% using our own recording. The estimated power consumption of the prototype is approximately 0.7 W. Results show that the proposed EBCI system reduces the overall classification error rate for the three datasets of the BCI-competition by 5% compared to other similar implementations. Moreover, experiment shows that the proposed system maintains a high accuracy rate with a short processing time, a low power consumption, and a low cost. Performing dynamic filtering of EEG-signals using WOLA increases the recognition rate of ERD/ERS patterns of motor imagery brain activity. This approach allows to develop a complete prototype of a EBCI system that achieves excellent accuracy rates. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. An adaptive algorithm for simulation of stochastic reaction-diffusion processes

    International Nuclear Information System (INIS)

    Ferm, Lars; Hellander, Andreas; Loetstedt, Per

    2010-01-01

    We propose an adaptive hybrid method suitable for stochastic simulation of diffusion dominated reaction-diffusion processes. For such systems, simulation of the diffusion requires the predominant part of the computing time. In order to reduce the computational work, the diffusion in parts of the domain is treated macroscopically, in other parts with the tau-leap method and in the remaining parts with Gillespie's stochastic simulation algorithm (SSA) as implemented in the next subvolume method (NSM). The chemical reactions are handled by SSA everywhere in the computational domain. A trajectory of the process is advanced in time by an operator splitting technique and the timesteps are chosen adaptively. The spatial adaptation is based on estimates of the errors in the tau-leap method and the macroscopic diffusion. The accuracy and efficiency of the method are demonstrated in examples from molecular biology where the domain is discretized by unstructured meshes.

  7. Mitigation and Adaptation within a Climate Policy Portfolio

    Science.gov (United States)

    An effective policy response to climate change will include, among other things, investments in lowering greenhouse gas emissions (mitigation), as well as short-term temporary (flow) and long-lived capital-intensive (stock) adaptation to climate change. A critical near-term ques...

  8. Degenerate target sites mediate rapid primed CRISPR adaptation

    NARCIS (Netherlands)

    Fineran, P.C.; Gerritzen, M.J.; Suarez-Diez, M.; Kunne, T.; Boekhorst, J.; Hijum, S.A.F.T. van; Staals, R.H.G.; Brouns, S.J.

    2014-01-01

    Prokaryotes encode adaptive immune systems, called CRISPR-Cas (clustered regularly interspaced short palindromic repeats-CRISPR associated), to provide resistance against mobile invaders, such as viruses and plasmids. Host immunity is based on incorporation of invader DNA sequences in a memory locus

  9. Degenerate target sites mediate rapid primed CRISPR adaptation

    NARCIS (Netherlands)

    Fineran, P.C.; Gerritzen, M.J.H.; Suarez Diez, M.; Künne, T.A.; Boekhorst, J.; Hijum, van S.A.F.T.; Staals, R.H.J.; Brouns, S.J.J.

    2014-01-01

    Prokaryotes encode adaptive immune systems, called CRISPR-Cas (clustered regularly interspaced short palindromic repeats–CRISPR associated), to provide resistance against mobile invaders, such as viruses and plasmids. Host immunity is based on incorporation of invader DNA sequences in a memory locus

  10. Short-distance expansion for the electromagnetic half-space Green's tensor: general results and an application to radiative lifetime computations

    International Nuclear Information System (INIS)

    Panasyuk, George Y; Schotland, John C; Markel, Vadim A

    2009-01-01

    We obtain a short-distance expansion for the half-space, frequency domain electromagnetic Green's tensor. The small parameter of the theory is ωε 1 L/c, where ω is the frequency, ε 1 is the permittivity of the upper half-space, in which both the source and the point of observation are located, and which is assumed to be transparent, c is the speed of light in vacuum and L is a characteristic length, defined as the distance from the point of observation to the reflected (with respect to the planar interface) position of the source. In the case when the lower half-space (the substrate) is characterized by a complex permittivity ε 2 , we compute the expansion to third order. For the case when the substrate is a transparent dielectric, we compute the imaginary part of the Green's tensor to seventh order. The analytical calculations are verified numerically. The practical utility of the obtained expansion is demonstrated by computing the radiative lifetime of two electromagnetically interacting molecules in the vicinity of a transparent dielectric substrate. The computation is performed in the strong interaction regime when the quasi-particle pole approximation is inapplicable. In this regime, the integral representation for the half-space Green's tensor is difficult to use while its electrostatic limiting expression is grossly inadequate. However, the analytical expansion derived in this paper can be used directly and efficiently. The results of this study are also relevant to nano-optics and near-field imaging, especially when tomographic image reconstruction is involved

  11. Sobol indices for dimension adaptivity in sparse grids

    NARCIS (Netherlands)

    Dwight, R.P.; Desmedt, S.G.L.; Shoeibi Omrani, P.

    2016-01-01

    Propagation of random variables through computer codes of many inputs is primarily limited by computational expense. The use of sparse grids mitigates these costs somewhat; here we show how Sobol indices can be used to perform dimension adaptivity to mitigate them further. The method is compared to

  12. Adaptive finite element method for shape optimization

    KAUST Repository

    Morin, Pedro; Nochetto, Ricardo H.; Pauletti, Miguel S.; Verani, Marco

    2012-01-01

    We examine shape optimization problems in the context of inexact sequential quadratic programming. Inexactness is a consequence of using adaptive finite element methods (AFEM) to approximate the state and adjoint equations (via the dual weighted residual method), update the boundary, and compute the geometric functional. We present a novel algorithm that equidistributes the errors due to shape optimization and discretization, thereby leading to coarse resolution in the early stages and fine resolution upon convergence, and thus optimizing the computational effort. We discuss the ability of the algorithm to detect whether or not geometric singularities such as corners are genuine to the problem or simply due to lack of resolution - a new paradigm in adaptivity. © EDP Sciences, SMAI, 2012.

  13. Adaptive finite element method for shape optimization

    KAUST Repository

    Morin, Pedro

    2012-01-16

    We examine shape optimization problems in the context of inexact sequential quadratic programming. Inexactness is a consequence of using adaptive finite element methods (AFEM) to approximate the state and adjoint equations (via the dual weighted residual method), update the boundary, and compute the geometric functional. We present a novel algorithm that equidistributes the errors due to shape optimization and discretization, thereby leading to coarse resolution in the early stages and fine resolution upon convergence, and thus optimizing the computational effort. We discuss the ability of the algorithm to detect whether or not geometric singularities such as corners are genuine to the problem or simply due to lack of resolution - a new paradigm in adaptivity. © EDP Sciences, SMAI, 2012.

  14. Mesh adaptation technique for Fourier-domain fluorescence lifetime imaging

    International Nuclear Information System (INIS)

    Soloviev, Vadim Y.

    2006-01-01

    A novel adaptive mesh technique in the Fourier domain is introduced for problems in fluorescence lifetime imaging. A dynamical adaptation of the three-dimensional scheme based on the finite volume formulation reduces computational time and balances the ill-posed nature of the inverse problem. Light propagation in the medium is modeled by the telegraph equation, while the lifetime reconstruction algorithm is derived from the Fredholm integral equation of the first kind. Stability and computational efficiency of the method are demonstrated by image reconstruction of two spherical fluorescent objects embedded in a tissue phantom

  15. Algorithms for adaptive nonlinear pattern recognition

    Science.gov (United States)

    Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric; Key, Gary

    2011-09-01

    In Bayesian pattern recognition research, static classifiers have featured prominently in the literature. A static classifier is essentially based on a static model of input statistics, thereby assuming input ergodicity that is not realistic in practice. Classical Bayesian approaches attempt to circumvent the limitations of static classifiers, which can include brittleness and narrow coverage, by training extensively on a data set that is assumed to cover more than the subtense of expected input. Such assumptions are not realistic for more complex pattern classification tasks, for example, object detection using pattern classification applied to the output of computer vision filters. In contrast, we have developed a two step process, that can render the majority of static classifiers adaptive, such that the tracking of input nonergodicities is supported. Firstly, we developed operations that dynamically insert (or resp. delete) training patterns into (resp. from) the classifier's pattern database, without requiring that the classifier's internal representation of its training database be completely recomputed. Secondly, we developed and applied a pattern replacement algorithm that uses the aforementioned pattern insertion/deletion operations. This algorithm is designed to optimize the pattern database for a given set of performance measures, thereby supporting closed-loop, performance-directed optimization. This paper presents theory and algorithmic approaches for the efficient computation of adaptive linear and nonlinear pattern recognition operators that use our pattern insertion/deletion technology - in particular, tabular nearest-neighbor encoding (TNE) and lattice associative memories (LAMs). Of particular interest is the classification of nonergodic datastreams that have noise corruption with time-varying statistics. The TNE and LAM based classifiers discussed herein have been successfully applied to the computation of object classification in hyperspectral

  16. Adaptation of cotton cultivars | Wondimu | African Crop Science ...

    African Journals Online (AJOL)

    For each cultivar a linear regression of yield on the mean yield of all cultivars for each year was computed to measure cultivar adaptation. The cultivars with the highest mean yield exhibited a similar degree of adaptation to different environments with regression coefficient close to 1.0. For example, the breeding lines, Acala ...

  17. A Weibull distribution accrual failure detector for cloud computing.

    Science.gov (United States)

    Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin

    2017-01-01

    Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.

  18. An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization

    Directory of Open Access Journals (Sweden)

    Shibli Nisar

    2016-01-01

    Full Text Available Short Time Fourier Transform (STFT is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT. Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.

  19. Computer-Aided Sensor Development Focused on Security Issues.

    Science.gov (United States)

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  20. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  1. Cross-Cultural Adaptation of a Farsi Version of the Impulsive Behavior Scale‎-Short Form in Iran

    Directory of Open Access Journals (Sweden)

    Omid Shokri

    2016-12-01

    Full Text Available Background: The aim of the present study was to investigate psychometric properties of the Impulsive Behavior Scale-Short Form (IBS-SF among undergraduate Farsi-speaking Iranian students. In this study, 201 individuals (95 men, 106 women answered to the IBS-SF and the Problematic and Risky Internet Use Screening Scale‎ (PRIUSS.Methods: The confirmatory factor analysis and internal consistency methods were used to compute the factorial validity and reliability of the IBS-SF, respectively. In order to examine the construct validity of the IBS-SF, the correlation of different dimensions of IBS-SF with PRIUSS was determined.Results: The results of confirmatory factor analysis showed that a 5-factor structure of the negative urgency, lack of perseverance, lack of premeditation, sensation seeking, and positive urgency was replicated in the Iranian sample. The IBS-SF convergent validity was confirmed by a correlation between different features of impulsivity trait and problematic and risky internet use behavior. The internal consistency of the different subscales of impulsivity trait ranged from 0.67 to 0.80.Conclusion: The present study revealed that the IBS-SF is a valid and reliable scale for measuring impulsivity trait among undergraduate Farsi-speaking Iranian students.

  2. Superresolution restoration of an image sequence: adaptive filtering approach.

    Science.gov (United States)

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  3. A parallel adaptive finite difference algorithm for petroleum reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hoang, Hai Minh

    2005-07-01

    Adaptive finite differential for problems arising in simulation of flow in porous medium applications are considered. Such methods have been proven useful for overcoming limitations of computational resources and improving the resolution of the numerical solutions to a wide range of problems. By local refinement of the computational mesh where it is needed to improve the accuracy of solutions, yields better solution resolution representing more efficient use of computational resources than is possible with traditional fixed-grid approaches. In this thesis, we propose a parallel adaptive cell-centered finite difference (PAFD) method for black-oil reservoir simulation models. This is an extension of the adaptive mesh refinement (AMR) methodology first developed by Berger and Oliger (1984) for the hyperbolic problem. Our algorithm is fully adaptive in time and space through the use of subcycling, in which finer grids are advanced at smaller time steps than the coarser ones. When coarse and fine grids reach the same advanced time level, they are synchronized to ensure that the global solution is conservative and satisfy the divergence constraint across all levels of refinement. The material in this thesis is subdivided in to three overall parts. First we explain the methodology and intricacies of AFD scheme. Then we extend a finite differential cell-centered approximation discretization to a multilevel hierarchy of refined grids, and finally we are employing the algorithm on parallel computer. The results in this work show that the approach presented is robust, and stable, thus demonstrating the increased solution accuracy due to local refinement and reduced computing resource consumption. (Author)

  4. Final Report: Symposium on Adaptive Methods for Partial Differential Equations

    Energy Technology Data Exchange (ETDEWEB)

    Pernice, Michael; Johnson, Christopher R.; Smith, Philip J.; Fogelson, Aaron

    1998-12-08

    Complex physical phenomena often include features that span a wide range of spatial and temporal scales. Accurate simulation of such phenomena can be difficult to obtain, and computations that are under-resolved can even exhibit spurious features. While it is possible to resolve small scale features by increasing the number of grid points, global grid refinement can quickly lead to problems that are intractable, even on the largest available computing facilities. These constraints are particularly severe for three dimensional problems that involve complex physics. One way to achieve the needed resolution is to refine the computational mesh locally, in only those regions where enhanced resolution is required. Adaptive solution methods concentrate computational effort in regions where it is most needed. These methods have been successfully applied to a wide variety of problems in computational science and engineering. Adaptive methods can be difficult to implement, prompting the development of tools and environments to facilitate their use. To ensure that the results of their efforts are useful, algorithm and tool developers must maintain close communication with application specialists. Conversely it remains difficult for application specialists who are unfamiliar with the methods to evaluate the trade-offs between the benefits of enhanced local resolution and the effort needed to implement an adaptive solution method.

  5. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    Science.gov (United States)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  6. Interoperable adaptive educational hypermedia : a web service definition

    NARCIS (Netherlands)

    Meccawy, M.; Celik, I.; Cristea, A.I.; Stewart, C.; Ashman, H.; Kinshuk, xx; Koper, R.; Kommers, P.A.M.

    2006-01-01

    This paper presents an approach to resolve the problem of authoring and interchanging educational material, based on web services. Here we describe the ultimate goal, of reusing and interchanging freely adaptive elearning material, shortly sketch previous solutions, showing their benefits but also

  7. Adaptive grid generation in a patient-specific cerebral aneurysm

    Science.gov (United States)

    Hodis, Simona; Kallmes, David F.; Dragomir-Daescu, Dan

    2013-11-01

    Adapting grid density to flow behavior provides the advantage of increasing solution accuracy while decreasing the number of grid elements in the simulation domain, therefore reducing the computational time. One method for grid adaptation requires successive refinement of grid density based on observed solution behavior until the numerical errors between successive grids are negligible. However, such an approach is time consuming and it is often neglected by the researchers. We present a technique to calculate the grid size distribution of an adaptive grid for computational fluid dynamics (CFD) simulations in a complex cerebral aneurysm geometry based on the kinematic curvature and torsion calculated from the velocity field. The relationship between the kinematic characteristics of the flow and the element size of the adaptive grid leads to a mathematical equation to calculate the grid size in different regions of the flow. The adaptive grid density is obtained such that it captures the more complex details of the flow with locally smaller grid size, while less complex flow characteristics are calculated on locally larger grid size. The current study shows that kinematic curvature and torsion calculated from the velocity field in a cerebral aneurysm can be used to find the locations of complex flow where the computational grid needs to be refined in order to obtain an accurate solution. We found that the complexity of the flow can be adequately described by velocity and vorticity and the angle between the two vectors. For example, inside the aneurysm bleb, at the bifurcation, and at the major arterial turns the element size in the lumen needs to be less than 10% of the artery radius, while at the boundary layer, the element size should be smaller than 1% of the artery radius, for accurate results within a 0.5% relative approximation error. This technique of quantifying flow complexity and adaptive remeshing has the potential to improve results accuracy and reduce

  8. Hybrid Direct and Iterative Solver with Library of Multi-criteria Optimal Orderings for h Adaptive Finite Element Method Computations

    KAUST Repository

    AbouEisha, Hassan M.

    2016-06-02

    In this paper we present a multi-criteria optimization of element partition trees and resulting orderings for multi-frontal solver algorithms executed for two dimensional h adaptive finite element method. In particular, the problem of optimal ordering of elimination of rows in the sparse matrices resulting from adaptive finite element method computations is reduced to the problem of finding of optimal element partition trees. Given a two dimensional h refined mesh, we find all optimal element partition trees by using the dynamic programming approach. An element partition tree defines a prescribed order of elimination of degrees of freedom over the mesh. We utilize three different metrics to estimate the quality of the element partition tree. As the first criterion we consider the number of floating point operations(FLOPs) performed by the multi-frontal solver. As the second criterion we consider the number of memory transfers (MEMOPS) performed by the multi-frontal solver algorithm. As the third criterion we consider memory usage (NONZEROS) of the multi-frontal direct solver. We show the optimization results for FLOPs vs MEMOPS as well as for the execution time estimated as FLOPs+100MEMOPS vs NONZEROS. We obtain Pareto fronts with multiple optimal trees, for each mesh, and for each refinement level. We generate a library of optimal elimination trees for small grids with local singularities. We also propose an algorithm that for a given large mesh with identified local sub-grids, each one with local singularity. We compute Schur complements over the sub-grids using the optimal trees from the library, and we submit the sequence of Schur complements into the iterative solver ILUPCG.

  9. Prematurity reduces functional adaptation to intestinal resection in piglets

    DEFF Research Database (Denmark)

    Aunsholt, Lise; Thymann, Thomas; Qvist, Niels

    2015-01-01

    Background: Necrotizing enterocolitis and congenital gastrointestinal malformations in infants often require intestinal resection, with a subsequent risk of short bowel syndrome (SBS). We hypothesized that immediate intestinal adaptation following resection of the distal intestine with placement ...

  10. Adaptive Inference on General Graphical Models

    OpenAIRE

    Acar, Umut A.; Ihler, Alexander T.; Mettu, Ramgopal; Sumer, Ozgur

    2012-01-01

    Many algorithms and applications involve repeatedly solving variations of the same inference problem; for example we may want to introduce new evidence to the model or perform updates to conditional dependencies. The goal of adaptive inference is to take advantage of what is preserved in the model and perform inference more rapidly than from scratch. In this paper, we describe techniques for adaptive inference on general graphs that support marginal computation and updates to the conditional ...

  11. Unstructured Grid Adaptation: Status, Potential Impacts, and Recommended Investments Toward CFD Vision 2030

    Science.gov (United States)

    Park, Michael A.; Krakos, Joshua A.; Michal, Todd; Loseille, Adrien; Alonso, Juan J.

    2016-01-01

    Unstructured grid adaptation is a powerful tool to control discretization error for Computational Fluid Dynamics (CFD). It has enabled key increases in the accuracy, automation, and capacity of some fluid simulation applications. Slotnick et al. provides a number of case studies in the CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences to illustrate the current state of CFD capability and capacity. The authors forecast the potential impact of emerging High Performance Computing (HPC) environments forecast in the year 2030 and identify that mesh generation and adaptivity continue to be significant bottlenecks in the CFD work flow. These bottlenecks may persist because very little government investment has been targeted in these areas. To motivate investment, the impacts of improved grid adaptation technologies are identified. The CFD Vision 2030 Study roadmap and anticipated capabilities in complementary disciplines are quoted to provide context for the progress made in grid adaptation in the past fifteen years, current status, and a forecast for the next fifteen years with recommended investments. These investments are specific to mesh adaptation and impact other aspects of the CFD process. Finally, a strategy is identified to diffuse grid adaptation technology into production CFD work flows.

  12. An adaptive Cartesian control scheme for manipulators

    Science.gov (United States)

    Seraji, H.

    1987-01-01

    A adaptive control scheme for direct control of manipulator end-effectors to achieve trajectory tracking in Cartesian space is developed. The control structure is obtained from linear multivariable theory and is composed of simple feedforward and feedback controllers and an auxiliary input. The direct adaptation laws are derived from model reference adaptive control theory and are not based on parameter estimation of the robot model. The utilization of feedforward control and the inclusion of auxiliary input are novel features of the present scheme and result in improved dynamic performance over existing adaptive control schemes. The adaptive controller does not require the complex mathematical model of the robot dynamics or any knowledge of the robot parameters or the payload, and is computationally fast for online implementation with high sampling rates.

  13. Smartphone adapters for digital photomicrography.

    Science.gov (United States)

    Roy, Somak; Pantanowitz, Liron; Amin, Milon; Seethala, Raja R; Ishtiaque, Ahmed; Yousem, Samuel A; Parwani, Anil V; Cucoranu, Ioan; Hartman, Douglas J

    2014-01-01

    Photomicrographs in Anatomic Pathology provide a means of quickly sharing information from a glass slide for consultation, education, documentation and publication. While static image acquisition historically involved the use of a permanently mounted camera unit on a microscope, such cameras may be expensive, need to be connected to a computer, and often require proprietary software to acquire and process images. Another novel approach for capturing digital microscopic images is to use smartphones coupled with the eyepiece of a microscope. Recently, several smartphone adapters have emerged that allow users to attach mobile phones to the microscope. The aim of this study was to test the utility of these various smartphone adapters. We surveyed the market for adapters to attach smartphones to the ocular lens of a conventional light microscope. Three adapters (Magnifi, Skylight and Snapzoom) were tested. We assessed the designs of these adapters and their effectiveness at acquiring static microscopic digital images. All adapters facilitated the acquisition of digital microscopic images with a smartphone. The optimal adapter was dependent on the type of phone used. The Magnifi adapters for iPhone were incompatible when using a protective case. The Snapzoom adapter was easiest to use with iPhones and other smartphones even with protective cases. Smartphone adapters are inexpensive and easy to use for acquiring digital microscopic images. However, they require some adjustment by the user in order to optimize focus and obtain good quality images. Smartphone microscope adapters provide an economically feasible method of acquiring and sharing digital pathology photomicrographs.

  14. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  15. Adaptive security systems -- Combining expert systems with adaptive technologies

    International Nuclear Information System (INIS)

    Argo, P.; Loveland, R.; Anderson, K.

    1997-01-01

    The Adaptive Multisensor Integrated Security System (AMISS) uses a variety of computational intelligence techniques to reason from raw sensor data through an array of processing layers to arrive at an assessment for alarm/alert conditions based on human behavior within a secure facility. In this paper, the authors give an overview of the system and briefly describe some of the major components of the system. This system is currently under development and testing in a realistic facility setting

  16. Iterative Adaptive Sampling For Accurate Direct Illumination

    National Research Council Canada - National Science Library

    Donikian, Michael

    2004-01-01

    This thesis introduces a new multipass algorithm, Iterative Adaptive Sampling, for efficiently computing the direct illumination in scenes with many lights, including area lights that cause realistic soft shadows...

  17. Rapid adaptation to oil exposure in the cosmopolitan copepod Acartia tonsa

    DEFF Research Database (Denmark)

    Krause, K. E.; Dinh, Khuong Van; Nielsen, Torkel Gissel

    Oil spills are potential environmental hazards to marine ecosystems worldwide. Oil spills may persist in seawater longer than one generation of many zooplankton species. However, whether populations of short-lived and fast growing marine organisms adapt to oil exposure through natural selection...... in size at maturity of females was less pronounced in the second generation. Strikingly, both survival, egg production and hatching success were recovered in the second generation, indicating a rapid selection towards individuals with adaptations to deal with pyrene exposure. Our results show...... that populations of short-lived and fast-growing copepods have the potential of showing surprisingly strong resilience to the type of oil contamination they might face in their natural coastal habitats...

  18. The population ecology of contemporary adaptations: what empirical studies reveal about the conditions that promote adaptive evolution.

    Science.gov (United States)

    Reznick, D N; Ghalambor, C K

    2001-01-01

    Under what conditions might organisms be capable of rapid adaptive evolution? We reviewed published studies documenting contemporary adaptations in natural populations and looked for general patterns in the population ecological causes. We found that studies of contemporary adaptation fall into two general settings: (1) colonization of new environments that established newly adapted populations, and (2) local adaptations within the context of a heterogeneous environments and metapopulation structure. Local ecological processes associated with colonizations and introductions included exposure to: (1) a novel host or food resource; (2) a new biophysical environment; (3) a new predator community; and (4) a new coexisting competitor. The new environments that were colonized often had depauperate communities, sometimes because of anthropogenic disturbance. Local adaptation in heterogeneous environments was also often associated with recent anthropogenic changes, such as insecticide and herbicide resistance, or industrial melanism. A common feature of many examples is the combination of directional selection with at least a short-term opportunity for population growth. We suggest that such opportunities for population growth may be a key factor that promotes rapid evolution, since directional selection might otherwise be expected to cause population decline and create the potential for local extinction, which is an ever-present alternative to local adaptation. We also address the large discrepancy between the rate of evolution observed in contemporary studies and the apparent rate of evolution seen in the fossil record.

  19. Disturbance Accommodating Adaptive Control with Application to Wind Turbines

    Science.gov (United States)

    Frost, Susan

    2012-01-01

    Adaptive control techniques are well suited to applications that have unknown modeling parameters and poorly known operating conditions. Many physical systems experience external disturbances that are persistent or continually recurring. Flexible structures and systems with compliance between components often form a class of systems that fail to meet standard requirements for adaptive control. For these classes of systems, a residual mode filter can restore the ability of the adaptive controller to perform in a stable manner. New theory will be presented that enables adaptive control with accommodation of persistent disturbances using residual mode filters. After a short introduction to some of the control challenges of large utility-scale wind turbines, this theory will be applied to a high-fidelity simulation of a wind turbine.

  20. A white box perspective on behavioural adaptation

    DEFF Research Database (Denmark)

    Bruni, Roberto; Corradini, Andrea; Gadducci, Fabio

    2015-01-01

    We present a white-box conceptual framework for adaptation developed in the context of the EU Project ASCENS coordinated by Martin Wirsing. We called it CoDA, for Control Data Adaptation, since it is based on the notion of control data. CoDA promotes a neat separation between application and adap......We present a white-box conceptual framework for adaptation developed in the context of the EU Project ASCENS coordinated by Martin Wirsing. We called it CoDA, for Control Data Adaptation, since it is based on the notion of control data. CoDA promotes a neat separation between application...... and adaptation logic through a clear identification of the set of data that is relevant for the latter. The framework provides an original perspective from which we survey a representative set of approaches to adaptation, ranging from programming languages and paradigms to computational models and architectural...

  1. Metabolic Adaptation to Muscle Ischemia

    Science.gov (United States)

    Cabrera, Marco E.; Coon, Jennifer E.; Kalhan, Satish C.; Radhakrishnan, Krishnan; Saidel, Gerald M.; Stanley, William C.

    2000-01-01

    Although all tissues in the body can adapt to varying physiological/pathological conditions, muscle is the most adaptable. To understand the significance of cellular events and their role in controlling metabolic adaptations in complex physiological systems, it is necessary to link cellular and system levels by means of mechanistic computational models. The main objective of this work is to improve understanding of the regulation of energy metabolism during skeletal/cardiac muscle ischemia by combining in vivo experiments and quantitative models of metabolism. Our main focus is to investigate factors affecting lactate metabolism (e.g., NADH/NAD) and the inter-regulation between carbohydrate and fatty acid metabolism during a reduction in regional blood flow. A mechanistic mathematical model of energy metabolism has been developed to link cellular metabolic processes and their control mechanisms to tissue (skeletal muscle) and organ (heart) physiological responses. We applied this model to simulate the relationship between tissue oxygenation, redox state, and lactate metabolism in skeletal muscle. The model was validated using human data from published occlusion studies. Currently, we are investigating the difference in the responses to sudden vs. gradual onset ischemia in swine by combining in vivo experimental studies with computational models of myocardial energy metabolism during normal and ischemic conditions.

  2. A short assessment of health literacy (SAHL) in the Netherlands

    NARCIS (Netherlands)

    Pander Maat, Henk; Essink-Bot, Marie-Louise; Leenaars, Karlijn EF; Fransen, Mirjam P.

    2014-01-01

    Abstract Background: An earlier attempt to adapt the REALM (Rapid Estimate of Adult Literacy in Medicine) word recognition test to Dutch was not entirely successful due to ceiling effects. In contrast to REALM, the Short Assessment of Health Literacy (SAHL) assesses both word recognition and

  3. Synaptic plasticity, neural circuits, and the emerging role of altered short-term information processing in schizophrenia

    Science.gov (United States)

    Crabtree, Gregg W.; Gogos, Joseph A.

    2014-01-01

    Synaptic plasticity alters the strength of information flow between presynaptic and postsynaptic neurons and thus modifies the likelihood that action potentials in a presynaptic neuron will lead to an action potential in a postsynaptic neuron. As such, synaptic plasticity and pathological changes in synaptic plasticity impact the synaptic computation which controls the information flow through the neural microcircuits responsible for the complex information processing necessary to drive adaptive behaviors. As current theories of neuropsychiatric disease suggest that distinct dysfunctions in neural circuit performance may critically underlie the unique symptoms of these diseases, pathological alterations in synaptic plasticity mechanisms may be fundamental to the disease process. Here we consider mechanisms of both short-term and long-term plasticity of synaptic transmission and their possible roles in information processing by neural microcircuits in both health and disease. As paradigms of neuropsychiatric diseases with strongly implicated risk genes, we discuss the findings in schizophrenia and autism and consider the alterations in synaptic plasticity and network function observed in both human studies and genetic mouse models of these diseases. Together these studies have begun to point toward a likely dominant role of short-term synaptic plasticity alterations in schizophrenia while dysfunction in autism spectrum disorders (ASDs) may be due to a combination of both short-term and long-term synaptic plasticity alterations. PMID:25505409

  4. Analysis of genetic polymorphism of nine short tandem repeat loci in ...

    African Journals Online (AJOL)

    Yomi

    2012-03-15

    Mar 15, 2012 ... Key words: short tandem repeat, repeat motif, genetic polymorphism, Han population, forensic genetics. INTRODUCTION. Short tandem repeat (STR) is widely .... Data analysis. The exact test of Hardy-Weinberg equilibrium was conducted with. Arlequin version 3.5 software (Computational and Molecular.

  5. Learn-and-Adapt Stochastic Dual Gradients for Network Resource Allocation

    OpenAIRE

    Chen, Tianyi; Ling, Qing; Giannakis, Georgios B.

    2017-01-01

    Network resource allocation shows revived popularity in the era of data deluge and information explosion. Existing stochastic optimization approaches fall short in attaining a desirable cost-delay tradeoff. Recognizing the central role of Lagrange multipliers in network resource allocation, a novel learn-and-adapt stochastic dual gradient (LA-SDG) method is developed in this paper to learn the sample-optimal Lagrange multiplier from historical data, and accordingly adapt the upcoming resource...

  6. [Cross-cultural adaptation and apparent and content validity of the short version of The Eating Motivation Survey (TEMS) in Brazilian Portuguese].

    Science.gov (United States)

    Moraes, Jéssica Maria Muniz; Alvarenga, Marle Dos Santos

    2017-10-26

    Understanding why people eat what they eat is essential for developing nutritional guidelines capable of modifying inadequate and dysfunctional eating patterns. Such understanding can be assessed by specific instruments, amongst which The Eating Motivation Survey (TEMS) allows the identification of factors that determine motivations for eating and food choices. The aim of this study is to present the cross-cultural adaptation of the short version of TEMS for use in studies in the Brazilian population. The process involved conceptual and item equivalences; semantic equivalence by 2 translators, 1 linguist, 22 experts (frequency of response understanding), and 23 bilingual individuals (with response comparisons by the paired t-test, Pearson correlation coefficient, and intra-class correlation coefficient); and operational equivalence, performed with 32 individuals. The measurement equivalence corresponding to psychometric properties is under way. All equivalences showed satisfactory results for the scale's use in Brazil, thus allowing application of TEMS to assess motivations for eating choices in the Brazilian context.

  7. Adapting inland fisheries management to a changing climate

    Science.gov (United States)

    Paukert, Craig P.; Glazer, Bob A.; Hansen, Gretchen J. A.; Irwin, Brian J.; Jacobson, Peter C.; Kershner, Jeffrey L.; Shuter, Brian J.; Whitney, James E.; Lynch, Abigail J.

    2016-01-01

    Natural resource decision makers are challenged to adapt management to a changing climate while balancing short-term management goals with long-term changes in aquatic systems. Adaptation will require developing resilient ecosystems and resilient management systems. Decision makers already have tools to develop or ensure resilient aquatic systems and fisheries such as managing harvest and riparian zones. Because fisheries management often interacts with multiple stakeholders, adaptation strategies involving fisheries managers and other partners focused on land use, policy, and human systems, coupled with long-term monitoring, are necessary for resilient systems. We show how agencies and organizations are adapting to a changing climate in Minnesota and Ontario lakes and Montana streams. We also present how the Florida Fish and Wildlife Commission created a management structure to develop adaptation strategies. These examples demonstrate how organizations and agencies can cope with climate change effects on fishes and fisheries through creating resilient management and ecological systems.

  8. Design of an LVDS to USB3.0 adapter and application

    Science.gov (United States)

    Qiu, Xiaohan; Wang, Yu; Zhao, Xin; Chang, Zhen; Zhang, Quan; Tian, Yuze; Zhang, Yunyi; Lin, Fang; Liu, Wenqing

    2016-10-01

    USB 3.0 specification was published in 2008. With the development of technology, USB 3.0 is becoming popular. LVDS(Low Voltage Differential Signaling) to USB 3.0 Adapter connects the communication port of spectrometer device and the USB 3.0 port of a computer, and converts the output of an LVDS spectrometer device data to USB. In order to adapt to the changing and developing of technology, LVDS to USB3.0 Adapter was designed and developed based on LVDS to USB2.0 Adapter. The CYUSB3014, a new generation of USB bus interface chip produced by Cypress and conforming to USB3.0 communication protocol, utilizes GPIF-II (GPIF, general programmable interface) to connect the FPGA and increases effective communication speed to 2Gbps. Therefore, the adapter, based on USB3.0 technology, is able to connect more spectrometers to single computer and provides technical basis for the development of the higher speed industrial camera. This article describes the design and development process of the LVDS to USB3.0 adapter.

  9. High-Capacity Short-Range Optical Communication Links

    DEFF Research Database (Denmark)

    Tatarczak, Anna

    Over the last decade, we have observed a tremendous spread of end-user mobile devices. The user base of a mobile application can grow or shrink by millions per day. This situation creates a pressing need for highly scalable server infrastructure; a need nowadays satisfied through cloud computing...... offered by data centers. As the popularity of cloud computing soars, the demand for high-speed, short-range data center links grows. Vertical cavity surface emitting lasers (VCSEL) and multimode fibers (MMF) prove especially well-suited for such scenarios. VCSELs have high modulation bandwidths......, we achieve 10 Gbps over 400 m and then conrm the approach in an optimized system at 25 Gbps over 300 m. The techniques described in this thesis leverage additional degrees of freedom to better utilize the available resources of short-range links. The proposed schemes enable higher speeds and longer...

  10. Adaptive value of sex in microbial pathogens.

    Science.gov (United States)

    Michod, Richard E; Bernstein, Harris; Nedelcu, Aurora M

    2008-05-01

    Explaining the adaptive value of sex is one of the great outstanding problems in biology. The challenge comes from the difficulty in identifying the benefits provided by sex, which must outweigh the substantial costs of sex. Here, we consider the adaptive value of sex in viruses, bacteria and fungi, and particularly the information available on the adaptive role of sex in pathogenic microorganisms. Our general theme is that the varied aspects of sex in pathogens illustrate the varied issues surrounding the evolution of sex generally. These include, the benefits of sex (in the short- and long-term), as well as the costs of sex (both to the host and to the pathogen). For the benefits of sex (that is, its adaptive value), we consider three hypotheses: (i) sex provides for effective and efficient recombinational repair of DNA damages, (ii) sex provides DNA for food, and (iii) sex produces variation and reduces genetic associations among alleles under selection. Although the evolution of sex in microbial pathogens illustrates these general issues, our paper is not a general review of theories for the evolution of sex in all organisms. Rather, we focus on the adaptive value of sex in microbial pathogens and conclude that in terms of short-term benefits, the DNA repair hypothesis has the most support and is the most generally applicable hypothesis in this group. In particular, recombinational repair of DNA damages may substantially benefit pathogens when challenged by the oxidative defenses of the host. However, in the long-term, sex may help get rid of mutations, increase the rate of adaptation of the population, and, in pathogens, may infrequently create new infective strains. An additional general issue about sex illustrated by pathogens is that some of the most interesting consequences of sex are not necessarily the reasons for which sex evolved. For example, antibiotic resistance may be transferred by bacterial sex, but this transfer is probably not the reason sex

  11. Shorts due to diagnostic leads

    International Nuclear Information System (INIS)

    Ellis, J.F.; Lubell, M.S.; Pillsbury, R.D.; Shen, S.S.; Thome, R.J.; Walstrom, P.L.

    1985-01-01

    The superconducting toroidal field coils that are being tested in the Large Coil Test Facility (LCTF) are heavily instrumented. General Electric coil, a lead wire of an internal sensor became shorted across an estimated three or four turns of the pancake winding. This short occurred during the final stages of the winding fabrication and was not accessible for repair. Resistance, voltage gradient, and transient voltage decay measurements were performed to characterize the short and the magnetic damping of the large steel bobbin and outer structural ring. The 32-gage wire causing the short was estimated to be about 10 cm long, with a resistance of 55 mΩ. As a safety measure, we decided to burn out the shorted wire at room temperature before installing the coil in LCTF. Tests were made to determine the energy needed to vaporize a small wire. Computer calculations indicated that within the voltage limits set for the coil, it was not feasible to burn out the wire by rapidly dumping the coil from a low-current dc charge-up. We accomplished the burnout by applying 800 V at 3.25 A, and 60 Hz for about 1 s. Transient voltage decay measurements made after the burnout and compared with those made before the attempt confirmed that the short had indeed been opened

  12. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  13. NeatSort - A practical adaptive algorithm

    OpenAIRE

    La Rocca, Marcello; Cantone, Domenico

    2014-01-01

    We present a new adaptive sorting algorithm which is optimal for most disorder metrics and, more important, has a simple and quick implementation. On input $X$, our algorithm has a theoretical $\\Omega (|X|)$ lower bound and a $\\mathcal{O}(|X|\\log|X|)$ upper bound, exhibiting amazing adaptive properties which makes it run closer to its lower bound as disorder (computed on different metrics) diminishes. From a practical point of view, \\textit{NeatSort} has proven itself competitive with (and of...

  14. Multiple model adaptive control with mixing

    Science.gov (United States)

    Kuipers, Matthew

    Despite the remarkable theoretical accomplishments and successful applications of adaptive control, the field is not sufficiently mature to solve challenging control problems requiring strict performance and safety guarantees. Towards addressing these issues, a novel deterministic multiple-model adaptive control approach called adaptive mixing control is proposed. In this approach, adaptation comes from a high-level system called the supervisor that mixes into feedback a number of candidate controllers, each finely-tuned to a subset of the parameter space. The mixing signal, the supervisor's output, is generated by estimating the unknown parameters and, at every instant of time, calculating the contribution level of each candidate controller based on certainty equivalence. The proposed architecture provides two characteristics relevant to solving stringent, performance-driven applications. First, the full-suite of linear time invariant control tools is available. A disadvantage of conventional adaptive control is its restriction to utilizing only those control laws whose solutions can be feasibly computed in real-time, such as model reference and pole-placement type controllers. Because its candidate controllers are computed off line, the proposed approach suffers no such restriction. Second, the supervisor's output is smooth and does not necessarily depend on explicit a priori knowledge of the disturbance model. These characteristics can lead to improved performance by avoiding the unnecessary switching and chattering behaviors associated with some other multiple adaptive control approaches. The stability and robustness properties of the adaptive scheme are analyzed. It is shown that the mean-square regulation error is of the order of the modeling error. And when the parameter estimate converges to its true value, which is guaranteed if a persistence of excitation condition is satisfied, the adaptive closed-loop system converges exponentially fast to a closed

  15. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  16. Big data extraction with adaptive wavelet analysis (Presentation Video)

    Science.gov (United States)

    Qu, Hongya; Chen, Genda; Ni, Yiqing

    2015-04-01

    Nondestructive evaluation and sensing technology have been increasingly applied to characterize material properties and detect local damage in structures. More often than not, they generate images or data strings that are difficult to see any physical features without novel data extraction techniques. In the literature, popular data analysis techniques include Short-time Fourier Transform, Wavelet Transform, and Hilbert Transform for time efficiency and adaptive recognition. In this study, a new data analysis technique is proposed and developed by introducing an adaptive central frequency of the continuous Morlet wavelet transform so that both high frequency and time resolution can be maintained in a time-frequency window of interest. The new analysis technique is referred to as Adaptive Wavelet Analysis (AWA). This paper will be organized in several sections. In the first section, finite time-frequency resolution limitations in the traditional wavelet transform are introduced. Such limitations would greatly distort the transformed signals with a significant frequency variation with time. In the second section, Short Time Wavelet Transform (STWT), similar to Short Time Fourier Transform (STFT), is defined and developed to overcome such shortcoming of the traditional wavelet transform. In the third section, by utilizing the STWT and a time-variant central frequency of the Morlet wavelet, AWA can adapt the time-frequency resolution requirement to the signal variation over time. Finally, the advantage of the proposed AWA is demonstrated in Section 4 with a ground penetrating radar (GPR) image from a bridge deck, an analytical chirp signal with a large range sinusoidal frequency change over time, the train-induced acceleration responses of the Tsing-Ma Suspension Bridge in Hong Kong, China. The performance of the proposed AWA will be compared with the STFT and traditional wavelet transform.

  17. Accurate typing of short tandem repeats from genome-wide sequencing data and its applications.

    Science.gov (United States)

    Fungtammasan, Arkarachai; Ananda, Guruprasad; Hile, Suzanne E; Su, Marcia Shu-Wei; Sun, Chen; Harris, Robert; Medvedev, Paul; Eckert, Kristin; Makova, Kateryna D

    2015-05-01

    Short tandem repeats (STRs) are implicated in dozens of human genetic diseases and contribute significantly to genome variation and instability. Yet profiling STRs from short-read sequencing data is challenging because of their high sequencing error rates. Here, we developed STR-FM, short tandem repeat profiling using flank-based mapping, a computational pipeline that can detect the full spectrum of STR alleles from short-read data, can adapt to emerging read-mapping algorithms, and can be applied to heterogeneous genetic samples (e.g., tumors, viruses, and genomes of organelles). We used STR-FM to study STR error rates and patterns in publicly available human and in-house generated ultradeep plasmid sequencing data sets. We discovered that STRs sequenced with a PCR-free protocol have up to ninefold fewer errors than those sequenced with a PCR-containing protocol. We constructed an error correction model for genotyping STRs that can distinguish heterozygous alleles containing STRs with consecutive repeat numbers. Applying our model and pipeline to Illumina sequencing data with 100-bp reads, we could confidently genotype several disease-related long trinucleotide STRs. Utilizing this pipeline, for the first time we determined the genome-wide STR germline mutation rate from a deeply sequenced human pedigree. Additionally, we built a tool that recommends minimal sequencing depth for accurate STR genotyping, depending on repeat length and sequencing read length. The required read depth increases with STR length and is lower for a PCR-free protocol. This suite of tools addresses the pressing challenges surrounding STR genotyping, and thus is of wide interest to researchers investigating disease-related STRs and STR evolution. © 2015 Fungtammasan et al.; Published by Cold Spring Harbor Laboratory Press.

  18. SU-E-J-153: MRI Based, Daily Adaptive Radiotherapy for Rectal Cancer: Contour Adaptation

    International Nuclear Information System (INIS)

    Kleijnen, J; Burbach, M; Verbraeken, T; Weggers, R; Zoetelief, A; Reerink, O; Lagendijk, J; Raaymakers, B; Asselen, B

    2014-01-01

    Purpose: A major hurdle in adaptive radiotherapy is the adaptation of the planning MRI's delineations to the daily anatomy. We therefore investigate the accuracy and time needed for online clinical target volume (CTV) adaptation by radiation therapists (RTT), to be used in MRI-guided adaptive treatments on a MRI-Linac (MRL). Methods: Sixteen patients, diagnosed with early stage rectal cancer, underwent a T2-weighted MRI prior to each fraction of short-course radiotherapy, resulting in 4–5 scans per patient. On these scans, the CTV was delineated according to guidelines by an experienced radiation oncologist (RO) and considered to be the gold standard. For each patient, the first MRI was considered as the planning MRI and matched on bony anatomy to the 3–4 daily MRIs. The planning MRI's CTV delineation was rigidly propagated to the daily MRI scans as a proposal for adaptation. Three RTTs in training started the adaptation of the CTV conform guidelines, after a two hour training lecture and a two patient (n=7) training set. To assess the inter-therapist variation, all three RTTs altered delineations of 3 patients (n=12). One RTT altered the CTV delineations (n=53) of the remaining 11 patients. Time needed for adaptation of the CTV to guidelines was registered.As a measure of agreement, the conformity index (CI) was determined between the RTTs' delineations as a group. Dice similarity coefficients were determined between delineations of the RTT and the RO. Results: We found good agreement between RTTs' and RO's delineations (average Dice=0.91, SD=0.03). Furthermore, the inter-observer agreement between the RTTs was high (average CI=0.94, SD=0.02). Adaptation time reduced from 10:33 min (SD= 3:46) to 2:56 min (SD=1:06) between the first and last ten delineations, respectively. Conclusion: Daily CTV adaptation by RTTs, seems a feasible and safe way to introduce daily, online MRI-based plan adaptation for a MRL

  19. Towards sustainable adaptation to climate change: The role of ...

    African Journals Online (AJOL)

    Towards sustainable adaptation to climate change: The role of indigenous ... From the short to the long term, climate change and variability threaten human and ... to food insecurity, lack of potable water and poor health, but also the cultural ...

  20. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...