WorldWideScience

Sample records for computer adaptive short

  1. Letting the CAT out of the bag: comparing computer adaptive tests and an 11-item short form of the Roland-Morris Disability Questionnaire.

    Science.gov (United States)

    Cook, Karon F; Choi, Seung W; Crane, Paul K; Deyo, Richard A; Johnson, Kurt L; Amtmann, Dagmar

    2008-05-20

    A post hoc simulation of a computer adaptive administration of the items of a modified version of the Roland-Morris Disability Questionnaire. To evaluate the effectiveness of adaptive administration of back pain-related disability items compared with a fixed 11-item short form. Short form versions of the Roland-Morris Disability Questionnaire have been developed. An alternative to paper-and-pencil short forms is to administer items adaptively so that items are presented based on a person's responses to previous items. Theoretically, this allows precise estimation of back pain disability with administration of only a few items. Data were gathered from 2 previously conducted studies of persons with back pain. An item response theory model was used to calibrate scores based on all items, items of a paper-and-pencil short form, and several computer adaptive tests (CATs). Correlations between each CAT condition and scores based on a 23-item version of the Roland-Morris Disability Questionnaire ranged from 0.93 to 0.98. Compared with an 11-item short form, an 11-item CAT produced scores that were significantly more highly correlated with scores based on the 23-item scale. CATs with even fewer items also produced scores that were highly correlated with scores based on all items. For example, scores from a 5-item CAT had a correlation of 0.93 with full scale scores. Seven- and 9-item CATs correlated at 0.95 and 0.97, respectively. A CAT with a standard-error-based stopping rule produced scores that correlated at 0.95 with full scale scores. A CAT-based back pain-related disability measure may be a valuable tool for use in clinical and research contexts. Use of CAT for other common measures in back pain research, such as other functional scales or measures of psychological distress, may offer similar advantages.

  2. Towards psychologically adaptive brain-computer interfaces

    Science.gov (United States)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  3. Adaptive Computed Tomography Imaging Spectrometer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The present proposal describes the development of an adaptive Computed Tomography Imaging Spectrometer (CTIS), or "Snapshot" spectrometer which can "instantaneously"...

  4. Obligatory and adaptive averaging in visual short-term memory.

    Science.gov (United States)

    Dubé, Chad; Sekuler, Robert

    2015-01-01

    Visual memory can draw upon averaged perceptual representations, a dependence that could be both adaptive and obligatory. In support of this idea, we review a wide range of evidence, including findings from our own lab. This evidence shows that time- and space-averaged memory representations influence detection and recognition responses, and do so without instruction to compute or report an average. Some of the work reviewed exploits fine-grained measures of retrieval from visual short-term memory to closely track the influence of stored averages on recall and recognition of briefly presented visual textures. Results show that reliance on perceptual averages is greatest when memory resources are taxed or when subjects are uncertain about the fidelity of their memory representation. We relate these findings to models of how summary statistics impact visual short-term memory, and discuss a neural signature for contexts in which perceptual averaging exerts maximal influence.

  5. QPSO-Based Adaptive DNA Computing Algorithm

    Directory of Open Access Journals (Sweden)

    Mehmet Karakose

    2013-01-01

    Full Text Available DNA (deoxyribonucleic acid computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO. Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1 parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2 adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3 numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  6. Learning Words through Computer-Adaptive Tool

    DEFF Research Database (Denmark)

    Zhang, Chun

    2005-01-01

    construction, I stress the design of a test theory, namely, a learning algorithm. The learning algorithm is designed under such principles that users experience both 'elaborative rehearsal’ (aspects in receptive and productive learning) and 'expanding rehearsal, (memory-based learning and repetitive act...... the category of L2 lexical learning in computer-adaptive learning environment. The reason to adopt computer-adaptive tool in WPG is based on the following premises: 1. Lexical learning is incremental in nature. 2. Learning can be measured precisely with tests (objectivist epistemology). In the course of WPG......). These design principles are coupled with cognitive approaches for design and analysis of learning and instruction in lexical learning....

  7. Evaluation Parameters for Computer-Adaptive Testing

    Science.gov (United States)

    Georgiadou, Elisabeth; Triantafillou, Evangelos; Economides, Anastasios A.

    2006-01-01

    With the proliferation of computers in test delivery today, adaptive testing has become quite popular, especially when examinees must be classified into two categories (passfail, master nonmaster). Several well-established organisations have provided standards and guidelines for the design and evaluation of educational and psychological testing.…

  8. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  9. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    This paper described the mathematical basis and computational framework of a computer program developed for short circuit studies of electric power systems. The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric ...

  10. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang

    2010-10-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting the changes in a grid system can help to alarm the anomalies, clean the noises, and report the new patterns. In this paper, we proposed an approach of self-adaptive change detection based on the Page-Hinkley statistic test. It handles the non-stationary distribution without the assumption of data distribution and the empirical setting of parameters. We validate the approach on the EGEE streaming jobs, and report its better performance on achieving higher accuracy comparing to the other change detection methods. Meanwhile this change detection process could help to discover the device fault which was not claimed in the system logs. © 2010 IEEE.

  11. Adaptive Digital Signature Design and Short-Data-Record Adaptive Filtering

    Science.gov (United States)

    2008-04-01

    Linear adaptive transmitter-receiver structures for asynchronous CDMA systems ,” European Trans. Telecomm ., vol. 6, pp. 21-28, Jan.-Feb. 1995. [8] T. F...terms of its performance in and application to multiple-input-multiple-output (MIMO) systems . 15. SUBJECT TERMS Short data record, adaptive filtering...Design of Minimum PTSC Binary Antipodal Signature Sets . . . 28 Case 1:Underloaded Systems (K ≤ L) . . . . . . . . . . . . . . . 29 Case 2

  12. Psychometric properties of the Polish adaptation of short form of the Empathy Quotient (EQ-Short).

    Science.gov (United States)

    Jankowiak-Siuda, Kamila; Kantor-Martynuska, Joanna; Siwy-Hudowska, Anna; Śmieja, Magdalena; Dobrołowicz-Konkol, Mariola; Zaraś-Wieczorek, Iwona; Siedler, Agnieszka

    2017-08-29

    The purpose of the present study was to analyze the psychometric properties of the Polish-language version of the EQ-Short questionnaire, designed to measure affective and cognitive empathy. 940 subjects, aged 15-80, took part in the study. Subjects fluent in both Polish and English (N = 31) completed the questionnaire in the original English version and its Polish translation. The remaining subjects (N = 909) participated in a study designed to verify construct validity and reliability of the Polish version of the tool. The Polish and English versions of the EQ-Short show linguistic equivalence at a satisfactory level (r = 0.80, p Polish-language EQ-Short has good psychometric properties (Cronbach's alpha = 0.78), comparable to the original version. In all age groups there were statistically significant sex differences in EQ-Short scores: women scored higher than men. The Polish-language adaptation of EQ-Short is linguistically and psychometrically similar to the English original and meets the criteria of a reliable tool for measuring empathy.

  13. ICAN Computer Code Adapted for Building Materials

    Science.gov (United States)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  14. Understanding Coral's Short-term Adaptive Ability to Changing Environment

    Science.gov (United States)

    Tisthammer, K.; Richmond, R. H.

    2016-02-01

    Corals in Maunalua Bay, Hawaii are under chronic pressures from sedimentation and terrestrial runoffs containing multiple pollutants as a result of large scale urbanization that has taken place in the last 100 years. However, some individual corals thrive despite the prolonged exposure to these environmental stressors, which suggests that these individuals may have adapted to withstand such stressors. A recent survey showed that the lobe coral Porites lobata from the `high-stress' nearshore site had an elevated level of stress ixnduced proteins, compared to those from the `low-stress,' less polluted offshore site. To understand the genetic basis for the observed differential stress responses between the nearshore and offshore P. lobata populations, an analysis of the lineage-scale population genetic structure, as well as a reciprocal transplant experiment were conducted. The result of the genetic analysis revealed a clear genetic differentiation between P. lobata from the nearshore site and the offshore site. Following the 30- day reciprocal transplant experiment, protein expression profiles and other stress-related physiological characteristics were compared between the two populations. The experimental results suggest that the nearshore genotype can cope better with sedimentation/pollutants than the offshore genotype. This indicates that the observed genetic differentiation is due to selection for tolerance to these environmental stressors. Understanding the little-known, linage-scale genetic variation in corals offers a critical insight into their short-term adaptive ability, which is indispensable for protecting corals from impending environmental and climate change. The results of this study also offer a valuable tool for resource managers to make effective decisions on coral reef conservation, such as designing marine protected areas that incorporate and maintain such genetic diversity, and establishing acceptable pollution run-off levels.

  15. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    Science.gov (United States)

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  16. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen....... In this thesis we champion using software to improve energy efficiency — in particular we develop guidelines for reasoning and evaluating software performance on modern computers, and a middleware that has been designed for modern computers, improving computational performance both in terms of energy...

  17. Short-term effects of playing computer games on attention.

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  18. Discrete linear canonical transform computation by adaptive method.

    Science.gov (United States)

    Zhang, Feng; Tao, Ran; Wang, Yue

    2013-07-29

    The linear canonical transform (LCT) describes the effect of quadratic phase systems on a wavefield and generalizes many optical transforms. In this paper, the computation method for the discrete LCT using the adaptive least-mean-square (LMS) algorithm is presented. The computation approaches of the block-based discrete LCT and the stream-based discrete LCT using the LMS algorithm are derived, and the implementation structures of these approaches by the adaptive filter system are considered. The proposed computation approaches have the inherent parallel structures which make them suitable for efficient VLSI implementations, and are robust to the propagation of possible errors in the computation process.

  19. A computer-controlled adaptive antenna system

    Science.gov (United States)

    Fetterolf, P. C.; Price, K. M.

    The problem of active pattern control in multibeam or phased array antenna systems is one that is well suited to technologies based upon microprocessor feedback control systems. Adaptive arrays can be realized by incorporating microprocessors as control elements in closed-loop feedback paths. As intelligent controllers, microprocessors can detect variations in arrays and implement suitable configuration changes. The subject of this paper is the application of the Howells-Applebaum power inversion algorithm in a C-band multibeam antenna system. A proof-of-concept, microprocessor controlled, adaptive beamforming network (BFN) was designed, assembled, and subsequent tests were performed demonstrating the algorithm's capacity for nulling narrowband jammers.

  20. Active adaptive sound control in a duct - A computer simulation

    Science.gov (United States)

    Burgess, J. C.

    1981-09-01

    A digital computer simulation of adaptive closed-loop control for a specific application (sound cancellation in a duct) is discussed. The principal element is an extension of Sondhi's adaptive echo canceler and Widrow's adaptive noise canceler from signal processing to control. Thus, the adaptive algorithm is based on the LMS gradient search method. The simulation demonstrates that one or more pure tones can be canceled down to the computer bit noise level (-120 dB). When additive white noise is present, pure tones can be canceled to at least 10 dB below the noise spectrum level for SNRs down to at least 0 dB. The underlying theory suggests that the algorithm allows tracking tones with amplitudes and frequencies that change more slowly with time than the adaptive filter adaptation rate. It also implies that the method can cancel narrow-band sound in the presence of spectrally overlapping broadband sound.

  1. Computing three-point functions for short operators

    Energy Technology Data Exchange (ETDEWEB)

    Bargheer, Till [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Minahan, Joseph A.; Pereira, Raul [Uppsala Univ. (Sweden). Dept. of Physics and Astronomy

    2013-11-15

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  2. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  3. A Guide to Computer Adaptive Testing Systems

    Science.gov (United States)

    Davey, Tim

    2011-01-01

    Some brand names are used generically to describe an entire class of products that perform the same function. "Kleenex," "Xerox," "Thermos," and "Band-Aid" are good examples. The term "computerized adaptive testing" (CAT) is similar in that it is often applied uniformly across a diverse family of testing methods. Although the various members of…

  4. Computation as an emergent feature of adaptive synchronization.

    Science.gov (United States)

    Zanin, M; Papo, D; Sendiña-Nadal, I; Boccaletti, S

    2011-12-01

    We report on the spontaneous emergence of computation from adaptive synchronization of networked dynamical systems. The fundamentals are nonlinear elements, interacting in a directed graph via a coupling that adapts itself to the synchronization level between two input signals. These units can emulate different Boolean logics, and perform any computational task in a Turing sense, each specific operation being associated with a given network's motif. The resilience of the computation against noise is proven, and the general applicability is demonstrated with regard to periodic and chaotic oscillators, and excitable systems mimicking neural dynamics.

  5. Unauthorised adaptation of computer programmes - is criminalisation a solution?

    Directory of Open Access Journals (Sweden)

    L Muswaka

    2011-12-01

    Full Text Available In Haupt t/a Softcopy v Brewers Marketing Intelligence (Pty Ltd 2006 4 SA 458 (SCA Haupt sought to enforce a copyright claim in the Data Explorer computer programme against Brewers Marketing Intelligence (Pty Ltd. His claim was dismissed in the High Court and he appealed to the Supreme Court of Appeal. The Court held that copyright in the Data Explorer programme vested in Haupt. Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty Ltd.This case note inter alia analyses the possibility of an author being sued for infringement even though he has acquired copyright in a work that he created by making unauthorised adaptations to another's copyright material. Furthermore, it examines whether or not the law adequately protects copyright owners in situations where infringement takes the form of unauthorised adaptations of computer programmes. It is argued that the protection afforded by the Copyright Act 98 of 1978 (Copyright Act in terms of section 27(1 to copyright owners of computer programmes is narrowly defined. It excludes from its ambit of criminal liability the act of making unauthorised adaptation of computer programmes. The issue that is considered is therefore whether or not the unauthorised adaptation of computer programmes should attract a criminal sanction. In addressing this issue and with the aim of making recommendations, the legal position in the United Kingdom (UK is analysed. From the analysis it is recommended that the Copyright Act be amended by the insertion of a new section, section 27(1(A, which will make the act of making an unauthorised adaptation of a computer programme an offence. This recommended section will close the gap that currently exists in our law with regard to unauthorised adaptations of computer programmes.

  6. Short-Term Neural Adaptation to Simultaneous Bifocal Images

    Science.gov (United States)

    Radhakrishnan, Aiswaryah; Dorronsoro, Carlos; Sawides, Lucie; Marcos, Susana

    2014-01-01

    Simultaneous vision is an increasingly used solution for the correction of presbyopia (the age-related loss of ability to focus near images). Simultaneous Vision corrections, normally delivered in the form of contact or intraocular lenses, project on the patient's retina a focused image for near vision superimposed with a degraded image for far vision, or a focused image for far vision superimposed with the defocused image of the near scene. It is expected that patients with these corrections are able to adapt to the complex Simultaneous Vision retinal images, although the mechanisms or the extent to which this happens is not known. We studied the neural adaptation to simultaneous vision by studying changes in the Natural Perceived Focus and in the Perceptual Score of image quality in subjects after exposure to Simultaneous Vision. We show that Natural Perceived Focus shifts after a brief period of adaptation to a Simultaneous Vision blur, similar to adaptation to Pure Defocus. This shift strongly correlates with the magnitude and proportion of defocus in the adapting image. The magnitude of defocus affects perceived quality of Simultaneous Vision images, with 0.5 D defocus scored lowest and beyond 1.5 D scored “sharp”. Adaptation to Simultaneous Vision shifts the Perceptual Score of these images towards higher rankings. Larger improvements occurred when testing simultaneous images with the same magnitude of defocus as the adapting images, indicating that wearing a particular bifocal correction improves the perception of images provided by that correction. PMID:24664087

  7. Short-term neural adaptation to simultaneous bifocal images.

    Directory of Open Access Journals (Sweden)

    Aiswaryah Radhakrishnan

    Full Text Available Simultaneous vision is an increasingly used solution for the correction of presbyopia (the age-related loss of ability to focus near images. Simultaneous Vision corrections, normally delivered in the form of contact or intraocular lenses, project on the patient's retina a focused image for near vision superimposed with a degraded image for far vision, or a focused image for far vision superimposed with the defocused image of the near scene. It is expected that patients with these corrections are able to adapt to the complex Simultaneous Vision retinal images, although the mechanisms or the extent to which this happens is not known. We studied the neural adaptation to simultaneous vision by studying changes in the Natural Perceived Focus and in the Perceptual Score of image quality in subjects after exposure to Simultaneous Vision. We show that Natural Perceived Focus shifts after a brief period of adaptation to a Simultaneous Vision blur, similar to adaptation to Pure Defocus. This shift strongly correlates with the magnitude and proportion of defocus in the adapting image. The magnitude of defocus affects perceived quality of Simultaneous Vision images, with 0.5 D defocus scored lowest and beyond 1.5 D scored "sharp". Adaptation to Simultaneous Vision shifts the Perceptual Score of these images towards higher rankings. Larger improvements occurred when testing simultaneous images with the same magnitude of defocus as the adapting images, indicating that wearing a particular bifocal correction improves the perception of images provided by that correction.

  8. Short-Pulse Laser-Matter Computational Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Town, R; Tabak, M

    2004-11-02

    For three days at the end of August 2004, 55 plasma scientists met at the Four Points by Sheraton in Pleasanton to discuss some of the critical issues associated with the computational aspects of the interaction of short-pulse high-intensity lasers with matter. The workshop was organized around the following six key areas: (1) Laser propagation/interaction through various density plasmas: micro scale; (2) Anomalous electron transport effects: From micro to meso scale; (3) Electron transport through plasmas: From meso to macro scale; (4) Ion beam generation, transport, and focusing; (5) ''Atomic-scale'' electron and proton stopping powers; and (6) K{alpha} diagnostics.

  9. Translation project adaptation for MT-enhanced computer assisted translation

    OpenAIRE

    Cettolo, Mauro; Bertoldi, Nicola; Federico, Marcello; Schwenk, Holger; Barrault, loïc; Servan, Christophe

    2014-01-01

    International audience; The effective integration of MT technology into computer-assisted translation tools is a challenging topic both for academic research and the translation industry. Particularly, professional translators feel crucial the ability of MT systems to adapt to their feedback. In this paper, we propose an adaptation scheme to tune a statistical MT system to a translation project using small amounts of post-edited texts, like those generated by a single user in even just one da...

  10. Towards a neuro-computational account of prism adaptation.

    Science.gov (United States)

    Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta

    2017-12-14

    Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017. Published by Elsevier Ltd.

  11. Vergence Adaptation to Short-Duration Stimuli in Early Childhood.

    Science.gov (United States)

    Babinsky, Erin; Sreenivasan, Vidhyapriya; Candy, T Rowan

    2016-03-01

    To investigate whether nonstrabismic typically developing young children are capable of exhibiting vergence adaptation. Fifteen adults (19.5-35.8 years) and 34 children (2.5-7.3 years) provided usable data. None wore habitual refractive correction. Eye position and accommodation were recorded using Purkinje image eye tracking and eccentric photorefraction (MCS PowerRefractor). Vergence was measured in three conditions while the participant viewed naturalistic targets at 33 cm. Viewing was monocular for at least 60 seconds and then binocular for either 5 seconds (5-second condition), 60 seconds (60-second), or 60 seconds through a 10-pd base-out prism (prism 60-second). The right eye was then occluded again for 60 seconds and an exponential function was fit to these data to assess the impact of adaptation on alignment. The 63% time constant was significantly longer for the prism 60-second condition (mean = 11.5 seconds) compared to both the 5-second (5.3 seconds; P = 0.015) and the 60-second conditions (7.1 seconds; P = 0.035), with no significant difference between children and adults (P > 0.4). Correlations between the 63% time constant (prism 60-second condition) and age, refractive error, interpupillary distance (IPD), or baseline heterophoria were not significant (P > 0.4). The final stable monocular alignment, measured after binocular viewing, was similar to the baseline initial alignment across all conditions and ages. For a limited-duration near task, 2- to 7-year-old children showed comparable levels of vergence adaptation to adults. In a typically developing visual system, where IPD and refractive error are maturing, this adaptation could help maintain eye alignment.

  12. Adaptive Tracking Control for Robots With an Interneural Computing Scheme.

    Science.gov (United States)

    Tsai, Feng-Sheng; Hsu, Sheng-Yi; Shih, Mau-Hsiang

    2017-01-24

    Adaptive tracking control of mobile robots requires the ability to follow a trajectory generated by a moving target. The conventional analysis of adaptive tracking uses energy minimization to study the convergence and robustness of the tracking error when the mobile robot follows a desired trajectory. However, in the case that the moving target generates trajectories with uncertainties, a common Lyapunov-like function for energy minimization may be extremely difficult to determine. Here, to solve the adaptive tracking problem with uncertainties, we wish to implement an interneural computing scheme in the design of a mobile robot for behavior-based navigation. The behavior-based navigation adopts an adaptive plan of behavior patterns learning from the uncertainties of the environment. The characteristic feature of the interneural computing scheme is the use of neural path pruning with rewards and punishment interacting with the environment. On this basis, the mobile robot can be exploited to change its coupling weights in paths of neural connections systematically, which can then inhibit or enhance the effect of flow elimination in the dynamics of the evolutionary neural network. Such dynamical flow translation ultimately leads to robust sensory-to-motor transformations adapting to the uncertainties of the environment. A simulation result shows that the mobile robot with the interneural computing scheme can perform fault-tolerant behavior of tracking by maintaining suitable behavior patterns at high frequency levels.

  13. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    Science.gov (United States)

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  14. X-Y plotter adapter developed for SDS-930 computer

    Science.gov (United States)

    Robertson, J. B.

    1968-01-01

    Graphical Display Adapter provides a real time display for digital computerized experiments. This display uses a memory oscilloscope which records a single trace until erased. It is a small hardware unit which interfaces with the J-box feature of the SDS-930 computer to either an X-Y plotter or a memory oscilloscope.

  15. Computer Adaptive Testing for Small Scale Programs and Instructional Systems

    Science.gov (United States)

    Rudner, Lawrence M.; Guo, Fanmin

    2011-01-01

    This study investigates measurement decision theory (MDT) as an underlying model for computer adaptive testing when the goal is to classify examinees into one of a finite number of groups. The first analysis compares MDT with a popular item response theory model and finds little difference in terms of the percentage of correct classifications. The…

  16. An adaptive random search for short term generation scheduling with network constraints.

    Directory of Open Access Journals (Sweden)

    J A Marmolejo

    Full Text Available This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  17. Adaptive synchrosqueezing based on a quilted short-time Fourier transform

    Science.gov (United States)

    Berrian, Alexander; Saito, Naoki

    2017-08-01

    In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.

  18. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  19. Short-time evolution in the adaptive immune system.

    Science.gov (United States)

    Guttenberg, Nicholas; Tabei, S M Ali; Dinner, Aaron R

    2011-09-01

    We exploit a simple model to numerically and analytically investigate the effect of enforcing a time constraint for achieving a system-wide goal during an evolutionary dynamics. This situation is relevant to finding antibody specificities in the adaptive immune response as well as to artificial situations in which an evolutionary dynamics is used to generate a desired capability in a limited number of generations. When the likelihood of finding the target phenotype is low, we find that the optimal mutation rate can exceed the error threshold, in contrast to conventional evolutionary dynamics. We also show how a logarithmic correction to the usual inverse scaling of population size with mutation rate arises. Implications for natural and artificial evolutionary situations are discussed.

  20. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  1. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  2. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  3. Computation emerges from adaptive synchronization of networking neurons.

    Directory of Open Access Journals (Sweden)

    Massimiliano Zanin

    Full Text Available The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain.

  4. Landau Theory of Adaptive Integration in Computational Intelligence

    CERN Document Server

    Plewczynski, Dariusz

    2010-01-01

    Computational Intelligence (CI) is a sub-branch of Artificial Intelligence paradigm focusing on the study of adaptive mechanisms to enable or facilitate intelligent behavior in complex and changing environments. There are several paradigms of CI [like artificial neural networks, evolutionary computations, swarm intelligence, artificial immune systems, fuzzy systems and many others], each of these has its origins in biological systems [biological neural systems, natural Darwinian evolution, social behavior, immune system, interactions of organisms with their environment]. Most of those paradigms evolved into separate machine learning (ML) techniques, where probabilistic methods are used complementary with CI techniques in order to effectively combine elements of learning, adaptation, evolution and Fuzzy logic to create heuristic algorithms that are, in some sense, intelligent. The current trend is to develop consensus techniques, since no single machine learning algorithms is superior to others in all possible...

  5. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  6. Adaptive Methods and Parallel Computation for Partial Differential Equations

    Science.gov (United States)

    1992-05-01

    PARTIAL DIFFERENTIAL EQUATIONS RUPAK BISWAS MESSAOUD BENANTAR JOSEPH E. FLAHERTY MAY 1992 US ARMY ARMAMENT RESEARCH, DEVELOPMENT AND ENGINEERING CENTER...PARTIAL DIFFERENTIAL EQUATIONS AIMcMS: 6111.02.H610.011 PRON: 1A04ZOCANMSC 6.AUTHOR(S) Rupak Biswas (RPI, Troy, NY), Messaoud Benantar (RPI), and Joseph E...19 ADAPTIVE METHODS AND PARALLEL COMPUTATION FOR PARTIAL DIFFERENTIAL EQUATIONS* Rupak Biswas, Messaoud

  7. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  8. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  9. The Cultural Adaptation Process during a Short-Term Study Abroad Experience in Swaziland

    Science.gov (United States)

    Conner, Nathan W.; Roberts, T. Grady

    2015-01-01

    Globalization continuously shapes our world and influences post-secondary education. This study explored the cultural adaptation process of participants during a short-term study abroad program. Participants experienced stages which included initial feelings, cultural uncertainty, cultural barriers, cultural negativity, academic and career growth,…

  10. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...... testing (MCAT). The increase in the precision of personality facet scores is obtained from exploiting the correlations between the facets. Results indicate that the NEO PI-R could be substantially shorter without attenuating precision when the MCAT methodology is used. Furthermore, the study shows...

  11. Short Stories via Computers in EFL Classrooms: An Empirical Study for Reading and Writing Skills

    Science.gov (United States)

    Yilmaz, Adnan

    2015-01-01

    The present empirical study scrutinizes the use of short stories via computer technologies in teaching and learning English language. The objective of the study is two-fold: to examine how short stories could be used through computer programs in teaching and learning English and to collect data about students' perceptions of this technique via…

  12. Short-term adaptations of the dynamic disparity vergence and phoria systems.

    Science.gov (United States)

    Kim, Eun H; Vicci, Vincent R; Granger-Donetti, Bérangère; Alvarez, Tara L

    2011-07-01

    The ability to adapt is critical to survival and varies between individuals. Adaptation of one motor system may be related to the ability to adapt another. This study sought to determine whether phoria adaptation was correlated with the ability to modify the dynamics of disparity vergence. Eye movements from ten subjects were recorded during dynamic disparity vergence modification and phoria adaptation experiments. Two different convergent stimuli were presented during the dynamic vergence modification experiment: a test stimulus (4° step) and a conditioning stimulus (4° double step). Dynamic disparity vergence responses were quantified by measuring the peak velocity (°/s). Phoria adaptation experiments measured the changes in phoria over a 5-min period of sustained fixation. The maximum velocity of phoria adaptation was determined from an exponential fit of the phoria data points. Phoria and dynamic disparity vergence peak velocity were both significantly modified (P  0.89; P vergence, and other oculomotor parameters can yield insights into the plasticity of short-term adaptation mechanisms.

  13. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  14. Evaluation of an MMPI--a short form: implications for adaptive testing.

    Science.gov (United States)

    Archer, R P; Tirrell, C A; Elkins, D E

    2001-02-01

    Efforts to develop a viable short form of the MMPI (Hathaway & McKinley, 1943) span more than 50 years, with more recent attempts to significantly shorten the item pool focused on the use of adaptive computerized test administration. In this article, we report some psychometric properties of an MMPI-Adolescent version (MMPI-A; Butcher et al., 1992) short form based on administration of the first 150 items of this test instrument. We report results for both the MMPI-A normative sample of 1,620 adolescents and a clinical sample of 565 adolescents in a variety of treatment settings. We summarize results for the MMPI-A basic scales in terms of Pearson product-moment correlations generated between full administration and short-form administration formats and mean Tscore elevations for the basic scales generated by each approach. In this investigation, we also examined single-scale and 2-point congruences found for the MMPI-A basic clinical scales as derived from standard and short-form administrations. We present the relative strengths and weaknesses of the MMPI-A short form and discuss the findings in terms of implications for attempts to shorten the item pool through the use of computerized adaptive assessment approaches.

  15. A short history of optical computing: rise, decline, and evolution

    Science.gov (United States)

    Ambs, Pierre

    2009-10-01

    This paper gives a brief historical review of the development of optical computing from the early years, 60 years ago, until today. All the major inventions in the field were made in the sixties, generating a lot of enthusiasm. However it is between 1980 and 2000, that optical computing had its golden age with numerous new technologies and innovating optical processors been designed and constructed for real applications. Today the field of optical computing has evolved and its results benefit to new research topics such as nanooptics, biophotonics, or communication systems.

  16. The future of outcomes measurement: item banking, tailored short-forms, and computerized adaptive assessment.

    Science.gov (United States)

    Cella, David; Gershon, Richard; Lai, Jin-Shei; Choi, Seung

    2007-01-01

    The use of item banks and computerized adaptive testing (CAT) begins with clear definitions of important outcomes, and references those definitions to specific questions gathered into large and well-studied pools, or "banks" of items. Items can be selected from the bank to form customized short scales, or can be administered in a sequence and length determined by a computer programmed for precision and clinical relevance. Although far from perfect, such item banks can form a common definition and understanding of human symptoms and functional problems such as fatigue, pain, depression, mobility, social function, sensory function, and many other health concepts that we can only measure by asking people directly. The support of the National Institutes of Health (NIH), as witnessed by its cooperative agreement with measurement experts through the NIH Roadmap Initiative known as PROMIS (www.nihpromis.org), is a big step in that direction. Our approach to item banking and CAT is practical; as focused on application as it is on science or theory. From a practical perspective, we frequently must decide whether to re-write and retest an item, add more items to fill gaps (often at the ceiling of the measure), re-test a bank after some modifications, or split up a bank into units that are more unidimensional, yet less clinically relevant or complete. These decisions are not easy, and yet they are rarely unforgiving. We encourage people to build practical tools that are capable of producing multiple short form measures and CAT administrations from common banks, and to further our understanding of these banks with various clinical populations and ages, so that with time the scores that emerge from these many activities begin to have not only a common metric and range, but a shared meaning and understanding across users. In this paper, we provide an overview of item banking and CAT, discuss our approach to item banking and its byproducts, describe testing options, discuss an

  17. Cultural Adaptation and Psychometric Testing of the Short Form of Iranian Childbirth Self Efficacy Inventory

    OpenAIRE

    Khorsandi, Mahboubeh; Asghari Jafarabadi, Mohammad; Jahani, Farzaneh; Rafiei, Mohammad

    2013-01-01

    Background: To assess maternal confidence in her ability to cope with labor, a measure of childbirth self efficacy is necessary. Objectives: This paper aims to assess the cultural adaptation and psychometric testing of the short form of childbirth self-efficacy Inventory among Iranian pregnant women. Patients and Methods: In this descriptive-methodological study, we investigated 383 Iranian pregnant women in the third trimester. They were recruited from the outpatient prenatal care clinic of ...

  18. Configurable multiplier modules for an adaptive computing system

    Directory of Open Access Journals (Sweden)

    O. A. Pfänder

    2006-01-01

    Full Text Available The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  19. ADAPTIVE NEURO-FUZZY COMPUTING TECHNIQUE FOR PRECIPITATION ESTIMATION

    Directory of Open Access Journals (Sweden)

    Dalibor Petković

    2016-08-01

    Full Text Available The paper investigates the accuracy of an adaptive neuro-fuzzy computing technique in precipitation estimation. The monthly precipitation data from 29 synoptic stations in Serbia during 1946-2012 are used as case studies. Even though a number of mathematical functions have been proposed for modeling the precipitation estimation, these models still suffer from the disadvantages such as their being very demanding in terms of calculation time. Artificial neural network (ANN can be used as an alternative to the analytical approach since it offers advantages such as no required knowledge of internal system parameters, compact solution for multi-variable problems and fast calculation. Due to its being a crucial problem, this paper presents a process constructed so as to simulate precipitation with an adaptive neuro-fuzzy inference (ANFIS method. ANFIS is a specific type of the ANN family and shows very good learning and prediction capabilities, which makes it an efficient tool for dealing with encountered uncertainties in any system such as precipitation. Neural network in ANFIS adjusts parameters of membership function in the fuzzy logic of the fuzzy inference system (FIS. This intelligent algorithm is implemented using Matlab/Simulink and the performances are investigated.  The simulation results presented in this paper show the effectiveness of the developed method.

  20. Learners' Perceptions and Illusions of Adaptivity in Computer-Based Learning Environments

    Science.gov (United States)

    Vandewaetere, Mieke; Vandercruysse, Sylke; Clarebout, Geraldine

    2012-01-01

    Research on computer-based adaptive learning environments has shown exemplary growth. Although the mechanisms of effective adaptive instruction are unraveled systematically, little is known about the relative effect of learners' perceptions of adaptivity in adaptive learning environments. As previous research has demonstrated that the learners'…

  1. Speed regulating Effects of Incentive-based Intelligent Speed Adaptation in the short and medium term

    DEFF Research Database (Denmark)

    Agerholm, Niels

    sufficient further road safety on the basis of these solutions, while additional solutions known as Intelligent Transport Systems, and more particularly Intelligent Speed Adaptation (ISA), can be seen as a central solution towards a safer road network. ISA can be informative. It informs the driver about...... developed involving an incentive-based ISA system aimed at the target groups of young drivers and commercial drivers, on a commercial basis. The trials were Pay As You Speed (PAYS) and Intelligent Speed Adaptation Commercial (ISA C). In PAYS the size of the insurance rate would depend on the driver’s amount......Speed regulating Effects of Incentive-based Intelligent Speed Adaptation in the short and medium term Despite massive improvements in vehicles’ safety equipment, more information and safer road network, inappropriate road safety is still causing that more than 250 people are killed and several...

  2. A short course in computational geometry and topology

    CERN Document Server

    Edelsbrunner, Herbert

    2014-01-01

    With the aim to bring the subject of Computational Geometry and Topology closer to the scientific audience, this book is written in thirteen ready-to-teach sections organized in four parts: tessellations, complexes, homology, persistence. To speak to the non-specialist, detailed formalisms are often avoided in favor of lively 2- and 3-dimensional illustrations. The book is warmly recommended to everybody who loves geometry and the fascinating world of shapes.

  3. Quinoa - Adaptive Computational Fluid Dynamics, 0.2

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter, an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh

  4. Passenger thermal perceptions, thermal comfort requirements, and adaptations in short- and long-haul vehicles.

    Science.gov (United States)

    Lin, Tzu-Ping; Hwang, Ruey-Lung; Huang, Kuo-Tsang; Sun, Chen-Yi; Huang, Ying-Che

    2010-05-01

    While thermal comfort in mass transportation vehicles is relevant to service quality and energy consumption, benchmarks for such comfort that reflect the thermal adaptations of passengers are currently lacking. This study reports a field experiment involving simultaneous physical measurements and a questionnaire survey, collecting data from 2,129 respondents, that evaluated thermal comfort in short- and long-haul buses and trains. Experimental results indicate that high air temperature, strong solar radiation, and low air movement explain why passengers feel thermally uncomfortable. The overall insulation of clothing worn by passengers and thermal adaptive behaviour in vehicles differ from those in their living and working spaces. Passengers in short-haul vehicles habitually adjust the air outlets to increase thermal comfort, while passengers in long-haul vehicles prefer to draw the drapes to reduce discomfort from extended exposure to solar radiation. The neutral temperatures for short- and long-haul vehicles are 26.2 degrees C and 27.4 degrees C, while the comfort zones are 22.4-28.9 degrees C and 22.4-30.1 degrees C, respectively. The results of this study provide a valuable reference for practitioners involved in determining the adequate control and management of in-vehicle thermal environments, as well as facilitating design of buses and trains, ultimately contributing to efforts to achieve a balance between the thermal comfort satisfaction of passengers and energy conserving measures for air-conditioning in mass transportation vehicles.

  5. A Short Course in Computational Science and Engineering

    Science.gov (United States)

    Yevick, David

    2012-05-01

    1. Introduction; 2. Octave programming; 3. Installing and running the Dev-C++ programming environment; 4. Introduction to computer and software architecture; 5. Fundamental concepts; 6. Procedural programming basics; 7. An introduction to object-oriented analysis; 8. C++ object-oriented programming syntax; 9. Arrays and matrices; 10. Input and output stream; 11. References; 12. Pointers and dynamic memory allocation; 13. Memory management; 14. The static keyword, multiple and virtual inheritance, templates and the STL library; 15. Creating a Java development environment; 16. Basic Java programming constructs; 17. Java classes and objects; 18. Advanced Java features; 19. Introductory numerical analysis; 20. Linear algebra; 21. Fourier transforms; 22. Differential equations; 23. Monte-Carlo methods; 24. Parabolic partial differential equation solvers; Index.

  6. Computational design of short pulse laser driven iron opacity experiments

    Science.gov (United States)

    Martin, M. E.; London, R. A.; Goluoglu, S.; Whitley, H. D.

    2017-02-01

    The resolution of current disagreements between solar parameters calculated from models and observations would benefit from the experimental validation of theoretical opacity models. Iron's complex ionic structure and large contribution to the opacity in the radiative zone of the sun make iron a good candidate for validation. Short pulse lasers can be used to heat buried layer targets to plasma conditions comparable to the radiative zone of the sun, and the frequency dependent opacity can be inferred from the target's measured x-ray emission. Target and laser parameters must be optimized to reach specific plasma conditions and meet x-ray emission requirements. The HYDRA radiation hydrodynamics code is used to investigate the effects of modifying laser irradiance and target dimensions on the plasma conditions, x-ray emission, and inferred opacity of iron and iron-magnesium buried layer targets. It was determined that plasma conditions are dominantly controlled by the laser energy and the tamper thickness. The accuracy of the inferred opacity is sensitive to tamper emission and optical depth effects. Experiments at conditions relevant to the radiative zone of the sun would investigate the validity of opacity theories important to resolving disagreements between solar parameters calculated from models and observations.

  7. Short-term saccadic adaptation in the macaque monkey: a binocular mechanism

    Science.gov (United States)

    Schultz, K. P.

    2013-01-01

    Saccadic eye movements are rapid transfers of gaze between objects of interest. Their duration is too short for the visual system to be able to follow their progress in time. Adaptive mechanisms constantly recalibrate the saccadic responses by detecting how close the landings are to the selected targets. The double-step saccadic paradigm is a common method to simulate alterations in saccadic gain. While the subject is responding to a first target shift, a second shift is introduced in the middle of this movement, which masks it from visual detection. The error in landing introduced by the second shift is interpreted by the brain as an error in the programming of the initial response, with gradual gain changes aimed at compensating the apparent sensorimotor mismatch. A second shift applied dichoptically to only one eye introduces disconjugate landing errors between the two eyes. A monocular adaptive system would independently modify only the gain of the eye exposed to the second shift in order to reestablish binocular alignment. Our results support a binocular mechanism. A version-based saccadic adaptive process detects postsaccadic version errors and generates compensatory conjugate gain alterations. A vergence-based saccadic adaptive process detects postsaccadic disparity errors and generates corrective nonvisual disparity signals that are sent to the vergence system to regain binocularity. This results in striking dynamical similarities between visually driven combined saccade-vergence gaze transfers, where the disparity is given by the visual targets, and the double-step adaptive disconjugate responses, where an adaptive disparity signal is generated internally by the saccadic system. PMID:23076111

  8. Computer-adaptive test to measure community reintegration of Veterans.

    Science.gov (United States)

    Resnik, Linda; Tian, Feng; Ni, Pengsheng; Jette, Alan

    2012-01-01

    The Community Reintegration of Injured Service Members (CRIS) measure consists of three scales measuring extent of, perceived limitations in, and satisfaction with community reintegration. Length of the CRIS may be a barrier to its widespread use. Using item response theory (IRT) and computer-adaptive test (CAT) methodologies, this study developed and evaluated a briefer community reintegration measure called the CRIS-CAT. Large item banks for each CRIS scale were constructed. A convenience sample of 517 Veterans responded to all items. Exploratory and confirmatory factor analyses (CFAs) were used to identify the dimensionality within each domain, and IRT methods were used to calibrate items. Accuracy and precision of CATs of different lengths were compared with the full-item bank, and data were examined for differential item functioning (DIF). CFAs supported unidimensionality of scales. Acceptable item fit statistics were found for final models. Accuracy of 10-, 15-, 20-, and variable-item CATs for all three scales was 0.88 or above. CAT precision increased with number of items administered and decreased at the upper ranges of each scale. Three items exhibited moderate DIF by sex. The CRIS-CAT demonstrated promising measurement properties and is recommended for use in community reintegration assessment.

  9. Adaptive ultra-short-term wind power prediction based on risk assessment

    DEFF Research Database (Denmark)

    Xue, Yusheng; Yu, Chen; Li, Kang

    2016-01-01

    A risk assessment based adaptive ultra-short-term wind power prediction (USTWPP) method is proposed in this paper. The method first extracts features from the historical data, and split every wind power time series (WPTS) into several subsets defined by their stationary patterns. A WPTS that does...... not match with any of the stationary patterns is then included into a subset of non-stationary patterns. Every WPTS subset is then related to a USTWPP model which is specially selected and optimized offline based on the proposed risk assessment index. For on-line applications, the pattern of the last short...... WPTS is first recognized, and the relevant prediction model is applied for USTWPP. Experimental results confirm the efficacy of the proposed method....

  10. Short-Term Load Forecasting Using Adaptive Annealing Learning Algorithm Based Reinforcement Neural Network

    Directory of Open Access Journals (Sweden)

    Cheng-Ming Lee

    2016-11-01

    Full Text Available A reinforcement learning algorithm is proposed to improve the accuracy of short-term load forecasting (STLF in this article. The proposed model integrates radial basis function neural network (RBFNN, support vector regression (SVR, and adaptive annealing learning algorithm (AALA. In the proposed methodology, firstly, the initial structure of RBFNN is determined by using an SVR. Then, an AALA with time-varying learning rates is used to optimize the initial parameters of SVR-RBFNN (AALA-SVR-RBFNN. In order to overcome the stagnation for searching optimal RBFNN, a particle swarm optimization (PSO is applied to simultaneously find promising learning rates in AALA. Finally, the short-term load demands are predicted by using the optimal RBFNN. The performance of the proposed methodology is verified on the actual load dataset from the Taiwan Power Company (TPC. Simulation results reveal that the proposed AALA-SVR-RBFNN can achieve a better load forecasting precision compared to various RBFNNs.

  11. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  12. Translation, cultural adaptation and validation of the English "Short form SF 12v2" into Bengali in rheumatoid arthritis patients

    NARCIS (Netherlands)

    Islam, Nazrul; Khan, Ikramul Hasan; Ferdous, Nira; Rasker, Johannes J.

    2017-01-01

    Background: To develop a culturally adapted and validated Bengali Short Form SF 12v2 among Rheumatoid arthritis (RA) patients. Methods: The English SF 12v2 was translated, adapted and back translated into and from Bengali, pre-tested by 60 patients. The Bengali SF 12v2 was administered twice with 14

  13. Age related neural adaptation following short term resistance training in women.

    Science.gov (United States)

    Bemben, M G; Murphy, R E

    2001-09-01

    This study examined the influence of age on neural facilitation and neural cross-education following short term unilateral dynamic resistance training with the hypothesis that older women may have a diminished ability for adaptation. This was a prospective, repeated measures design. The non-dominant left arm served as a control limb and follow-up testing was performed two weeks after pretesting. Testing was conducted in the Neuromuscular Research Laboratory at the University of Oklahoma. 20 females (n=10, young (YF) 20.8+/-0.1 yrs; n=10, older (OF) 58.1+/-0.14) were assessed. 14 days of training of the right elbow flexors only. On each day, subjects performed four sets of ten repetitions using 70 percent of maximal strength of the biceps brachii. The following variables in both right and left elbow flexor muscle groups were evaluated; isometric strength (IMS), efficiency of electrical activity (EEA) and estimated upper arm cross-sectional area (CSA). There were significant increases (peffects. Short term unilateral dynamic resistance training is a sufficient stimulus to induce significant strength increases in both trained and untrained contralateral limbs and that a neural mechanism is responsible for the muscular adaptation in both young and older women. Implication exists for unilateral stroke victims, individuals with single hip or knee replacements, or single limb casts.

  14. Development and Validation of a Short-Form Adaptation of the Age-Related Vision Loss Scale: The AVL12

    Science.gov (United States)

    Horowitz, Amy; Reinhardt, Joann P.; Raykov, Tenko

    2007-01-01

    This article describes the development and evaluation of a short form of the 24-item Adaptation to Age-Related Vision Loss (AVL) scale. The evaluation provided evidence of the reliability and validity of the short form (the AVL12), for significant interindividual differences at the baseline and for individual-level change in AVL scores over time.…

  15. Passive adaptation to stress in adulthood after short-term social instability stress during adolescence in mice.

    Science.gov (United States)

    de Lima, A P N; Massoco, C O

    2017-05-01

    This study reports that short-term social instability stress (SIS) in adolescence increases passive-coping in adulthood in male mice. Short-term SIS decreased the latency of immobility and increased the frequency and time of immobility in tail suspension test. These findings support the hypothesis that adolescent stress can induce a passive adaptation to stress in adulthood, even if it is a short period of stress.

  16. Cultural adaptation and psychometric testing of the short form of Iranian childbirth self efficacy inventory.

    Science.gov (United States)

    Khorsandi, Mahboubeh; Asghari Jafarabadi, Mohammad; Jahani, Farzaneh; Rafiei, Mohammad

    2013-11-01

    To assess maternal confidence in her ability to cope with labor, a measure of childbirth self efficacy is necessary. This paper aims to assess the cultural adaptation and psychometric testing of the short form of childbirth self-efficacy Inventory among Iranian pregnant women. In this descriptive-methodological study, we investigated 383 Iranian pregnant women in the third trimester. They were recruited from the outpatient prenatal care clinic of Taleghani Hospital and an urban health center from August to November 2011. Content validity was evaluated by a panel of specialists after adding two religious items. The women completed the inventory and the demographic characteristics questionnaire in an interview room. The internal consistency and construct validity were assessed by Cronbach's alpha and by exploratory and confirmatory factor analyses, respectively. Known group analysis on gravity assessed the discriminant validity of the measure. Content validity of the short form of the Iranian childbirth self-efficacy Inventory was confirmed. Factor analyses supported the conceptual two-factor structure of measure and hence supported its construct validity. The internal consistency was approved for the total scale and both subscales. The instrument differentiated prim gravid from multigravida women in the total scale and the efficacy expectancy subscale. Validity and reliability of the measure supports the use of the short form of the instrument as a clinical and research instrument in measuring childbirth self-efficacy among Iranian pregnant women.

  17. Short-term effects of implemented high intensity shoulder elevation during computer work

    DEFF Research Database (Denmark)

    Larsen, Mette K; Samani, Afshin; Madeleine, Pascal

    2009-01-01

    contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction...... on productivity, RPE and upper trapezius activity and rest during computer work and a subsequent pause from computer work. METHODS: 18 female computer workers performed 2 sessions of 15 min standardized computer mouse work preceded by 1 min pause with and without prior high intensity contraction of shoulder....... RESULTS: The main findings were that a high intensity shoulder elevation did not modify RPE, productivity or EMG activity of the upper trapezius during the subsequent pause and computer work. However, the high intensity contraction reduced the relative rest time of the uppermost (clavicular) trapezius...

  18. Genre-adaptive semantic computing and audio-based modelling for music mood annotation.

    OpenAIRE

    Saari, Pasi; Fazekas, Gyorgy; Eerola, Tuomas; Barthet, Mathieu; Lartillot, Olivier; Sandler, Mark

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are proposed. A technique called the ACTwg employs genre-adaptive semantic computing of mood-related social tags, whereas ACTwg-SLPwg combines semantic computing and audio-based modelling, both in a genre-ada...

  19. Computational identification of adaptive mutants using the VERT system

    Directory of Open Access Journals (Sweden)

    Winkler James

    2012-04-01

    Full Text Available Background Evolutionary dynamics of microbial organisms can now be visualized using the Visualizing Evolution in Real Time (VERT system, in which several isogenic strains expressing different fluorescent proteins compete during adaptive evolution and are tracked using fluorescent cell sorting to construct a population history over time. Mutations conferring enhanced growth rates can be detected by observing changes in the fluorescent population proportions. Results Using data obtained from several VERT experiments, we construct a hidden Markov-derived model to detect these adaptive events in VERT experiments without external intervention beyond initial training. Analysis of annotated data revealed that the model achieves consensus with human annotation for 85-93% of the data points when detecting adaptive events. A method to determine the optimal time point to isolate adaptive mutants is also introduced. Conclusions The developed model offers a new way to monitor adaptive evolution experiments without the need for external intervention, thereby simplifying adaptive evolution efforts relying on population tracking. Future efforts to construct a fully automated system to isolate adaptive mutants may find the algorithm a useful tool.

  20. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  1. Spanish Adaptation and Validation of the Short Internalized Homonegativity Scale (SIHS).

    Science.gov (United States)

    Morell-Mengual, Vicente; Gil-Llario, María Dolores; Ballester-Arnal, Rafael; Salmerón-Sanchéz, Pedro

    2017-05-19

    Internalized homophobia has been related to mental health problems and sexual risk behaviors among nonheterosexual people. This article validates the Spanish adaptation of the Short Internalized Homonegativity Scale (SIHS). For this purpose, 347 men and 183 women completed the instrument. Exploratory factorial analysis showed three factors: public identification as homosexual (PIH), sexual comfort with homosexual people (SEXC), and social comfort with homosexual people (SOCC). These factors explained 57.96% of total variance. In addition, confirmatory factorial analysis supported this structure and internal consistency (Cronbach's alpha) was.80 for the full scale. The three subscales ranged from.70 to.79. Convergent validity showed a positive correlation between the SIHS and depressive symptoms, and negative correlation with condom use, self-esteem, and having sex after alcohol consumption. In conclusion, the SIHS could be an accurate instrument to evaluate internalized homophobia among the Spanish population.

  2. Short (

    NARCIS (Netherlands)

    Telleman, Gerdien; den Hartog, Laurens

    2013-01-01

    Aim: This systematic review assessed the implant survival rate of short (<10 mm) dental implants installed in partially edentulous patients. A case report of a short implant in the posterior region have been added. Materials and methods: A search was conducted in the electronic databases of MEDLINE

  3. Condition Driven Adaptive Music Generation for Computer Games

    OpenAIRE

    Naushad, Alamgir

    2013-01-01

    The video game industry has grown to a multi-billion dollar, worldwide industry. The background music tends adaptively in reference to the specific game content during the game length of the play. Adaptive music should be further explored by looking at the particular condition in the game; such condition is driven by generating a specific music in the background which best fits in with the active game content throughout the length of the gameplay. This research paper outlines the use of condi...

  4. Neural adaptations after short-term wingate-based high-intensity interval training

    Science.gov (United States)

    Vera-Ibañez, Antonio; Colomer-Poveda, David; Romero-Arenas, Salvador; Viñuela-García, Manuel; Márquez, Gonzalo

    2017-01-01

    Objectives: This study examined the neural adaptations associated with a low-volume Wingate-based High Intensity Interval Training (HIIT). Methods: Fourteen recreationally trained males were divided into an experimental (HIIT) and a control group to determine whether a short-term (4 weeks) Wingate-based HIIT program could alter the Hoffmann (H-) reflex, volitional (V-) wave and maximum voluntary contraction (MVC) of the plantar-flexor muscles, and the peak power achieved during a Wingate test. Results: Absolute and relative peak power increased in the HIIT group (ABS_Ppeak: +14.7%, P=0.001; and REL_Ppeak: +15.0%, P=0.001), but not in the control group (ABS_Ppeak: P=0.466; and REL_Ppeak: P=0.493). However, no significant changes were found in the MVC (P>0.05 for both groups). There was a significant increase in H-reflex size after HIIT (+24.5%, P=0.004), while it remained unchanged in the control group (P=0.134). No significant changes were observed either in the V-wave or in the Vwave/Mwave ratio (P>0.05 for both groups). Conclusion: The Wingate-based training led to an increased peak power together with a higher spinal excitability. However, no changes were found either in the volitional wave or in the MVC, indicating a lack of adaptation in the central motor drive. PMID:29199186

  5. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  6. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish

    Directory of Open Access Journals (Sweden)

    Radosław Rutkowski

    2017-01-01

    Full Text Available Purpose. The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA. Methods. The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheumatological hospital in Śrem (Poland. They represented different sociodemographic characteristics and were characterized as rural and city environments residents. Results. The mean age of the patients was 58.9±10.2 years. The majority of patients (85% were female. The average final FFI-RS score was 62.9±15.3. The internal consistency was achieved at a high level of 0.95 in Cronbach’s alpha test, with an interclass correlation coefficient ranging between 0.78 and 0.84. A strong correlation was observed between the FFI-RS and Health Assessment Questionnaire-Disability Index (HAQ-DI questionnaires. Conclusion. The Polish version of FFI-RS-PL indicator is an important tool for evaluating the functional condition of patients’ feet and can be applied in the diagnosis and treatment of Polish-speaking patients suffering from RA.

  7. Computing The No-Escape Envelope Of A Short-Range Missile

    Science.gov (United States)

    Neuman, Frank

    1991-01-01

    Method for computing no-escape envelope of short-range air-to-air missile devised. Useful for analysis of both strategies for avoidance and strategies for attack. With modifications, also useful in analysis of control strategies for one-on-one air-to-air combat, or wherever multiple control strategies considered.

  8. Adaptations in muscle activity to induced, short-term hindlimb lameness in trotting dogs.

    Directory of Open Access Journals (Sweden)

    Stefanie Fischer

    Full Text Available Muscle tissue has a great intrinsic adaptability to changing functional demands. Triggering more gradual responses such as tissue growth, the immediate responses to altered loading conditions involve changes in the activity. Because the reduction in a limb's function is associated with marked deviations in the gait pattern, understanding the muscular responses in laming animals will provide further insight into their compensatory mechanisms as well as help to improve treatment options to prevent musculoskeletal sequelae in chronic patients. Therefore, this study evaluated the changes in muscle activity in adaptation to a moderate, short-term, weight-bearing hindlimb lameness in two leg and one back muscle using surface electromyography (SEMG. In eight sound adult dogs that trotted on an instrumented treadmill, bilateral, bipolar recordings of the m. triceps brachii, the m. vastus lateralis and the m. longissimus dorsi were obtained before and after lameness was induced. Consistent with the unchanged vertical forces as well as temporal parameters, neither the timing nor the level of activity changed significantly in the m. triceps brachii. In the ipsilateral m. vastus lateralis, peak activity and integrated SEMG area were decreased, while they were significantly increased in the contralateral hindlimb. In both sides, the duration of the muscle activity was significantly longer due to a delayed offset. These observations are in accordance with previously described kinetic and kinematic changes as well as changes in muscle mass. Adaptations in the activity of the m. longissimus dorsi concerned primarily the unilateral activity and are discussed regarding known alterations in trunk and limb motions.

  9. Development and evaluation of a computer adaptive test to assess anxiety in cardiovascular rehabilitation patients.

    Science.gov (United States)

    Abberger, Birgit; Haschke, Anne; Wirtz, Markus; Kroehne, Ulf; Bengel, Juergen; Baumeister, Harald

    2013-12-01

    To develop and evaluate a computer adaptive test for the assessment of anxiety in cardiovascular rehabilitation patients (ACAT-cardio) that tailors an optimal test for each patient and enables precise and time-effective measurement. Simulation study, validation study (against the anxiety subscale of the Hospital Anxiety and Depression Scale and the physical component summary scale of the 12-Item Short-Form Health Survey), and longitudinal study (beginning and end of rehabilitation). Cardiac rehabilitation centers. Cardiovascular rehabilitation patients: simulation study sample (n=106; mean age, 57.8y; 25.5% women) and validation and longitudinal study sample (n=138; mean age, 58.6 and 57.9y, respectively; 16.7% and 12.1% women, respectively). Not applicable. Hospital Anxiety and Depression Scale, 12-Item Short-Form Health Survey, and ACAT-cardio. The mean number of items was 9.2 with an average processing time of 1:13 minutes when an SE ≤.50 was used as a stopping rule; with an SE ≤.32, there were 28 items and a processing time of 3:47 minutes. Validity could be confirmed via correlations between .68 and .81 concerning convergent validity (ACAT-cardio vs Hospital Anxiety and Depression Scale anxiety subscale) and correlations between -.47 and -.30 concerning discriminant validity (ACAT-cardio vs 12-Item Short-Form Health Survey physical component summary scale). Sensitivity to change was moderate to high with standardized response means between .45 and .82. The ACAT-cardio shows good psychometric properties and provides the opportunity for an innovative and time-effective assessment of anxiety in cardiovascular rehabilitation. A more flexible stopping rule might further improve the ACAT-cardio. Additionally, testing in other cardiovascular populations would increase generalizability. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  10. Adaptations of the vestibular system to short and long-term exposures to altered gravity

    Science.gov (United States)

    Bruce, L.

    Long-term space flight creates unique environmental conditions to which the vestibular system must adapt for optimal survival. We are studying two aspects of this vestibular adaptation: (1) How does long-term exposure to microgravity and hypergravity affect the development of vestibular afferents? (2) How does short- term exposure to extremely rapid changes in gravity, such as those that occur during launch and landing, affect the vestibular system. During space flight the gravistatic receptors in the otolith organs are effectively unloaded. In hypergravity conditions they are overloaded. However, the angular acceleration receptors of the semicircular canals receive relatively normal stimulation in both micro- and hypergravity.Rat embryos exposed to microgravity from gestation day 10 (prior to vestibular function) until gestation day 20 (vestibular system is somewhat functional) showed that afferents from the posterior vertical canal projecting to the medial vestibular nucleus developed similarly in microgravity, hypergravity, and in controls . However, afferents from the saccule showed delayed development in microgravity as compared to development in hypergravity and in controls. Cerebellar plasticity is crucial for modification of sensory-motor control and learning. Thus we explored the possibility that strong vestibular stimuli would modify cerebellar motor control (i.e., eye movement, postural control, gut motility) by altering the morphology of cerebellar Purkinje cells. To study the effects of short-term exposures to strong vestibular stimuli we focused on structural changes in the vestibulo-cerebellum that are caused by strong vestibular stimuli. Adult mice were exposed to various combinations of constant and/or rapidly changing angular and linear accelerations for 8.5 min (the time length of shuttle launch). Our data shows that these stimuli cause intense excitation of cerebellar Purkinje cells, inducing up-regulation of clathrin-mediated endocytosis

  11. Short-term effects of implemented high intensity shoulder elevation during computer work.

    Science.gov (United States)

    Larsen, Mette K; Samani, Afshin; Madeleine, Pascal; Olsen, Henrik B; Søgaard, Karen; Holtermann, Andreas

    2009-08-10

    Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction on productivity, RPE and upper trapezius activity and rest during computer work and a subsequent pause from computer work. 18 female computer workers performed 2 sessions of 15 min standardized computer mouse work preceded by 1 min pause with and without prior high intensity contraction of shoulder elevation. RPE was reported, productivity (drawings per min) measured, and bipolar surface electromyography (EMG) recorded from the dominant upper trapezius during pauses and sessions of computer work. Repeated measure ANOVA with Bonferroni corrected post-hoc tests was applied for the statistical analyses. The main findings were that a high intensity shoulder elevation did not modify RPE, productivity or EMG activity of the upper trapezius during the subsequent pause and computer work. However, the high intensity contraction reduced the relative rest time of the uppermost (clavicular) trapezius part during the subsequent pause from computer work (p shoulder elevation did not impose a negative impact on perceived effort, productivity or upper trapezius activity during computer work, implementation of high intensity contraction during computer work to prevent neck-shoulder pain may be possible without affecting the working routines. However, the unexpected reduction in clavicular trapezius rest during a pause with preceding high intensity contraction requires further investigation before high

  12. Short-term effects of implemented high intensity shoulder elevation during computer work

    Directory of Open Access Journals (Sweden)

    Madeleine Pascal

    2009-08-01

    Full Text Available Abstract Background Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction on productivity, RPE and upper trapezius activity and rest during computer work and a subsequent pause from computer work. Methods 18 female computer workers performed 2 sessions of 15 min standardized computer mouse work preceded by 1 min pause with and without prior high intensity contraction of shoulder elevation. RPE was reported, productivity (drawings per min measured, and bipolar surface electromyography (EMG recorded from the dominant upper trapezius during pauses and sessions of computer work. Repeated measure ANOVA with Bonferroni corrected post-hoc tests was applied for the statistical analyses. Results The main findings were that a high intensity shoulder elevation did not modify RPE, productivity or EMG activity of the upper trapezius during the subsequent pause and computer work. However, the high intensity contraction reduced the relative rest time of the uppermost (clavicular trapezius part during the subsequent pause from computer work (p Conclusion Since a preceding high intensity shoulder elevation did not impose a negative impact on perceived effort, productivity or upper trapezius activity during computer work, implementation of high intensity contraction during computer work to prevent neck-shoulder pain may be possible without affecting the working routines. However, the unexpected reduction in clavicular trapezius rest during a

  13. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale – Short Form

    Directory of Open Access Journals (Sweden)

    Fernanda Figueredo Chaves

    Full Text Available ABSTRACT OBJECTIVE To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale – Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. METHODS Assessment of the instrument’s conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee’s assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. RESULTS Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes – Versão Curta, was established. The scale had acceptable internal consistency with Cronbach’s alpha of 0.634 (95%CI 0.494– 0.737, while the correlation of the total score in the two periods was considered moderate (0.47. The intraclass correlation coefficient was 0.50. CONCLUSIONS The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv for recording the Expert Committee responses as well as the responses in the

  14. Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer Screening

    Science.gov (United States)

    2015-03-01

    detection in mammography ." Medical Physics 31: 958. Timp, S., N. Karssemeijer and J. Hendriks (2003). Analysis of changes in masses using contrast and...Award Number: W81XWH-11-1-0755 TITLE: Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer Screening PRINCIPAL...14Dec2014 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER W81XWH-11-1-0755 Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer

  15. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    Science.gov (United States)

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  16. Development of a lack of appetite item bank for computer-adaptive testing (CAT)

    NARCIS (Netherlands)

    Thamsborg, L.H.; Petersen, M.A.; Aaronson, N.K.; Chie, W.C.; Costantini, A.; Holzner, B.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2015-01-01

    Purpose: A significant proportion of oncological patients experiences lack of appetite. Precise measurement is relevant to improve the management of lack of appetite. The so-called computer-adaptive test (CAT) allows for adaptation of the questionnaire to the individual patient, thereby optimizing

  17. Applications of decision theory to computer-based adaptive instructional systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate

  18. Capitalization on item calibration error in computer adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.

    2005-01-01

    In test assembly, a fundamental difference exists between algorithms that select a test sequentially or simultaneously. Sequential assembly allows us to optimize an objective function at the examinee’s ability estimate, such as the test information function in computerized adaptive testing. But it

  19. Towards Adaptive Virtual Camera Control In Computer Games

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    machine learning to build predictive models of the virtual camera behaviour. The perfor- mance of the models on unseen data reveals accuracies above 70% for all the player behaviour types identified. The characteristics of the gener- ated models, their limits and their use for creating adaptive automatic...

  20. Capitalization on item calibration error in computer adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Glas, Cornelis A.W.

    2006-01-01

    In adaptive testing, item selection is sequentially optimized during the test. Since the optimization takes place over a pool of items calibrated with estimation error, capitalization on these errors is likely to occur. How serious the consequences of this phenomenon are depends not only on the

  1. Improving Short-Range Ensemble Kalman Storm Surge Forecasting Using Robust Adaptive Inflation

    KAUST Repository

    Altaf, Muhammad

    2013-08-01

    This paper presents a robust ensemble filtering methodology for storm surge forecasting based on the singular evolutive interpolated Kalman (SEIK) filter, which has been implemented in the framework of the H∞ filter. By design, an H∞ filter is more robust than the common Kalman filter in the sense that the estimation error in the H∞ filter has, in general, a finite growth rate with respect to the uncertainties in assimilation. The computational hydrodynamical model used in this study is the Advanced Circulation (ADCIRC) model. The authors assimilate data obtained from Hurricanes Katrina and Ike as test cases. The results clearly show that the H∞-based SEIK filter provides more accurate short-range forecasts of storm surge compared to recently reported data assimilation results resulting from the standard SEIK filter.

  2. Short-term memory trace in rapidly adapting synapses of inferior temporal cortex.

    Directory of Open Access Journals (Sweden)

    Yasuko Sugase-Miyamoto

    2008-05-01

    Full Text Available Visual short-term memory tasks depend upon both the inferior temporal cortex (ITC and the prefrontal cortex (PFC. Activity in some neurons persists after the first (sample stimulus is shown. This delay-period activity has been proposed as an important mechanism for working memory. In ITC neurons, intervening (nonmatching stimuli wipe out the delay-period activity; hence, the role of ITC in memory must depend upon a different mechanism. Here, we look for a possible mechanism by contrasting memory effects in two architectonically different parts of ITC: area TE and the perirhinal cortex. We found that a large proportion (80% of stimulus-selective neurons in area TE of macaque ITCs exhibit a memory effect during the stimulus interval. During a sequential delayed matching-to-sample task (DMS, the noise in the neuronal response to the test image was correlated with the noise in the neuronal response to the sample image. Neurons in perirhinal cortex did not show this correlation. These results led us to hypothesize that area TE contributes to short-term memory by acting as a matched filter. When the sample image appears, each TE neuron captures a static copy of its inputs by rapidly adjusting its synaptic weights to match the strength of their individual inputs. Input signals from subsequent images are multiplied by those synaptic weights, thereby computing a measure of the correlation between the past and present inputs. The total activity in area TE is sufficient to quantify the similarity between the two images. This matched filter theory provides an explanation of what is remembered, where the trace is stored, and how comparison is done across time, all without requiring delay period activity. Simulations of a matched filter model match the experimental results, suggesting that area TE neurons store a synaptic memory trace during short-term visual memory.

  3. Intricacies of Feedback in Computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    Prism Adaptation Therapy (PAT) is an intervention method for treatment of attentional disorders, such as neglect e.g. 1,2. The method involves repeated pointing at specified targets with or without prism glasses using a specifically designed wooden box. The aim of this study was to ascertain...... of visuospatial disorders.     1. Rossetti et al. Nature 1998, 395: 166-169 2. Frassinetti et al. Brain 2002, 125: 608-623...

  4. Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

    DEFF Research Database (Denmark)

    Saari, Pasi; Fazekas, György; Eerola, Tuomas

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are prop......This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling...... related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outperforms a semantic computing technique that does not exploit genre information, and ACTwg-SLPwg outperforms conventional techniques and other genre-adaptive alternatives. In particular, improvements......-based genre representation for genre-adaptive music mood analysis....

  5. Adapting the traveling salesman problem to an adiabatic quantum computer

    Science.gov (United States)

    Warren, Richard H.

    2013-04-01

    We show how to guide a quantum computer to select an optimal tour for the traveling salesman. This is significant because it opens a rapid solution method for the wide range of applications of the traveling salesman problem, which include vehicle routing, job sequencing and data clustering.

  6. Adaption of computers in Dutch Museums: interpreting the new tool

    NARCIS (Netherlands)

    Navarrete, Trilce

    2015-01-01

    abstractThe adoption of computers in Dutch museums has been marked by the changing technology as much as by the interpretation of what the technology is meant to do. The Social Construction of Technology framework is used to review the adoption of a digital work method and to highlight the

  7. Adaptive zooming in X-ray computed tomography

    NARCIS (Netherlands)

    A. Dabravolski (Andrei); K.J. Batenburg (Joost); J. Sijbers (Jan)

    2014-01-01

    htmlabstractBACKGROUND: In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. OBJECTIVE: Increase the spatial resolution of the reconstructed

  8. Extensive Intestinal Resection Triggers Behavioral Adaptation, Intestinal Remodeling and Microbiota Transition in Short Bowel Syndrome

    Directory of Open Access Journals (Sweden)

    Camille Mayeur

    2016-03-01

    Full Text Available Extensive resection of small bowel often leads to short bowel syndrome (SBS. SBS patients develop clinical mal-absorption and dehydration relative to the reduction of absorptive area, acceleration of gastrointestinal transit time and modifications of the gastrointestinal intra-luminal environment. As a consequence of severe mal-absorption, patients require parenteral nutrition (PN. In adults, the overall adaptation following intestinal resection includes spontaneous and complex compensatory processes such as hyperphagia, mucosal remodeling of the remaining part of the intestine and major modifications of the microbiota. SBS patients, with colon in continuity, harbor a specific fecal microbiota that we called “lactobiota” because it is enriched in the Lactobacillus/Leuconostoc group and depleted in anaerobic micro-organisms (especially Clostridium and Bacteroides. In some patients, the lactobiota-driven fermentative activities lead to an accumulation of fecal d/l-lactates and an increased risk of d-encephalopathy. Better knowledge of clinical parameters and lactobiota characteristics has made it possible to stratify patients and define group at risk for d-encephalopathy crises.

  9. The Cultural Adaptation Process of Agricultural and Life Sciences Students on Short-Term Study Abroad Experiences

    Science.gov (United States)

    Conner, Nathan William

    2013-01-01

    The purpose of this study was to explore how undergraduate students in a college of agricultural and life sciences experienced cultural adaptation during short-term study abroad programs. The specific objectives of this study were to describe how undergraduate students in the college of agricultural and life sciences experienced culture throughout…

  10. Short-wavelength plasma turbulence and temperature anisotropy instabilities: recent computational progress

    Science.gov (United States)

    Gary, S. Peter

    2015-01-01

    Plasma turbulence consists of an ensemble of enhanced, broadband electromagnetic fluctuations, typically driven by multi-wave interactions which transfer energy in wavevector space via non- linear cascade processes. Temperature anisotropy instabilities in collisionless plasmas are driven by quasi-linear wave–particle interactions which transfer particle kinetic energy to field fluctuation energy; the resulting enhanced fluctuations are typically narrowband in wavevector magnitude and direction. Whatever their sources, short-wavelength fluctuations are those at which charged particle kinetic, that is, velocity-space, properties are important; these are generally wavelengths of the order of or shorter than the ion inertial length or the thermal ion gyroradius. The purpose of this review is to summarize and interpret recent computational results concerning short-wavelength plasma turbulence, short-wavelength temperature anisotropy instabilities and relationships between the two phenomena. PMID:25848081

  11. The development of a new computer adaptive test to evaluate chorea in Huntington disease: HDQLIFE Chorea.

    Science.gov (United States)

    Carlozzi, N E; Downing, N R; Schilling, S G; Lai, J-S; Goodnight, S M; Miner, J A; A Frank, S

    2016-10-01

    Huntington's disease (HD) is an autosomal dominant neurodegenerative disease associated with motor, behavioral, and cognitive deficits. The hallmark symptom of HD, chorea, is often the focus of HD clinical trials. Unfortunately, there are no self-reported measures of chorea. To address this shortcoming, we developed a new measure of chorea for use in HD, HDQLIFE Chorea. Qualitative data and literature reviews were conducted to develop an initial item pool of 141 chorea items. An iterative process, including cognitive interviews, expert review, translatability review, and literacy review, was used to refine this item pool to 64 items. These 64 items were field tested in 507 individuals with prodromal and/or manifest HD. Exploratory and confirmatory factor analyses (EFA and CFA, respectively) were conducted to identify a unidimensional set of items. Then, an item response theory graded response model (GRM) and differential item functioning analyses were conducted to select the final items for inclusion in this measure. EFA and CFA supported the retention of 34 chorea items. GRM and DIF supported the retention of all of these items in the final measure. GRM calibration data were used to inform the selection of a 6-item, static short form and to program the HDQLIFE Chorea computer adaptive test (CAT). CAT simulation analyses indicated a 0.99 correlation between the CAT scores and the full item bank. The new HDQLIFE Chorea CAT and corresponding 6-item short form were developed using established rigorous measurement development standards; this is the first self-reported measure developed to evaluate the impact of chorea on HRQOL in HD. This development work indicates that these measures have strong psychometric properties; future work is needed to establish test-retest reliability and responsiveness to change.

  12. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  13.   Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

      Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  14. The effects of short-lasting anti-saccade training in homonymous hemianopia with and without saccadic adaptation

    Directory of Open Access Journals (Sweden)

    Delphine eLévy-Bencheton

    2016-01-01

    Full Text Available Homonymous Visual Field Defects (HVFD are common following stroke and can be highly debilitating for visual perception and higher level cognitive functions such as exploring visual scene or reading a text. Rehabilitation using oculomotor compensatory methods with automatic training over a short duration (~15 days have been shown as efficient as longer voluntary training methods (>1 month. Here, we propose to evaluate and compare the effect of an original HVFD rehabilitation method based on a single 15 min voluntary anti-saccades task (AS toward the blind hemifield, with automatic sensorimotor adaptation to increase AS amplitude. In order to distinguish between adaptation and training effect, fourteen left- or right-HVFD patients were exposed, one month apart, to three training, two isolated AS task (Delayed-shift & No-shift paradigm and one combined with AS adaptation (Adaptation paradigm. A quality of life questionnaire (NEI-VFQ 25 and functional measurements (reading speed, visual exploration time in pop-out and serial tasks as well as oculomotor measurements were assessed before and after each training. We could not demonstrate significant adaptation at the group level, but we identified a group of 9 adapted patients. While AS training itself proved to demonstrate significant functional improvements in the overall patient group , we could also demonstrate in the sub-group of adapted patients and specifically following the adaptation training, an increase of saccade amplitude during the reading task (left-HVFD patients and the Serial exploration task, and improvement of the visual quality of life. We conclude that short-lasting AS training combined with adaptation could be implemented in rehabilitation methods of cognitive dysfunctions following HVFD. Indeed, both voluntary and automatic processes have shown interesting effects on the control of visually guided saccades in different cognitive tasks.

  15. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  16. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  17. HAMSTRING ARCHITECTURAL AND FUNCTIONAL ADAPTATIONS FOLLOWING LONG VS. SHORT MUSCLE LENGTH ECCENTRIC TRAINING

    Directory of Open Access Journals (Sweden)

    Kenny Guex

    2016-08-01

    Full Text Available Most common preventive eccentric-based exercises, such as Nordic hamstring do not include any hip flexion. So, the elongation stress reached is lower than during the late swing phase of sprinting. The aim of this study was to assess the evolution of hamstring architectural (fascicle length and pennation angle and functional (concentric and eccentric optimum angles and concentric and eccentric peak torques parameters following a 3-week eccentric resistance program performed at long (LML versus short muscle length (SML. Both groups performed eight sessions of 3-5x8 slow maximal eccentric knee extensions on an isokinetic dynamometer: the SML group at 0° and the LML group at 80° of hip flexion. Architectural parameters were measured using ultrasound imaging and functional parameters using the isokinetic dynamometer. The fascicle length increased by 4.9% (p<0.01, medium effect size in the SML and by 9.3% (p<0.001, large effect size in the LML group. The pennation angle did not change (p=0.83 in the SML and tended to decrease by 0.7° (p=0.09, small effect size in the LML group. The concentric optimum angle tended to decrease by 8.8° (p=0.09, medium effect size in the SML and by 17.3° (p<0.01, large effect size in the LML group. The eccentric optimum angle did not change (p=0.19, small effect size in the SML and tended to decrease by 10.7° (p=0.06, medium effect size in the LML group. The concentric peak torque did not change in the SML (p=0.37 and the LML (p=0.23 groups, whereas eccentric peak torque increased by 12.9% (p<0.01, small effect size and 17.9% (p<0.001, small effect size in the SML and the LML group, respectively. No group-by-time interaction was found for any parameters. A correlation was found between the training-induced change in fascicle length and the change in concentric optimum angle (r=-0.57, p<0.01. These results suggest that performing eccentric exercises lead to several architectural and functional adaptations. However

  18. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  19. Adaptive contrast-based computer aided detection for pulmonary embolism

    Science.gov (United States)

    Dinesh, M. S.; Devarakota, Pandu; Raghupathi, Laks; Lakare, Sarang; Salganicoff, Marcos; Krishnan, Arun

    2009-02-01

    This work involves the computer-aided diagnosis (CAD) of pulmonary embolism (PE) in contrast-enhanced computed tomography pulmonary angiography (CTPA). Contrast plays an important role in analyzing and identifying PE in CTPA. At times the contrast mixing in blood may be insufficient due to several factors such as scanning speed, body weight and injection duration. This results in a suboptimal study (mixing artifact) due to non-homogeneous enhancement of blood's opacity. Most current CAD systems are not optimized to detect PE in sub optimal studies. To this effect, we propose new techniques for CAD to work robustly in both optimal and suboptimal situations. First, the contrast level at the pulmonary trunk is automatically detected using a landmark detection tool. This information is then used to dynamically configure the candidate generation (CG) and classification stages of the algorithm. In CG, a fast method based on tobogganing is proposed which also detects wall-adhering emboli. In addition, our proposed method correctly encapsulates potential PE candidates that enable accurate feature calculation over the entire PE candidate. Finally a classifier gating scheme has been designed that automatically switches the appropriate classifier for suboptimal and optimal studies. The system performance has been validated on 86 real-world cases collected from different clinical sites. Results show around 5% improvement in the detection of segmental PE and 6% improvement in lobar and sub segmental PE with a 40% decrease in the average false positive rate when compared to a similar system without contrast detection.

  20. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  1. Changes in Jump-Down Performance After Space Flight: Short- and Long-Term Adaptation

    Science.gov (United States)

    Kofman, I. S.; Reschke, M. F.; Cerisano, J. M.; Fisher, E. A.; Lawrence, E. L.; Peters, B. T.; Bloomberg, J. J.

    2010-01-01

    INTRODUCTION Successful jump performance requires functional coordination of visual, vestibular, and somatosensory systems, which are affected by prolonged exposure to microgravity. Astronauts returning from space flight exhibit impaired ability to coordinate effective landing strategies when jumping from a platform to the ground. This study compares the jump strategies used by astronauts before and after flight, the changes to those strategies within a test session, and the recoveries in jump-down performance parameters across several postflight test sessions. These data were obtained as part of an ongoing interdisciplinary study (Functional Task Test, FTT) designed to evaluate both astronaut postflight functional performance and related physiological changes. METHODS Six astronauts from short-duration (Shuttle) and three from long-duration (International Space Station) flights performed 3 two-footed jumps from a platform 30 cm high. A force plate measured the ground reaction forces and center-of-pressure displacement from the landings. Muscle activation data were collected from the medial gastrocnemius and anterior tibialis of both legs using surface electromyography electrodes. Two load cells in the platform measured the load exerted by each foot during the takeoff phase of the jump. Data were collected in 2 preflight sessions, on landing day (Shuttle only), and 1, 6, and 30 days after flight. RESULTS AND CONCLUSION Many of the astronauts tested were unable to maintain balance on their first postflight jump landing but recovered by the third jump, showing a learning progression in which the performance improvement could be attributed to adjustments of strategy on takeoff, landing, or both. Takeoff strategy changes were evident in air time (time between takeoff and landing), which was significantly reduced after flight, and also in increased asymmetry in foot latencies on takeoff. Landing modifications were seen in changes in ground reaction force curves. The

  2. Adaptation to short photoperiods augments circadian food anticipatory activity in Siberian hamsters.

    Science.gov (United States)

    Bradley, Sean P; Prendergast, Brian J

    2014-06-01

    This article is part of a Special Issue "Energy Balance". Both the light-dark cycle and the timing of food intake can entrain circadian rhythms. Entrainment to food is mediated by a food entrainable circadian oscillator (FEO) that is formally and mechanistically separable from the hypothalamic light-entrainable oscillator. This experiment examined whether seasonal changes in day length affect the function of the FEO in male Siberian hamsters (Phodopus sungorus). Hamsters housed in long (LD; 15 h light/day) or short (SD; 9h light/day) photoperiods were subjected to a timed-feeding schedule for 10 days, during which food was available only during a 5h interval of the light phase. Running wheel activity occurring within a 3h window immediately prior to actual or anticipated food delivery was operationally-defined as food anticipatory activity (FAA). After the timed-feeding interval, hamsters were fed ad libitum, and FAA was assessed 2 and 7 days later via probe trials of total food deprivation. During timed-feeding, all hamsters exhibited increases FAA, but FAA emerged more rapidly in SD; in probe trials, FAA was greater in magnitude and persistence in SD. Gonadectomy in LD did not induce the SD-like FAA phenotype, indicating that withdrawal of gonadal hormones is not sufficient to mediate the effects of photoperiod on FAA. Entrainment of the circadian system to light markedly affects the functional output of the FEO via gonadal hormone-independent mechanisms. Rapid emergence and persistent expression of FAA in SD may reflect a seasonal adaptation that directs behavior toward sources of nutrition with high temporal precision at times of year when food is scarce. © 2013.

  3. Short-term adaptations following Complex Training in team-sports: A meta-analysis.

    Science.gov (United States)

    Freitas, Tomás T; Martinez-Rodriguez, Alejandro; Calleja-González, Julio; Alcaraz, Pedro E

    2017-01-01

    The purpose of this meta-analysis was to study the short-term adaptations on sprint and vertical jump (VJ) performance following Complex Training (CT) in team-sports. CT is a resistance training method aimed at developing both strength and power, which has a direct effect on sprint and VJ. It consists on alternating heavy resistance training exercises with plyometric/power ones, set for set, on the same workout. A search of electronic databases up to July 2016 (PubMed-MEDLINE, SPORTDiscus, Web of Knowledge) was conducted. Inclusion criteria: 1) at least one CT intervention group; 2) training protocols ≥4-wks; 3) sample of team-sport players; 4) sprint or VJ as an outcome variable. Effect sizes (ES) of each intervention were calculated and subgroup analyses were performed. A total of 9 studies (13 CT groups) met the inclusion criteria. Medium effect sizes (ES) (ES = 0.73) were obtained for pre-post improvements in sprint, and small (ES = 0.41) in VJ, following CT. Experimental-groups presented better post-intervention sprint (ES = 1.01) and VJ (ES = 0.63) performance than control-groups. large ESs were exhibited in younger athletes (12 total sessions (ES = 0.74). Large ESs in programs with >12 total sessions (ES = 0.81). Medium ESs obtained for under-Division I individuals (ES = 0.56); protocols with intracomplex rest intervals ≥2 min (ES = 0.55); conditioning activities with intensities ≤85% 1RM (ES = 0.64); basketball/volleyball players (ES = 0.55). Small ESs were found for younger athletes (ES = 0.42); interventions ≥6 weeks (ES = 0.45). CT interventions have positive medium effects on sprint performance and small effects on VJ in team-sport athletes. This training method is a suitable option to include in the season planning.

  4. Short-term adaptations following Complex Training in team-sports: A meta-analysis

    Science.gov (United States)

    Martinez-Rodriguez, Alejandro; Calleja-González, Julio; Alcaraz, Pedro E.

    2017-01-01

    Objective The purpose of this meta-analysis was to study the short-term adaptations on sprint and vertical jump (VJ) performance following Complex Training (CT) in team-sports. CT is a resistance training method aimed at developing both strength and power, which has a direct effect on sprint and VJ. It consists on alternating heavy resistance training exercises with plyometric/power ones, set for set, on the same workout. Methods A search of electronic databases up to July 2016 (PubMed-MEDLINE, SPORTDiscus, Web of Knowledge) was conducted. Inclusion criteria: 1) at least one CT intervention group; 2) training protocols ≥4-wks; 3) sample of team-sport players; 4) sprint or VJ as an outcome variable. Effect sizes (ES) of each intervention were calculated and subgroup analyses were performed. Results A total of 9 studies (13 CT groups) met the inclusion criteria. Medium effect sizes (ES) (ES = 0.73) were obtained for pre-post improvements in sprint, and small (ES = 0.41) in VJ, following CT. Experimental-groups presented better post-intervention sprint (ES = 1.01) and VJ (ES = 0.63) performance than control-groups. Sprint large ESs were exhibited in younger athletes (players (ES = 0.76); training programs >12 total sessions (ES = 0.74). VJ Large ESs in programs with >12 total sessions (ES = 0.81). Medium ESs obtained for under-Division I individuals (ES = 0.56); protocols with intracomplex rest intervals ≥2 min (ES = 0.55); conditioning activities with intensities ≤85% 1RM (ES = 0.64); basketball/volleyball players (ES = 0.55). Small ESs were found for younger athletes (ES = 0.42); interventions ≥6 weeks (ES = 0.45). Conclusions CT interventions have positive medium effects on sprint performance and small effects on VJ in team-sport athletes. This training method is a suitable option to include in the season planning. PMID:28662108

  5. Short-term adaptations following Complex Training in team-sports: A meta-analysis.

    Directory of Open Access Journals (Sweden)

    Tomás T Freitas

    Full Text Available The purpose of this meta-analysis was to study the short-term adaptations on sprint and vertical jump (VJ performance following Complex Training (CT in team-sports. CT is a resistance training method aimed at developing both strength and power, which has a direct effect on sprint and VJ. It consists on alternating heavy resistance training exercises with plyometric/power ones, set for set, on the same workout.A search of electronic databases up to July 2016 (PubMed-MEDLINE, SPORTDiscus, Web of Knowledge was conducted. Inclusion criteria: 1 at least one CT intervention group; 2 training protocols ≥4-wks; 3 sample of team-sport players; 4 sprint or VJ as an outcome variable. Effect sizes (ES of each intervention were calculated and subgroup analyses were performed.A total of 9 studies (13 CT groups met the inclusion criteria. Medium effect sizes (ES (ES = 0.73 were obtained for pre-post improvements in sprint, and small (ES = 0.41 in VJ, following CT. Experimental-groups presented better post-intervention sprint (ES = 1.01 and VJ (ES = 0.63 performance than control-groups.large ESs were exhibited in younger athletes (12 total sessions (ES = 0.74.Large ESs in programs with >12 total sessions (ES = 0.81. Medium ESs obtained for under-Division I individuals (ES = 0.56; protocols with intracomplex rest intervals ≥2 min (ES = 0.55; conditioning activities with intensities ≤85% 1RM (ES = 0.64; basketball/volleyball players (ES = 0.55. Small ESs were found for younger athletes (ES = 0.42; interventions ≥6 weeks (ES = 0.45.CT interventions have positive medium effects on sprint performance and small effects on VJ in team-sport athletes. This training method is a suitable option to include in the season planning.

  6. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale - Short Form.

    Science.gov (United States)

    Chaves, Fernanda Figueredo; Reis, Ilka Afonso; Pagano, Adriana Silvina; Torres, Heloísa de Carvalho

    2017-03-23

    To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale - Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. Assessment of the instrument's conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee's assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes - Versão Curta, was established. The scale had acceptable internal consistency with Cronbach's alpha of 0.634 (95%CI 0.494- 0.737), while the correlation of the total score in the two periods was considered moderate (0.47). The intraclass correlation coefficient was 0.50. The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv) for recording the Expert Committee responses as well as the responses in the validation tests proved to be a reliable, safe and innovative method. Traduzir

  7. Translation, cultural adaptation and validation of the English ?Short form SF 12v2? into Bengali in rheumatoid arthritis patients

    OpenAIRE

    Islam, Nazrul; Khan, Ikramul Hasan; Ferdous, Nira; Rasker, Johannes J.

    2017-01-01

    Background To develop a culturally adapted and validated Bengali Short Form SF 12v2 among Rheumatoid arthritis (RA) patients. Methods The English SF 12v2 was translated, adapted and back translated into and from Bengali, pre-tested by 60 patients. The Bengali SF 12v2 was administered twice with 14 days interval to 130 Bangladeshi RA patients. The psychometric properties of the Bengali SF 12v2 were assessed. Test-retest reliability was assessed by intra-class correlation coefficient (ICC) and ...

  8. Long-term metabolic and skeletal muscle adaptations to short-sprint training: implications for sprint training and tapering.

    Science.gov (United States)

    Ross, A; Leveritt, M

    2001-01-01

    The adaptations of muscle to sprint training can be separated into metabolic and morphological changes. Enzyme adaptations represent a major metabolic adaptation to sprint training, with the enzymes of all three energy systems showing signs of adaptation to training and some evidence of a return to baseline levels with detraining. Myokinase and creatine phosphokinase have shown small increases as a result of short-sprint training in some studies and elite sprinters appear better able to rapidly breakdown phosphocreatine (PCr) than the sub-elite. No changes in these enzyme levels have been reported as a result of detraining. Similarly, glycolytic enzyme activity (notably lactate dehydrogenase, phosphofructokinase and glycogen phosphorylase) has been shown to increase after training consisting of either long (>10-second) or short (training levels after somewhere between 7 weeks and 6 months of detraining. Mitochondrial enzyme activity also increases after sprint training, particularly when long sprints or short recovery between short sprints are used as the training stimulus. Morphological adaptations to sprint training include changes in muscle fibre type, sarcoplasmic reticulum, and fibre cross-sectional area. An appropriate sprint training programme could be expected to induce a shift toward type IIa muscle, increase muscle cross-sectional area and increase the sarcoplasmic reticulum volume to aid release of Ca(2+). Training volume and/or frequency of sprint training in excess of what is optimal for an individual, however, will induce a shift toward slower muscle contractile characteristics. In contrast, detraining appears to shift the contractile characteristics towards type IIb, although muscle atrophy is also likely to occur. Muscle conduction velocity appears to be a potential non-invasive method of monitoring contractile changes in response to sprint training and detraining. In summary, adaptation to sprint training is clearly dependent on the duration of

  9. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  10. Method for computing short-range forces between solid-liquid interfaces driving grain boundary premelting

    Science.gov (United States)

    Hoyt, J. J.; Olmsted, David; Jindal, Saryu; Asta, Mark; Karma, Alain

    2009-02-01

    We present a molecular dynamics based method for accurately computing short-range structural forces resulting from the overlap of spatially diffuse solid-liquid interfaces at wetted grain boundaries close to the melting point. The method is based on monitoring the fluctuations of the liquid layer width at different temperatures to extract the excess interfacial free energy as a function of this width. The method is illustrated for a high-energy Σ9 twist boundary in pure Ni. The short-range repulsion driving premelting is found to be dominant in comparison to long-range dispersion and entropic forces and consistent with previous experimental findings that nanometer-scale layer widths may be observed only very close to the melting point.

  11. Adaptive MultiCAP modulation for short range VCSEL based transmissions

    DEFF Research Database (Denmark)

    Puerta Ramírez, Rafael; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    2016-01-01

    We propose an adaptive approach for multi-band carrierless amplitude/phase modulation, with advantages of adaptive bit rate and energy savings. Successful performance is demonstrated on 850 nm multi-mode VCSEL based transmissions achieving up to 40.6 Gb/s.......We propose an adaptive approach for multi-band carrierless amplitude/phase modulation, with advantages of adaptive bit rate and energy savings. Successful performance is demonstrated on 850 nm multi-mode VCSEL based transmissions achieving up to 40.6 Gb/s....

  12. Examining the short term effects of emotion under an Adaptation Level Theory model of tinnitus perception.

    Science.gov (United States)

    Durai, Mithila; O'Keeffe, Mary G; Searchfield, Grant D

    2017-03-01

    Existing evidence suggests a strong relationship between tinnitus and emotion. The objective of this study was to examine the effects of short-term emotional changes along valence and arousal dimensions on tinnitus outcomes. Emotional stimuli were presented in two different modalities: auditory and visual. The authors hypothesized that (1) negative valence (unpleasant) stimuli and/or high arousal stimuli will lead to greater tinnitus loudness and annoyance than positive valence and/or low arousal stimuli, and (2) auditory emotional stimuli, which are in the same modality as the tinnitus, will exhibit a greater effect on tinnitus outcome measures than visual stimuli. Auditory and visual emotive stimuli were administered to 22 participants (12 females and 10 males) with chronic tinnitus, recruited via email invitations send out to the University of Auckland Tinnitus Research Volunteer Database. Emotional stimuli used were taken from the International Affective Digital Sounds- Version 2 (IADS-2) and the International Affective Picture System (IAPS) (Bradley and Lang, 2007a, 2007b). The Emotion Regulation Questionnaire (Gross and John, 2003) was administered alongside subjective ratings of tinnitus loudness and annoyance, and psychoacoustic sensation level matches to external sounds. Males had significantly different emotional regulation scores than females. Negative valence emotional auditory stimuli led to higher tinnitus loudness ratings in males and females and higher annoyance ratings in males only; loudness matches of tinnitus remained unchanged. The visual stimuli did not have an effect on tinnitus ratings. The results are discussed relative to the Adaptation Level Theory Model of Tinnitus. The results indicate that the negative valence dimension of emotion is associated with increased tinnitus magnitude judgements and gender effects may also be present, but only when the emotional stimulus is in the auditory modality. Sounds with emotional associations may be

  13. The Psychological Well-Being and Sociocultural Adaptation of Short-Term International Students in Ireland

    Science.gov (United States)

    O'Reilly, Aileen; Ryan, Dermot; Hickey, Tina

    2010-01-01

    This article reports on an empirical study of the psychosocial adaptation of international students in Ireland. Using measures of social support, loneliness, stress, psychological well-being, and sociocultural adaptation, data were obtained from international students and a comparison sample of Irish students. The study found that, although…

  14. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Applications of automatic mesh generation and adaptive methods in computational medicine

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, J.A.; Macleod, R.S. [Univ. of Utah, Salt Lake City, UT (United States); Johnson, C.R.; Eason, J.C. [Duke Univ., Durham, NC (United States)

    1995-12-31

    Important problems in Computational Medicine exist that can benefit from the implementation of adaptive mesh refinement techniques. Biological systems are so inherently complex that only efficient models running on state of the art hardware can begin to simulate reality. To tackle the complex geometries associated with medical applications we present a general purpose mesh generation scheme based upon the Delaunay tessellation algorithm and an iterative point generator. In addition, automatic, two- and three-dimensional adaptive mesh refinement methods are presented that are derived from local and global estimates of the finite element error. Mesh generation and adaptive refinement techniques are utilized to obtain accurate approximations of bioelectric fields within anatomically correct models of the heart and human thorax. Specifically, we explore the simulation of cardiac defibrillation and the general forward and inverse problems in electrocardiography (ECG). Comparisons between uniform and adaptive refinement techniques are made to highlight the computational efficiency and accuracy of adaptive methods in the solution of field problems in computational medicine.

  16. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talking about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.

  17. Progress Monitoring with Computer Adaptive Assessments: The Impact of Data Collection Schedule on Growth Estimates

    Science.gov (United States)

    Nelson, Peter M.; Van Norman, Ethan R.; Klingbeil, Dave A.; Parker, David C.

    2017-01-01

    Although extensive research exists on the use of curriculum-based measures for progress monitoring, little is known about using computer adaptive tests (CATs) for progress-monitoring purposes. The purpose of this study was to evaluate the impact of the frequency of data collection on individual and group growth estimates using a CAT. Data were…

  18. The Effects of Routing and Scoring within a Computer Adaptive Multi-Stage Framework

    Science.gov (United States)

    Dallas, Andrew

    2014-01-01

    This dissertation examined the overall effects of routing and scoring within a computer adaptive multi-stage framework (ca-MST). Testing in a ca-MST environment has become extremely popular in the testing industry. Testing companies enjoy its efficiency benefits as compared to traditionally linear testing and its quality-control features over…

  19. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  20. METHOD, APPARATUS AND COMPUTER PROGRAM FOR ADAPTIVE COMPENSATION OF A MTF

    OpenAIRE

    Bonnier, Nicolas; Lindner, Albrecht

    2010-01-01

    A method and system for adaptively compensating for the printer MTF. The MTF compensation applied locally depends on the local mean: Several compensated High-pass images are computed for different mean values. Then locally, depending on the value of the low-pass band, one compensation high-pass value is selected for the final result.

  1. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  2. The Effect of Adaptive Confidence Strategies in Computer-Assisted Instruction on Learning and Learner Confidence

    Science.gov (United States)

    Warren, Richard Daniel

    2012-01-01

    The purpose of this research was to investigate the effects of including adaptive confidence strategies in instructionally sound computer-assisted instruction (CAI) on learning and learner confidence. Seventy-one general educational development (GED) learners recruited from various GED learning centers at community colleges in the southeast United…

  3. A comparison of computerized adaptive testing and fixed-length short forms for the Prosthetic Limb Users Survey of Mobility (PLUS-MTM).

    Science.gov (United States)

    Amtmann, Dagmar; Bamer, Alyssa M; Kim, Jiseon; Bocell, Fraser; Chung, Hyewon; Park, Ryoungsun; Salem, Rana; Hafner, Brian J

    2017-09-01

    New health status instruments can be administered by computerized adaptive test or short forms. The Prosthetic Limb Users Survey of Mobility (PLUS-MTM) is a self-report measure of mobility for prosthesis users with lower limb loss. This study used the PLUS-M to examine advantages and disadvantages of computerized adaptive test and short forms. To compare scores obtained from computerized adaptive test to scores obtained from fixed-length short forms (7-item and 12-item) in order to provide guidance to researchers and clinicians on how to select the best form of administration for different uses. Cross-sectional, observational study. Individuals with lower limb loss completed the PLUS-M by computerized adaptive test and short forms. Administration time, correlations between the scores, and standard errors were compared. Scores and standard errors from the computerized adaptive test, 7-item short form, and 12-item short form were highly correlated and all forms of administration were efficient. Computerized adaptive test required less time to administer than either paper or electronic short forms; however, time savings were minimal compared to the 7-item short form. Results indicate that the PLUS-M computerized adaptive test is most efficient, and differences in scores between administration methods are minimal. The main advantage of the computerized adaptive test was more reliable scores at higher levels of mobility compared to short forms. Clinical relevance Health-related item banks, like the Prosthetic Limb Users Survey of Mobility (PLUS-MTM), can be administered by computerized adaptive testing (CAT) or as fixed-length short forms (SFs). Results of this study will help clinicians and researchers decide whether they should invest in a CAT administration system or whether SFs are more appropriate.

  4. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  5. A computer simulation of an adaptive noise canceler with a single input

    Science.gov (United States)

    Albert, Stuart D.

    1991-06-01

    A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.

  6. Dynamic Context-Aware and Limited Resources-Aware Service Adaptation for Pervasive Computing

    Directory of Open Access Journals (Sweden)

    Moeiz Miraoui

    2011-01-01

    Full Text Available A pervasive computing system (PCS requires that devices be context aware in order to provide proactively adapted services according to the current context. Because of the highly dynamic environment of a PCS, the service adaptation task must be performed during device operation. Most of the proposed approaches do not deal with the problem in depth, because they are either not really context aware or the problem itself is not thought to be dynamic. Devices in a PCS are generally hand-held, that is, they have limited resources, and so, in the effort to make them more reliable, the service adaptation must take into account this constraint. In this paper, we propose a dynamic service adaptation approach for a device operating in a PCS that is both context aware and limited resources aware. The approach is then modeled using colored Petri Nets and simulated using the CPN Tools, an important step toward its validation.

  7. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  8. Computational and experimental analysis of short peptide motifs for enzyme inhibition.

    Directory of Open Access Journals (Sweden)

    Jinglin Fu

    Full Text Available The metabolism of living systems involves many enzymes that play key roles as catalysts and are essential to biological function. Searching ligands with the ability to modulate enzyme activities is central to diagnosis and therapeutics. Peptides represent a promising class of potential enzyme modulators due to the large chemical diversity, and well-established methods for library synthesis. Peptides and their derivatives are found to play critical roles in modulating enzymes and mediating cellular uptakes, which are increasingly valuable in therapeutics. We present a methodology that uses molecular dynamics (MD and point-variant screening to identify short peptide motifs that are critical for inhibiting β-galactosidase (β-Gal. MD was used to simulate the conformations of peptides and to suggest short motifs that were most populated in simulated conformations. The function of the simulated motifs was further validated by the experimental point-variant screening as critical segments for inhibiting the enzyme. Based on the validated motifs, we eventually identified a 7-mer short peptide for inhibiting an enzyme with low μM IC50. The advantage of our methodology is the relatively simplified simulation that is informative enough to identify the critical sequence of a peptide inhibitor, with a precision comparable to truncation and alanine scanning experiments. Our combined experimental and computational approach does not rely on a detailed understanding of mechanistic and structural details. The MD simulation suggests the populated motifs that are consistent with the results of the experimental alanine and truncation scanning. This approach appears to be applicable to both natural and artificial peptides. With more discovered short motifs in the future, they could be exploited for modulating biocatalysis, and developing new medicine.

  9. The psychological well-being and sociocultural adaptation of short-term international students in Ireland

    OpenAIRE

    O'Reilly, Aileen; Ryan, Dermot; Hickey, Tina

    2010-01-01

    This article reports on an empirical study of the psychosocial adaptation of international students in Ireland. Using measures of social support, loneliness, stress, psychological well-being, and sociocultural adaptation, data were obtained from international students and a comparison sample of Irish students. The study found that, although international students had high levels of social support and low levels of loneliness and stress, students were experiencing high levels of sociocultural ...

  10. Short-term adaptation of saccades does not affect smooth pursuit eye movement initiation.

    Science.gov (United States)

    Sun, Zongpeng; Smilgin, Aleksandra; Junker, Marc; Dicke, Peter W; Thier, Peter

    2017-08-01

    Scrutiny of the visual environment requires saccades that shift gaze to objects of interest. In case the object should be moving, smooth pursuit eye movements (SPEM) try to keep the image of the object within the confines of the fovea in order to ensure sufficient time for its analysis. Both saccades and SPEM can be adaptively changed by the experience of insufficiencies, compromising the precision of saccades or the minimization of object image slip in the case of SPEM. As both forms of adaptation rely on the cerebellar oculomotor vermis (OMV), most probably deploying a shared neuronal machinery, one might expect that the adaptation of one type of eye movement should affect the kinematics of the other. In order to test this expectation, we subjected two monkeys to a standard saccadic adaption paradigm with SPEM test trials at the end and, alternatively, the same two monkeys plus a third one to a random saccadic adaptation paradigm with interleaved trials of SPEM. In contrast to our expectation, we observed at best marginal transfer which, moreover, had little consistency across experiments and subjects. The lack of consistent transfer of saccadic adaptation decisively constrains models of the implementation of oculomotor learning in the OMV, suggesting an extensive separation of saccade- and SPEM-related synapses on P-cell dendritic trees.

  11. Are We Measuring Teachers’ Attitudes towards Computers in Detail?: Adaptation of a Questionnaire into Turkish Culture

    Directory of Open Access Journals (Sweden)

    Nilgün Günbaş

    2017-04-01

    Full Text Available Teachers’ perceptions of computers play an important role in integrating computers into education. The related literature includes studies developing or adapting a survey instrument in Turkish culture measuring teachers’ attitudes toward computers. These instruments have three to four factors (e.g., computer importance, computer enjoyment, computer confidence and 18 to 26 items under these factors. The purpose of the present study is to adapt a more detailed and stronger survey questionnaire measuring more dimensions related to teachers’ attitudes. The source instrument was developed by Christensen and Kenzek (2009 and called Teachers’ Attitudes toward Computers (TAC. It has nine factors with 51 items. Before testing the instrument, the interaction (e-mail factor was taken out because of the cultural differences. The reliability and validity testing of the translated instrument was completed with 273 teachers’ candidates in a Faculty of Education in Turkey. The results showed that the translated instrument (Cronbach’s Alpha: .94 included eight factors and consisted of 42 items under these factors, which were consistent with the original instrument. These factors were: Interest (α: .83, Comfort (α: .90, Accommodation (α: .87, Concern (α: .79, Utility (α: .90, Perception (α: .89, Absorption (α: .84, and Significance (α: .83. Additionally, the confirmatory factor analysis result for the model with eight factors was: RMSEA=0.050, χ2/df=1.69, RMR=0.075, SRMR=0.057, GFI= 0.81, AGFI= 0.78, NFI= 0.94, NNFI=0.97, CFI=0.97, IFI= 0.97. Accordingly, as a reliable, valid and stronger instrument, the adapted survey instrument can be suggested for the use in Turkish academic studies.

  12. Cross-cultural adaptation and validation of the Danish version of the Short Musculoskeletal Function Assessment questionnaire (SMFA).

    Science.gov (United States)

    Lindahl, Marianne; Andersen, Signe; Joergensen, Annette; Frandsen, Christian; Jensen, Liselotte; Benedikz, Eirikur

    2017-07-04

    The aim of this study was to translate and culturally adapt the Short Musculoskeletal Function Assessment (SMFA) into Danish (SMFA-DK) and assess the psychometric properties. SMFA was translated and cross-culturally adapted according to a standardized procedure. Minor changes in the wording in three items were made to adapt to Danish conditions. Acute patients (n = 201) and rehabilitation patients (n = 231) with musculoskeletal problems aged 18-87 years were included. The following analysis were made to evaluate psychometric quality of SMFA-DK: Reliability with Chronbach's alpha, content validity as coding according to the International Classification of Functioning, Disability and Health (ICF), floor/ceiling effects, construct validity as factor analysis, correlations between SMFA-DK and Short Form 36 and also known group method. Responsiveness and effect size were calculated. Cronbach's alpha values were between 0.79 and 0.94. SMFA-DK captured all components of the ICF, and there were no floor/ceiling effects. Factor analysis demonstrated four subscales. SMFA-DK correlated good with the SF-36 subscales for the rehabilitation patients and lower for the newly injured patients. Effect sizes were excellent and better for SMFA-DK than for SF-36. The study indicates that SMFA-DK can be a valid and responsive measure of outcome in rehabilitation settings.

  13. Glutamine alone or combined with short-chain fatty acids fails to enhance gut adaptation after massive enterectomy in rats.

    Science.gov (United States)

    Neves, José de Souza; Aguilar-Nascimento, José Eduardo de; Gomes-da-Silva, Maria Helena Gaiva; Cavalcanti, Rosecélia Nunes; Bicudo, Alberto Salomão; Nascimento, Mariana; Nochi, Rubens Jardim

    2006-01-01

    To investigate the effect of oral glutamine alone or combined with short chain fatty acids (SCFA) in the intestinal adaptation of rats submitted to an massive enterectomy. After receiving 70% small bowel resection, 30 Wistar rats were randomized to received either standard rat chow (control group, n=10) or the same diet supplemented with 3,05% of glutamine alone (glutamine group, n=10) or combined with a solution containing SCFA (glutamine+SCFA group, n=10). Animals were killed on the 14th postoperative day. Mucosal weight, crypt depth, villus height, wall width, and the mucosal content of DNA, were assessed in basal conditions (resected gut specimen) and compared to the small bowel specimen collected on the postoperative day 14, at both jejunum and ileum sites. All groups presented similar pattern in weight evolution. In all groups, both the morphological findings and the DNA content were significantly higher at the end of the experiment than in basal conditions, at both the jejunum and ileum. Except for the jejunum wall width that was higher in control group (808+/-95 micro) than in the other two groups (glutamine = 649+/-88 micro and glutamine+SCFA = 656+/-92; p<0.01), there was no difference among them in all variables at both intestinal sites after 14 days. All groups presented adaptation of the intestinal mucosa in the remnant gut. Glutamine combined or not with short chain fatty acids fails to influence the adaptive response of the small bowel.

  14. Short-term electricity demand and gas price forecasts using wavelet transforms and adaptive models

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Hang T.; Nabney, Ian T. [Non-linearity and Complexity Research Group, School of Engineering and Applied Science, Aston University, Aston Triangle, Birmingham B4 7ET (United Kingdom)

    2010-09-15

    This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their NMSEs are 0.02314 and 0.15384 respectively. (author)

  15. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  16. Exploiting metadata, ontologies and semantics to design/enhance new end-user experiences for adaptive pervasive computing environments.

    OpenAIRE

    Soylu, Ahmet

    2012-01-01

    Adaptive Systems and Pervasive Computing change the face of computing and redefine the way people interact with the technology. Pioneers pursue a vision that technology is seamlessly situated in people’s life and adapts itself to the characteristics, requirements, and needs of the users and the environment without any distraction at the user side. Adaptive Systems research mostly focuses on individual applications that can alter their interface, behavior, presentation etc. mainly with respect...

  17. Adaptation and validation of the short version WHOQOL-HIV in Ethiopia

    DEFF Research Database (Denmark)

    Tesfaye Woldeyohannes, Markos; Olsen, Mette Frahm; Medhin, Girmay

    2016-01-01

    -cultural equivalence of the WHOQOL-HIV when used among people with HIV in Ethiopia. Therefore, this study aimed at adapting the WHOQOL-HIV bref for the Ethiopian setting. METHODS: A step-wise adaptation of the WHOQOL-HIV bref for use in Ethiopia was conducted to produce an Ethiopian version.......82, TLI = 0.77 and RMSEA = 0.064). CONCLUSION: The WHOQOL-HIV-BREF-Eth has been shown to be a valid measure of quality of life for use in clinical settings among people with HIV in Ethiopia....

  18. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    Science.gov (United States)

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting.

  19. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation

    Directory of Open Access Journals (Sweden)

    Robert eBauer

    2015-02-01

    Full Text Available Restorative brain-computer interfaces (BCI are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation.In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting.

  20. Peripheral Quantitative Computed Tomography Predicts Humeral Diaphysis Torsional Mechanical Properties With Good Short-Term Precision.

    Science.gov (United States)

    Weatherholt, Alyssa M; Avin, Keith G; Hurd, Andrea L; Cox, Jacob L; Marberry, Scott T; Santoni, Brandon G; Warden, Stuart J

    2015-01-01

    Peripheral quantitative computed tomography (pQCT) is a popular tool for noninvasively estimating bone mechanical properties. Previous studies have demonstrated that pQCT provides precise estimates that are good predictors of actual bone mechanical properties at popular distal imaging sites (tibia and radius). The predictive ability and precision of pQCT at more proximal sites remain unknown. The aim of the present study was to explore the predictive ability and short-term precision of pQCT estimates of mechanical properties of the midshaft humerus, a site gaining popularity for exploring the skeletal benefits of exercise. Predictive ability was determined ex vivo by assessing the ability of pQCT-derived estimates of torsional mechanical properties in cadaver humeri (density-weighted polar moment of inertia [I(P)] and polar strength-strain index [SSI(P)]) to predict actual torsional properties. Short-term precision was assessed in vivo by performing 6 repeat pQCT scans at the level of the midshaft humerus in 30 young, healthy individuals (degrees of freedom = 150), with repeat scans performed by the same and different testers and on the same and different days to explore the influences of different testers and time between repeat scans on precision errors. IP and SSI(P) both independently predicted at least 90% of the variance in ex vivo midshaft humerus mechanical properties in cadaveric bones. Overall values for relative precision error (root mean squared coefficients of variation) for in vivo measures of IP and SSI(P) at the midshaft humerus were mechanical properties with good short-term precision, with measures being robust against the influences of different testers and time between repeat scans. Copyright © 2015 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  1. The Turkish Adaptation of the Burnout Measure-Short Version (BMS) and Couple Burnout Measure-Short Version (CBMS) and the Relationship between Career and Couple Burnout Based on Psychoanalytic-Existential Perspective

    Science.gov (United States)

    Capri, Burhan

    2013-01-01

    The purpose of this research is to carry out the Turkish adaptation, validity, and reliability studies of Burnout Measure-Short Form (BMS) and Couple Burnout Measure-Short Form (CBMS) and also to analyze the correlation between the careers and couple burnout scores of the participants from the psychoanalytic-existential perspective. This research…

  2. A Combined Methodology of Adaptive Neuro-Fuzzy Inference System and Genetic Algorithm for Short-term Energy Forecasting

    Directory of Open Access Journals (Sweden)

    KAMPOUROPOULOS, K.

    2014-02-01

    Full Text Available This document presents an energy forecast methodology using Adaptive Neuro-Fuzzy Inference System (ANFIS and Genetic Algorithms (GA. The GA has been used for the selection of the training inputs of the ANFIS in order to minimize the training result error. The presented algorithm has been installed and it is being operating in an automotive manufacturing plant. It periodically communicates with the plant to obtain new information and update the database in order to improve its training results. Finally the obtained results of the algorithm are used in order to provide a short-term load forecasting for the different modeled consumption processes.

  3. Influence of length of occlusal support on masticatory function of free-end removable partial dentures: short-term adaptation.

    Science.gov (United States)

    Sánchez-Ayala, Alfonso; Gonçalves, Thaís Marques Simek Vega; Ambrosano, Gláucia Maria Bovi; Garcia, Renata Cunha Matheus Rodrigues

    2013-06-01

    To analyze masticatory function after a short adaptation period relative to occlusal support length reduction in free-end removable partial denture (RPD) wearers. Twenty-three patients (55.2 ± 8.4 years) were rehabilitated with maxillary complete and mandibular free-end RPDs extending to the second molars. Five occlusal support length conditions were determined by removing artificial teeth from the RPDs: full occlusal support (control); occlusal support to the first molars, second premolars, and first premolars; and no occlusal support. To explore a probable short-term adaptation to occlusal support length reduction, participants wore their dentures at each condition for a period of 1 week before starting masticatory function assessment. For this purpose, masticatory performance, masticatory efficiency, chewing rate, selection chance, and breakage function were evaluated at each condition using the sieving method. Data were analyzed using repeated-measures ANOVA and post hoc Dunnett tests (α = 0.05). Masticatory performance and masticatory efficiency for 2 to 4 mm particles under the condition of occlusal support to the first molars and second premolars were similar to control values (p > 0.05). Masticatory efficiency relative to particles smaller than 2 mm was also seen at the condition of support length to the first premolars (p > 0.05). Chewing rates showed adaptation only at the condition of support length to the first molars (p > 0.05). A similar trend was noted for the selection chance of 8-mm particles, and breakage function for 8- and 2.4-mm particles (p > 0.05). After a 1-week adaptation period to free-end RPDs with occlusal support lengths reduced to the premolars, participants were able to achieve adequate masticatory function. © 2013 by the American College of Prosthodontists.

  4. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Science.gov (United States)

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Adaptive Remodeling of Achilles Tendon: A Multi-scale Computational Model

    Science.gov (United States)

    Rubenson, Jonas; Umberger, Brian

    2016-01-01

    While it is known that musculotendon units adapt to their load environments, there is only a limited understanding of tendon adaptation in vivo. Here we develop a computational model of tendon remodeling based on the premise that mechanical damage and tenocyte-mediated tendon damage and repair processes modify the distribution of its collagen fiber lengths. We explain how these processes enable the tendon to geometrically adapt to its load conditions. Based on known biological processes, mechanical and strain-dependent proteolytic fiber damage are incorporated into our tendon model. Using a stochastic model of fiber repair, it is assumed that mechanically damaged fibers are repaired longer, whereas proteolytically damaged fibers are repaired shorter, relative to their pre-damage length. To study adaptation of tendon properties to applied load, our model musculotendon unit is a simplified three-component Hill-type model of the human Achilles-soleus unit. Our model results demonstrate that the geometric equilibrium state of the Achilles tendon can coincide with minimization of the total metabolic cost of muscle activation. The proposed tendon model independently predicts rates of collagen fiber turnover that are in general agreement with in vivo experimental measurements. While the computational model here only represents a first step in a new approach to understanding the complex process of tendon remodeling in vivo, given these findings, it appears likely that the proposed framework may itself provide a useful theoretical foundation for developing valuable qualitative and quantitative insights into tendon physiology and pathology. PMID:27684554

  6. Adaptive finite element simulation of flow and transport applications on parallel computers

    Science.gov (United States)

    Kirk, Benjamin Shelton

    The subject of this work is the adaptive finite element simulation of problems arising in flow and transport applications on parallel computers. Of particular interest are new contributions to adaptive mesh refinement (AMR) in this parallel high-performance context, including novel work on data structures, treatment of constraints in a parallel setting, generality and extensibility via object-oriented programming, and the design/implementation of a flexible software framework. This technology and software capability then enables more robust, reliable treatment of multiscale--multiphysics problems and specific studies of fine scale interaction such as those in biological chemotaxis (Chapter 4) and high-speed shock physics for compressible flows (Chapter 5). The work begins by presenting an overview of key concepts and data structures employed in AMR simulations. Of particular interest is how these concepts are applied in the physics-independent software framework which is developed here and is the basis for all the numerical simulations performed in this work. This open-source software framework has been adopted by a number of researchers in the U.S. and abroad for use in a wide range of applications. The dynamic nature of adaptive simulations pose particular issues for efficient implementation on distributed-memory parallel architectures. Communication cost, computational load balance, and memory requirements must all be considered when developing adaptive software for this class of machines. Specific extensions to the adaptive data structures to enable implementation on parallel computers is therefore considered in detail. The libMesh framework for performing adaptive finite element simulations on parallel computers is developed to provide a concrete implementation of the above ideas. This physics-independent framework is applied to two distinct flow and transport applications classes in the subsequent application studies to illustrate the flexibility of the

  7. A short note on the use of the red-black tree in Cartesian adaptive mesh refinement algorithms

    Science.gov (United States)

    Hasbestan, Jaber J.; Senocak, Inanc

    2017-12-01

    Mesh adaptivity is an indispensable capability to tackle multiphysics problems with large disparity in time and length scales. With the availability of powerful supercomputers, there is a pressing need to extend time-proven computational techniques to extreme-scale problems. Cartesian adaptive mesh refinement (AMR) is one such method that enables simulation of multiscale, multiphysics problems. AMR is based on construction of octrees. Originally, an explicit tree data structure was used to generate and manipulate an adaptive Cartesian mesh. At least eight pointers are required in an explicit approach to construct an octree. Parent-child relationships are then used to traverse the tree. An explicit octree, however, is expensive in terms of memory usage and the time it takes to traverse the tree to access a specific node. For these reasons, implicit pointerless methods have been pioneered within the computer graphics community, motivated by applications requiring interactivity and realistic three dimensional visualization. Lewiner et al. [1] provides a concise review of pointerless approaches to generate an octree. Use of a hash table and Z-order curve are two key concepts in pointerless methods that we briefly discuss next.

  8. Adaptive, multi-domain techniques for two-phase flow computations

    Science.gov (United States)

    Uzgoren, Eray

    Computations of immiscible two-phase flows deal with interfaces that may move and/or deform in response to the dynamics within the flow field. As interfaces move, one needs to compute the new shapes and the associated geometric information (such as curvatures, normals, and projected areas/volumes) as part of the solution. The present study employs the immersed boundary method (IBM), which uses marker points to track the interface location and continuous interface methods to model interfacial conditions. The large transport property jumps across the interface, and the considerations of the mechanism including convection, diffusion, pressure, body force and surface tension create multiple time/length scales. The resulting computational stiffness and moving boundaries make numerical simulations computationally expensive in three-dimensions, even when the computations are performed on adaptively refined 3D Cartesian grids that efficiently resolve the length scales. A domain decomposition method and a partitioning strategy for adaptively refined grids are developed to enable parallel computing capabilities. Specifically, the approach consists of multilevel additive Schwarz method for domain decomposition, and Hilbert space filling curve ordering for partitioning. The issues related to load balancing, communication and computation, convergence rate of the iterative solver in regard to grid size and the number of sub-domains and interface shape deformation, are studied. Moreover, interfacial representation using marker points is extended to model complex solid geometries for single and two-phase flows. Developed model is validated using a benchmark test case, flow over a cylinder. Furthermore, overall algorithm is employed to further investigate steady and unsteady behavior of the liquid plug problem. Finally, capability of handling two-phase flow simulations in complex solid geometries is demonstrated by studying the effect of bifurcation point on the liquid plug, which

  9. An Investigation of the Validity and Reliability of the Adapted Mathematics Anxiety Rating Scale-Short Version (MARS-SV) among Turkish Students

    Science.gov (United States)

    Baloglu, Mustafa

    2010-01-01

    This study adapted the Mathematics Anxiety Rating Scale-Short Version (MARS-SV) into Turkish and investigated the validity and reliability of the adapted instrument. Twenty-five bilingual experts agreed on the language validity, and 49 Turkish language experts agreed on the conformity and understandability of the scale's items. Thirty-two subject…

  10. Computerized Adaptive Testing Using the PROMIS Physical Function Item Bank Reduces Test Burden With Less Ceiling Effects Compared With the Short Musculoskeletal Function Assessment in Orthopaedic Trauma Patients.

    Science.gov (United States)

    Hung, Man; Stuart, Ami R; Higgins, Thomas F; Saltzman, Charles L; Kubiak, Erik N

    2014-08-01

    Patient-reported outcomes are important to assess effectiveness of clinical interventions. For orthopaedic trauma patients, the short Musculoskeletal Function Assessment (sMFA) is a commonly used questionnaire. Recently, the Patient-Reported Outcome Measurement Information System (PROMIS) PF Function Computer Adaptive Test (PF CAT) was developed using item response theory to efficiently administer questions from a calibrated bank of 124 PF questions using computerized adaptive testing. In this study, we compared the sMFA versus the PROMIS PF CAT for trauma patients. Orthopaedic trauma patients completed the sMFA and the PROMIS PF CAT on a tablet wirelessly connected to the PROMIS Assessment Center. The time for each test administration was recorded. A 1-parameter item response theory model was used to examine the psychometric properties of the instruments, including precision and floor/ceiling effects. One hundred fifty-three orthopaedic trauma patients participated in the study. Mean test administration time for PROMIS PF CAT was 44 seconds versus 599 seconds for sMFA (P ceiling effect, whereas the PROMIS PF CAT had no appreciable ceiling effect. Administered by electronic means, the PROMIS PF CAT required less than one-tenth the amount of time for patients to complete than the sMFA while achieving equally high reliability and less ceiling effects. The PROMIS PF CAT is a very attractive and innovative method for assessing patient-reported outcomes with minimal burden to patients.

  11. Adaptation and validation of the short version WHOQOL-HIV in Ethiopia.

    Science.gov (United States)

    Tesfaye, Markos; Olsen, Mette Frahm; Medhin, Girmay; Friis, Henrik; Hanlon, Charlotte; Holm, Lotte

    2016-01-01

    Quality of life of patients is an important element in the evaluation of outcome of health care, social services and clinical trials. The WHOQOL instruments were originally developed for measurement of quality of life across cultures. However, there were concerns raised about the cross-cultural equivalence of the WHOQOL-HIV when used among people with HIV in Ethiopia. Therefore, this study aimed at adapting the WHOQOL-HIV bref for the Ethiopian setting. A step-wise adaptation of the WHOQOL-HIV bref for use in Ethiopia was conducted to produce an Ethiopian version-WHOQOL-HIV-BREF-Eth. Semantic and item equivalence was tested on 20 people with HIV. One hundred people with HIV were interviewed to test for measurement equivalence (known group validity and internal consistency) of the WHOQOL-HIV-BREF-Eth. Confirmatory factor analysis was conducted using data from 348 people with HIV who were recruited from HIV clinics. In the process of adaptation, new items of relevance to the context were added while seven items were deleted because of problems with acceptability and poor psychometric properties. The Cronbach's α for the final tool with twenty-seven items WHOQOL-HIV-BREF-Eth was 0.93. All six domains discriminated well between symptomatic and asymptomatic people with HIV (p < 0.001). Using confirmatory factor analysis, a second order factor structure with six first order indicator factors demonstrated moderate fit to the data ((χ(2) = 627.75; DF = 259; p < 0.001), CFI = 0.82, TLI = 0.77 and RMSEA = 0.064). The WHOQOL-HIV-BREF-Eth has been shown to be a valid measure of quality of life for use in clinical settings among people with HIV in Ethiopia.

  12. Saddle Pulmonary Embolism: Laboratory and Computed Tomographic Pulmonary Angiographic Findings to Predict Short-term Mortality.

    Science.gov (United States)

    Liu, Min; Miao, Ran; Guo, Xiaojuan; Zhu, Li; Zhang, Hongxia; Hou, Qing; Guo, Youmin; Yang, Yuanhua

    2017-02-01

    Saddle pulmonary embolism (SPE) is rare type of acute pulmonary embolism and there is debate about its treatment and prognosis. Our aim is to assess laboratory and computed tomographic pulmonary angiographic (CTPA) findings to predict short-term mortality in patients with SPE. This was a five-centre, retrospective study. The clinical information, laboratory and CTPA findings of 88 consecutive patients with SPE were collected. One-month mortality after diagnosis of SPE was the primary end-point. The correlation of laboratory and CTPA findings with one-month mortality was analysed with area under curve (AUC) of receiver operating characteristic (ROC) curves and logistic regression analysis. Eighteen patients with SPE died within one month. Receiver operating characteristic curves revealed that the cutoff values for the right and left atrial diameter ratio, the right ventricular area and left ventricular area ratio (RVa/LVa ratio), Mastora score, septal angle, N-terminal pro-brain natriuretic peptide and cardiac troponin I (cTnI) for detecting early mortality were 2.15, 2.13, 69%, 57°, 3036 pg/mL and 0.18ng/mL, respectively. Using logistic regression analysis of laboratory and CTPA findings with regard to one-month mortality of SPE, RVa/LVa ratio and cTnI were shown to be independently associated with early death. A combination of cTnI and RVa/LVa ratio revealed an increase in the AUC value, but the difference did not reach significance compared with RVa/LVa or cTnI, alone (P>0.05). In patients with SPE, both the RVa/LVa ratio on CTPA and cTnI appear valuable for the prediction of short-term mortality. Copyright © 2016 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  13. Reduced short term adaptation to robot generated dynamic environment in children affected by Cerebral Palsy

    Directory of Open Access Journals (Sweden)

    Di Rosa Giuseppe

    2011-05-01

    Full Text Available Abstract Background It is known that healthy adults can quickly adapt to a novel dynamic environment, generated by a robotic manipulandum as a structured disturbing force field. We suggest that it may be of clinical interest to evaluate to which extent this kind of motor learning capability is impaired in children affected by cerebal palsy. Methods We adapted the protocol already used with adults, which employs a velocity dependant viscous field, and compared the performance of a group of subjects affected by Cerebral Palsy (CP group, 7 subjects with a Control group of unimpaired age-matched children. The protocol included a familiarization phase (FA, during which no force was applied, a force field adaptation phase (CF, and a wash-out phase (WO in which the field was removed. During the CF phase the field was shut down in a number of randomly selected "catch" trials, which were used in order to evaluate the "learning index" for each single subject and the two groups. Lateral deviation, speed and acceleration peaks and average speed were evaluated for each trajectory; a directional analysis was performed in order to inspect the role of the limb's inertial anisotropy in the different experimental phases. Results During the FA phase the movements of the CP subjects were more curved, displaying greater and variable directional error; over the course of the CF phase both groups showed a decreasing trend in the lateral error and an after-effect at the beginning of the wash-out, but the CP group had a non significant adaptation rate and a lower learning index, suggesting that CP subjects have reduced ability to learn to compensate external force. Moreover, a directional analysis of trajectories confirms that the control group is able to better predict the force field by tuning the kinematic features of the movements along different directions in order to account for the inertial anisotropy of arm. Conclusions Spatial abnormalities in children affected

  14. The nociceptive withdrawal reflex does not adapt to joint position change and short-term motor practice.

    Science.gov (United States)

    Eckert, Nathan; Riley, Zachary A

    2013-01-01

    The nociceptive withdrawal reflex is a protective mechanism to mediate interactions within a potentially dangerous environment. The reflex is formed by action-based sensory encoding during the early post-natal developmental period, and it is unknown if the protective motor function of the nociceptive withdrawal reflex in the human upper-limb is adaptable based on the configuration of the arm or if it can be modified by short-term practice of a similar or opposing motor action. In the present study, nociceptive withdrawal reflexes were evoked by a brief train of electrical stimuli applied to digit II, 1) in five different static arm positions and, 2) before and after motor practice that was opposite (EXT) or similar (FLEX) to the stereotyped withdrawal response, in 10 individuals. Withdrawal responses were quantified by the electromyography (EMG) reflex response in several upper limb muscles, and by the forces and moments recorded at the wrist. EMG onset latencies and response amplitudes were not significantly different across the arm positions or between the EXT and FLEX practice conditions, and the general direction of the withdrawal response was similar across arm positions. In addition, the force vectors were not different after practice in either the practice condition or between EXT and FLEX conditions. We conclude the withdrawal response is insensitive to changes in elbow or shoulder joint angles as well as remaining resistant to short-term adaptations from the practice of motor actions, resulting in a generalized limb withdrawal in each case. It is further hypothesized that the multisensory feedback is weighted differently in each arm position, but integrated to achieve a similar withdrawal response to safeguard against erroneous motor responses that could cause further harm. The results remain consistent with the concept that nociceptive withdrawal reflexes are shaped through long-term and not short-term action based sensory encoding.

  15. Computationally Efficient Adaptive Beamformer for Ultrasound Imaging Based on QR Decomposition.

    Science.gov (United States)

    Park, Jongin; Wi, Seok-Min; Lee, Jin S

    2016-02-01

    Adaptive beamforming methods for ultrasound imaging have been studied to improve image resolution and contrast. The most common approach is the minimum variance (MV) beamformer which minimizes the power of the beamformed output while maintaining the response from the direction of interest constant. The method achieves higher resolution and better contrast than the delay-and-sum (DAS) beamformer, but it suffers from high computational cost. This cost is mainly due to the computation of the spatial covariance matrix and its inverse, which requires O(L(3)) computations, where L denotes the subarray size. In this study, we propose a computationally efficient MV beamformer based on QR decomposition. The idea behind our approach is to transform the spatial covariance matrix to be a scalar matrix σI and we subsequently obtain the apodization weights and the beamformed output without computing the matrix inverse. To do that, QR decomposition algorithm is used and also can be executed at low cost, and therefore, the computational complexity is reduced to O(L(2)). In addition, our approach is mathematically equivalent to the conventional MV beamformer, thereby showing the equivalent performances. The simulation and experimental results support the validity of our approach.

  16. Adaptive allocation of decisionmaking responsibility between human and computer in multitask situations

    Science.gov (United States)

    Chu, Y.-Y.; Rouse, W. B.

    1979-01-01

    As human and computer come to have overlapping decisionmaking abilities, a dynamic or adaptive allocation of responsibilities may be the best mode of human-computer interaction. It is suggested that the computer serve as a backup decisionmaker, accepting responsibility when human workload becomes excessive and relinquishing responsibility when workload becomes acceptable. A queueing theory formulation of multitask decisionmaking is used and a threshold policy for turning the computer on/off is proposed. This policy minimizes event-waiting cost subject to human workload constraints. An experiment was conducted with a balanced design of several subject runs within a computer-aided multitask flight management situation with different task demand levels. It was found that computer aiding enhanced subsystem performance as well as subjective ratings. The queueing model appears to be an adequate representation of the multitask decisionmaking situation, and to be capable of predicting system performance in terms of average waiting time and server occupancy. Server occupancy was further found to correlate highly with the subjective effort ratings.

  17. Adapting the short form of the Coping Inventory for Stressful Situations into Chinese

    Directory of Open Access Journals (Sweden)

    Li C

    2017-06-01

    Full Text Available Chun Li,1 Qing Liu,2 Ti Hu,3 Xiaoyan Jin1 1International School of Chinese Studies, Northeast Normal University, Changchun, 2Department of Nuclear Medicine and Medical PET Center, The Second Hospital of Zhejiang University School of Medicine, Zhejiang University, Hangzhou, 3School of Physical Education and Sports, Beijing Normal University, Beijing, People’s Republic of China Objectives: The Coping Inventory for Stressful Situations (CISS is a measurement tool for evaluating stress that has good psychometric properties. We investigated the applicability of a short-form version of the CISS in a large sample of Chinese university students. Methods: Nine hundred and seventy-two Chinese university students aged 18–30 years (mean =20.15, standard deviation =3.26 were chosen as subjects, of whom 101 were randomly selected to be retested after a 2-week interval. Results: The results of a confirmatory factor analysis revealed that the root mean square error of approximation of a four-factor model was 0.06, while the comparative fit index was 0.91, the incremental fit index was 0.93, the non-normed fit index was 0.91, and the root mean residual was 0.07. The Cronbach’s α coefficients for the task-oriented, emotion-oriented, distraction, and social diversion coping subscales were 0.81, 0.74, 0.7, and 0.66, respectively. The 2-week test–retest reliability was 0.78, 0.74, 0.7, and 0.65 for the task-oriented, emotion-oriented, distraction, and social diversion coping subscales, respectively. In the Chinese version of the CISS short form, task-oriented coping was positively correlated with positive affect and extraversion and negatively correlated with neuroticism; emotion-oriented coping was negatively correlated with extraversion and positively correlated with negative affect, anxiety, and neuroticism; distraction coping was positively correlated with neuroticism, extroversion, anxiety, positive affect, and negative affect and negatively

  18. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  19. Adapting to the destitute situations: poverty cues lead to short-term choice.

    Directory of Open Access Journals (Sweden)

    Lei Liu

    Full Text Available BACKGROUND: Why do some people live for the present, whereas others save for the future? The evolutionary framework of life history theory predicts that preference for delay of gratification should be influenced by social economic status (SES. However, here we propose that the decision to choose alternatives in immediate and delayed gratification in poverty environments may have a psychological dimension. Specifically, the perception of environmental poverty cues may induce people alike to favor choices with short-term, likely smaller benefit than choices with long-term, greater benefit. METHODOLOGY/PRINCIPAL FINDINGS: The present study was conducted to explore how poverty and affluence cues affected individuals' intertemporal choices. In our first two experiments, individuals exposed explicitly (Experiment 1 and implicitly (Experiment 2 to poverty pictures (the poverty cue were induced to prefer immediate gratification compared with those exposed to affluence pictures (the affluence cue. Furthermore, by the manipulation of temporary perceptions of poverty and affluence status using a lucky draw game; individuals in the poverty state were more impulsive in a manner, which made them pursue immediate gratification in intertemporal choices (Experiment 3. Thus, poverty cues can lead to short-term choices. CONCLUSIONS/SIGNIFICANCE: Decision makers chose more frequently the sooner-smaller reward over the later-larger reward as they were exposed to the poverty cue. This indicates that it is that just the feeling of poverty influences intertemporal choice - the actual reality of poverty (restricted resources, etc. is not necessary to get the effect. Furthermore, our findings emphasize that it is a change of the poverty-affluence status, not a trait change, can influence individual preference in intertemporal choice.

  20. Muscle Adaptations Following Short-Duration Bed Rest with Integrated Resistance, Interval, and Aerobic Exercise

    Science.gov (United States)

    Hackney, Kyle J.; Scott, Jessica M.; Buxton, Roxanne; Redd-Goetchius, Elizabeth; Crowell, J. Brent; Everett, Meghan E.; Wickwire, Jason; Ryder, Jeffrey W.; Bloomberg, Jacob J.; Ploutz-Snyder, Lori L.

    2011-01-01

    Unloading of the musculoskeletal system during space flight results in deconditioning that may impair mission-related task performance in astronauts. Exercise countermeasures have been frequently tested during bed rest (BR) and limb suspension; however, high-intensity, short-duration exercise prescriptions have not been fully explored. PURPOSE: To determine if a high intensity resistance, interval, and aerobic exercise program could protect against muscle atrophy and dysfunction when performed during short duration BR. METHODS: Nine subjects (1 female, 8 male) performed a combination of supine exercises during 2 weeks of horizontal BR. Resistance exercise (3 d / wk) consisted of squat, leg press, hamstring curl, and heel raise exercises (3 sets, 12 repetitions). Aerobic (6 d / wk) sessions alternated continuous (75% VO2 peak) and interval exercise (30 s, 2 min, and 4 min) and were completed on a supine cycle ergometer and vertical treadmill, respectively. Muscle volumes of the upper leg were calculated pre, mid, and post-BR using magnetic resonance imaging. Maximal isometric force (MIF), rate of force development (RFD), and peak power of the lower body extensors were measured twice before BR (averaged to represent pre) and once post BR. ANOVA with repeated measures and a priori planned contrasts were used to test for differences. RESULTS: There were no changes to quadriceps, hamstring, and adductor muscle volumes at mid and post BR time points compared to pre BR (Table 1). Peak power increased significantly from 1614 +/- 372 W to 1739 +/- 359 W post BR (+7.7%, p = 0.035). Neither MIF (pre: 1676 +/- 320 N vs. post: 1711 +/- 250 N, +2.1%, p = 0.333) nor RFD (pre: 7534 +/- 1265 N/ms vs. post: 6951 +/- 1241 N/ms, -7.7%, p = 0.136) were significantly impaired post BR.

  1. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Science.gov (United States)

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  2. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  3. Adaptive workflow scheduling in grid computing based on dynamic resource availability

    Directory of Open Access Journals (Sweden)

    Ritu Garg

    2015-06-01

    Full Text Available Grid computing enables large-scale resource sharing and collaboration for solving advanced science and engineering applications. Central to the grid computing is the scheduling of application tasks to the resources. Various strategies have been proposed, including static and dynamic strategies. The former schedules the tasks to resources before the actual execution time and later schedules them at the time of execution. Static scheduling performs better but it is not suitable for dynamic grid environment. The lack of dedicated resources and variations in their availability at run time has made this scheduling a great challenge. In this study, we proposed the adaptive approach to schedule workflow tasks (dependent tasks to the dynamic grid resources based on rescheduling method. It deals with the heterogeneous dynamic grid environment, where the availability of computing nodes and links bandwidth fluctuations are inevitable due to existence of local load or load by other users. The proposed adaptive workflow scheduling (AWS approach involves initial static scheduling, resource monitoring and rescheduling with the aim to achieve the minimum execution time for workflow application. The approach differs from other techniques in literature as it considers the changes in resources (hosts and links availability and considers the impact of existing load over the grid resources. The simulation results using randomly generated task graphs and task graphs corresponding to real world problems (GE and FFT demonstrates that the proposed algorithm is able to deal with fluctuations of resource availability and provides overall optimal performance.

  4. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    Science.gov (United States)

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  5. Adaptation of MPDATA Heterogeneous Stencil Computation to Intel Xeon Phi Coprocessor

    Directory of Open Access Journals (Sweden)

    Lukasz Szustak

    2015-01-01

    Full Text Available The multidimensional positive definite advection transport algorithm (MPDATA belongs to the group of nonoscillatory forward-in-time algorithms and performs a sequence of stencil computations. MPDATA is one of the major parts of the dynamic core of the EULAG geophysical model. In this work, we outline an approach to adaptation of the 3D MPDATA algorithm to the Intel MIC architecture. In order to utilize available computing resources, we propose the (3 + 1D decomposition of MPDATA heterogeneous stencil computations. This approach is based on combination of the loop tiling and fusion techniques. It allows us to ease memory/communication bounds and better exploit the theoretical floating point efficiency of target computing platforms. An important method of improving the efficiency of the (3 + 1D decomposition is partitioning of available cores/threads into work teams. It permits for reducing inter-cache communication overheads. This method also increases opportunities for the efficient distribution of MPDATA computation onto available resources of the Intel MIC architecture, as well as Intel CPUs. We discuss preliminary performance results obtained on two hybrid platforms, containing two CPUs and Intel Xeon Phi. The top-of-the-line Intel Xeon Phi 7120P gives the best performance results, and executes MPDATA almost 2 times faster than two Intel Xeon E5-2697v2 CPUs.

  6. Computationally Efficient Adaptive Type-2 Fuzzy Control of Flexible-Joint Manipulators

    Directory of Open Access Journals (Sweden)

    Hicham Chaoui

    2013-05-01

    Full Text Available In this paper, we introduce an adaptive type-2 fuzzy logic controller (FLC for flexible-joint manipulators with structured and unstructured dynamical uncertainties. Simplified interval fuzzy sets are used for real-time efficiency, and internal stability is enhanced by adopting a trade-off strategy between the manipulator’s and the actuators’ velocities. Furthermore, the control scheme is independent of the computationally expensive noisy torque and acceleration signals. The controller is validated through a set of numerical simulations and by comparing it against its type-1 counterpart. The ability of the adaptive type-2 FLC in coping with large magnitudes of uncertainties yields an improved performance. The stability of the proposed control system is guaranteed using Lyapunov stability theory.

  7. Short Paper and Poster Proceedings of the 22nd Annual Conference on Computer Animation and Social Agents

    NARCIS (Netherlands)

    Nijholt, Antinus; Egges, A.; van Welbergen, H.; Hondorp, G.H.W.

    2009-01-01

    These are the proceedings containing the short and poster papers of CASA 2009, the twenty second international conference on Computer Animation and Social Agents. CASA 2009 was organized in Amsterdam, the Netherlands from the 17th to the 19th of June 2009. CASA is organized under the auspices of the

  8. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  9. High performance computing for deformable image registration: towards a new paradigm in adaptive radiotherapy.

    Science.gov (United States)

    Samant, Sanjiv S; Xia, Junyi; Muyan-Ozcelik, Pinar; Owens, John D

    2008-08-01

    The advent of readily available temporal imaging or time series volumetric (4D) imaging has become an indispensable component of treatment planning and adaptive radiotherapy (ART) at many radiotherapy centers. Deformable image registration (DIR) is also used in other areas of medical imaging, including motion corrected image reconstruction. Due to long computation time, clinical applications of DIR in radiation therapy and elsewhere have been limited and consequently relegated to offline analysis. With the recent advances in hardware and software, graphics processing unit (GPU) based computing is an emerging technology for general purpose computation, including DIR, and is suitable for highly parallelized computing. However, traditional general purpose computation on the GPU is limited because the constraints of the available programming platforms. As well, compared to CPU programming, the GPU currently has reduced dedicated processor memory, which can limit the useful working data set for parallelized processing. We present an implementation of the demons algorithm using the NVIDIA 8800 GTX GPU and the new CUDA programming language. The GPU performance will be compared with single threading and multithreading CPU implementations on an Intel dual core 2.4 GHz CPU using the C programming language. CUDA provides a C-like language programming interface, and allows for direct access to the highly parallel compute units in the GPU. Comparisons for volumetric clinical lung images acquired using 4DCT were carried out. Computation time for 100 iterations in the range of 1.8-13.5 s was observed for the GPU with image size ranging from 2.0 x 10(6) to 14.2 x 10(6) pixels. The GPU registration was 55-61 times faster than the CPU for the single threading implementation, and 34-39 times faster for the multithreading implementation. For CPU based computing, the computational time generally has a linear dependence on image size for medical imaging data. Computational efficiency is

  10. Eliciting explanatory models of common mental disorders using the Short Explanatory Model Interview (SEMI) Urdu adaptation--a pilot study.

    Science.gov (United States)

    Mirza, Ilyas; Hassan, Rana; Chaudhary, Haroon; Jenkins, Rachel

    2006-10-01

    The purpose of this pilot study was to describe the presenting symptoms and its explanation from the patients' perspective of GHQ (General Health Questionnaire) positive cases attending primary care facility/a general practice in semiurban Lahore. Fifteen consecutive attenders were screened with GHQ and 11 GHQ positive cases went on to complete an adapted questionnaire derived from SEMI (Short Explanatory Model Interview). Though there was no consistency in the presenting symptoms of GHQ positive cases on presentation to a general practitioner, all described their problems as intense, less than 2 years on onset and on reflection located its origins in their social worlds. These findings have implications in terms of providing preliminary data for a larger study, perhaps looking at development of psychosocial interventions for treatment of mental distress in our local context as it seems to have its origins in their social worlds.

  11. Soft calls and broadcast calls in the corncrake as adaptations to short and long range communication.

    Science.gov (United States)

    Ręk, Paweł

    2013-10-01

    Because birds' acoustic signals function in antagonistic interactions between males and in female attraction, a majority of vocalisations are loud. In contrast, some birds, additionally produce soft vocalisations in escalated agonistic and sexual contexts. Nevertheless, the relationship between the acoustic parameters of such signals and their function is not clear. Here I investigate the sound transmission degradation properties of soft and broadcast (loud) calls in the corncrake using calls with natural and changed amplitude. I show that, if played at the same amplitude, the maximum limit for communication distance with soft calls was significantly shorter than that of broadcast calls, indicating that frequency structure is important in determining the range of both signals independently of their amplitude. At the same time, the values of excess attenuation were lower for soft calls than for broadcast calls at most distances, which suggests that the short transmission of soft calls is achieved mostly due to their low and narrow frequency ranges, promoting their masking by ambient noise. Finally, contrary to soft calls, changes in the energy of tails of echoes in broadcast calls were associated with the distance of propagation, which might be useful in assessing the distance to senders. I suggest that the acoustic structure of soft vocalisations can be used to limit the range of the signal, which might be helpful in eavesdropping avoidance, whereas broadcast calls are designed for long-range transmission. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Influence of Adaptive Statistical Iterative Reconstruction on coronary plaque analysis in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H; Broersen, Alexander

    2016-01-01

    PURPOSE: The purpose of this study was to study the effect of iterative reconstruction (IR) software on quantitative plaque measurements in coronary computed tomography angiography (CCTA). METHODS: Thirty patients with a three clinical risk factors for coronary artery disease (CAD) had one CCTA...... performed. Images were reconstructed using FBP, 30% and 60% adaptive statistical IR (ASIR). Coronary plaque analysis was performed as per patient and per vessel (LM, LAD, CX and RCA) measurements. Lumen and vessel volumes and plaque burden measurements were based on automatic detected contours in each...

  13. Common spatial pattern patches - an optimized filter ensemble for adaptive brain-computer interfaces.

    Science.gov (United States)

    Sannelli, Claudia; Vidaurre, Carmen; Muller, Klaus-Robert; Blankertz, Benjamin

    2010-01-01

    Laplacian filters are commonly used in Brain Computer Interfacing (BCI). When only data from few channels are available, or when, like at the beginning of an experiment, no previous data from the same user is available complex features cannot be used. In this case band power features calculated from Laplacian filtered channels represents an easy, robust and general feature to control a BCI, since its calculation does not involve any class information. For the same reason, the performance obtained with Laplacian features is poor in comparison to subject-specific optimized spatial filters, such as Common Spatial Patterns (CSP) analysis, which, on the other hand, can be used just in a later phase of the experiment, since they require a considerable amount of training data in order to enroll a stable and good performance. This drawback is particularly evident in case of poor performing BCI users, whose data is highly non-stationary and contains little class relevant information. Therefore, Laplacian filtering is preferred to CSP, e.g., in the initial period of co-adaptive calibration, a novel BCI paradigm designed to alleviate the problem of BCI illiteracy. In fact, in the co-adaptive calibration design the experiment starts with a subject-independent classifier and simple features are needed in order to obtain a fast adaptation of the classifier to the newly acquired user's data. Here, the use of an ensemble of local CSP patches (CSPP) is proposed, which can be considered as a compromise between Laplacians and CSP: CSPP needs less data and channels than CSP, while being superior to Laplacian filtering. This property is shown to be particularly useful for the co-adaptive calibration design and is demonstrated on off-line data from a previous co-adaptive BCI study.

  14. Cephalopods as Predators: A Short Journey among Behavioral Flexibilities, Adaptions, and Feeding Habits

    Science.gov (United States)

    Villanueva, Roger; Perricone, Valentina; Fiorito, Graziano

    2017-01-01

    The diversity of cephalopod species and the differences in morphology and the habitats in which they live, illustrates the ability of this class of molluscs to adapt to all marine environments, demonstrating a wide spectrum of patterns to search, detect, select, capture, handle, and kill prey. Photo-, mechano-, and chemoreceptors provide tools for the acquisition of information about their potential preys. The use of vision to detect prey and high attack speed seem to be a predominant pattern in cephalopod species distributed in the photic zone, whereas in the deep-sea, the development of mechanoreceptor structures and the presence of long and filamentous arms are more abundant. Ambushing, luring, stalking and pursuit, speculative hunting and hunting in disguise, among others are known modes of hunting in cephalopods. Cannibalism and scavenger behavior is also known for some species and the development of current culture techniques offer evidence of their ability to feed on inert and artificial foods. Feeding requirements and prey choice change throughout development and in some species, strong ontogenetic changes in body form seem associated with changes in their diet and feeding strategies, although this is poorly understood in planktonic and larval stages. Feeding behavior is altered during senescence and particularly in brooding octopus females. Cephalopods are able to feed from a variety of food sources, from detritus to birds. Their particular requirements of lipids and copper may help to explain why marine crustaceans, rich in these components, are common prey in all cephalopod diets. The expected variation in climate change and ocean acidification and their effects on chemoreception and prey detection capacities in cephalopods are unknown and needs future research. PMID:28861006

  15. Cephalopods as Predators: A Short Journey among Behavioral Flexibilities, Adaptions, and Feeding Habits

    Directory of Open Access Journals (Sweden)

    Roger Villanueva

    2017-08-01

    Full Text Available The diversity of cephalopod species and the differences in morphology and the habitats in which they live, illustrates the ability of this class of molluscs to adapt to all marine environments, demonstrating a wide spectrum of patterns to search, detect, select, capture, handle, and kill prey. Photo-, mechano-, and chemoreceptors provide tools for the acquisition of information about their potential preys. The use of vision to detect prey and high attack speed seem to be a predominant pattern in cephalopod species distributed in the photic zone, whereas in the deep-sea, the development of mechanoreceptor structures and the presence of long and filamentous arms are more abundant. Ambushing, luring, stalking and pursuit, speculative hunting and hunting in disguise, among others are known modes of hunting in cephalopods. Cannibalism and scavenger behavior is also known for some species and the development of current culture techniques offer evidence of their ability to feed on inert and artificial foods. Feeding requirements and prey choice change throughout development and in some species, strong ontogenetic changes in body form seem associated with changes in their diet and feeding strategies, although this is poorly understood in planktonic and larval stages. Feeding behavior is altered during senescence and particularly in brooding octopus females. Cephalopods are able to feed from a variety of food sources, from detritus to birds. Their particular requirements of lipids and copper may help to explain why marine crustaceans, rich in these components, are common prey in all cephalopod diets. The expected variation in climate change and ocean acidification and their effects on chemoreception and prey detection capacities in cephalopods are unknown and needs future research.

  16. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  17. Translation, cross-cultural adaptation and psychometric evaluation of yoruba version of the short-form 36 health survey.

    Science.gov (United States)

    Mbada, Chidozie Emmanuel; Adeogun, Gafar Atanda; Ogunlana, Michael Opeoluwa; Adedoyin, Rufus Adesoji; Akinsulore, Adesanmi; Awotidebe, Taofeek Oluwole; Idowu, Opeyemi Ayodiipo; Olaoye, Olumide Ayoola

    2015-09-14

    The Short-Form Health Survey (SF-36) is a valid quality of life tool often employed to determine the impact of medical intervention and the outcome of health care services. However, the SF-36 is culturally sensitive which necessitates its adaptation and translation into different languages. This study was conducted to cross-culturally adapt the SF-36 into Yoruba language and determine its reliability and validity. Based on the International Quality of Life Assessment project guidelines, a sequence of translation, test of item-scale correlation, and validation was implemented for the translation of the Yoruba version of the SF-36. Following pilot testing, the English and the Yoruba versions of the SF-36 were administered to a random sample of 1087 apparently healthy individuals to test validity and 249 respondents completed the Yoruba SF-36 again after two weeks to test reliability. Data was analyzed using Pearson's product moment correlation analysis, independent t-test, one-way analysis of variance, multi trait scaling analysis and Intra-Class Correlation (ICC) at p Yoruba SF-36 ranges between 0.636 and 0.843 for scales; and 0.783 and 0.851 for domains. The data quality, concurrent and discriminant validity, reliability and internal consistency of the Yoruba version of the SF-36 are adequate and it is recommended for measuring health-related quality of life among Yoruba population.

  18. Effect of parenteral nutrition supplemented with short-chain fatty acids on adaptation to massive small bowel resection.

    Science.gov (United States)

    Koruda, M J; Rolandelli, R H; Settle, R G; Zimmaro, D M; Rombeau, J L

    1988-09-01

    After massive small bowel resection, total parenteral nutrition (TPN) is prescribed to maintain nutritional status. However, TPN reduces the mass of the remaining intestinal mucosa, whereas adaptation to small bowel resection is associated with increased mucosal mass. Short-chain fatty acids (SCFAs) have been shown to stimulate mucosal cell mitotic activity. This study determined whether the addition of SCFAs to TPN following small bowel resection would prevent intestinal mucosal atrophy produced by TPN. Adult rats underwent an 80% small bowel resection and then received either standard TPN or TPN supplemented with SCFAs (sodium acetate, propionate, and butyrate). After 1 wk, jejunal and ileal mucosal weights, deoxyribonucleic acid, ribonucleic acid, and protein contents were measured and compared with the parameters obtained at the time of resection. Animals receiving TPN showed significant loss of jejunal mucosal weight, deoxyribonucleic acid, ribonucleic acid, and protein and ileal mucosal weight and deoxyribonucleic acid after small bowel resection, whereas animals receiving SCFA-supplemented TPN showed no significant change in the jejunal mucosal parameters and a significant increase in ileal mucosal protein. These data demonstrate that SCFA-supplemented TPN reduces the mucosal atrophy associated with TPN after massive bowel resection and thys may facilitate adaptation to small bowel resection.

  19. Using perceptive computing in multiple sclerosis - the Short Maximum Speed Walk test

    Science.gov (United States)

    2014-01-01

    Background We investigated the applicability and feasibility of perceptive computing assisted gait analysis in multiple sclerosis (MS) patients using Microsoft Kinect™. To detect the maximum walking speed and the degree of spatial sway, we established a computerized and observer-independent measure, which we named Short Maximum Speed Walk (SMSW), and compared it to established clinical measures of gait disability in MS, namely the Expanded Disability Status Scale (EDSS) and the Timed 25-Foot Walk (T25FW). Methods Cross-sectional study of 22 MS patients (age mean ± SD 43 ± 9 years, 13 female) and 22 age and gender matched healthy control subjects (HC) (age 37 ± 11 years, 13 female). The disability level of each MS patient was graded using the EDSS (median 3.0, range 0.0-6.0). All subjects then performed the SMSW and the Timed 25-Foot Walk (T25FW). The SMSW comprised five gait parameters, which together assessed average walking speed and gait stability in different dimensions (left/right, up/down and 3D deviation). Results SMSW average walking speed was slower in MS patients (1.6 ± 0.3 m/sec) than in HC (1.8 ± 0.4 m/sec) (p = 0.005) and correlated well with EDSS (Spearman’s Rho 0.676, p Kinect™ are feasible, well tolerated and can detect clinical gait disturbances in patients with MS. The retest-reliability was on par with the T25FW. PMID:24886525

  20. Short-term locomotor adaptation to a robotic ankle exoskeleton does not alter soleus Hoffmann reflex amplitude

    Directory of Open Access Journals (Sweden)

    Ferris Daniel P

    2010-07-01

    Full Text Available Abstract Background To improve design of robotic lower limb exoskeletons for gait rehabilitation, it is critical to identify neural mechanisms that govern locomotor adaptation to robotic assistance. Previously, we demonstrated soleus muscle recruitment decreased by ~35% when walking with a pneumatically-powered ankle exoskeleton providing plantar flexor torque under soleus proportional myoelectric control. Since a substantial portion of soleus activation during walking results from the stretch reflex, increased reflex inhibition is one potential mechanism for reducing soleus recruitment when walking with exoskeleton assistance. This is clinically relevant because many neurologically impaired populations have hyperactive stretch reflexes and training to reduce the reflexes could lead to substantial improvements in their motor ability. The purpose of this study was to quantify soleus Hoffmann (H- reflex responses during powered versus unpowered walking. Methods We tested soleus H-reflex responses in neurologically intact subjects (n=8 that had trained walking with the soleus controlled robotic ankle exoskeleton. Soleus H-reflex was tested at the mid and late stance while subjects walked with the exoskeleton on the treadmill at 1.25 m/s, first without power (first unpowered, then with power (powered, and finally without power again (second unpowered. We also collected joint kinematics and electromyography. Results When the robotic plantar flexor torque was provided, subjects walked with lower soleus electromyographic (EMG activation (27-48% and had concomitant reductions in H-reflex amplitude (12-24% compared to the first unpowered condition. The H-reflex amplitude in proportion to the background soleus EMG during powered walking was not significantly different from the two unpowered conditions. Conclusion These findings suggest that the nervous system does not inhibit the soleus H-reflex in response to short-term adaption to exoskeleton assistance

  1. Creating a computer adaptive test version of the late-life function and disability instrument.

    Science.gov (United States)

    Jette, Alan M; Haley, Stephen M; Ni, Pengsheng; Olarsch, Sippy; Moed, Richard

    2008-11-01

    This study applied item response theory (IRT) and computer adaptive testing (CAT) methodologies to develop a prototype function and disability assessment instrument for use in aging research. Herein, we report on the development of the CAT version of the Late-Life Function and Disability Instrument (Late-Life FDI) and evaluate its psychometric properties. We used confirmatory factor analysis, IRT methods, validation, and computer simulation analyses of data collected from 671 older adults residing in residential care facilities. We compared accuracy, precision, and sensitivity to change of scores from CAT versions of two Late-Life FDI scales with scores from the fixed-form instrument. Score estimates from the prototype CAT versus the original instrument were compared in a sample of 40 older adults. Distinct function and disability domains were identified within the Late-Life FDI item bank and used to construct two prototype CAT scales. Using retrospective data, scores from computer simulations of the prototype CAT scales were highly correlated with scores from the original instrument. The results of computer simulation, accuracy, precision, and sensitivity to change of the CATs closely approximated those of the fixed-form scales, especially for the 10- or 15-item CAT versions. In the prospective study, each CAT was administered in FDI were highly comparable to those obtained from the full-length instrument with a small loss in accuracy, precision, and sensitivity to change.

  2. Computing Adaptive Feature Weights with PSO to Improve Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2017-01-01

    Full Text Available Android malware detection is a complex and crucial issue. In this paper, we propose a malware detection model using a support vector machine (SVM method based on feature weights that are computed by information gain (IG and particle swarm optimization (PSO algorithms. The IG weights are evaluated based on the relevance between features and class labels, and the PSO weights are adaptively calculated to result in the best fitness (the performance of the SVM classification model. Moreover, to overcome the defects of basic PSO, we propose a new adaptive inertia weight method called fitness-based and chaotic adaptive inertia weight-PSO (FCAIW-PSO that improves on basic PSO and is based on the fitness and a chaotic term. The goal is to assign suitable weights to the features to ensure the best Android malware detection performance. The results of experiments indicate that the IG weights and PSO weights both improve the performance of SVM and that the performance of the PSO weights is better than that of the IG weights.

  3. Cross-Cultural adaptation of an instrument to computer accessibility evaluation for students with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Gerusa Ferreira Lourenço

    2015-03-01

    Full Text Available The specific literature indicates that the successful education of children with cerebral palsy may require the implementation of appropriate assistive technology resources, allowing students to improve their performance and complete everyday tasks more efficiently and independently. To this end, these resources must be selected properly, emphasizing the importance of an appropriate initial assessment of the child and the possibilities of the resources available. The present study aimed to translate and adapt theoretically an American instrument that evaluates computer accessibility for people with cerebral palsy, in order to contextualize it for applicability to Brazilian students with cerebral palsy. The methodology involved the steps of translation and cross-cultural adaptation of this instrument, as well as the construction of a supplementary script for additional use of that instrument in the educational context. Translation procedures, theoretical and technical adaptation of the American instrument and theoretical analysis (content and semantics were carried out with the participation of professional experts of the special education area as adjudicators. The results pointed to the relevance of the proposal of the translated instrument in conjunction with the script built to the reality of professionals involved with the education of children with cerebral palsy, such as occupational therapists and special educators.

  4. Reconfigurable and adaptive photonic networks for high-performance computing systems.

    Science.gov (United States)

    Kodi, Avinash; Louri, Ahmed

    2009-08-01

    As feature sizes decrease to the submicrometer regime and clock rates increase to the multigigahertz range, the limited bandwidth at higher bit rates and longer communication distances in electrical interconnects will create a major bandwidth imbalance in future high-performance computing (HPC) systems. We explore the application of an optoelectronic interconnect for the design of flexible, high-bandwidth, reconfigurable and adaptive interconnection architectures for chip-to-chip and board-to-board HPC systems. Reconfigurability is realized by interconnecting arrays of optical transmitters, and adaptivity is implemented by a dynamic bandwidth reallocation (DBR) technique that balances the load on each communication channel. We evaluate a DBR technique, the lockstep (LS) protocol, that monitors traffic intensities, reallocates bandwidth, and adapts to changes in communication patterns. We incorporate this DBR technique into a detailed discrete-event network simulator to evaluate the performance for uniform, nonuniform, and permutation communication patterns. Simulation results indicate that, without reconfiguration techniques being applied, optical based system architecture shows better performance than electrical interconnects for uniform and nonuniform patterns; with reconfiguration techniques being applied, the dynamically reconfigurable optoelectronic interconnect provides much better performance for all communication patterns. Based on the performance study, the reconfigured architecture shows 30%-50% increased throughput and 50%-75% reduced network latency compared with HPC electrical networks.

  5. Selection of Numerical Criterion Value for Determination of Short Circuit Type in Adaptive Micro-Processing Current Protection of Electric Power Lines

    Directory of Open Access Journals (Sweden)

    A. V. Kovalevsky

    2007-01-01

    Full Text Available The paper considers a principle for determination of a short circuit type which is used in the mathematical model of adaptive micro-processing protection with the purpose to improve sensitivity. As a result of a calculative experiment dependences ΔI(t for various short circuit types (three- and two-phase short circuits have been obtained at a number of points of the investigated power network. These dependences make it possible to determine a numerical value of ΔI coefficient. A comparative analysis has been made to study an operation of adaptive and non-adaptive microprocessing protections in the case of asymmetric damages of the investigated power network just in the same points.

  6. Improving Performance of Computer-aided Detection of Subtle Breast Masses Using an Adaptive Cueing Method

    Science.gov (United States)

    Wang, Xingwei; Li, Lihua; Xu, Weidong; Liu, Wei; Lederman, Dror; Zheng, Bin

    2012-01-01

    Current computer-aided detection (CAD) schemes for detecting mammographic masses have several limitations including high correlation with radiologists’ detection and cueing most subtle masses only on one view. To increase CAD sensitivity in cueing more subtle masses that are likely missed and/or overlooked by radiologists without increasing false-positive rates, we investigated a new case-dependent cueing method by combining the original CAD-generated detection scores with a computed bilateral mammographic density asymmetry index. Using the new method, we adaptively raise CAD-generated scores of regions detected on “high-risk” cases to cue more subtle mass regions and reduce CAD scores of regions detected on “low-risk” cases to discard more false-positive regions. A testing dataset involving 78 positive and 338 negative cases was used to test this adaptive cueing method. Each positive case involves two sequential examinations in which the mass was detected in “current” examination and missed in “prior” examination but detected in a retrospective review by radiologists. Applying to this dataset, a pre-optimized CAD scheme yielded 75% case-based and 55% region-based sensitivity on “current” examinations at a false-positive rate of 0.25 per image. CAD sensitivity was reduced to 42% (case-based) and 27% (region-based) on “prior” examinations. Using the new cueing method, case-based and region-based sensitivity could maximally increase 9% and 33% on the “prior” examinations, respectively. The percentages of the masses cued on two views also increased from 27% to 65%. The study demonstrated that using this adaptive cueing method enabled to help CAD cue more subtle cancers without increasing false-positive cueing rate. PMID:22218075

  7. Adaptive computer-based spatial-filtering method for more accurate estimation of the surface velocity of debris flow.

    Science.gov (United States)

    Uddin, M S; Inaba, H; Itakura, Y; Yoshida, Y; Kasahara, M

    1999-11-10

    An adaptive computer-based spatial-filtering velocimeter to measure the surface velocity of a natural debris flow with high accuracy is described that can adjust the filter parameters, specifically, the slit width of the filter, based on the surface-pattern characteristics of the flow. A computer simulation confirms the effectiveness of this technique. The surface velocity of a natural debris flow at the Mt. Yakedake Volcano, Japan, was estimated by this adaptive method, and the results were compared with those obtained by two other methods: hardware-based spatial filtering and normal computer-based spatial filtering.

  8. Effect of Preparation Depth on the Marginal and Internal Adaptation of Computer-aided Design/Computer-assisted Manufacture Endocrowns.

    Science.gov (United States)

    Gaintantzopoulou, M D; El-Damanhoury, H M

    The aim of the study was to evaluate the effect of preparation depth and intraradicular extension on the marginal and internal adaptation of computer-aided design/computer-assisted manufacture (CAD/CAM) endocrown restorations. Standardized preparations were made in resin endodontic tooth models (Nissin Dental), with an intracoronal preparation depth of 2 mm (group H2), with extra 1- (group H3) or 2-mm (group H4) intraradicular extensions in the root canals (n=12). Vita Enamic polymer-infiltrated ceramic-network material endocrowns were fabricated using the CEREC AC CAD/CAM system and were seated on the prepared teeth. Specimens were evaluated by microtomography. Horizontal and vertical tomographic sections were recorded and reconstructed by using the CTSkan software (TView v1.1, Skyscan).The surface/void volume (S/V) in the region of interest was calculated. Marginal gap (MG), absolute marginal discrepancy (MD), and internal marginal gap were measured at various measuring locations and calculated in microscale (μm). Marginal and internal discrepancy data (μm) were analyzed with nonparametric Kruskal-Wallis analysis of variance by ranks with Dunn's post hoc, whereas S/V data were analyzed by one-way analysis of variance and Bonferroni multiple comparisons (α=0.05). Significant differences were found in MG, MD, and internal gap width values between the groups, with H2 showing the lowest values from all groups. S/V calculations presented significant differences between H2 and the other two groups (H3 and H4) tested, with H2 again showing the lowest values. Increasing the intraradicular extension of endocrown restorations increased the marginal and internal gap of endocrown restorations.

  9. The Computer Code NOVO for the Calculation of Wake Potentials of the Very Short Ultra-relativistic Bunches

    Energy Technology Data Exchange (ETDEWEB)

    Novokhatski, Alexander; /SLAC

    2005-12-01

    The problem of electromagnetic interaction of a beam and accelerator elements is very important for linear colliders, electron-positron factories, and free electron lasers. Precise calculation of wake fields is required for beam dynamics study in these machines. We describe a method which allows computation of wake fields of the very short bunches. Computer code NOVO was developed based on this method. This method is free of unphysical solutions like ''self-acceleration'' of a bunch head, which is common to well known wake field codes. Code NOVO was used for the wake fields study for many accelerator projects all over the world.

  10. Translation, cultural adaptation and validation of the English "Short form SF 12v2" into Bengali in rheumatoid arthritis patients.

    Science.gov (United States)

    Islam, Nazrul; Khan, Ikramul Hasan; Ferdous, Nira; Rasker, Johannes J

    2017-05-22

    To develop a culturally adapted and validated Bengali Short Form SF 12v2 among Rheumatoid arthritis (RA) patients. The English SF 12v2 was translated, adapted and back translated into and from Bengali, pre-tested by 60 patients. The Bengali SF 12v2 was administered twice with 14 days interval to 130 Bangladeshi RA patients. The psychometric properties of the Bengali SF 12v2 were assessed. Test-retest reliability was assessed by intra-class correlation coefficient (ICC) and Spearman's rank correlation coefficient and internal consistency by Cronbach's alpha. Content validity was assessed by index for content validity (ICV) and floor and ceiling effects. To determine convergent and discriminant validity a Bengali Health Assessment Questionnaire (B-HAQ) was used. Factor analysis was done. The Bengali SF 12v2 was well accepted by the patients in the pre-test and showed good reliability. Internal consistency for both physical and mental component was satisfactory; Cronbach's alpha was 0.9. ICC exceeded 0.9 in all domains. Spearman's rho for all domains exceeded 0.8. The physical health component of Bengali SF 12v2 had convergent validity to the B-HAQ. Its mental health component had discriminant validity to the B-HAQ. The ICV of content validity was 1 for all items. Factor analysis revealed two factors a physical and a mental component. The interviewer-administered Bengali SF 12v2 appears to be an acceptable, reliable, and valid instrument for measuring health-related quality of life in Bengali speaking RA patients. Further evaluation in the general population and in different medical conditions should be done.

  11. Computation of the Short-Time Linear Canonical Transform with Dual Window

    National Research Council Canada - National Science Library

    Lei Huang; Ke Zhang; Yi Chai; Shuiqing Xu

    2017-01-01

    The short-time linear canonical transform (STLCT), which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing...

  12. Short-term effects of implemented high intensity shoulder elevation during computer work

    OpenAIRE

    Larsen, Mette K; Samani, Afshin; Madeleine, Pascal; Olsen, Henrik B; S?gaard, Karen; Holtermann, Andreas

    2009-01-01

    Abstract Background Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during c...

  13. Creating a Computer Adaptive Test Version of the Late-Life Function & Disability Instrument

    Science.gov (United States)

    Jette, Alan M.; Haley, Stephen M.; Ni, Pengsheng; Olarsch, Sippy; Moed, Richard

    2009-01-01

    Background This study applied Item Response Theory (IRT) and Computer Adaptive Test (CAT) methodologies to develop a prototype function and disability assessment instrument for use in aging research. Herein, we report on the development of the CAT version of the Late-Life Function & Disability instrument (Late-Life FDI) and evaluate its psychometric properties. Methods We employed confirmatory factor analysis, IRT methods, validation, and computer simulation analyses of data collected from 671 older adults residing in residential care facilities. We compared accuracy, precision, and sensitivity to change of scores from CAT versions of two Late-Life FDI scales with scores from the fixed-form instrument. Score estimates from the prototype CAT versus the original instrument were compared in a sample of 40 older adults. Results Distinct function and disability domains were identified within the Late-Life FDI item bank and used to construct two prototype CAT scales. Using retrospective data, scores from computer simulations of the prototype CAT scales were highly correlated with scores from the original instrument. The results of computer simulation, accuracy, precision, and sensitivity to change of the CATs closely approximated those of the fixed-form scales, especially for the 10- or 15-item CAT versions. In the prospective study each CAT was administered in less than 3 minutes and CAT scores were highly correlated with scores generated from the original instrument. Conclusions CAT scores of the Late-Life FDI were highly comparable to those obtained from the full-length instrument with a small loss in accuracy, precision, and sensitivity to change. PMID:19038841

  14. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Science.gov (United States)

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  15. Guide-star-based computational adaptive optics for broadband interferometric tomography

    Science.gov (United States)

    Adie, Steven G.; Shemonski, Nathan D.; Graf, Benedikt W.; Ahmad, Adeel; Scott Carney, P.; Boppart, Stephen A.

    2012-11-01

    We present a method for the numerical correction of optical aberrations based on indirect sensing of the scattered wavefront from point-like scatterers ("guide stars") within a three-dimensional broadband interferometric tomogram. This method enables the correction of high-order monochromatic and chromatic aberrations utilizing guide stars that are revealed after numerical compensation of defocus and low-order aberrations of the optical system. Guide-star-based aberration correction in a silicone phantom with sparse sub-resolution-sized scatterers demonstrates improvement of resolution and signal-to-noise ratio over a large isotome. Results in highly scattering muscle tissue showed improved resolution of fine structure over an extended volume. Guide-star-based computational adaptive optics expands upon the use of image metrics for numerically optimizing the aberration correction in broadband interferometric tomography, and is analogous to phase-conjugation and time-reversal methods for focusing in turbid media.

  16. Electronic Structure Calculations and Adaptation Scheme in Multi-core Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Seshagiri, Lakshminarasimhan; Sosonkina, Masha; Zhang, Zhao

    2009-05-20

    Multi-core processing environments have become the norm in the generic computing environment and are being considered for adding an extra dimension to the execution of any application. The T2 Niagara processor is a very unique environment where it consists of eight cores having a capability of running eight threads simultaneously in each of the cores. Applications like General Atomic and Molecular Electronic Structure (GAMESS), used for ab-initio molecular quantum chemistry calculations, can be good indicators of the performance of such machines and would be a guideline for both hardware designers and application programmers. In this paper we try to benchmark the GAMESS performance on a T2 Niagara processor for a couple of molecules. We also show the suitability of using a middleware based adaptation algorithm on GAMESS on such a multi-core environment.

  17. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    Science.gov (United States)

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  18. An adaptive levelset method for computing solutions to incompressible two-phase flows.

    Science.gov (United States)

    Sussman, Mark; Lbnl Collaboration; Fatemi, Emad; Smereka, Peter; Osher, Stan

    1997-11-01

    We present an adaptive level set method for computing 2d axisymmetric and fully 3d incompressible two-phase flow. Our methodology is specifically targeted at problems characterized by large density and viscosity jumps (e.g. air/water) and stiff, singular source terms, such as those due to surface tension. One such application is the modeling of ink-jet printers in which one wants to accurately model the break-up of the jet into droplets. We compare our method to the Volume-of-Fluid method and the Boundary Integral Method; we focus our comparison to problems in which a change in topology occurs. We also validate our method against experiments and theory.

  19. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  20. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  1. Advancing the efficiency and efficacy of patient reported outcomes with multivariate computer adaptive testing.

    Science.gov (United States)

    Morris, Scott; Bass, Mike; Lee, Mirinae; Neapolitan, Richard E

    2017-09-01

    The Patient Reported Outcomes Measurement Information System (PROMIS) initiative developed an array of patient reported outcome (PRO) measures. To reduce the number of questions administered, PROMIS utilizes unidimensional item response theory and unidimensional computer adaptive testing (UCAT), which means a separate set of questions is administered for each measured trait. Multidimensional item response theory (MIRT) and multidimensional computer adaptive testing (MCAT) simultaneously assess correlated traits. The objective was to investigate the extent to which MCAT reduces patient burden relative to UCAT in the case of PROs. One MIRT and 3 unidimensional item response theory models were developed using the related traits anxiety, depression, and anger. Using these models, MCAT and UCAT performance was compared with simulated individuals. Surprisingly, the root mean squared error for both methods increased with the number of items. These results were driven by large errors for individuals with low trait levels. A second analysis focused on individuals aligned with item content. For these individuals, both MCAT and UCAT accuracies improved with additional items. Furthermore, MCAT reduced the test length by 50%. For the PROMIS Emotional Distress banks, neither UCAT nor MCAT provided accurate estimates for individuals at low trait levels. Because the items in these banks were designed to detect clinical levels of distress, there is little information for individuals with low trait values. However, trait estimates for individuals targeted by the banks were accurate and MCAT asked substantially fewer questions. By reducing the number of items administered, MCAT can allow clinicians and researchers to assess a wider range of PROs with less patient burden.

  2. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    Science.gov (United States)

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  3. Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity.

    Directory of Open Access Journals (Sweden)

    Jorge F Mejias

    Full Text Available In this work we study the detection of weak stimuli by spiking (integrate-and-fire neurons in the presence of certain level of noisy background neural activity. Our study has focused in the realistic assumption that the synapses in the network present activity-dependent processes, such as short-term synaptic depression and facilitation. Employing mean-field techniques as well as numerical simulations, we found that there are two possible noise levels which optimize signal transmission. This new finding is in contrast with the classical theory of stochastic resonance which is able to predict only one optimal level of noise. We found that the complex interplay between adaptive neuron threshold and activity-dependent synaptic mechanisms is responsible for this new phenomenology. Our main results are confirmed by employing a more realistic FitzHugh-Nagumo neuron model, which displays threshold variability, as well as by considering more realistic stochastic synaptic models and realistic signals such as poissonian spike trains.

  4. Cross-cultural adaptation of the Dutch Short Musculoskeletal Function Assessment questionnaire (SMFA-NL) : Internal consistency, validity, repeatability and responsiveness

    NARCIS (Netherlands)

    Reininga, Inge H. F.; El Moumni, Mostafa; Bulstra, Sjoerd K.; Olthof, Maurits G. L.; Wendt, Klaus W.; Stevens, Martin

    The purpose of this study was to translate and culturally adapt the Dutch version of the Short Musculoskeletal Function Assessment questionnaire (SMFA-NL) and to investigate the internal consistency, validity, repeatability and responsiveness of the translated version. The original SMFA was first

  5. Synergistic effect of supplemental enteral nutrients and exogenous glucagon-like peptide 2 on intestinal adaptation in a rat model of short bowel syndrome

    DEFF Research Database (Denmark)

    Liu, Xiaowen; Nelson, David W; Holst, Jens Juul

    2006-01-01

    BACKGROUND: Short bowel syndrome (SBS) can lead to intestinal failure and require total or supplemental parenteral nutrition (TPN or PN, respectively). Glucagon-like peptide 2 (GLP-2) is a nutrient-dependent, proglucagon-derived gut hormone that stimulates intestinal adaptation. OBJECTIVE: Our...... of GLP-2 (SEN x GLP-2 interaction, P cellularity and digestive capacity in parenterally fed rats with SBS...

  6. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  7. An Efficient Adaptive Load Balancing Algorithm for Cloud Computing Under Bursty Workloads

    Directory of Open Access Journals (Sweden)

    S. F. Issawi

    2015-06-01

    Full Text Available Cloud computing is a recent, emerging technology in the IT industry. It is an evolution of previous models such as grid computing. It enables a wide range of users to access a large sharing pool of resources over the internet. In such complex system, there is a tremendous need for an efficient load balancing scheme in order to satisfy peak user demands and provide high quality of services. One of the challenging problems that degrade the performance of a load balancing process is bursty workloads. Although there are a lot of researches proposing different load balancing algorithms, most of them neglect the problem of bursty workloads. Motivated by this problem, this paper proposes a new burstness-aware load balancing algorithm which can adapt to the variation in the request rate by adopting two load balancing algorithms: RR in burst and Random in non-burst state. Fuzzy logic is used in order to assign the received request to a balanced VM. The algorithm has been evaluated and compared with other algorithms using Cloud Analyst simulator. Results show that the proposed algorithm improves the average response time and average processing time in comparison with other algorithms.

  8. An Adaptive and Integrated Low-Power Framework for Multicore Mobile Computing

    Directory of Open Access Journals (Sweden)

    Jongmoo Choi

    2017-01-01

    Full Text Available Employing multicore in mobile computing such as smartphone and IoT (Internet of Things device is a double-edged sword. It provides ample computing capabilities required in recent intelligent mobile services including voice recognition, image processing, big data analysis, and deep learning. However, it requires a great deal of power consumption, which causes creating a thermal hot spot and putting pressure on the energy resource in a mobile device. In this paper, we propose a novel framework that integrates two well-known low-power techniques, DPM (Dynamic Power Management and DVFS (Dynamic Voltage and Frequency Scaling for energy efficiency in multicore mobile systems. The key feature of the proposed framework is adaptability. By monitoring the online resource usage such as CPU utilization and power consumption, the framework can orchestrate diverse DPM and DVFS policies according to workload characteristics. Real implementation based experiments using three mobile devices have shown that it can reduce the power consumption ranging from 22% to 79%, while affecting negligibly the performance of workloads.

  9. Grain classifier with computer vision using adaptive neuro-fuzzy inference system.

    Science.gov (United States)

    Sabanci, Kadir; Toktas, Abdurrahim; Kayabasi, Ahmet

    2017-09-01

    A computer vision-based classifier using an adaptive neuro-fuzzy inference system (ANFIS) is designed for classifying wheat grains into bread or durum. To train and test the classifier, images of 200 wheat grains (100 for bread and 100 for durum) are taken by a high-resolution camera. Visual feature data of the grains related to dimension (#4), color (#3) and texture (#5) as inputs of the classifier are mainly acquired for each grain using image processing techniques (IPTs). In addition to these main data, nine features are reproduced from the main features to ensure a varied population. Thus four sub-sets including categorized features of reproduced data are constituted to examine their effects on the classification. In order to simplify the classifier, the most effective visual features on the results are investigated. The data sets are compared with each other regarding classification accuracy. A simplified classifier having seven selected features is achieved with the best results. In the testing process, the simplified classifier computes the output with 99.46% accuracy and assorts the wheat grains with 100% accuracy. A system which classifies wheat grains with higher accuracy is designed. The proposed classifier integrated to industrial applications can automatically classify a variety of wheat grains. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  10. Bayesian inference for an adaptive Ordered Probit model: an application to Brain Computer Interfacing.

    Science.gov (United States)

    Yoon, Ji Won; Roberts, Stephen J; Dyson, Mathew; Gan, John Q

    2011-09-01

    This paper proposes an algorithm for adaptive, sequential classification in systems with unknown labeling errors, focusing on the biomedical application of Brain Computer Interfacing (BCI). The method is shown to be robust in the presence of label and sensor noise. We focus on the inference and prediction of target labels under a nonlinear and non-Gaussian model. In order to handle missing or erroneous labeling, we model observed labels as a noisy observation of a latent label set with multiple classes (≥ 2). Whilst this paper focuses on the method's application to BCI systems, the algorithm has the potential to be applied to many application domains in which sequential missing labels are to be imputed in the presence of uncertainty. This dynamic classification algorithm combines an Ordered Probit model and an Extended Kalman Filter (EKF). The EKF estimates the parameters of the Ordered Probit model sequentially with time. We test the performance of the classification approach by processing synthetic datasets and real experimental EEG signals with multiple classes (2, 3 and 4 labels) for a Brain Computer Interfacing (BCI) experiment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Functional near-infrared spectroscopy for adaptive human-computer interfaces

    Science.gov (United States)

    Yuksel, Beste F.; Peck, Evan M.; Afergan, Daniel; Hincks, Samuel W.; Shibata, Tomoki; Kainerstorfer, Jana; Tgavalekos, Kristen; Sassaroli, Angelo; Fantini, Sergio; Jacob, Robert J. K.

    2015-03-01

    We present a brain-computer interface (BCI) that detects, analyzes and responds to user cognitive state in real-time using machine learning classifications of functional near-infrared spectroscopy (fNIRS) data. Our work is aimed at increasing the narrow communication bandwidth between the human and computer by implicitly measuring users' cognitive state without any additional effort on the part of the user. Traditionally, BCIs have been designed to explicitly send signals as the primary input. However, such systems are usually designed for people with severe motor disabilities and are too slow and inaccurate for the general population. In this paper, we demonstrate with previous work1 that a BCI that implicitly measures cognitive workload can improve user performance and awareness compared to a control condition by adapting to user cognitive state in real-time. We also discuss some of the other applications we have used in this field to measure and respond to cognitive states such as cognitive workload, multitasking, and user preference.

  12. Computer-Aided Modelling of Short-Path Evaporation for Chemical Product Purification, Analysis and Design

    DEFF Research Database (Denmark)

    Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    An important stage in the design process for many chemical products is its manufacture where, for a class of chemical products that may be thermally unstable (such as, drugs, insecticides, flavours /fragrances, and so on), the purification step plays a major role. Short-path evaporation is a safe...... method, suitable for separation and purification of thermally unstable materials whose design and analysis can be efficiently performed through reliable model-based techniques. This paper presents a generalized model for short-path evaporation and highlights its development, implementation and solution...

  13. Adaptive off-line protocol for prostate external radiotherapy with cone beam computer tomography.

    Science.gov (United States)

    Piziorska, M; Kukołowicz, P; Zawadzka, A; Pilichowska, M; Pęczkowski, P

    2012-11-01

    The goal of this work was to prepare and to evaluate an off-line adaptive protocol for prostate teleradiotherapy with kilovoltage cone beam computer tomography (CBCT). Ten patients with localized prostate carcinoma treated with external beams underwent image-guided radiotherapy. In total, 162 CBCT images were collected. Position of prostate and pubis symphysis (PS) with respect to the isocenter were measured off-line. Using the CBCT scans obtained in the first three fractions the average position of prostate in relation (AvPosPr) to PB was calculated. On each CBCT scan, the position of prostate with respect to AvPosPr was calculated and cumulative histogram of prostate displacement with respect to AvPosPr was prepared. Using this data, the adaptive protocol was prepared in which (1) based on the CBCT made in the first three fractions the AvPosPr to PS is obtained, (2) in all other fractions two orthogonal images are acquired and if for any direction set-up error exceeds 0.2 cm the patient's position is corrected, and (3) additionally, the patient's position is corrected if the AvPosPr exceeds 0.2 cm in any direction. To evaluate the adaptive protocol for 30 consecutive patients, the CBCT was also made in 10th and 21st fraction. For the first 10 patients, the results revealed that the prostate was displaced in relation to AvPosPr >0.7 cm in the vertical and longitudinal directions only on 4 and 5 images of 162 CBCT images, respectively. For the lateral direction, this displacement was >0.3 cm in one case. For the group of 30 patients, displacement was never >0.7, and 0.3 cm for the vertical and lateral directions. In two cases, displacements were >0.7 cm for the longitudinal direction. Implementation of the proposed adaptive procedure based on the on-line set-up error elimination followed by a reduction of systematic internal error enables reducing the CTV-PTV margin to 0.7, 0.7, and 0.4 cm for the vertical, longitudinal, and lateral directions

  14. Evolution of Computer Virus Concealment and Anti-Virus Techniques: A Short Survey

    OpenAIRE

    Rad, Babak Bashari; Masrom, Maslin; Ibrahim, Suhaimi

    2011-01-01

    This paper presents a general overview on evolution of concealment methods in computer viruses and defensive techniques employed by anti-virus products. In order to stay far from the anti-virus scanners, computer viruses gradually improve their codes to make them invisible. On the other hand, anti-virus technologies continually follow the virus tricks and methodologies to overcome their threats. In this process, anti-virus experts design and develop new methodologies to make them stronger, mo...

  15. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    2017-09-20

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P image noise was significantly lower (P image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  16. Short Review of Computational Models for Single Cell Deformation and Migration

    NARCIS (Netherlands)

    Vermolen, F.J.

    2015-01-01

    This short review communication aims at enumerating several modeling efforts that have been performed to model cell migration and deformation. To optimize and improve medical treatments against diseases like cancer, ischemic wounds or pressure ulcers, it is of vital importance to understand the

  17. a computer program for short circuit analysis of electric power systems

    African Journals Online (AJOL)

    ES Obe

    1981-03-01

    Mar 1, 1981 ... system. Specifically in short circuit studies, the power system network is subjected to postulated fault conditions and the resulting faulted network is solved to determine the phase (and sequence) voltages, currents and power of any bus or transmission line of the system. From this analysis, the power system.

  18. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  19. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Directory of Open Access Journals (Sweden)

    Cohen Laurent

    2006-05-01

    Full Text Available Abstract Background In a companion article 1, we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed.

  20. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Science.gov (United States)

    Wilson, Anna J; Revkin, Susannah K; Cohen, David; Cohen, Laurent; Dehaene, Stanislas

    2006-01-01

    Background In a companion article [1], we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed. PMID:16734906

  1. Short- and Long-Term Biomarkers for Bacterial Robustness: A Framework for Quantifying Correlations between Cellular Indicators and Adaptive Behavior

    NARCIS (Netherlands)

    Besten, den H.M.W.; Arvind, A.; Gaballo, H.M.S.; Moezelaar, R.; Zwietering, M.H.; Abee, T.

    2010-01-01

    The ability of microorganisms to adapt to changing environments challenges the prediction of their history-dependent behavior. Cellular biomarkers that are quantitatively correlated to stress adaptive behavior will facilitate our ability to predict the impact of these adaptive traits. Here, we

  2. Short-term adaptation during propagation improves the performance of xylose-fermenting Saccharomyces cerevisiae in simultaneous saccharification and co-fermentation.

    Science.gov (United States)

    Nielsen, Fredrik; Tomás-Pejó, Elia; Olsson, Lisbeth; Wallberg, Ola

    2015-01-01

    Inhibitors that are generated during thermochemical pretreatment and hydrolysis impair the performance of microorganisms during fermentation of lignocellulosic hydrolysates. In omitting costly detoxification steps, the fermentation process relies extensively on the performance of the fermenting microorganism. One attractive option of improving its performance and tolerance to microbial inhibitors is short-term adaptation during propagation. This study determined the influence of short-term adaptation on the performance of recombinant Saccharomyces cerevisiae in simultaneous saccharification and co-fermentation (SSCF). The aim was to understand how short-term adaptation with lignocellulosic hydrolysate affects the cell mass yield of propagated yeast and performance in subsequent fermentation steps. The physiology of propagated yeast was examined with regard to viability, vitality, stress responses, and upregulation of relevant genes to identify any links between the beneficial traits that are promoted during adaptation and overall ethanol yields in co-fermentation. The presence of inhibitors during propagation significantly improved fermentation but lowered cell mass yield during propagation. Xylose utilization of adapted cultures was enhanced by increasing amounts of hydrolysate in the propagation. Ethanol yields improved by over 30 % with inhibitor concentrations that corresponded to ≥2.5 % water-insoluble solids (WIS) load during the propagation compared with the unadapted culture. Adaptation improved cell viability by >10 % and increased vitality by >20 %. Genes that conferred resistance against inhibitors were upregulated with increasing amounts of inhibitors during the propagation, but the adaptive response was not associated with improved ethanol yields in SSCF. The positive effects in SSCF were observed even with adaptation at inhibitor concentrations that corresponded to 2.5 % WIS. Higher amounts of hydrolysate in the propagation feed further

  3. A Computer Program to Calculate Two-Stage Short-Run Control Chart Factors for (X,MR Charts

    Directory of Open Access Journals (Sweden)

    Matthew E. Elam

    2006-04-01

    Full Text Available This paper is the second in a series of two papers that fully develops two-stage short-run (X, MR control charts. This paper describes the development and execution of a computer program that accurately calculates first- and second-stage short-run control chart factors for (X, MR charts using the equations derived in the first paper. The software used is Mathcad. The program accepts values for number of subgroups, α for the X chart, and α for the MR chart both above the upper control limit and below the lower control limit. Tables are generated for specific values of these inputs and the implications of the results are discussed. A numerical example illustrates the use of the program.

  4. Development of a lack of appetite item bank for computer-adaptive testing (CAT).

    Science.gov (United States)

    Thamsborg, Lise Holst; Petersen, Morten Aa; Aaronson, Neil K; Chie, Wei-Chu; Costantini, Anna; Holzner, Bernhard; Verdonck-de Leeuw, Irma M; Young, Teresa; Groenvold, Mogens

    2015-06-01

    A significant proportion of oncological patients experiences lack of appetite. Precise measurement is relevant to improve the management of lack of appetite. The so-called computer-adaptive test (CAT) allows for adaptation of the questionnaire to the individual patient, thereby optimizing measurement precision. The EORTC Quality of Life Group is developing a CAT version of the widely used EORTC QLQ-C30 questionnaire. Here, we report on the development of the lack of appetite CAT. The EORTC approach to CAT development comprises four phases: literature search, operationalization, pre-testing, and field testing. Phases 1-3 are described in this paper. First, a list of items was retrieved from the literature. This was refined, deleting redundant and irrelevant items. Next, new items fitting the "QLQ-C30 item style" were created. These were evaluated by international samples of experts and cancer patients. The literature search generated a list of 146 items. After a comprehensive item selection procedure, the list was reduced to 24 items. These formed the basis for 21 new items fitting the QLQ-C30 item style. Expert evaluations (n = 10) and patient interviews (n = 49) reduced the list to 12 lack of appetite items. Phases 1-3 resulted in 12 lack of appetite candidate items. Based on a field testing (phase 4), the psychometric characteristics of the items will be assessed and the final item bank will be generated. This CAT item bank is expected to provide precise and efficient measurement of lack of appetite while still being backward compatible to the original QLQ-C30 scale.

  5. Computationally Efficient Blind Code Synchronization for Asynchronous DS-CDMA Systems with Adaptive Antenna Arrays

    Directory of Open Access Journals (Sweden)

    Chia-Chang Hu

    2005-04-01

    Full Text Available A novel space-time adaptive near-far robust code-synchronization array detector for asynchronous DS-CDMA systems is developed in this paper. There are the same basic requirements that are needed by the conventional matched filter of an asynchronous DS-CDMA system. For the real-time applicability, a computationally efficient architecture of the proposed detector is developed that is based on the concept of the multistage Wiener filter (MWF of Goldstein and Reed. This multistage technique results in a self-synchronizing detection criterion that requires no inversion or eigendecomposition of a covariance matrix. As a consequence, this detector achieves a complexity that is only a linear function of the size of antenna array (J, the rank of the MWF (M, the system processing gain (N, and the number of samples in a chip interval (S, that is, 𝒪(JMNS. The complexity of the equivalent detector based on the minimum mean-squared error (MMSE or the subspace-based eigenstructure analysis is a function of 𝒪((JNS3. Moreover, this multistage scheme provides a rapid adaptive convergence under limited observation-data support. Simulations are conducted to evaluate the performance and convergence behavior of the proposed detector with the size of the J-element antenna array, the amount of the L-sample support, and the rank of the M-stage MWF. The performance advantage of the proposed detector over other DS-CDMA detectors is investigated as well.

  6. Computational Design of Short Pulse Laser Driven Iron Opacity Measurements at Stellar-Relevant Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Madison E. [Univ. of Florida, Gainesville, FL (United States)

    2017-05-20

    Opacity is a critical parameter in the simulation of radiation transport in systems such as inertial con nement fusion capsules and stars. The resolution of current disagreements between solar models and helioseismological observations would bene t from experimental validation of theoretical opacity models. Overall, short pulse laser heated iron experiments reaching stellar-relevant conditions have been designed with consideration of minimizing tamper emission and optical depth effects while meeting plasma condition and x-ray emission goals.

  7. Adaptive estimation of hand movement trajectory in an EEG based brain-computer interface system.

    Science.gov (United States)

    Robinson, Neethu; Guan, Cuntai; Vinod, A P

    2015-12-01

    The various parameters that define a hand movement such as its trajectory, speed, etc, are encoded in distinct brain activities. Decoding this information from neurophysiological recordings is a less explored area of brain-computer interface (BCI) research. Applying non-invasive recordings such as electroencephalography (EEG) for decoding makes the problem more challenging, as the encoding is assumed to be deep within the brain and not easily accessible by scalp recordings. EEG based BCI systems can be developed to identify the neural features underlying movement parameters that can be further utilized to provide a detailed and well defined control command set to a BCI output device. A real-time continuous control is better suited for practical BCI systems, and can be achieved by continuous adaptive reconstruction of movement trajectory than discrete brain activity classifications. In this work, we adaptively reconstruct/estimate the parameters of two-dimensional hand movement trajectory, namely movement speed and position, from multi-channel EEG recordings. The data for analysis is collected by performing an experiment that involved center-out right-hand movement tasks in four different directions at two different speeds in random order. We estimate movement trajectory using a Kalman filter that models the relation between brain activity and recorded parameters based on a set of defined predictors. We propose a method to define these predictor variables that includes spatial, spectral and temporally localized neural information and to select optimally informative variables. The proposed method yielded correlation of (0.60 ± 0.07) between recorded and estimated data. Further, incorporating the proposed predictor subset selection, the correlation achieved is (0.57 ± 0.07, p reduction in number of predictors (76%) for the savings of computational time. The proposed system provides a real time movement control system using EEG-BCI with control over movement speed

  8. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    Science.gov (United States)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the

  9. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems.

    Science.gov (United States)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be

  10. Selection of items for a computer-adaptive test to measure fatigue in patients with rheumatoid arthritis - A Delphi approach

    NARCIS (Netherlands)

    Nikolaus, Stephanie; Bode, Christina; Taal, Erik; van de Laar, Mart A F J

    2012-01-01

    Purpose Computer-adaptive tests (CATs) can measure precisely at individual level with few items selected from an item bank. Our aim was to select fatigue items to develop a CAT for rheumatoid arthritis (RA) and include expert opinions that are important for content validity of measurement

  11. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  12. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  13. Content Range and Precision of a Computer Adaptive Test of Upper Extremity Function for Children with Cerebral Palsy

    Science.gov (United States)

    Montpetit, Kathleen; Haley, Stephen; Bilodeau, Nathalie; Ni, Pengsheng; Tian, Feng; Gorton, George, III; Mulcahey, M. J.

    2011-01-01

    This article reports on the content range and measurement precision of an upper extremity (UE) computer adaptive testing (CAT) platform of physical function in children with cerebral palsy. Upper extremity items representing skills of all abilities were administered to 305 parents. These responses were compared with two traditional standardized…

  14. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF) and fati...

  15. Usability of an adaptive computer assistant that improves self-care and health literacy of older adults

    NARCIS (Netherlands)

    Blanson Henkemans, O.A.; Rogers, W.A.; Fisk, A.D.; Neerincx, M.A.; Lindenberg, J.; Mast, C.A.P.G. van der

    2008-01-01

    Objectives: We developed an adaptive computer assistant for the supervision of diabetics' self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient's electronic diary. Accordingly, it provides

  16. The Predictive Validity of a Computer-Adaptive Assessment of Kindergarten and First-Grade Reading Skills

    Science.gov (United States)

    Clemens, Nathan H.; Hagan-Burke, Shanna; Luo, Wen; Cerda, Carissa; Blakely, Alane; Frosch, Jennifer; Gamez-Patience, Brenda; Jones, Meredith

    2015-01-01

    This study examined the predictive validity of a computer-adaptive assessment for measuring kindergarten reading skills using the STAR Early Literacy (SEL) test. The findings showed that the results of SEL assessments administered during the fall, winter, and spring of kindergarten were moderate and statistically significant predictors of year-end…

  17. Using Artificial Intelligence to Control and Adapt Level of Difficulty in Computer Based, Cognitive Therapy – an Explorative Study

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2011-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...

  18. Real-time computational camera system for high-sensitivity imaging by using combined long/short exposure

    Science.gov (United States)

    Sato, Satoshi; Okada, Yusuke; Azuma, Takeo

    2012-03-01

    In this study, we realize high-resolution (4K-format), small-size (1.43 x 1.43 μm pixel pitch size with a single imager) and high-sensitivity (four times higher sensitivity as compared to conventional imagers) video camera system. Our proposed system is the real time computational camera system that combined long exposure green pixels with short exposure red / blue pixels. We demonstrate that our proposed camera system is effective even in conditions of low illumination.

  19. Ultralow dose computed tomography attenuation correction for pediatric PET CT using adaptive statistical iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Brady, Samuel L., E-mail: samuel.brady@stjude.org [Division of Diagnostic Imaging, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States); Shulkin, Barry L. [Nuclear Medicine and Department of Radiological Sciences, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States)

    2015-02-15

    Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (10–35 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV{sub bw}) of various diameter targets (range 8–37 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV{sub bw}, background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.3–0.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake.

  20. Influence of Adaptive Statistical Iterative Reconstruction on coronary plaque analysis in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Kitslaar, Pieter H; Broersen, Alexander; Dijkstra, Jouke; Gerke, Oke; Thygesen, Jesper; Egstrup, Kenneth; Lambrechtsen, Jess

    The purpose of this study was to study the effect of iterative reconstruction (IR) software on quantitative plaque measurements in coronary computed tomography angiography (CCTA). Thirty patients with a three clinical risk factors for coronary artery disease (CAD) had one CCTA performed. Images were reconstructed using FBP, 30% and 60% adaptive statistical IR (ASIR). Coronary plaque analysis was performed as per patient and per vessel (LM, LAD, CX and RCA) measurements. Lumen and vessel volumes and plaque burden measurements were based on automatic detected contours in each reconstruction. Lumen and plaque intensity measurements and HU based plaque characterization were based on corrected contours copied to each reconstruction. No significant changes between FBP and 30% ASIR were found except for lumen- (-2.53 HU) and plaque intensities (-1.28 HU). Between FBP and 60% ASIR the change in total volume showed an increase of 0.94%, 4.36% and 2.01% for lumen, plaque and vessel, respectively. The change in total plaque burden between FBP and 60% ASIR was 0.76%. Lumen and plaque intensities decreased between FBP and 60% ASIR with -9.90 HU and -1.97 HU, respectively. The total plaque component volume changes were all small with a maximum change of -1.13% of necrotic core between FBP and 60% ASIR. Quantitative plaque measurements only showed modest differences between FBP and the 60% ASIR level. Differences were increased lumen-, vessel- and plaque volumes, decreased lumen- and plaque intensities and a small percentage change in the individual plaque component volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  1. Adaptive Laplacian filtering for sensorimotor rhythm-based brain-computer interfaces

    Science.gov (United States)

    Lu, Jun; McFarland, Dennis J.; Wolpaw, Jonathan R.

    2013-02-01

    Objective. Sensorimotor rhythms (SMRs) are 8-30 Hz oscillations in the electroencephalogram (EEG) recorded from the scalp over sensorimotor cortex that change with movement and/or movement imagery. Many brain-computer interface (BCI) studies have shown that people can learn to control SMR amplitudes and can use that control to move cursors and other objects in one, two or three dimensions. At the same time, if SMR-based BCIs are to be useful for people with neuromuscular disabilities, their accuracy and reliability must be improved substantially. These BCIs often use spatial filtering methods such as common average reference (CAR), Laplacian (LAP) filter or common spatial pattern (CSP) filter to enhance the signal-to-noise ratio of EEG. Here, we test the hypothesis that a new filter design, called an ‘adaptive Laplacian (ALAP) filter’, can provide better performance for SMR-based BCIs. Approach. An ALAP filter employs a Gaussian kernel to construct a smooth spatial gradient of channel weights and then simultaneously seeks the optimal kernel radius of this spatial filter and the regularization parameter of linear ridge regression. This optimization is based on minimizing the leave-one-out cross-validation error through a gradient descent method and is computationally feasible. Main results. Using a variety of kinds of BCI data from a total of 22 individuals, we compare the performances of ALAP filter to CAR, small LAP, large LAP and CSP filters. With a large number of channels and limited data, ALAP performs significantly better than CSP, CAR, small LAP and large LAP both in classification accuracy and in mean-squared error. Using fewer channels restricted to motor areas, ALAP is still superior to CAR, small LAP and large LAP, but equally matched to CSP. Significance. Thus, ALAP may help to improve the accuracy and robustness of SMR-based BCIs.

  2. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  3. Computation of 2D vibroacoustic wave's dispersion for optimizing acoustic power flow in interaction with adaptive metacomposites

    Science.gov (United States)

    Collet, M.; Ouisse, M.; Ichchou, M.; Ohayon, R.

    2013-04-01

    This paper presents an integrated methodology for optimizing vibroacoustic energy flow in interaction between an adaptive metacomposite made of periodically distributed shunted piezoelectric material glued onto passive plate and open acoustic domain. The computation of interacting Floquet-Bloch propagators is also used to optimize vibroacoustic behavior. The main purpose of this work is first to propose a numerical methodology to compute the fluid-structure multi-modal wave dispersions. In a second step, optimization of electric circuit is used to control the acoustic power flow. 3D standard computation is used to confirm the efficiency of the designed metacomposite in terms of acoustic emissivity and absorption.

  4. Computer simulation of creep damage at crack tip in short fibre composites

    Science.gov (United States)

    Shuangyin, Zhang; Tsai, L. W.

    1994-08-01

    Creep damage at crack tip in short fibre composites has been simulated by using the finite element method (FEM). The well-known Schapery non-linear viscoelastic constitutive relationship was used to characterize time-dependent behaviour of the material. A modified recurrence equation was adopted to accelerate the iteration. Kachanov-Rabotnov's damage evolution law was employed. The growth of the damage zone with time around the crack tip was calculated and the results were shown with the so-called “digit photo”, which was produced by the printer.

  5. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Thygesen, Jesper; Gerke, Oke; Egstrup, Kenneth; Waaler, Dag; Lambrechtsen, Jess

    2016-12-01

    Coronary computed tomography angiography (CCTA) requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR) techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. To evaluate whether adaptive statistical iterative reconstruction (ASIR) enhances perceived image quality in CCTA compared to filtered back projection (FBP). Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA) and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR]) was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR (P = 0.004). The objective measures showed significant differences between FBP and 60% ASIR (P ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  6. Computational design of genomic transcriptional networks with adaptation to varying environments

    Science.gov (United States)

    Carrera, Javier; Elena, Santiago F.; Jaramillo, Alfonso

    2012-01-01

    Transcriptional profiling has been widely used as a tool for unveiling the coregulations of genes in response to genetic and environmental perturbations. These coregulations have been used, in a few instances, to infer global transcriptional regulatory models. Here, using the large amount of transcriptomic information available for the bacterium Escherichia coli, we seek to understand the design principles determining the regulation of its transcriptome. Combining transcriptomic and signaling data, we develop an evolutionary computational procedure that allows obtaining alternative genomic transcriptional regulatory network (GTRN) that still maintains its adaptability to dynamic environments. We apply our methodology to an E. coli GTRN and show that it could be rewired to simpler transcriptional regulatory structures. These rewired GTRNs still maintain the global physiological response to fluctuating environments. Rewired GTRNs contain 73% fewer regulated operons. Genes with similar functions and coordinated patterns of expression across environments are clustered into longer regulated operons. These synthetic GTRNs are more sensitive and show a more robust response to challenging environments. This result illustrates that the natural configuration of E. coli GTRN does not necessarily result from selection for robustness to environmental perturbations, but that evolutionary contingencies may have been important as well. We also discuss the limitations of our methodology in the context of the demand theory. Our procedure will be useful as a novel way to analyze global transcription regulation networks and in synthetic biology for the de novo design of genomes. PMID:22927389

  7. Utilizing Multidimensional Computer Adaptive Testing to Mitigate Burden With Patient Reported Outcomes.

    Science.gov (United States)

    Bass, Michael; Morris, Scott; Neapolitan, Richard

    Utilization of patient-reported outcome measures (PROs) had been limited by the lack of psychometrically sound measures scored in real-time. The Patient Reported Outcomes Measurement Information System (PROMIS) initiative developed a broad array of high-quality PRO measures. Towards reducing the number of items administered in measuring PROs, PROMIS employs Item Response Theory (IRT) and Computer Adaptive Testing (CAT). By only administering questions targeted to the subject's trait level, CAT has cut testing times in half(1). The IRT/CAT implementation in PROMIS is unidimensional in that there is a separate set of questions administered for each measured trait. However, there are often correlations among traits. Multidimensional IRT (MIRT) and multidimensional CAT (MCAT) provide items concerning several correlated traits, and should ameliorate patient burden. We developed an MIRT model using existing PROMIS item banks for depression and anxiety, developed MCAT software, and compared the efficiency of the MCAT approach to the unidimensional approach. Note: Research reported in this publication was supported in part by the National Library of Medicine of the National Institutes of Health under Award Number R01LM011962.

  8. Verification of Compressible and Incompressible Computational Fluid Dynamics Codes and Residual-based Mesh Adaptation

    Science.gov (United States)

    Choudhary, Aniruddha

    Code verifition is the process of ensuring, to the degree possible, that there are no algorithm deficiencies and coding mistakes (bugs) in a scientific computing simulation. In this work, techniques are presented for performing code verifition of boundary conditions commonly used in compressible and incompressible Computational Fluid Dynamics (CFD) codes. Using a compressible CFD code, this study assesses the subsonic in flow (isentropic and fixed-mass), subsonic out ow, supersonic out ow, no-slip wall (adiabatic and isothermal), and inviscid slip-wall. The use of simplified curved surfaces is proposed for easier generation of manufactured solutions during the verifition of certain boundary conditions involving many constraints. To perform rigorous code verifition, general grids with mixed cell types at the verified boundary are used. A novel approach is introduced to determine manufactured solutions for boundary condition verifition when the velocity-field is constrained to be divergence-free during the simulation in an incompressible CFD code. Order of accuracy testing using the Method of Manufactured Solutions (MMS) is employed here for code verifition of the major components of an open-source, multiphase ow code - MFIX. The presence of two-phase governing equations and a modified SIMPLE-based algorithm requiring divergence-free flows makes the selection of manufactured solutions more involved than for single-phase, compressible flows. Code verifition is performed here on 2D and 3D, uniform and stretched meshes for incompressible, steady and unsteady, single-phase and two-phase flows using the two-fluid model of MFIX. In a CFD simulation, truncation error (TE) is the difference between the continuous governing equation and its discrete approximation. Since TE can be shown to be the local source term for the discretization error, TE is proposed as the criterion for determining which regions of the computational mesh should be refined/coarsened. For mesh

  9. Translation and cross-cultural adaptation of the Brazilian Portuguese version of the driving anger scale (DAS): long form and short form.

    Science.gov (United States)

    Cantini, Jessye Almeida; Santos, George Oliveira; Machado, Eduardo de Carvalho; Nardi, Antonio Egídio; Silva, Adriana Cardoso

    2015-01-01

    Driving anger has attracted the attention of researchers in recent years because it may induce individuals to drive aggressively or adopt risk behaviors. The Driving Anger Scale (DAS) was designed to evaluate the propensity of drivers to become angry or aggressive while driving. This study describes the cross-cultural adaptation of a Brazilian version of the short form and the long form of the DAS. Translation and adaptation were made in four steps: two translations and two back-translations carried out by independent evaluators; the development of a brief version by four bilingual experts in mental health and driving behaviors; a subsequent experimental application; and, finally, an investigation of operational equivalence. Final Brazilian versions of the short form and of the long form of the DAS were made and are presented. This important instrument, which assesses driving anger and aggressive behaviors, is now available to evaluate the driving behaviors of the Brazilian population, which facilitates research in this field.

  10. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  11. A computational method for selecting short peptide sequences for inorganic material binding.

    Science.gov (United States)

    Nayebi, Niloofar; Cetinel, Sibel; Omar, Sara Ibrahim; Tuszynski, Jack A; Montemagno, Carlo

    2017-11-01

    Discovering or designing biofunctionalized materials with improved quality highly depends on the ability to manipulate and control the peptide-inorganic interaction. Various peptides can be used as assemblers, synthesizers, and linkers in the material syntheses. In another context, specific and selective material-binding peptides can be used as recognition blocks in mining applications. In this study, we propose a new in silico method to select short 4-mer peptides with high affinity and selectivity for a given target material. This method is illustrated with the calcite (104) surface as an example, which has been experimentally validated. A calcite binding peptide can play an important role in our understanding of biomineralization. A practical aspect of calcite is a need for it to be selectively depressed in mining sites. © 2017 Wiley Periodicals, Inc.

  12. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  13. Short-term adaptations in spinal cord circuits evoked by repetitive transcranial magnetic stimulation: possible underlying mechanisms

    DEFF Research Database (Denmark)

    Perez, Monica A.; Lungholt, Bjarke K.S.; Nielsen, Jens Bo

    2005-01-01

    Repetitive transcranial magnetic stimulation (rTMS) has been shown to induce adaptations in cortical neuronal circuitries. In the present study we investigated whether rTMS, through its effect on corticospinal pathways, also produces adaptations at the spinal level, and what the neuronal mechanis...

  14. Short-term outcome of 1,465 computer-navigated primary total knee replacements 2005-2008.

    Science.gov (United States)

    Gøthesen, Oystein; Espehaug, Birgitte; Havelin, Leif; Petursson, Gunnar; Furnes, Ove

    2011-06-01

    and purpose Improvement of positioning and alignment by the use of computer-assisted surgery (CAS) might improve longevity and function in total knee replacements, but there is little evidence. In this study, we evaluated the short-term results of computer-navigated knee replacements based on data from the Norwegian Arthroplasty Register. Primary total knee replacements without patella resurfacing, reported to the Norwegian Arthroplasty Register during the years 2005-2008, were evaluated. The 5 most common implants and the 3 most common navigation systems were selected. Cemented, uncemented, and hybrid knees were included. With the risk of revision for any cause as the primary endpoint and intraoperative complications and operating time as secondary outcomes, 1,465 computer-navigated knee replacements (CAS) and 8,214 conventionally operated knee replacements (CON) were compared. Kaplan-Meier survival analysis and Cox regression analysis with adjustment for age, sex, prosthesis brand, fixation method, previous knee surgery, preoperative diagnosis, and ASA category were used. Kaplan-Meier estimated survival at 2 years was 98% (95% CI: 97.5-98.3) in the CON group and 96% (95% CI: 95.0-97.8) in the CAS group. The adjusted Cox regression analysis showed a higher risk of revision in the CAS group (RR = 1.7, 95% CI: 1.1-2.5; p = 0.02). The LCS Complete knee had a higher risk of revision with CAS than with CON (RR = 2.1, 95% CI: 1.3-3.4; p = 0.004)). The differences were not statistically significant for the other prosthesis brands. Mean operating time was 15 min longer in the CAS group. With the introduction of computer-navigated knee replacement surgery in Norway, the short-term risk of revision has increased for computer-navigated replacement with the LCS Complete. The mechanisms of failure of these implantations should be explored in greater depth, and in this study we have not been able to draw conclusions regarding causation.

  15. Validation of a computer-adaptive test to evaluate generic health-related quality of life

    Directory of Open Access Journals (Sweden)

    Zardaín Pilar C

    2010-12-01

    Full Text Available Abstract Background Health Related Quality of Life (HRQoL is a relevant variable in the evaluation of health outcomes. Questionnaires based on Classical Test Theory typically require a large number of items to evaluate HRQoL. Computer Adaptive Testing (CAT can be used to reduce tests length while maintaining and, in some cases, improving accuracy. This study aimed at validating a CAT based on Item Response Theory (IRT for evaluation of generic HRQoL: the CAT-Health instrument. Methods Cross-sectional study of subjects aged over 18 attending Primary Care Centres for any reason. CAT-Health was administered along with the SF-12 Health Survey. Age, gender and a checklist of chronic conditions were also collected. CAT-Health was evaluated considering: 1 feasibility: completion time and test length; 2 content range coverage, Item Exposure Rate (IER and test precision; and 3 construct validity: differences in the CAT-Health scores according to clinical variables and correlations between both questionnaires. Results 396 subjects answered CAT-Health and SF-12, 67.2% females, mean age (SD 48.6 (17.7 years. 36.9% did not report any chronic condition. Median completion time for CAT-Health was 81 seconds (IQ range = 59-118 and it increased with age (p Conclusions Although domain-specific CATs exist for various areas of HRQoL, CAT-Health is one of the first IRT-based CATs designed to evaluate generic HRQoL and it has proven feasible, valid and efficient, when administered to a broad sample of individuals attending primary care settings.

  16. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography

    Directory of Open Access Journals (Sweden)

    Helle Precht

    2016-12-01

    Full Text Available Background Coronary computed tomography angiography (CCTA requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. Purpose To evaluate whether adaptive statistical iterative reconstruction (ASIR enhances perceived image quality in CCTA compared to filtered back projection (FBP. Material and Methods Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR] was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. Results VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR (P = 0.004. The objective measures showed significant differences between FBP and 60% ASIR (P < 0.0001 for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. Conclusion ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  17. Development of a computer-adaptive physical function instrument for Social Security Administration disability determination.

    Science.gov (United States)

    Ni, Pengsheng; McDonough, Christine M; Jette, Alan M; Bogusz, Kara; Marfeo, Elizabeth E; Rasch, Elizabeth K; Brandt, Diane E; Meterko, Mark; Haley, Stephen M; Chan, Leighton

    2013-09-01

    To develop and test an instrument to assess physical function for Social Security Administration (SSA) disability programs, the SSA-Physical Function (SSA-PF) instrument. Item response theory (IRT) analyses were used to (1) create a calibrated item bank for each of the factors identified in prior factor analyses, (2) assess the fit of the items within each scale, (3) develop separate computer-adaptive testing (CAT) instruments for each scale, and (4) conduct initial psychometric testing. Cross-sectional data collection; IRT analyses; CAT simulation. Telephone and Internet survey. Two samples: SSA claimants (n=1017) and adults from the U.S. general population (n=999). None. Model fit statistics, correlation, and reliability coefficients. IRT analyses resulted in 5 unidimensional SSA-PF scales: Changing & Maintaining Body Position, Whole Body Mobility, Upper Body Function, Upper Extremity Fine Motor, and Wheelchair Mobility for a total of 102 items. High CAT accuracy was demonstrated by strong correlations between simulated CAT scores and those from the full item banks. On comparing the simulated CATs with the full item banks, very little loss of reliability or precision was noted, except at the lower and upper ranges of each scale. No difference in response patterns by age or sex was noted. The distributions of claimant scores were shifted to the lower end of each scale compared with those of a sample of U.S. adults. The SSA-PF instrument contributes important new methodology for measuring the physical function of adults applying to the SSA disability programs. Initial evaluation revealed that the SSA-PF instrument achieved considerable breadth of coverage in each content domain and demonstrated noteworthy psychometric properties. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Short-Term Forecasting Models for Photovoltaic Plants: Analytical versus Soft-Computing Techniques

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2013-01-01

    Full Text Available We present and compare two short-term statistical forecasting models for hourly average electric power production forecasts of photovoltaic (PV plants: the analytical PV power forecasting model (APVF and the multiplayer perceptron PV forecasting model (MPVF. Both models use forecasts from numerical weather prediction (NWP tools at the location of the PV plant as well as the past recorded values of PV hourly electric power production. The APVF model consists of an original modeling for adjusting irradiation data of clear sky by an irradiation attenuation index, combined with a PV power production attenuation index. The MPVF model consists of an artificial neural network based model (selected among a large set of ANN optimized with genetic algorithms, GAs. The two models use forecasts from the same NWP tool as inputs. The APVF and MPVF models have been applied to a real-life case study of a grid-connected PV plant using the same data. Despite the fact that both models are quite different, they achieve very similar results, with forecast horizons covering all the daylight hours of the following day, which give a good perspective of their applicability for PV electric production sale bids to electricity markets.

  19. Computer simulation of yielding supports under static and short-term dynamic load

    Directory of Open Access Journals (Sweden)

    Kumpyak Oleg

    2018-01-01

    Full Text Available Dynamic impacts that became frequent lately cause large human and economic losses, and their prevention methods are not always effective and reasonable. The given research aims at studying the way of enhancing explosion safety of building structures by means of yielding supports. The paper presents results of numerical studies of strength and deformation property of yielding supports in the shape of annular tubes under static and short-term dynamic loading. The degree of influence of yielding supports was assessed taking into account three peculiar stages of deformation: elastic; elasto-plastic; and elasto-plastic with hardening. The methodology for numerical studies performance was described using finite element analysis with program software Ansys Mechanical v17.2. It was established that rigidity of yielding supports influences significantly their stress-strain state. The research determined that with the increase in deformable elements rigidity dependence between load and deformation of the support in elastic and plastic stages have linear character. Significant reduction of the dynamic response and increase in deformation time of yielding supports were observed due to increasing the plastic component. Therefore, it allows assuming on possibility of their application as supporting units in RC beams.

  20. Intelligent Adaptation and Personalization Techniques in Computer-Supported Collaborative Learning

    CERN Document Server

    Demetriadis, Stavros; Xhafa, Fatos

    2012-01-01

    Adaptation and personalization have been extensively studied in CSCL research community aiming to design intelligent systems that adaptively support eLearning processes and collaboration. Yet, with the fast development in Internet technologies, especially with the emergence of new data technologies and the mobile technologies, new opportunities and perspectives are opened for advanced adaptive and personalized systems. Adaptation and personalization are posing new research and development challenges to nowadays CSCL systems. In particular, adaptation should be focused in a multi-dimensional way (cognitive, technological, context-aware and personal). Moreover, it should address the particularities of both individual learners and group collaboration. As a consequence, the aim of this book is twofold. On the one hand, it discusses the latest advances and findings in the area of intelligent adaptive and personalized learning systems. On the other hand it analyzes the new implementation perspectives for intelligen...

  1. Computation of Nonlinear Parameters of Heart Rhythm Using Short Time ECG Segments

    Directory of Open Access Journals (Sweden)

    Berik Koichubekov

    2015-01-01

    Full Text Available We propose the method to compute the nonlinear parameters of heart rhythm (correlation dimension D2 and correlation entropy K2 using 5-minute ECG recordings preferred for screening of population. Conversion of RR intervals’ time series into continuous function x(t allows getting the new time series with different sampling rate dt. It has been shown that for all dt (250, 200, 125, and 100 ms the cross-plots of D2 and K2 against embedding dimension m for phase-space reconstruction start to level off at m=9. The sample size N at different sampling rates varied from 1200 at dt = 250 ms to 3000 at dt = 100 ms. Along with, the D2 and K2 means were not statistically different; that is, the sampling rate did not influence the results. We tested the feasibility of the method in two models: nonlinear heart rhythm dynamics in different states of autonomous nervous system and age-related characteristics of nonlinear parameters. According to the acquired data, the heart rhythm is more complex in childhood and adolescence with more influential parasympathetic influence against the background of elevated activity of sympathetic autonomous nervous system.

  2. Towards computational improvement of DNA database indexing and short DNA query searching.

    Science.gov (United States)

    Stojanov, Done; Koceski, Sašo; Mileva, Aleksandra; Koceska, Nataša; Bande, Cveta Martinovska

    2014-09-03

    In order to facilitate and speed up the search of massive DNA databases, the database is indexed at the beginning, employing a mapping function. By searching through the indexed data structure, exact query hits can be identified. If the database is searched against an annotated DNA query, such as a known promoter consensus sequence, then the starting locations and the number of potential genes can be determined. This is particularly relevant if unannotated DNA sequences have to be functionally annotated. However, indexing a massive DNA database and searching an indexed data structure with millions of entries is a time-demanding process. In this paper, we propose a fast DNA database indexing and searching approach, identifying all query hits in the database, without having to examine all entries in the indexed data structure, limiting the maximum length of a query that can be searched against the database. By applying the proposed indexing equation, the whole human genome could be indexed in 10 hours on a personal computer, under the assumption that there is enough RAM to store the indexed data structure. Analysing the methodology proposed by Reneker, we observed that hits at starting positions [Formula: see text] are not reported, if the database is searched against a query shorter than [Formula: see text] nucleotides, such that [Formula: see text] is the length of the DNA database words being mapped and [Formula: see text] is the length of the query. A solution of this drawback is also presented.

  3. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation.

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-Han; Duan, Lian; Zhu, Chao-Zhe

    2017-08-01

    Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  <54% in two-choice classification accuracy. We believe GMMAC will be useful for clinical fNIRS-based brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  4. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  <54% in two-choice classification accuracy. Significance. We believe GMMAC will be useful for clinical fNIRS-based brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  5. Computational study on monkey VOR adaptation and smooth pursuit based on the parallel control-pathway theory.

    Science.gov (United States)

    Tabata, Hiromitsu; Yamamoto, Kenji; Kawato, Mitsuo

    2002-04-01

    Much controversy remains about the site of learning and memory for vestibuloocular reflex (VOR) adaptation in spite of numerous previous studies. One possible explanation for VOR adaptation is the flocculus hypothesis, which assumes that this adaptation is caused by synaptic plasticity in the cerebellar cortex. Another hypothesis is the model proposed by Lisberger that assumes that the learning that occurs in both the cerebellar cortex and the vestibular nucleus is necessary for VOR adaptation. Lisberger's model is characterized by a strong positive feedback loop carrying eye velocity information from the vestibular nucleus to the cerebellar cortex. This structure contributes to the maintenance of a smooth pursuit driving command with zero retinal slip during the steady-state phase of smooth pursuit with gain 1 or during the target blink condition. Here, we propose an alternative hypothesis that suggests that the pursuit driving command is maintained in the medial superior temporal (MST) area based on MST firing data during target blink and during ocular following blank, and as a consequence, we assume a much smaller gain for the positive feedback from the vestibular nucleus to the cerebellar cortex. This hypothesis is equivalent to assuming that there are two parallel neural pathways for controlling VOR and smooth pursuit: a main pathway of the semicircular canals to the vestibular nucleus for VOR, and a main pathway of the MST-dorsolateral pontine nuclei (DLPN)-flocculus/ventral paraflocculus to the vestibular nucleus for smooth pursuit. First, we theoretically demonstrate that this parallel control-pathway theory can reproduce the various firing patterns of horizontal gaze velocity Purkinje cells in the flocculus/ventral paraflocculus dependent on VOR in the dark, smooth pursuit, and VOR cancellation as reported in Miles et al. at least equally as well as the gaze velocity theory, which is the basic framework of Lisberger's model. Second, computer simulations

  6. Enhanced goal-oriented error assessment and computational strategies in adaptive reduced basis solver for stochastic problems

    OpenAIRE

    Serafin, Kevin; Magnain, Benoît; Florentin, Eric; Parés Mariné, Núria; Díez, Pedro

    2017-01-01

    This work focuses on providing accurate low-cost approximations of stochastic ¿nite elements simulations in the framework of linear elasticity. In a previous work, an adaptive strategy was introduced as an improved Monte-Carlo method for multi-dimensional large stochastic problems. We provide here a complete analysis of the method including a new enhanced goal-oriented error estimator and estimates of CPU (computational processing unit) cost gain. Technical insights of these two topics are pr...

  7. Brain-Computer Evolutionary Multi-Objective Optimization (BC-EMO): a genetic algorithm adapting to the decision maker

    OpenAIRE

    Battiti, Roberto; Passerini, Andrea

    2009-01-01

    The centrality of the decision maker (DM) is widely recognized in the Multiple Criteria Decision Making community. This translates into emphasis on seamless human-computer interaction, and adaptation of the solution technique to the knowledge which is progressively acquired from the DM. This paper adopts the methodology of Reactive Optimization(RO) for evolutionary interactive multi-objective optimization. RO follows to the paradigm of "learning while optimizing", through the use of online ma...

  8. Sex determination of human mandible using metrical parameters by computed tomography: A prospective radiographic short study

    Directory of Open Access Journals (Sweden)

    Basavaraj N Kallalli

    2016-01-01

    Full Text Available Introduction: Sex determination of unidentified human remains is very important in forensic medicine, medicolegal cases, and forensic anthropology. The mandible is the largest and hardest facial bone that commonly resists postmortem damage and forms an important source of personal identification. Additional studies have demonstrated the applicability of facial reconstruction using three-dimensional computed tomography scan (3D-CT for the purpose of individual identification. Aim: To determine the sex of human mandible using metrical parameters by CT. Materials and Methods: The study included thirty subjects (15 males and 15 females, with age group ranging between 10 and 60 years obtained from the outpatient department of Oral Medicine and Radiology, Narsinhbhai Patel Dental College and Hospital. CT scan was performed on all the subjects, and the data obtained were reconstructed for 3D viewing. After obtaining 3D-CT scan, a total of seven mandibular measurements, i.e., gonial angle (G-angle, ramus length (Ramus-L, minimum ramus breadth and gonion-gnathion length (G-G-L, bigonial breadth, bicondylar breadth (BIC-Br, and coronoid length (CO-L were measured; collected data were analyzed using SPSS statistical analysis program by Student's t-test. Results: The result of the study showed that out of seven parameters, G-angle, Ramus-L, G-G-L, BIC-Br, and CO-L showed a significant statistical difference (P < 0.05, with overall accuracy of 86% for males and 82% for females. Conclusion: Personal identification using mandible by conventional methods has already been proved but with variable efficacies. Advanced imaging modalities can aid in personal identification with much higher accuracy than conventional methods.

  9. Computer-Based Adaptive Instructional Strategies for the Improvement of Performance and Reduction of Time.

    Science.gov (United States)

    Tennyson, Robert D.; Tennyson, Carol L.

    Three design strategies for selecting number of instructional instances needed in concept learning were investigated. Two strategies used adaptive procedures for the selection, while a nonadaptive strategy selected instances by number of associated attributes. The data analysis showed that the full adaptive strategy (using pretask and on-task…

  10. An Adaptive Computational Network Model for Multi-Emotional Social Interaction

    NARCIS (Netherlands)

    Roller, Ramona; Blommestijn, Suzan Q.; Treur, J.

    2017-01-01

    The study reported in this paper investigates an adaptive temporal-causal network-model for emotion contagion. The dynamic network principles of emotion contagion and the adaptive principles of homophily and Hebbian learning were used to simulate the change in multiple emotions and social

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  12. IFCPT S-Duct Grid-Adapted FUN3D Computations for the Third Propulsion Aerodynamics Works

    Science.gov (United States)

    Davis, Zach S.; Park, M. A.

    2017-01-01

    Contributions of the unstructured Reynolds-averaged Navier-Stokes code, FUN3D, to the 3rd AIAA Propulsion Aerodynamics Workshop are described for the diffusing IFCPT S-Duct. Using workshop-supplied grids, results for the baseline S-Duct, baseline S-Duct with Aerodynamic Interface Plane (AIP) rake hardware, and baseline S-Duct with flow control devices are compared with experimental data and results computed with output-based, off-body grid adaptation in FUN3D. Due to the absence of influential geometry components, total pressure recovery is overpredicted on the baseline S-Duct and S-Duct with flow control vanes when compared to experimental values. An estimate for the exact value of total pressure recovery is derived for these cases given an infinitely refined mesh. When results from output-based mesh adaptation are compared with those computed on workshop-supplied grids, a considerable improvement in predicting total pressure recovery is observed. By including more representative geometry, output-based mesh adaptation compares very favorably with experimental data in terms of predicting the total pressure recovery cost-function; whereas, results computed using the workshop-supplied grids are underpredicted.

  13. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  14. Adapting the Computed Tomography Criteria of Hemorrhagic Transformation to Stroke Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Lars Neeb

    2013-08-01

    Full Text Available Background: The main safety aspect in the use of stroke thrombolysis and in clinical trials of new pharmaceutical or interventional stroke therapies is the incidence of hemorrhagic transformation (HT after treatment. The computed tomography (CT-based classification of the European Cooperative Acute Stroke Study (ECASS distinguishes four categories of HTs. An HT can range from a harmless spot of blood accumulation to a symptomatic space-occupying parenchymal bleeding associated with a massive deterioration of symptoms and clinical prognosis. In magnetic resonance imaging (MRI HTs are often categorized using the ECASS criteria although this classification has not been validated in MRI. We developed MRI-specific criteria for the categorization of HT and sought to assess its diagnostic reliability in a retrospective study. Methods: Consecutive acute ischemic stroke patients, who had received a 3-tesla MRI before and 12-36 h after thrombolysis, were screened retrospectively for an HT of any kind in post-treatment MRI. Intravenous tissue plasminogen activator was given to all patients within 4.5 h. HT categorization was based on a simultaneous read of 3 different MRI sequences (fluid-attenuated inversion recovery, diffusion-weighted imaging and T2* gradient-recalled echo. Categorization of HT in MRI accounted for the various aspects of the imaging pattern as the shape of the bleeding area and signal intensity on each sequence. All data sets were independently categorized in a blinded fashion by 3 expert and 3 resident observers. Interobserver reliability of this classification was determined for all observers together and for each group separately by calculating Kendall's coefficient of concordance (W. Results: Of the 186 patients screened, 39 patients (21% had an HT in post-treatment MRI and were included for the categorization of HT by experts and residents. The overall agreement of HT categorization according to the modified classification was

  15. Procedures for Computing Transonic Flows for Control of Adaptive Wind Tunnels. Ph.D. Thesis - Technische Univ., Berlin, Mar. 1986

    Science.gov (United States)

    Rebstock, Rainer

    1987-01-01

    Numerical methods are developed for control of three dimensional adaptive test sections. The physical properties of the design problem occurring in the external field computation are analyzed, and a design procedure suited for solution of the problem is worked out. To do this, the desired wall shape is determined by stepwise modification of an initial contour. The necessary changes in geometry are determined with the aid of a panel procedure, or, with incident flow near the sonic range, with a transonic small perturbation (TSP) procedure. The designed wall shape, together with the wall deflections set during the tunnel run, are the input to a newly derived one-step formula which immediately yields the adapted wall contour. This is particularly important since the classical iterative adaptation scheme is shown to converge poorly for 3D flows. Experimental results obtained in the adaptive test section with eight flexible walls are presented to demonstrate the potential of the procedure. Finally, a method is described to minimize wall interference in 3D flows by adapting only the top and bottom wind tunnel walls.

  16. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  17. h-p adaptive finite element methods in computational fluid dynamics

    Science.gov (United States)

    Oden, J. T.; Demkowicz, L.

    1991-01-01

    The principal ideas of h-p adaptive finite element methods for fluid dynamics problems are discussed. Applications include acoustics, compressible Euler and both compressible and incompressible Navier-Stokes equations. Several numerical examples illustrate the presented concepts.

  18. Software abstractions and computational issues in parallel structure adaptive mesh methods for electronic structure calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Weare, J.; Ong, E.; Baden, S.

    1997-05-01

    We have applied structured adaptive mesh refinement techniques to the solution of the LDA equations for electronic structure calculations. Local spatial refinement concentrates memory resources and numerical effort where it is most needed, near the atomic centers and in regions of rapidly varying charge density. The structured grid representation enables us to employ efficient iterative solver techniques such as conjugate gradient with FAC multigrid preconditioning. We have parallelized our solver using an object- oriented adaptive mesh refinement framework.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. Sub-module Short Circuit Fault Diagnosis in Modular Multilevel Converter Based on Wavelet Transform and Adaptive Neuro Fuzzy Inference System

    DEFF Research Database (Denmark)

    Liu, Hui; Loh, Poh Chiang; Blaabjerg, Frede

    2015-01-01

    by employing wavelet transform under different fault conditions. Then the fuzzy logic rules are automatically trained based on the fuzzified fault features to diagnose the different faults. Neither additional sensor nor the capacitor voltages are needed in the proposed method. The high accuracy, good...... for continuous operation and post-fault maintenance. In this article, a fault diagnosis technique is proposed for the short circuit fault in a modular multi-level converter sub-module using the wavelet transform and adaptive neuro fuzzy inference system. The fault features are extracted from output phase voltage...

  1. Adaptive Laboratory Evolution Of Escherichia Coli Reveals Arduous Resistance Development To A Combination Of Three Novel Antimicrobial Compounds And To The Short Amp P9-4

    DEFF Research Database (Denmark)

    Citterio, Linda; Franzyk, Henrik; Gram, Lone

    2015-01-01

    of resistance development decreases when two or more compounds are combined as compared to single-drug treatments. The purpose of this study was to determine if resistance could develop in Escherichia coli ATCC 25922 to the peptidomimetic HF-1002 2 and the AMPs novicidin and P9-4. The mentioned compounds were......, none of the lineages exposed to P9-4 was adapted to 32 x MIC. This indicates that this short-length antimicrobial peptide may be a promising candidate for further optimization for future application in clinical settings....

  2. Power Spectral Analysis of Short-Term Heart Rate Variability in Healthy and Arrhythmia Subjects by the Adaptive Continuous Morlet Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ram Sewak SINGH

    2017-12-01

    Full Text Available Power spectral analysis of short-term heart rate variability (HRV can provide instant valuable information to understand the functioning of autonomic control over the cardiovascular system. In this study, an adaptive continuous Morlet wavelet transform (ACMWT method has been used to describe the time-frequency characteristics of the HRV using band power spectra and the median value of interquartile range. Adaptation of the method was based on the measurement of maximum energy concentration. The ACMWT has been validated on synthetic signals (i.e. stationary, non-stationary as slow varying and fast changing frequency with time modeled as closest to dynamic changes in HRV signals. This method has been also tested in the presence of additive white Gaussian noise (AWGN to show its robustness towards the noise. From the results of testing on synthetic signals, the ACMWT was found to be an enhanced energy concentration estimator for assessment of power spectral of short-term HRV time series compared to adaptive Stockwell transform (AST, adaptive modified Stockwell transform (AMST, standard continuous Morlet wavelet transform (CMWT and Stockwell transform (ST estimators at statistical significance level of 5%. Further, the ACMWT was applied to real HRV data from Fantasia and MIT-BIH databases, grouped as healthy young group (HYG, healthy elderly group (HEG, arrhythmia controlled medication group (ARCMG, and supraventricular tachycardia group (SVTG subjects. The global results demonstrate that spectral indices of low frequency power (LFp and high frequency power (HFp of HRV were decreased in HEG compared to HYG subjects (p<0.0001. While LFp and HFp indices were increased in ARCMG compared to HEG (p<0.00001. The LFp and HFp components of HRV obtained from SVTG were reduced compared to other group subjects (p<0.00001.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. Effects of a chronic reduction of short-wavelength light input on melatonin and sleep patterns in humans: evidence for adaptation.

    Science.gov (United States)

    Giménez, Marina C; Beersma, Domien G M; Bollen, Pauline; van der Linden, Matthijs L; Gordijn, Marijke C M

    2014-06-01

    Light is an important environmental stimulus for the entrainment of the circadian clock and for increasing alertness. The intrinsically photosensitive ganglion cells in the retina play an important role in transferring this light information to the circadian system and they are elicited in particular by short-wavelength light. Exposure to short wavelengths is reduced, for instance, in elderly people due to yellowing of the ocular lenses. This reduction may be involved in the disrupted circadian rhythms observed in aged subjects. Here, we tested the effects of reduced blue light exposure in young healthy subjects (n = 15) by using soft orange contact lenses (SOCL). We showed (as expected) that a reduction in the melatonin suppressing effect of light is observed when subjects wear the SOCL. However, after chronic exposure to reduced (short wavelength) light for two consecutive weeks we observed an increase in sensitivity of the melatonin suppression response. The response normalized as if it took place under a polychromatic light pulse. No differences were found in the dim light melatonin onset or in the amplitude of the melatonin rhythms after chronic reduced blue light exposure. The effects on sleep parameters were limited. Our results demonstrate that the non-visual light system of healthy young subjects is capable of adapting to changes in the spectral composition of environmental light exposure. The present results emphasize the importance of considering not only the short-term effects of changes in environmental light characteristics.

  5. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  6. CodABC: a computational framework to coestimate recombination, substitution, and molecular adaptation rates by approximate Bayesian computation.

    Science.gov (United States)

    Arenas, Miguel; Lopes, Joao S; Beaumont, Mark A; Posada, David

    2015-04-01

    The estimation of substitution and recombination rates can provide important insights into the molecular evolution of protein-coding sequences. Here, we present a new computational framework, called "CodABC," to jointly estimate recombination, substitution and synonymous and nonsynonymous rates from coding data. CodABC uses approximate Bayesian computation with and without regression adjustment and implements a variety of codon models, intracodon recombination, and longitudinal sampling. CodABC can provide accurate joint parameter estimates from recombining coding sequences, often outperforming maximum-likelihood methods based on more approximate models. In addition, CodABC allows for the inclusion of several nuisance parameters such as those representing codon frequencies, transition matrices, heterogeneity across sites or invariable sites. CodABC is freely available from http://code.google.com/p/codabc/, includes a GUI, extensive documentation and ready-to-use examples, and can run in parallel on multicore machines. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  7. Semi-supervised adaptation in ssvep-based brain-computer interface using tri-training

    DEFF Research Database (Denmark)

    Bender, Thomas; Kjaer, Troels W.; Thomsen, Carsten E.

    2013-01-01

    This paper presents a novel and computationally simple tri-training based semi-supervised steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI). It is implemented with autocorrelation-based features and a Naïve-Bayes classifier (NBC). The system uses nine characters...

  8. A linear matrix inequality-based approach for the computation of actuator bandwidth limits in adaptive control

    Science.gov (United States)

    Wagner, Daniel Robert

    Linear matrix inequalities and convex optimization techniques have become popular tools to solve nontrivial problems in the field of adaptive control. Specifically, the stability of adaptive control laws in the presence of actuator dynamics remains as an important open control problem. In this thesis, we present a linear matrix inequalities-based hedging approach and evaluate it for model reference adaptive control of an uncertain dynamical system in the presence of actuator dynamics. The ideal reference dynamics are modified such that the hedging approach allows the correct adaptation without being hindered by the presence of actuator dynamics. The hedging approach is first generalized such that two cases are considered where the actuator output and control effectiveness are known and unknown. We then show the stability of the closed-loop dynamical system using Lyapunov based stability analysis tools and propose a linear matrix inequality-based framework for the computation of the minimum allowable actuator bandwidth limits such that the closed-loop dynamical system remains stable. The results of the linear matrix inequality-based heading approach are then generalized to multiactuator systems with a new linear matrix inequality condition. The minimum actuator bandwidth solutions for closed-loop system stability are theoretically guaranteed to exist in a convex set with a partially convex constraint and then solved numerically using an algorithm in the case where there are multiple actuators. Finally, the efficacy of the results contained in this thesis are demonstrated using several illustrative numerical examples.

  9. A new computing approach for power signal modeling using fractional adaptive algorithms.

    Science.gov (United States)

    Chaudhary, Naveed Ishtiaq; Zubair, Syed; Raja, Muhammad Asif Zahoor

    2017-05-01

    Estimating the harmonic parameters is fundamental requirement for signal modelling in a power supply system. In this study, exploration and exploitation in fractional adaptive signal processing (FrASP) is carried out for identification of parameters in power signals. We design FrASP algorithms based on recently introduced variants of generalized least mean square (LMS) adaptive strategies for parameter estimation of the model. The performance of the proposed fractional adaptive schemes is evaluated for number of scenarios based on step size and noise variations. Results of the simulated system for sufficient large number of independent runs validated the reliability and effectiveness of the given methods through different performance measures in terms of mean square error, variance account for, and Nash Sutcliffe efficiency. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Adaptive computations of flow around a delta wing with vortex breakdown

    Science.gov (United States)

    Modiano, David L.; Murman, Earll M.

    1993-01-01

    An adaptive unstructured mesh solution method for the three-dimensional Euler equations was used to simulate the flow around a sharp edged delta wing. Emphasis was on the breakdown of the leading edge vortex at high angle of attack. Large values of entropy, which indicate vortical regions of the flow, specified the region in which adaptation was performed. The aerodynamic normal force coefficients show excellent agreement with wind tunnel data measured by Jarrah, and demonstrate the importance of adaptation in obtaining an accurate solution. The pitching moment coefficient and the location of vortex breakdown are compared with experimental data measured by Hummel and Srinivasan, showing good agreement in cases in which vortex breakdown is located over the wing.

  11. Distinguishing Short Quantum Computations

    OpenAIRE

    Rosgen, Bill

    2008-01-01

    Distinguishing logarithmic depth quantum circuits on mixed states is shown to be complete for $QIP$, the class of problems having quantum interactive proof systems. Circuits in this model can represent arbitrary quantum processes, and thus this result has implications for the verification of implementations of quantum algorithms. The distinguishability problem is also complete for $QIP$ on constant depth circuits containing the unbounded fan-out gat...

  12. Translation and cross-cultural adaptation of the Brazilian Portuguese version of the Driving Anger Scale (DAS: long form and short form

    Directory of Open Access Journals (Sweden)

    Jessye Almeida Cantini

    2015-03-01

    Full Text Available Introduction: Driving anger has attracted the attention of researchers in recent years because it may induce individuals to drive aggressively or adopt risk behaviors. The Driving Anger Scale (DAS was designed to evaluate the propensity of drivers to become angry or aggressive while driving. This study describes the cross-cultural adaptation of a Brazilian version of the short form and the long form of the DAS.Methods: Translation and adaptation were made in four steps: two translations and two back-translations carried out by independent evaluators; the development of a brief version by four bilingual experts in mental health and driving behaviors; a subsequent experimental application; and, finally, an investigation of operational equivalence.Results: Final Brazilian versions of the short form and of the long form of the DAS were made and are presented. Conclusions: This important instrument, which assesses driving anger and aggressive behaviors, is now available to evaluate the driving behaviors of the Brazilian population, which facilitates research in this field.

  13. Moving finite elements: A continuously adaptive method for computational fluid dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Glasser, A.H. (Los Alamos National Lab., NM (USA)); Miller, K.; Carlson, N. (California Univ., Berkeley, CA (USA))

    1991-01-01

    Moving Finite Elements (MFE), a recently developed method for computational fluid dynamics, promises major advances in the ability of computers to model the complex behavior of liquids, gases, and plasmas. Applications of computational fluid dynamics occur in a wide range of scientifically and technologically important fields. Examples include meteorology, oceanography, global climate modeling, magnetic and inertial fusion energy research, semiconductor fabrication, biophysics, automobile and aircraft design, industrial fluid processing, chemical engineering, and combustion research. The improvements made possible by the new method could thus have substantial economic impact. This paper describes the mathematical formulation and illustrates its use.

  14. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

    NARCIS (Netherlands)

    Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

    2015-01-01

    Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

  15. Light-induced short-term adaptation mechanisms under redox control in the PS II-LHCII supercomplex: LHC II state transitions and PS II repair cycle

    Science.gov (United States)

    Kruse, Olaf

    2001-05-01

    Oxygenic photosynthesis takes place in the thylakoid membranes of cyanobacteria, algae and higher plants. While cyanobacteria have adapted to relatively constant environments, higher plants had to evolve mechanisms to adapt to continuous environmental changes. These include changes in light intensity, temperature and availability of water. One of the great challenges in plant cell biology is therefore to determine the regulatory mechanisms employed by higher plants and some algae to adapt to these constant environmental changes. The particular emphasis of this review is the description and characterisation of light-induced redox-controlled processes regulating the photosynthetic reactions, which involves maintaining maximal electron transport flow through the PS II-Cytb6f-PS I-FoF1ATPase electron transport chain and minimising light-induced oxidative damage to PS II which drives the highly oxidising water-splitting reaction. Two of the mechanisms involved in such short-term regulation processes are known as light harvesting complex II (LHC II) state transitions and photosystem II (PS II) repair cycle. They are followed by, and indeed may be a precondition in order to establish, the onset of the subsequent long-term mechanisms of regulation. In particular, the redox control of LHC II state transitions by reversible phosphorylation has been in the focus of many investigations, leading to many new results demonstrating the complexity of thylakoid-associated redox control mechanisms.

  16. A Discontinuous Galerkin Time-Domain Method with Dynamically Adaptive Cartesian Meshes for Computational Electromagnetics

    CERN Document Server

    Yan, Su; Arslanbekov, Robert R; Kolobov, Vladimir I; Jin, Jian-Ming

    2016-01-01

    A discontinuous Galerkin time-domain (DGTD) method based on dynamically adaptive Cartesian meshes (ACM) is developed for a full-wave analysis of electromagnetic fields in dispersive media. Hierarchical Cartesian grids offer simplicity close to that of structured grids and the flexibility of unstructured grids while being highly suited for adaptive mesh refinement (AMR). The developed DGTD-ACM achieves a desired accuracy by refining non-conformal meshes near material interfaces to reduce stair-casing errors without sacrificing the high efficiency afforded with uniform Cartesian meshes. Moreover, DGTD-ACM can dynamically refine the mesh to resolve the local variation of the fields during propagation of electromagnetic pulses. A local time-stepping scheme is adopted to alleviate the constraint on the time-step size due to the stability condition of the explicit time integration. Simulations of electromagnetic wave diffraction over conducting and dielectric cylinders and spheres demonstrate that the proposed meth...

  17. Spatial co-adaptation of cortical control columns in a micro-ECoG brain-computer interface

    Science.gov (United States)

    Rouse, A. G.; Williams, J. J.; Wheeler, J. J.; Moran, D. W.

    2016-10-01

    Objective. Electrocorticography (ECoG) has been used for a range of applications including electrophysiological mapping, epilepsy monitoring, and more recently as a recording modality for brain-computer interfaces (BCIs). Studies that examine ECoG electrodes designed and implanted chronically solely for BCI applications remain limited. The present study explored how two key factors influence chronic, closed-loop ECoG BCI: (i) the effect of inter-electrode distance on BCI performance and (ii) the differences in neural adaptation and performance when fixed versus adaptive BCI decoding weights are used. Approach. The amplitudes of epidural micro-ECoG signals between 75 and 105 Hz with 300 μm diameter electrodes were used for one-dimensional and two-dimensional BCI tasks. The effect of inter-electrode distance on BCI control was tested between 3 and 15 mm. Additionally, the performance and cortical modulation differences between constant, fixed decoding using a small subset of channels versus adaptive decoding weights using the entire array were explored. Main results. Successful BCI control was possible with two electrodes separated by 9 and 15 mm. Performance decreased and the signals became more correlated when the electrodes were only 3 mm apart. BCI performance in a 2D BCI task improved significantly when using adaptive decoding weights (80%-90%) compared to using constant, fixed weights (50%-60%). Additionally, modulation increased for channels previously unavailable for BCI control under the fixed decoding scheme upon switching to the adaptive, all-channel scheme. Significance. Our results clearly show that neural activity under a BCI recording electrode (which we define as a ‘cortical control column’) readily adapts to generate an appropriate control signal. These results show that the practical minimal spatial resolution of these control columns with micro-ECoG BCI is likely on the order of 3 mm. Additionally, they show that the combination and

  18. Designing of adaptive computer aided learning system of tasks for probabilistic statistical branch of mathematics

    Directory of Open Access Journals (Sweden)

    С Н Дворяткина

    2013-12-01

    Full Text Available This article focuses on the development of a model of adaptive learning system problems in probability and statistics branches of mathematics, based on ICT, which takes into account the shortcomings of modern educational systems, namely: they are highly specialized to a pre-rigid structure, closed, static, focused on the target audience and do not take into account dynamic characteristics of individual student.

  19. Scanning Electron Microscopy Analysis of the Adaptation of Single-Unit Screw-Retained Computer-Aided Design/Computer-Aided Manufacture Abutments After Mechanical Cycling.

    Science.gov (United States)

    Markarian, Roberto Adrian; Galles, Deborah Pedroso; Gomes França, Fabiana Mantovani

    2017-06-20

    To measure the microgap between dental implants and custom abutments fabricated using different computer-aided design/computer-aided manufacture (CAD/CAM) methods before and after mechanical cycling. CAD software (Dental System, 3Shape) was used to design a custom abutment for a single-unit, screw-retained crown compatible with a 4.1-mm external hexagon dental implant. The resulting stereolithography file was sent for manufacturing using four CAD/CAM methods (n = 40): milling and sintering of zirconium dioxide (ZO group), cobalt-chromium (Co-Cr) sintered via selective laser melting (SLM group), fully sintered machined Co-Cr alloy (MM group), and machined and sintered agglutinated Co-Cr alloy powder (AM group). Prefabricated titanium abutments (TI group) were used as controls. Each abutment was placed on a dental implant measuring 4.1× 11 mm (SA411, SIN) inserted into an aluminum block. Measurements were taken using scanning electron microscopy (SEM) (×4,000) on four regions of the implant-abutment interface (IAI) and at a relative distance of 90 degrees from each other. The specimens were mechanically aged (1 million cycles, 2 Hz, 100 N, 37°C) and the IAI width was measured again using the same approach. Data were analyzed using two-way analysis of variance, followed by the Tukey test. After mechanical cycling, the best adaptation results were obtained from the TI (2.29 ± 1.13 μm), AM (3.58 ± 1.80 μm), and MM (1.89 ± 0.98 μm) groups. A significantly worse adaptation outcome was observed for the SLM (18.40 ± 20.78 μm) and ZO (10.42 ± 0.80 μm) groups. Mechanical cycling had a marked effect only on the AM specimens, which significantly increased the microgap at the IAI. Custom abutments fabricated using fully sintered machined Co-Cr alloy and machined and sintered agglutinated Co-Cr alloy powder demonstrated the best adaptation results at the IAI, similar to those obtained with commercial prefabricated titanium abutments after mechanical cycling. The

  20. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Directory of Open Access Journals (Sweden)

    Cohen Laurent

    2006-05-01

    Full Text Available Abstract Background Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. Methods "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations. Results The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article 1 describes the evolution of number sense and arithmetic scores before and after training. Conclusion The software, open-source and freely available online, is designed for learning disabled children aged 5–8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains.

  1. Towards incorporating affective computing to virtual rehabilitation; surrogating attributed attention from posture for boosting therapy adaptation

    Science.gov (United States)

    Rivas, Jesús J.; Heyer, Patrick; Orihuela-Espina, Felipe; Sucar, Luis Enrique

    2015-01-01

    Virtual rehabilitation (VR) is a novel motor rehabilitation therapy in which the rehabilitation exercises occurs through interaction with bespoken virtual environments. These virtual environments dynamically adapt their activity to match the therapy progress. Adaptation should be guided by the cognitive and emotional state of the patient, none of which are directly observable. Here, we present our first steps towards inferring non-observable attentional state from unobtrusively observable seated posture, so that this knowledge can later be exploited by a VR platform to modulate its behaviour. The space of seated postures was discretized and 648 pictures of acted representations were exposed to crowd-evaluation to determine attributed state of attention. A semi-supervised classifier based on Na¨ıve Bayes with structural improvement was learnt to unfold a predictive relation between posture and attributed attention. Internal validity was established following a 2×5 cross-fold strategy. Following 4959 votes from crowd, classification accuracy reached a promissory 96.29% (µ±σ = 87.59±6.59) and F-measure reached 82.35% (µ ± σ = 69.72 ± 10.50). With the afforded rate of classification, we believe it is safe to claim posture as a reliable proxy for attributed attentional state. It follows that unobtrusively monitoring posture can be exploited for guiding an intelligent adaptation in a virtual rehabilitation platform. This study further helps to identify critical aspects of posture permitting inference of attention.

  2. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Science.gov (United States)

    Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David

    2006-01-01

    Background Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. Methods "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). Results The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article [1] describes the evolution of number sense and arithmetic scores before and after training. Conclusion The software, open-source and freely available online, is designed for learning disabled children aged 5–8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains. PMID:16734905

  3. An 8-item short form of the Eating Disorder Examination-Questionnaire adapted for children (ChEDE-Q8).

    Science.gov (United States)

    Kliem, Sören; Schmidt, Ricarda; Vogel, Mandy; Hiemisch, Andreas; Kiess, Wieland; Hilbert, Anja

    2017-06-01

    Eating disturbances are common in children placing a vulnerable group of them at risk for full-syndrome eating disorders and adverse health outcomes. To provide a valid self-report assessment of eating disorder psychopathology in children, a short form of the child version of the Eating Disorder Examination (ChEDE-Q) was psychometrically evaluated. Similar to the EDE-Q, the ChEDE-Q provides assessment of eating disorder psychopathology related to anorexia nervosa, bulimia nervosa, and binge-eating disorder; however, the ChEDE-Q does not assess symptoms of avoidant/restrictive food intake disorder, pica, or rumination disorder. In 1,836 participants ages 7 to 18 years, recruited from two independent population-based samples, the factor structure of the recently established 8-item short form EDE-Q8 for adults was examined, including measurement invariance analyses on age, gender, and weight status derived from objectively measured weight and height. For convergent validity, the ChEDE-Q global score, body esteem scale, strengths and difficulties questionnaire, and sociodemographic characteristics were used. Item characteristics and age- and gender-specific norms were calculated. Confirmatory factor analysis revealed good model fit for the 8-item ChEDE-Q. Measurement invariance analyses indicated strict invariance for all analyzed subgroups. Convergent validity was provided through associations with well-established questionnaires and age, gender, and weight status, in expected directions. The newly developed ChEDE-Q8 proved to be a psychometrically sound and economical self-report assessment tool of eating disorder psychopathology in children. Further validation studies are needed, particularly concerning discriminant and predictive validity. © 2017 Wiley Periodicals, Inc.

  4. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P reconstructed using ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  5. Adaptive TrimTree: Green Data Center Networks through Resource Consolidation, Selective Connectedness and Energy Proportional Computing

    Directory of Open Access Journals (Sweden)

    Saima Zafar

    2016-10-01

    Full Text Available A data center is a facility with a group of networked servers used by an organization for storage, management and dissemination of its data. The increase in data center energy consumption over the past several years is staggering, therefore efforts are being initiated to achieve energy efficiency of various components of data centers. One of the main reasons data centers have high energy inefficiency is largely due to the fact that most organizations run their data centers at full capacity 24/7. This results into a number of servers and switches being underutilized or even unutilized, yet working and consuming electricity around the clock. In this paper, we present Adaptive TrimTree; a mechanism that employs a combination of resource consolidation, selective connectedness and energy proportional computing for optimizing energy consumption in a Data Center Network (DCN. Adaptive TrimTree adopts a simple traffic-and-topology-based heuristic to find a minimum power network subset called ‘active network subset’ that satisfies the existing network traffic conditions while switching off the residual unused network components. A ‘passive network subset’ is also identified for redundancy which consists of links and switches that can be required in future and this subset is toggled to sleep state. An energy proportional computing technique is applied to the active network subset for adapting link data rates to workload thus maximizing energy optimization. We have compared our proposed mechanism with fat-tree topology and ElasticTree; a scheme based on resource consolidation. Our simulation results show that our mechanism saves 50%–70% more energy as compared to fat-tree and 19.6% as compared to ElasticTree, with minimal impact on packet loss percentage and delay. Additionally, our mechanism copes better with traffic anomalies and surges due to passive network provision.

  6. A Comment on Early Student Blunders on Computer-Based Adaptive Tests

    Science.gov (United States)

    Green, Bert F.

    2011-01-01

    This article refutes a recent claim that computer-based tests produce biased scores for very proficient test takers who make mistakes on one or two initial items and that the "bias" can be reduced by using a four-parameter IRT model. Because the same effect occurs with pattern scores on nonadaptive tests, the effect results from IRT scoring, not…

  7. Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); S.M. Bohte (Sander)

    2016-01-01

    textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. It is an open question how real spiking neurons produce the kind of powerful neural computation that is possible with deep artificial neural networks, using only so very few spikes to communicate. Building on

  8. Switching between manual control and brain-computer interface using long term and short term quality measures

    Directory of Open Access Journals (Sweden)

    Alex eKreilinger

    2012-01-01

    Full Text Available Assistive devices for persons with limited motor control translateor amplify remaining functions to allow otherwise impossible actions.These assistive devices usually rely on just one type of input signalwhich can be derived from residual muscle functions or any other kindof biosignal. When only one signal is used, the functionality of theassistive device can be reduced as soon as the quality of the providedsignal is impaired. The quality can decrease in case of fatigue, lack ofconcentration, high noise, spasms, tremors, depending on the type ofsignal. To overcome this dependency on one input signal, a combination of more inputs should be feasible. This work presents a hybridBrain-Computer Interface (hBCI approach where two different inputsignals (joystick and BCI were monitored and only one of them waschosen as a control signal at a time. Users could move a car in agame-like feedback application to collect coins and avoid obstacles viaeither joystick or BCI control. Both control types were constantlymonitored with four different long term quality measures to evaluate the current state of the signals. As soon as the quality droppedbelow a certain threshold, a monitoring system would switch to theother control mode and vice versa. Additionally, short term qualitymeasures were applied to check for strong artifacts that could rendervoluntary control impossible. These measures were used to prohibitactions carried out during times when highly uncertain signals wererecorded. The switching possibility allowed more functionality for theusers. Moving the car was still possible even after one control modewas not working any more. The proposed system serves as a basisthat shows how BCI can be used as an assistive device, especially incombination with other assistive technology.

  9. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    Science.gov (United States)

    Acqualagna, Laura; Botrel, Loic; Vidaurre, Carmen; Kübler, Andrea; Blankertz, Benjamin

    2016-01-01

    In the last years Brain Computer Interface (BCI) technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR) these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD) was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods.

  10. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    Directory of Open Access Journals (Sweden)

    Laura Acqualagna

    Full Text Available In the last years Brain Computer Interface (BCI technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods.

  11. Automatic left ventricle segmentation using iterative thresholding and an active contour model with adaptation on short-axis cardiac MRI.

    Science.gov (United States)

    Lee, Hae-Yeoun; Codella, Noel C F; Cham, Matthew D; Weinsaft, Jonathan W; Wang, Yi

    2010-04-01

    An automatic left ventricle (LV) segmentation algorithm is presented for quantification of cardiac output and myocardial mass in clinical practice. The LV endocardium is first segmented using region growth with iterative thresholding by detecting the effusion into the surrounding myocardium and tissues. Then the epicardium is extracted using the active contour model guided by the endocardial border and the myocardial signal information estimated by iterative thresholding. This iterative thresholding and active contour model with adaptation (ITHACA) algorithm was compared to manual tracing used in clinical practice and the commercial MASS Analysis software (General Electric) in 38 patients, with Institutional Review Board (IRB) approval. The ITHACA algorithm provided substantial improvement over the MASS software in defining myocardial borders. The ITHACA algorithm agreed well with manual tracing with a mean difference of blood volume and myocardial mass being 2.9 +/- 6.2 mL (mean +/- standard deviation) and -0.9 +/- 16.5 g, respectively. The difference was smaller than the difference between manual tracing and the MASS software (approximately -20.0 +/- 6.9 mL and -1.0 +/- 20.2 g, respectively). These experimental results support that the proposed ITHACA segmentation is accurate and useful for clinical practice.

  12. Metabolic analysis of adaptation to short-term changes in culture conditions of the marine diatom Thalassiosira pseudonana.

    Directory of Open Access Journals (Sweden)

    Mariusz A Bromke

    Full Text Available This report describes the metabolic and lipidomic profiling of 97 low-molecular weight compounds from the primary metabolism and 124 lipid compounds of the diatom Thalassiosira pseudonana. The metabolic profiles were created for diatoms perturbed for 24 hours with four different treatments: (I removal of nitrogen, (II lower iron concentration, (III addition of sea salt, (IV addition of carbonate to their growth media. Our results show that as early as 24 hours after nitrogen depletion significant qualitative and quantitative change in lipid composition as well as in the primary metabolism of Thalassiosira pseudonana occurs. So we can observe the accumulation of several storage lipids, namely triacylglycerides, and TCA cycle intermediates, of which citric acid increases more than 10-fold. These changes are positively correlated with expression of TCA enzymes genes. Next to the TCA cycle intermediates and storage lipid changes, we have observed decrease in N-containing lipids and primary metabolites such as amino acids. As a measure of counteracting nitrogen starvation, we have observed elevated expression levels of nitrogen uptake and amino acid biosynthetic genes. This indicates that diatoms can fast and efficiently adapt to changing environment by altering the metabolic fluxes and metabolite abundances. Especially, the accumulation of proline and the decrease of dimethylsulfoniopropionate suggest that the proline is the main osmoprotectant for the diatom in nitrogen rich conditions.

  13. Adaptation of the short intergenic spacers between co-directional genes to the Shine-Dalgarno motif among prokaryote genomes

    DEFF Research Database (Denmark)

    Caro, Albert Pallejà; García-Vallvé, Santiago; Romeu, Antoni

    2009-01-01

    ABSTRACT: BACKGROUND: In prokaryote genomes most of the co-directional genes are in close proximity. Even the coding sequence or the stop codon of a gene can overlap with the Shine-Dalgarno (SD) sequence of the downstream co-directional gene. In this paper we analyze how the presence of SD may...... influence the stop codon usage or the spacing lengths between co-directional genes. RESULTS: The SD sequences for 530 prokaryote genomes have been predicted using computer calculations of the base-pairing free energy between translation initiation regions and the 16S rRNA 3' tail. Genomes with a large...... to the discussion of which factors affect the intergenic lengths, which cannot be totally explained by the pressure to compact the prokaryote genomes....

  14. Parallel paving: An algorithm for generating distributed, adaptive, all-quadrilateral meshes on parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Lober, R.R.; Tautges, T.J.; Vaughan, C.T.

    1997-03-01

    Paving is an automated mesh generation algorithm which produces all-quadrilateral elements. It can additionally generate these elements in varying sizes such that the resulting mesh adapts to a function distribution, such as an error function. While powerful, conventional paving is a very serial algorithm in its operation. Parallel paving is the extension of serial paving into parallel environments to perform the same meshing functions as conventional paving only on distributed, discretized models. This extension allows large, adaptive, parallel finite element simulations to take advantage of paving`s meshing capabilities for h-remap remeshing. A significantly modified version of the CUBIT mesh generation code has been developed to host the parallel paving algorithm and demonstrate its capabilities on both two dimensional and three dimensional surface geometries and compare the resulting parallel produced meshes to conventionally paved meshes for mesh quality and algorithm performance. Sandia`s {open_quotes}tiling{close_quotes} dynamic load balancing code has also been extended to work with the paving algorithm to retain parallel efficiency as subdomains undergo iterative mesh refinement.

  15. Adaptation of PyFlag to Efficient Analysis of Overtaken Computer Data Storage

    OpenAIRE

    Aleksander Byrski; Wojciech Stryjewski; Bartłomiej Czechowicz

    2010-01-01

    Based on existing software aimed at investigation support in the analysis of computer data storage overtaken during investigation (PyFlag), an extension is proposed involving the introduction of dedicated components for data identification and filtering. Hash codes for popular software contained in NIST/NSRL database are considered in order to avoid unwanted files while searching and to classify them into several categories. The extension allows for further analysis, e.g. using artificial int...

  16. Lessons Learned in Adapting a Software System to a Micro Computer

    Science.gov (United States)

    2013-02-05

    that had essentially unlimited resources (disk, memory and processor) and modify it to run on a microcontroller which has rather limited resources...system that could be readily used by soldiers. 15. SUBJECT TERMS microprocessor , desktop computer, algoritm modification 16. SECURITY CLASSIFICATION...check the armor. A data file is collected and compared to the database to determine armor health. If the data file differs substantially from the

  17. Burnout syndrome among dental students: a short version of the "Burnout Clinical Subtype Questionnaire" adapted for students (BCSQ-12-SS).

    Science.gov (United States)

    Montero-Marin, Jesus; Monticelli, Francesca; Casas, Marina; Roman, Amparo; Tomas, Inmaculada; Gili, Margarita; Garcia-Campayo, Javier

    2011-12-12

    Burnout has been traditionally defined in relation to the dimensions of "exhaustion", "cynicism", and "inefficiency". More recently, the Burnout Clinical Subtype Questionnaire (BCSQ-12) further established three different subtypes of burnout: the "frenetic" subtype (related to "overload"), the "under-challenged" subtype (related to "lack of development"), and the "worn-out" subtype (related to "neglect"). However, to date, these definitions have not been applied to students. The aims of this research were (1) to adapt a Spanish version of the BCSQ-12 for use with students, (2) to test its factorial validity, internal consistency, convergent and discriminant validity, and (3) to assess potential socio-demographic and occupational risk factors associated with the development of the subtypes. We used a cross-sectional design on a sample of dental students (n = 314) from Santiago and Huesca universities (Spain). Participants completed the Burnout Clinical Subtype Questionnaire Student Survey (BCSQ-12-SS), the Maslach Burnout Inventory Student Survey (MBI-SS), and a series of socio-demographic and occupational questions formulated for the specific purpose of this study. Data were subjected to exploratory factor analysis (EFA) using the principal component method with varimax orthogonal rotation. To assess the relations with the criterion, we calculated the Pearson correlation coefficient (r), multiple correlation coefficient (R(y.123)), and the coefficient of determination (R(2)(y.123)). To assess the association between the subtypes and the socio-demographic variables, we examined the adjusted odds ratio (OR) obtained from multivariate logistic regression models. Factorial analyses supported the theoretical proposition of the BCSQ-12-SS, with α-values exceeding 0.80 for all dimensions. The "overload-exhaustion" relation was r = 0.59 (p burnout as established by the BCSQ-12-SS. As such, the BCSQ-12-SS can be used for the recognition of clinical profiles and for the

  18. Burnout syndrome among dental students: a short version of the "Burnout Clinical Subtype Questionnaire" adapted for students (BCSQ-12-SS)

    Science.gov (United States)

    2011-01-01

    Background Burnout has been traditionally defined in relation to the dimensions of "exhaustion", "cynicism", and "inefficiency". More recently, the Burnout Clinical Subtype Questionnaire (BCSQ-12) further established three different subtypes of burnout: the "frenetic" subtype (related to "overload"), the "under-challenged" subtype (related to "lack of development"), and the "worn-out" subtype (related to "neglect"). However, to date, these definitions have not been applied to students. The aims of this research were (1) to adapt a Spanish version of the BCSQ-12 for use with students, (2) to test its factorial validity, internal consistency, convergent and discriminant validity, and (3) to assess potential socio-demographic and occupational risk factors associated with the development of the subtypes. Method We used a cross-sectional design on a sample of dental students (n = 314) from Santiago and Huesca universities (Spain). Participants completed the Burnout Clinical Subtype Questionnaire Student Survey (BCSQ-12-SS), the Maslach Burnout Inventory Student Survey (MBI-SS), and a series of socio-demographic and occupational questions formulated for the specific purpose of this study. Data were subjected to exploratory factor analysis (EFA) using the principal component method with varimax orthogonal rotation. To assess the relations with the criterion, we calculated the Pearson correlation coefficient (r), multiple correlation coefficient (Ry.123), and the coefficient of determination (R2y.123). To assess the association between the subtypes and the socio-demographic variables, we examined the adjusted odds ratio (OR) obtained from multivariate logistic regression models. Results Factorial analyses supported the theoretical proposition of the BCSQ-12-SS, with α-values exceeding 0.80 for all dimensions. The "overload-exhaustion" relation was r = 0.59 (p burnout as established by the BCSQ-12-SS. As such, the BCSQ-12-SS can be used for the recognition of clinical

  19. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali

    2014-11-01

    The European Extremely Large Telescope project (E-ELT) is one of Europe\\'s highest priorities in ground-based astronomy. ELTs are built on top of a variety of highly sensitive and critical astronomical instruments. In particular, a new instrument called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used to drive the deformable mirror in real time from the measurements. A new numerical algorithm is proposed (1) to capture the actual experimental noise and (2) to substantially speed up previous implementations by exposing more concurrency, while reducing the number of floating-point operations. Based on the Matrices Over Runtime System at Exascale numerical library (MORSE), a dynamic scheduler drives all computational stages of the tomographic reconstruct or simulation and allows to pipeline and to run tasks out-of order across different stages on heterogeneous systems, while ensuring data coherency and dependencies. The proposed TR simulation outperforms asymptotically previous state-of-the-art implementations up to 13-fold speedup. At more than 50000 unknowns, this appears to be the largest-scale AO problem submitted to computation, to date, and opens new research directions for extreme scale AO simulations. © 2014 IEEE.

  20. Human versus Computer Controlled Selection of Ventilator Settings: An Evaluation of Adaptive Support Ventilation and Mid-Frequency Ventilation

    Science.gov (United States)

    Mireles-Cabodevila, Eduardo; Diaz-Guzman, Enrique; Arroliga, Alejandro C.; Chatburn, Robert L.

    2012-01-01

    Background. There are modes of mechanical ventilation that can select ventilator settings with computer controlled algorithms (targeting schemes). Two examples are adaptive support ventilation (ASV) and mid-frequency ventilation (MFV). We studied how different clinician-chosen ventilator settings are from these computer algorithms under different scenarios. Methods. A survey of critical care clinicians provided reference ventilator settings for a 70 kg paralyzed patient in five clinical/physiological scenarios. The survey-derived values for minute ventilation and minute alveolar ventilation were used as goals for ASV and MFV, respectively. A lung simulator programmed with each scenario's respiratory system characteristics was ventilated using the clinician, ASV, and MFV settings. Results. Tidal volumes ranged from 6.1 to 8.3 mL/kg for the clinician, 6.7 to 11.9 mL/kg for ASV, and 3.5 to 9.9 mL/kg for MFV. Inspiratory pressures were lower for ASV and MFV. Clinician-selected tidal volumes were similar to the ASV settings for all scenarios except for asthma, in which the tidal volumes were larger for ASV and MFV. MFV delivered the same alveolar minute ventilation with higher end expiratory and lower end inspiratory volumes. Conclusions. There are differences and similarities among initial ventilator settings selected by humans and computers for various clinical scenarios. The ventilation outcomes are the result of the lung physiological characteristics and their interaction with the targeting scheme. PMID:23119152

  1. Human versus Computer Controlled Selection of Ventilator Settings: An Evaluation of Adaptive Support Ventilation and Mid-Frequency Ventilation

    Directory of Open Access Journals (Sweden)

    Eduardo Mireles-Cabodevila

    2012-01-01

    Full Text Available Background. There are modes of mechanical ventilation that can select ventilator settings with computer controlled algorithms (targeting schemes. Two examples are adaptive support ventilation (ASV and mid-frequency ventilation (MFV. We studied how different clinician-chosen ventilator settings are from these computer algorithms under different scenarios. Methods. A survey of critical care clinicians provided reference ventilator settings for a 70 kg paralyzed patient in five clinical/physiological scenarios. The survey-derived values for minute ventilation and minute alveolar ventilation were used as goals for ASV and MFV, respectively. A lung simulator programmed with each scenario’s respiratory system characteristics was ventilated using the clinician, ASV, and MFV settings. Results. Tidal volumes ranged from 6.1 to 8.3 mL/kg for the clinician, 6.7 to 11.9 mL/kg for ASV, and 3.5 to 9.9 mL/kg for MFV. Inspiratory pressures were lower for ASV and MFV. Clinician-selected tidal volumes were similar to the ASV settings for all scenarios except for asthma, in which the tidal volumes were larger for ASV and MFV. MFV delivered the same alveolar minute ventilation with higher end expiratory and lower end inspiratory volumes. Conclusions. There are differences and similarities among initial ventilator settings selected by humans and computers for various clinical scenarios. The ventilation outcomes are the result of the lung physiological characteristics and their interaction with the targeting scheme.

  2. Detecting short spatial scale local adaptation and epistatic selection in climate-related candidate genes in European beech (Fagus sylvatica) populations.

    Science.gov (United States)

    Csilléry, Katalin; Lalagüe, Hadrien; Vendramin, Giovanni G; González-Martínez, Santiago C; Fady, Bruno; Oddou-Muratorio, Sylvie

    2014-10-01

    Detecting signatures of selection in tree populations threatened by climate change is currently a major research priority. Here, we investigated the signature of local adaptation over a short spatial scale using 96 European beech (Fagus sylvatica L.) individuals originating from two pairs of populations on the northern and southern slopes of Mont Ventoux (south-eastern France). We performed both single and multilocus analysis of selection based on 53 climate-related candidate genes containing 546 SNPs. FST outlier methods at the SNP level revealed a weak signal of selection, with three marginally significant outliers in the northern populations. At the gene level, considering haplotypes as alleles, two additional marginally significant outliers were detected, one on each slope. To account for the uncertainty of haplotype inference, we averaged the Bayes factors over many possible phase reconstructions. Epistatic selection offers a realistic multilocus model of selection in natural populations. Here, we used a test suggested by Ohta based on the decomposition of the variance of linkage disequilibrium. Overall populations, 0.23% of the SNP pairs (haplotypes) showed evidence of epistatic selection, with nearly 80% of them being within genes. One of the between gene epistatic selection signals arose between an FST outlier and a nonsynonymous mutation in a drought response gene. Additionally, we identified haplotypes containing selectively advantageous allele combinations which were unique to high or low elevations and northern or southern populations. Several haplotypes contained nonsynonymous mutations situated in genes with known functional importance for adaptation to climatic factors. © 2014 John Wiley & Sons Ltd.

  3. From Collective Adaptive Systems to Human Centric Computation and Back: Spatial Model Checking for Medical Imaging

    Directory of Open Access Journals (Sweden)

    Gina Belmonte

    2016-07-01

    Full Text Available Recent research on formal verification for Collective Adaptive Systems (CAS pushed advancements in spatial and spatio-temporal model checking, and as a side result provided novel image analysis methodologies, rooted in logical methods for topological spaces. Medical Imaging (MI is a field where such technologies show potential for ground-breaking innovation. In this position paper, we present a preliminary investigation centred on applications of spatial model checking to MI. The focus is shifted from pure logics to a mixture of logical, statistical and algorithmic approaches, driven by the logical nature intrinsic to the specification of the properties of interest in the field. As a result, novel operators are introduced, that could as well be brought back to the setting of CAS.

  4. Exact and Adaptive Signed Distance Fields Computation for Rigid and Deformable Models on GPUs.

    Science.gov (United States)

    Liu, Fuchang; Kim, Young J

    2014-05-01

    Most techniques for real-time construction of a signed distance field, whether on a CPU or GPU, involve approximate distances. We use a GPU to build an exact adaptive distance field, constructed from an octree by using the Morton code. We use rectangle-swept spheres to construct a bounding volume hierarchy (BVH) around a triangulated model. To speed up BVH construction, we can use a multi-BVH structure to improve the workload balance between GPU processors. An upper bound on distance to the model provided by the octree itself allows us to reduce the number of BVHs involved in determining the distances from the centers of octree nodes at successively lower levels, prior to an exact distance query involving the remaining BVHs. Distance fields can be constructed 35-64 times as fast as a serial CPU implementation of a similar algorithm, allowing us to simulate a piece of fabric interacting with the Stanford Bunny at 20 frames per second.

  5. Adaptable structural synthesis using advanced analysis and optimization coupled by a computer operating system

    Science.gov (United States)

    Sobieszczanski-Sobieski, J.; Bhat, R. B.

    1979-01-01

    A finite element program is linked with a general purpose optimization program in a 'programing system' which includes user supplied codes that contain problem dependent formulations of the design variables, objective function and constraints. The result is a system adaptable to a wide spectrum of structural optimization problems. In a sample of numerical examples, the design variables are the cross-sectional dimensions and the parameters of overall shape geometry, constraints are applied to stresses, displacements, buckling and vibration characteristics, and structural mass is the objective function. Thin-walled, built-up structures and frameworks are included in the sample. Details of the system organization and characteristics of the component programs are given.

  6. Synaptic plasticity in medial vestibular nucleus neurons: comparison with computational requirements of VOR adaptation.

    Directory of Open Access Journals (Sweden)

    John R W Menzies

    Full Text Available BACKGROUND: Vestibulo-ocular reflex (VOR gain adaptation, a longstanding experimental model of cerebellar learning, utilizes sites of plasticity in both cerebellar cortex and brainstem. However, the mechanisms by which the activity of cortical Purkinje cells may guide synaptic plasticity in brainstem vestibular neurons are unclear. Theoretical analyses indicate that vestibular plasticity should depend upon the correlation between Purkinje cell and vestibular afferent inputs, so that, in gain-down learning for example, increased cortical activity should induce long-term depression (LTD at vestibular synapses. METHODOLOGY/PRINCIPAL FINDINGS: Here we expressed this correlational learning rule in its simplest form, as an anti-Hebbian, heterosynaptic spike-timing dependent plasticity interaction between excitatory (vestibular and inhibitory (floccular inputs converging on medial vestibular nucleus (MVN neurons (input-spike-timing dependent plasticity, iSTDP. To test this rule, we stimulated vestibular afferents to evoke EPSCs in rat MVN neurons in vitro. Control EPSC recordings were followed by an induction protocol where membrane hyperpolarizing pulses, mimicking IPSPs evoked by flocculus inputs, were paired with single vestibular nerve stimuli. A robust LTD developed at vestibular synapses when the afferent EPSPs coincided with membrane hyperpolarization, while EPSPs occurring before or after the simulated IPSPs induced no lasting change. Furthermore, the iSTDP rule also successfully predicted the effects of a complex protocol using EPSP trains designed to mimic classical conditioning. CONCLUSIONS: These results, in strong support of theoretical predictions, suggest that the cerebellum alters the strength of vestibular synapses on MVN neurons through hetero-synaptic, anti-Hebbian iSTDP. Since the iSTDP rule does not depend on post-synaptic firing, it suggests a possible mechanism for VOR adaptation without compromising gaze-holding and VOR

  7. Burnout syndrome among dental students: a short version of the "Burnout Clinical Subtype Questionnaire" adapted for students (BCSQ-12-SS

    Directory of Open Access Journals (Sweden)

    Montero-Marin Jesus

    2011-12-01

    Full Text Available Abstract Background Burnout has been traditionally defined in relation to the dimensions of "exhaustion", "cynicism", and "inefficiency". More recently, the Burnout Clinical Subtype Questionnaire (BCSQ-12 further established three different subtypes of burnout: the "frenetic" subtype (related to "overload", the "under-challenged" subtype (related to "lack of development", and the "worn-out" subtype (related to "neglect". However, to date, these definitions have not been applied to students. The aims of this research were (1 to adapt a Spanish version of the BCSQ-12 for use with students, (2 to test its factorial validity, internal consistency, convergent and discriminant validity, and (3 to assess potential socio-demographic and occupational risk factors associated with the development of the subtypes. Method We used a cross-sectional design on a sample of dental students (n = 314 from Santiago and Huesca universities (Spain. Participants completed the Burnout Clinical Subtype Questionnaire Student Survey (BCSQ-12-SS, the Maslach Burnout Inventory Student Survey (MBI-SS, and a series of socio-demographic and occupational questions formulated for the specific purpose of this study. Data were subjected to exploratory factor analysis (EFA using the principal component method with varimax orthogonal rotation. To assess the relations with the criterion, we calculated the Pearson correlation coefficient (r, multiple correlation coefficient (Ry.123, and the coefficient of determination (R2y.123. To assess the association between the subtypes and the socio-demographic variables, we examined the adjusted odds ratio (OR obtained from multivariate logistic regression models. Results Factorial analyses supported the theoretical proposition of the BCSQ-12-SS, with α-values exceeding 0.80 for all dimensions. The "overload-exhaustion" relation was r = 0.59 (p y.123 = 0.62, 30.25% in "cynicism" (Ry.123 = 0.55, and 26.01% in "inefficiency" (Ry.123 = 0

  8. Short-term wind speed forecasting by an adaptive network-based fuzzy inference system (ANFIS: an attempt towards an ensemble forecasting method

    Directory of Open Access Journals (Sweden)

    Moslem Yousefi

    2015-12-01

    Full Text Available Accurate Wind speed forecasting has a vital role in efficient utilization of wind farms. Wind forecasting could be performed for long or short time horizons. Given the volatile nature of wind and its dependent on many geographical parameters, it is difficult for traditional methods to provide a reliable forecast of wind speed time series. In this study, an attempt is made to establish an efficient adaptive network-based fuzzy interference (ANFIS for short-term wind speed forecasting. Using the available data sets in the literature, the ANFIS network is constructed, tested and the results are compared with that of a regular neural network, which has been forecasted the same set of dataset in previous studies. To avoid trial-and-error process for selection of the ANFIS input data, the results of autocorrelation factor (ACF and partial auto correlation factor (PACF on the historical wind speed data are employed. The available data set is divided into two parts. 50% for training and 50% for testing and validation. The testing part of data set will be merely used for assessing the performance of the neural network which guarantees that only unseen data is used to evaluate the forecasting performance of the network. On the other hand, validation data could be used for parameter-setting of the network if required. The results indicate that ANFIS could not outperform ANN in short-term wind speed forecasting though its results are competitive. The two methods are hybridized, though simply by weightage, and the hybrid methods shows slight improvement comparing to both ANN and ANFIS results. Therefore, the goal of future studies could be implementing ANFIS and ANNs in a more comprehensive ensemble method which could be ultimately more robust and accurate

  9. Adaptation of PyFlag to Efficient Analysis of Overtaken Computer Data Storage

    Directory of Open Access Journals (Sweden)

    Aleksander Byrski

    2010-03-01

    Full Text Available Based on existing software aimed at investigation support in the analysis of computer data storage overtaken during investigation (PyFlag, an extension is proposed involving the introduction of dedicated components for data identification and filtering. Hash codes for popular software contained in NIST/NSRL database are considered in order to avoid unwanted files while searching and to classify them into several categories. The extension allows for further analysis, e.g. using artificial intelligence methods. The considerations are illustrated by the overview of the system's design.

  10. Computational fluid dynamics assisted characterization of parafoveal hemodynamics in normal and diabetic eyes using adaptive optics scanning laser ophthalmoscopy.

    Science.gov (United States)

    Lu, Yang; Bernabeu, Miguel O; Lammer, Jan; Cai, Charles C; Jones, Martin L; Franco, Claudio A; Aiello, Lloyd Paul; Sun, Jennifer K

    2016-12-01

    Diabetic retinopathy (DR) is the leading cause of visual loss in working-age adults worldwide. Previous studies have found hemodynamic changes in the diabetic eyes, which precede clinically evident pathological alterations of the retinal microvasculature. There is a pressing need for new methods to allow greater understanding of these early hemodynamic changes that occur in DR. In this study, we propose a noninvasive method for the assessment of hemodynamics around the fovea (a region of the eye of paramount importance for vision). The proposed methodology combines adaptive optics scanning laser ophthalmoscopy and computational fluid dynamics modeling. We compare results obtained with this technique with in vivo measurements of blood flow based on blood cell aggregation tracking. Our results suggest that parafoveal hemodynamics, such as capillary velocity, wall shear stress, and capillary perfusion pressure can be noninvasively and reliably characterized with this method in both healthy and diabetic retinopathy patients.

  11. Initial experience with adaptive iterative dose reduction 3D to reduce radiation dose in computed tomographic urography.

    Science.gov (United States)

    Juri, Hiroshi; Matsuki, Mitsuru; Itou, Yasushi; Inada, Yuki; Nakai, Go; Azuma, Haruhito; Narumi, Yoshifumi

    2013-01-01

    This study aimed to investigate the feasibility of low-dose computed tomographic (CT) urography with adaptive iterative dose reduction 3D (AIDR 3D). Thirty patients underwent routine-dose CT scans with filtered back projection and low-dose CT scans with AIDR 3D in the excretory phase of CT urography. Visual evaluations were performed with respect to internal image noises, sharpness, streak artifacts, and diagnostic acceptability. Quantitative measures of the image noise and radiation dose were also obtained. All results were compared on the basis of body mass index (BMI). At visual evaluations, streak artifacts in the urinary bladder were statistically weaker on low-dose CT than on routine-dose CT in the axial and coronal images (P urography with AIDR 3D allows 45% reduction of radiation dose without degenerating of the image quality in the excretory phase independently to a BMI.

  12. Computations of Unsteady Viscous Compressible Flows Using Adaptive Mesh Refinement in Curvilinear Body-fitted Grid Systems

    Science.gov (United States)

    Steinthorsson, E.; Modiano, David; Colella, Phillip

    1994-01-01

    A methodology for accurate and efficient simulation of unsteady, compressible flows is presented. The cornerstones of the methodology are a special discretization of the Navier-Stokes equations on structured body-fitted grid systems and an efficient solution-adaptive mesh refinement technique for structured grids. The discretization employs an explicit multidimensional upwind scheme for the inviscid fluxes and an implicit treatment of the viscous terms. The mesh refinement technique is based on the AMR algorithm of Berger and Colella. In this approach, cells on each level of refinement are organized into a small number of topologically rectangular blocks, each containing several thousand cells. The small number of blocks leads to small overhead in managing data, while their size and regular topology means that a high degree of optimization can be achieved on computers with vector processors.

  13. Hybrid Direct and Iterative Solver with Library of Multi-criteria Optimal Orderings for h Adaptive Finite Element Method Computations

    KAUST Repository

    AbouEisha, Hassan M.

    2016-06-02

    In this paper we present a multi-criteria optimization of element partition trees and resulting orderings for multi-frontal solver algorithms executed for two dimensional h adaptive finite element method. In particular, the problem of optimal ordering of elimination of rows in the sparse matrices resulting from adaptive finite element method computations is reduced to the problem of finding of optimal element partition trees. Given a two dimensional h refined mesh, we find all optimal element partition trees by using the dynamic programming approach. An element partition tree defines a prescribed order of elimination of degrees of freedom over the mesh. We utilize three different metrics to estimate the quality of the element partition tree. As the first criterion we consider the number of floating point operations(FLOPs) performed by the multi-frontal solver. As the second criterion we consider the number of memory transfers (MEMOPS) performed by the multi-frontal solver algorithm. As the third criterion we consider memory usage (NONZEROS) of the multi-frontal direct solver. We show the optimization results for FLOPs vs MEMOPS as well as for the execution time estimated as FLOPs+100MEMOPS vs NONZEROS. We obtain Pareto fronts with multiple optimal trees, for each mesh, and for each refinement level. We generate a library of optimal elimination trees for small grids with local singularities. We also propose an algorithm that for a given large mesh with identified local sub-grids, each one with local singularity. We compute Schur complements over the sub-grids using the optimal trees from the library, and we submit the sequence of Schur complements into the iterative solver ILUPCG.

  14. The transition of the national certification examination from paper and pencil to computer adaptive testing.

    Science.gov (United States)

    Zaglaniczny, K L

    1996-02-01

    The Council on Certification of Nurse Anesthetists (CCNA) has been exploring computerized adaptive testing (CAT) for the national certification examination (NCE) over the past several years. CCNA representatives have consulted with experts in testing and with individuals from professional associations who use CAT for certification or licensure testing. This article will provide an overview of CAT and discuss how the CCNA plans to implement CAT for the NCE beginning April 8, 1996. A future article that explains the theoretical concepts of CAT will be published in the April 1996 AANA Journal. It is important to note that the NCE will not be a new test, the current content outline and item bank will remain the same. It is only the method of test administration that is changed--from paper and pencil to CAT. Each candidate will answer questions and take a test that is individualized to his or her ability or competence level and meets the specifications of the test outline. All candidates must achieve the same passing score. The implementation of CAT for the NCE will be advantageous for the candidates and provide a more efficient competency assessment. The last paper and pencil examination was administered on December 9, 1995. The transition is a significant event in nurse anesthesia history, just as nurse anesthesia was the first advanced practice nursing specialty to implement the certification credential, the CCNA will be the first to introduce CAT.

  15. Adaptiveness in monotone pseudo-Boolean optimization and stochastic neural computation.

    Science.gov (United States)

    Grossi, Giuliano

    2009-08-01

    Hopfield neural network (HNN) is a nonlinear computational model successfully applied in finding near-optimal solutions of several difficult combinatorial problems. In many cases, the network energy function is obtained through a learning procedure so that its minima are states falling into a proper subspace (feasible region) of the search space. However, because of the network nonlinearity, a number of undesirable local energy minima emerge from the learning procedure, significantly effecting the network performance. In the neural model analyzed here, we combine both a penalty and a stochastic process in order to enhance the performance of a binary HNN. The penalty strategy allows us to gradually lead the search towards states representing feasible solutions, so avoiding oscillatory behaviors or asymptotically instable convergence. Presence of stochastic dynamics potentially prevents the network to fall into shallow local minima of the energy function, i.e., quite far from global optimum. Hence, for a given fixed network topology, the desired final distribution on the states can be reached by carefully modulating such process. The model uses pseudo-Boolean functions both to express problem constraints and cost function; a combination of these two functions is then interpreted as energy of the neural network. A wide variety of NP-hard problems fall in the class of problems that can be solved by the model at hand, particularly those having a monotonic quadratic pseudo-Boolean function as constraint function. That is, functions easily derived by closed algebraic expressions representing the constraint structure and easy (polynomial time) to maximize. We show the asymptotic convergence properties of this model characterizing its state space distribution at thermal equilibrium in terms of Markov chain and give evidence of its ability to find high quality solutions on benchmarks and randomly generated instances of two specific problems taken from the computational graph

  16. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  17. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    Science.gov (United States)

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  18. Comparative Analysis of Classifiers for Developing an Adaptive Computer-Assisted EEG Analysis System for Diagnosing Epilepsy

    Directory of Open Access Journals (Sweden)

    Malik Anas Ahmad

    2015-01-01

    Full Text Available Computer-assisted analysis of electroencephalogram (EEG has a tremendous potential to assist clinicians during the diagnosis of epilepsy. These systems are trained to classify the EEG based on the ground truth provided by the neurologists. So, there should be a mechanism in these systems, using which a system’s incorrect markings can be mentioned and the system should improve its classification by learning from them. We have developed a simple mechanism for neurologists to improve classification rate while encountering any false classification. This system is based on taking discrete wavelet transform (DWT of the signals epochs which are then reduced using principal component analysis, and then they are fed into a classifier. After discussing our approach, we have shown the classification performance of three types of classifiers: support vector machine (SVM, quadratic discriminant analysis, and artificial neural network. We found SVM to be the best working classifier. Our work exhibits the importance and viability of a self-improving and user adapting computer-assisted EEG analysis system for diagnosing epilepsy which processes each channel exclusive to each other, along with the performance comparison of different machine learning techniques in the suggested system.

  19. Comparative analysis of classifiers for developing an adaptive computer-assisted EEG analysis system for diagnosing epilepsy.

    Science.gov (United States)

    Ahmad, Malik Anas; Ayaz, Yasar; Jamil, Mohsin; Omer Gillani, Syed; Rasheed, Muhammad Babar; Imran, Muhammad; Khan, Nadeem Ahmed; Majeed, Waqas; Javaid, Nadeem

    2015-01-01

    Computer-assisted analysis of electroencephalogram (EEG) has a tremendous potential to assist clinicians during the diagnosis of epilepsy. These systems are trained to classify the EEG based on the ground truth provided by the neurologists. So, there should be a mechanism in these systems, using which a system's incorrect markings can be mentioned and the system should improve its classification by learning from them. We have developed a simple mechanism for neurologists to improve classification rate while encountering any false classification. This system is based on taking discrete wavelet transform (DWT) of the signals epochs which are then reduced using principal component analysis, and then they are fed into a classifier. After discussing our approach, we have shown the classification performance of three types of classifiers: support vector machine (SVM), quadratic discriminant analysis, and artificial neural network. We found SVM to be the best working classifier. Our work exhibits the importance and viability of a self-improving and user adapting computer-assisted EEG analysis system for diagnosing epilepsy which processes each channel exclusive to each other, along with the performance comparison of different machine learning techniques in the suggested system.

  20. Adapting computational optimization concepts from aeronautics to nuclear fusion reactor design

    Directory of Open Access Journals (Sweden)

    Baelmans M.

    2012-10-01

    Full Text Available Even on the most powerful supercomputers available today, computational nuclear fusion reactor divertor design is extremely CPU demanding, not least due to the large number of design variables and the hybrid micro-macro character of the flows. Therefore, automated design methods based on optimization can greatly assist current reactor design studies. Over the past decades, “adjoint methods” for shape optimization have proven their virtue in the field of aerodynamics. Applications include drag reduction for wing and wing-body configurations. Here we demonstrate that also for divertor design, these optimization methods have a large potential. Specifically, we apply the continuous adjoint method to the optimization of the divertor geometry in a 2D poloidal cross section of an axisymmetric tokamak device (as, e.g., JET and ITER, using a simplified model for the plasma edge. The design objective is to spread the target material heat load as much as possible by controlling the shape of the divertor, while maintaining the full helium ash removal capabilities of the vacuum pumping system.

  1. Adaptive thresholding of chest temporal subtraction images in computer-aided diagnosis of pathologic change

    Science.gov (United States)

    Harrison, Melanie; Looper, Jared; Armato, Samuel G.

    2016-03-01

    Radiologists frequently use chest radiographs acquired at different times to diagnose a patient by identifying regions of change. Temporal subtraction (TS) images are formed when a computer warps a radiographic image to register and then subtract one image from the other, accentuating regions of change. The purpose of this study was to create a computeraided diagnostic (CAD) system to threshold chest TS images and identify candidate regions of pathologic change. Each thresholding technique created two different candidate regions: light and dark. Light regions have a high gray-level mean, while dark regions have a low gray-level mean; areas with no change appear as medium-gray pixels. Ten different thresholding techniques were examined and compared. By thresholding light and dark candidate regions separately, the number of properly thresholded regions improved. The thresholding of light and dark regions separately produced fewer overall candidate regions that included more regions of actual pathologic change than global thresholding of the image. Overall, the moment-preserving method produced the best results for light regions, while the normal distribution method produced the best results for dark regions. Separation of light and dark candidate regions by thresholding shows potential as the first step in creating a CAD system to detect pathologic change in chest TS images.

  2. Adaptation Computing Parameters of Pan-Tilt-Zoom Cameras for Traffic Monitoring

    Directory of Open Access Journals (Sweden)

    Ya Lin WU

    2014-01-01

    Full Text Available The Closed- CIRCUIT television (CCTV cameras have been widely used in recent years for traffic monitoring and surveillance applications. We can use CCTV cameras to extract automatically real-time traffic parameters according to the image processing and tracking technologies. Especially, the pan-tilt-zoom (PTZ cameras can provide flexible view selection as well as a wider observation range, and this makes the traffic parameters can be accurately calculated. Therefore, that the parameters of PTZ cameras are calibrated plays an important role in vision-based traffic applications. However, in the specific traffic environment, which is that the license plate number of the illegal parking is located, the parameters of PTZ cameras have to be updated according to the position and distance of illegal parking. In proposed traffic monitoring systems, we use the ordinary webcam and PTZ camera. We get vanishing-point of traffic lane lines in the pixel-based coordinate system by fixed webcam. The parameters of PTZ camera can be initialized by distance of the traffic monitoring and specific objectives and vanishing-point. And then we can use the coordinate position of the illegally parked car to update the parameters of PTZ camera and then get the real word coordinate position of the illegally parked car and use it to compute the distance. The result shows the error of the tested distance and real distance is only 0.2064 meter.

  3. Senior Adults and Computers in the 1990s.

    Science.gov (United States)

    Lawhon, Tommie; And Others

    1996-01-01

    Older adults use computers for entertainment, education, and creative and business endeavors. Computer training helps them increase productivity, learn skills, and boost short-term memory. Electronic mail, online services, and the Internet encourage socialization. Adapted technology helps disabled and ill elders use computers. (SK)

  4. Appraisal of adaptive neuro-fuzzy computing technique for estimating anti-obesity properties of a medicinal plant.

    Science.gov (United States)

    Kazemipoor, Mahnaz; Hajifaraji, Majid; Radzi, Che Wan Jasimah Bt Wan Mohamed; Shamshirband, Shahaboddin; Petković, Dalibor; Mat Kiah, Miss Laiha

    2015-01-01

    This research examines the precision of an adaptive neuro-fuzzy computing technique in estimating the anti-obesity property of a potent medicinal plant in a clinical dietary intervention. Even though a number of mathematical functions such as SPSS analysis have been proposed for modeling the anti-obesity properties estimation in terms of reduction in body mass index (BMI), body fat percentage, and body weight loss, there are still disadvantages of the models like very demanding in terms of calculation time. Since it is a very crucial problem, in this paper a process was constructed which simulates the anti-obesity activities of caraway (Carum carvi) a traditional medicine on obese women with adaptive neuro-fuzzy inference (ANFIS) method. The ANFIS results are compared with the support vector regression (SVR) results using root-mean-square error (RMSE) and coefficient of determination (R(2)). The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the ANFIS approach. The following statistical characteristics are obtained for BMI loss estimation: RMSE=0.032118 and R(2)=0.9964 in ANFIS testing and RMSE=0.47287 and R(2)=0.361 in SVR testing. For fat loss estimation: RMSE=0.23787 and R(2)=0.8599 in ANFIS testing and RMSE=0.32822 and R(2)=0.7814 in SVR testing. For weight loss estimation: RMSE=0.00000035601 and R(2)=1 in ANFIS testing and RMSE=0.17192 and R(2)=0.6607 in SVR testing. Because of that, it can be applied for practical purposes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. SHORT AND LONG TERM EFFECTS OF HIGH-INTENSITY INTERVAL TRAINING ON HORMONES, METABOLITES, ANTIOXIDANT SYSTEM, GLYCOGEN CONCENTRATION AND AEROBIC PERFORMANCE ADAPTATIONS IN RATS

    Directory of Open Access Journals (Sweden)

    Gustavo Gomes De Araujo

    2016-10-01

    Full Text Available The purpose of the study was to investigate the effects of short and long term High-Intensity Interval Training (HIIT on anaerobic and aerobic performance, creatinine, uric acid, urea, creatine kinase, lactate dehydrogenase, catalase, superoxide dismutase, testosterone, corticosterone and glycogen concentration (liver, soleus and gastrocnemius. The Wistar were separated in two groups: HIIT and sedentary/control (CT. The lactate minimum (LM was used to evaluate the aerobic and anaerobic performance (AP (baseline, 6 and 12 wk. The lactate peak determination consisted of two swim bouts at 13% of body weight (bw: 1 30 s of effort; 2 30 s of passive recovery; 3 exercise until exhaustion (AP. Tethered loads equivalent to 3.5, 4.0, 4.5, 5.0, 5.5 and 6.5% bw were performed in incremental phase. The aerobic capacity in HIIT group increased after 12 wk (5.2±0.2 % bw in relation to baseline (4.4±0.2 % bw, but not after 6 wk (4.5±0.3 % bw. The exhaustion time in HIIT group showed higher values than CT after 6 (HIIT= 58±5 s; CT=40±7 s and 12 wk (HIIT=62±7 s; CT=49±3 s. Glycogen (mg/100mg increased in gastrocnemius for HIIT group after 6 wk (0.757±0.076 and 12 wk (1.014±0.157 in comparison to baseline (0.358±0.024. In soleus, the HIIT increased glycogen after 6 wk (0.738±0.057 and 12 wk (0.709±0.085 in comparison to baseline (0.417±0.035. The glycogen in liver increased after HIIT 12 wk (4.079±0.319 in relation to baseline (2.400±0.416. The corticosterone (ng/mL in HIIT increased after 6 wk (529.0±30.5 and reduced after 12 wk (153.6±14.5 in comparison to baseline (370.0±18.3. In conclusion, long term HIIT enhanced the aerobic capacity, but short term (6wk was not enough to cause aerobic adaptations. The anaerobic performance increased in HIIT short and long term compared with CT, without differences between HIIT short and long term. Furthermore, the glycogen super-compensantion increased after short and long term HIIT in comparison to

  6. Short and Long Term Effects of High-Intensity Interval Training on Hormones, Metabolites, Antioxidant System, Glycogen Concentration, and Aerobic Performance Adaptations in Rats.

    Science.gov (United States)

    de Araujo, Gustavo G; Papoti, Marcelo; Dos Reis, Ivan Gustavo Masselli; de Mello, Maria A R; Gobatto, Claudio A

    2016-01-01

    The purpose of the study was to investigate the effects of short and long term High-Intensity Interval Training (HIIT) on anaerobic and aerobic performance, creatinine, uric acid, urea, creatine kinase, lactate dehydrogenase, catalase, superoxide dismutase, testosterone, corticosterone, and glycogen concentration (liver, soleus, and gastrocnemius). The Wistar rats were separated in two groups: HIIT and sedentary/control (CT). The lactate minimum (LM) was used to evaluate the aerobic and anaerobic performance (AP) (baseline, 6, and 12 weeks). The lactate peak determination consisted of two swim bouts at 13% of body weight (bw): (1) 30 s of effort; (2) 30 s of passive recovery; (3) exercise until exhaustion (AP). Tethered loads equivalent to 3.5, 4.0, 4.5, 5.0, 5.5, and 6.5% bw were performed in incremental phase. The aerobic capacity in HIIT group increased after 12 weeks (5.2 ± 0.2% bw) in relation to baseline (4.4 ± 0.2% bw), but not after 6 weeks (4.5 ± 0.3% bw). The exhaustion time in HIIT group showed higher values than CT after 6 (HIIT = 58 ± 5 s; CT = 40 ± 7 s) and 12 weeks (HIIT = 62 ± 7 s; CT = 49 ± 3 s). Glycogen (mg/100 mg) increased in gastrocnemius for HIIT group after 6 weeks (0.757 ± 0.076) and 12 weeks (1.014 ± 0.157) in comparison to baseline (0.358 ± 0.024). In soleus, the HIIT increased glycogen after 6 weeks (0.738 ± 0.057) and 12 weeks (0.709 ± 0.085) in comparison to baseline (0.417 ± 0.035). The glycogen in liver increased after HIIT 12 weeks (4.079 ± 0.319) in relation to baseline (2.400 ± 0.416). The corticosterone (ng/mL) in HIIT increased after 6 weeks (529.0 ± 30.5) and reduced after 12 weeks (153.6 ± 14.5) in comparison to baseline (370.0 ± 18.3). In conclusion, long term HIIT enhanced the aerobic capacity, but short term was not enough to cause aerobic adaptations. The anaerobic performance increased in HIIT short and long term compared with CT, without differences between HIIT short and long term. Furthermore, the

  7. Short version of the Smartphone Addiction Scale adapted to Spanish and French: Towards a cross-cultural research in problematic mobile phone use.

    Science.gov (United States)

    Lopez-Fernandez, Olatz

    2017-01-01

    Research into smartphone addiction has followed the scientific literature on problematic mobile phone use developed during the last decade, with valid screening scales being developed to identify maladaptive behaviour associated with this technology, usually in adolescent populations. This study adapts the short version of the Smartphone Addiction Scale [SAS-SV] into Spanish and into French. The aim of the study was to (i) examine the scale's psychometric properties in both languages, (ii) estimate the prevalence of potential excessive smartphone use among Spanish and Belgian adults, and (iii) compare the addictive symptomatology measured by the SAS-SV between potentially excessive users from both countries. Data were collected via online surveys administered to 281 and 144 voluntary participants from both countries respectively, aged over 18years and recruited from academic environments. Results indicated that the reliability was excellent (i.e., Cronbach alphas: Spain: .88 and Belgium: .90), and the validity was very good (e.g., unifactoriality with a 49% and 54% of variance explained through explorative factor analysis, respectively). Findings showed that the prevalence of potential excessive smartphone use 12.5% for Spanish and 21.5% for francophone Belgians. The scale showed that at least 60% of excessive users endorsed withdrawal and tolerance symptoms in both countries, although the proposed addictive symptomatology did not cover the entire group of estimated excessive users and cultural differences appeared. This first cross-cultural study discusses the smartphone excessive use construct from its addictive pathway. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Translation, adaptation, validation and performance of the American Weight Efficacy Lifestyle Questionnaire Short Form (WEL-SF) to a Norwegian version: a cross-sectional study.

    Science.gov (United States)

    Flølo, Tone N; Andersen, John R; Nielsen, Hans J; Natvig, Gerd K

    2014-01-01

    Background. Researchers have emphasized a need to identify predictors that can explain the variability in weight management after bariatric surgery. Eating self-efficacy has demonstrated predictive impact on patients' adherence to recommended eating habits following multidisciplinary treatment programs, but has to a limited extent been subject for research after bariatric surgery. Recently an American short form version (WEL-SF) of the commonly used Weight Efficacy Lifestyle Questionnaire (WEL) was available for research and clinical purposes. Objectives. We intended to translate and culturally adapt the WEL-SF to Norwegian conditions, and to evaluate the new versions' psychometrical properties in a Norwegian population of morbidly obese patients eligible for bariatric surgery. Design. Cross-sectional Methods. A total of 225 outpatients selected for Laparoscopic sleeve gastrectomy (LSG) were recruited; 114 non-operated and 111 operated patients, respectively. The questionnaire was translated through forward and backward procedures. Structural properties were assessed performing principal component analysis (PCA), correlation and regression analysis were conducted to evaluate convergent validity and sensitivity, respectively. Data was assessed by mean, median, item response, missing values, floor- and ceiling effect, Cronbach's alpha and alpha if item deleted. Results. The PCA resulted in one factor with eigenvalue > 1, explaining 63.0% of the variability. The WEL-SF sum scores were positively correlated with the Self-efficacy and quality of life instruments (p eating self-efficacy, with acceptable psychometrical properties in a population of morbidly obese patients.

  9. Transitions between Short-Term and Long-Term Memory in Learning Meaningful Unrelated Paired Associates Using Computer Based Drills.

    Science.gov (United States)

    Goldenberg, Tzvika Y.; Turnure, James E.

    1989-01-01

    Discussion of short-term and long-term memory in learning paired associates focuses on two microcomputer-based instructional design experiments with eleventh and twelfth graders that were modeled after traditional drill and practice routines. Research questions are presented, treatment conditions are explained, and additional research is…

  10. Development and Assessment of an Adaptive Strategy Utilizing Regression Analysis Techniques for the Presentation of Instruction Via Computer. Tech Report Number 27.

    Science.gov (United States)

    Rivers, Lee

    This investigation developed a methodology for adapting self-instructional materials to individual differences. Data on within-course variables of proportion of correct answers, latency and anxiety were monitored and regression analysis used to determine predictors of final performance. Regression equations were coded into the computer logic and…

  11. Design of a Computer-Adaptive Test to Measure English Literacy and Numeracy in the Singapore Workforce: Considerations, Benefits, and Implications

    Science.gov (United States)

    Jacobsen, Jared; Ackermann, Richard; Eguez, Jane; Ganguli, Debalina; Rickard, Patricia; Taylor, Linda

    2011-01-01

    A computer adaptive test (CAT) is a delivery methodology that serves the larger goals of the assessment system in which it is embedded. A thorough analysis of the assessment system for which a CAT is being designed is critical to ensure that the delivery platform is appropriate and addresses all relevant complexities. As such, a CAT engine must be…

  12. Exploring the Cross-Linguistic Transfer of Reading Skills in Spanish to English in the Context of a Computer Adaptive Reading Intervention

    Science.gov (United States)

    Baker, Doris Luft; Basaraba, Deni Lee; Smolkowski, Keith; Conry, Jillian; Hautala, Jarkko; Richardson, Ulla; English, Sherril; Cole, Ron

    2017-01-01

    We explore the potential of a computer-adaptive decoding game in Spanish to increase the decoding skills and oral reading fluency in Spanish and English of bilingual students. Participants were 78 first-grade Spanish-speaking students attending bilingual programs in five classrooms in Texas. Classrooms were randomly assigned to the treatment…

  13. Thermal latency adds to lesion depth after application of high-power short-duration radiofrequency energy: Results of a computer-modeling study.

    Science.gov (United States)

    Irastorza, Ramiro M; d'Avila, Andre; Berjano, Enrique

    2018-02-01

    The use of ultra-short RF pulses could achieve greater lesion depth immediately after the application of the pulse due to thermal latency. A computer model of irrigated-catheter RF ablation was built to study the impact of thermal latency on the lesion depth. The results showed that the shorter the RF pulse duration (keeping energy constant), the greater the lesion depth during the cooling phase. For instance, after a 10-second pulse, lesion depth grew from 2.05 mm at the end of the pulse to 2.39 mm (17%), while after an ultra-short RF pulse of only 1 second the extra growth was 37% (from 2.22 to 3.05 mm). Importantly, short applications resulted in deeper lesions than long applications (3.05 mm vs. 2.39 mm, for 1- and 10-second pulse, respectively). While shortening the pulse duration produced deeper lesions, the associated increase in applied voltage caused overheating in the tissue: temperatures around 100 °C were reached at a depth of 1 mm in the case of 1- and 5-second pulses. However, since the lesion depth increased during the cooling period, lower values of applied voltage could be applied in short durations in order to obtain lesion depths similar to those in longer durations while avoiding overheating. The thermal latency phenomenon seems to be the cause of significantly greater lesion depth after short-duration high-power RF pulses. Balancing the applied total energy when the voltage and duration are changed is not the optimal strategy since short pulses can also cause overheating. © 2017 Wiley Periodicals, Inc.

  14. Computer-supported feedback message tailoring: theory-informed adaptation of clinical audit and feedback for learning and behavior change.

    Science.gov (United States)

    Landis-Lewis, Zach; Brehaut, Jamie C; Hochheiser, Harry; Douglas, Gerald P; Jacobson, Rebecca S

    2015-01-21

    Evidence shows that clinical audit and feedback can significantly improve compliance with desired practice, but it is unclear when and how it is effective. Audit and feedback is likely to be more effective when feedback messages can influence barriers to behavior change, but barriers to change differ across individual health-care providers, stemming from differences in providers' individual characteristics. The purpose of this article is to invite debate and direct research attention towards a novel audit and feedback component that could enable interventions to adapt to barriers to behavior change for individual health-care providers: computer-supported tailoring of feedback messages. We argue that, by leveraging available clinical data, theory-informed knowledge about behavior change, and the knowledge of clinical supervisors or peers who deliver feedback messages, a software application that supports feedback message tailoring could improve feedback message relevance for barriers to behavior change, thereby increasing the effectiveness of audit and feedback interventions. We describe a prototype system that supports the provision of tailored feedback messages by generating a menu of graphical and textual messages with associated descriptions of targeted barriers to behavior change. Supervisors could use the menu to select messages based on their awareness of each feedback recipient's specific barriers to behavior change. We anticipate that such a system, if designed appropriately, could guide supervisors towards giving more effective feedback for health-care providers. A foundation of evidence and knowledge in related health research domains supports the development of feedback message tailoring systems for clinical audit and feedback. Creating and evaluating computer-supported feedback tailoring tools is a promising approach to improving the effectiveness of clinical audit and feedback.

  15. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases......Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...

  16. Low-dose computed tomographic urography using adaptive iterative dose reduction 3-dimensional: comparison with routine-dose computed tomography with filtered back projection.

    Science.gov (United States)

    Juri, Hiroshi; Matsuki, Mitsuru; Inada, Yuki; Tsuboyama, Takahiro; Kumano, Seishi; Azuma, Haruhito; Narumi, Yoshifumi

    2013-01-01

    The aim of this study was to evaluate the image quality of low-dose computed tomographic (CT) urography using adaptive iterative dose reduction 3-dimensional (AIDR 3D) compared with routine-dose CT using filtered back projection (FBP). Thirty patients underwent low- and routine-dose CT scans in the nephrographic and excretory phases of CT urography. Low-dose CT was reconstructed with AIDR 3D, and routine-dose CT was reconstructed with FBP. In quantitative analyses, image noises were measured on the renal cortex, aorta, retroperitoneal fat, and psoas muscle in both CT scans and compared. Qualitative analyses of the urinary system were performed in both CT scans and compared. These results were compared on the basis of the body mass index (BMI) of the patients. The CT dose index (CTDIvol) was measured, and the dose reduction was calculated. In quantitative analyses, image noises in all organs on low-dose CT were less than those on routine-dose CT in both phases independently of the patient's BMI. There were no statistical differences between low- and routine-dose CT for diagnostic acceptability on all urinary systems in both phases independently of the patient's BMI. The average CTDIvol on routine-dose CT was 14.5 mGy in the nephrographic phase and 9.2 mGy in the excretory phase. The average CTDIvol on low-dose CT was 4.2 mGy in the nephrographic phase and 2.7 mGy in the excretory phase. Low-dose CT urography using AIDR 3D can offer diagnostic acceptability comparable with routine-dose CT urography with FBP with approximately 70% dose reduction.

  17. MUSIC: A Hybrid Computing Environment for Burrows-Wheeler Alignment for Massive Amount of Short Read Sequence Data

    OpenAIRE

    Gupta, Saurabh; Panda, Sanjoy Chaudhury 'and' Binay

    2014-01-01

    High-throughput DNA sequencers are becoming indispensable in our understanding of diseases at molecular level, in marker-assisted selection in agriculture and in microbial genetics research. These sequencing instruments produce enormous amount of data (often terabytes of raw data in a month) that requires efficient analysis, management and interpretation. The commonly used sequencing instrument today produces billions of short reads (upto 150 bases) from each run. The first step in the data a...

  18. Ultrasound computed tomography by frequency-shift low-pass filtering and least mean square adaptive filtering

    Science.gov (United States)

    Wang, Shanshan; Song, Junjie; Peng, Yang; Zhou, Liang; Ding, Mingyue; Yuchi, Ming

    2017-03-01

    In recent years, many research studies have been carried out on ultrasound computed tomography (USCT) for improving the detection and management of breast cancer. This paper investigates a signal pre-processing method based on frequency-shift low-pass filtering (FSLF) and least mean square adaptive filtering (LMSAF) for USCT image quality enhancement (proposed in our previous work). FSLF is designed base on Zoom Fast Fourier Transform algorithm (ZFFT) for processing the ultrasound signals in the frequency domain, while LMSAPF is based on the least mean square (LMS) algorithm in the time domain. Through the combination of the two filters, the ultrasound image is expected to have less noises and artifacts, and higher resolution and contrast. The proposed method was verified with the radio-frequency (RF) data of the nylon threads and the breast phantom captured by the USCT system developed in the Medical Ultrasound Laboratory. Experimental results show that the reconstructed images of nylon threads by the proposed method had narrower main lobe width and lower side lobe level comparing to the delay-and-sum (DAS). The background noises and artifacts could also be efficiently restrained. The reconstructed image of breast phantom by the proposed method had a higher resolution and the contrast ratio (CR) could be enhanced for about 12dB to 18dB at different region of interest (ROI).

  19. Selection of items for a computer-adaptive test to measure fatigue in patients with rheumatoid arthritis: a Delphi approach.

    Science.gov (United States)

    Nikolaus, Stephanie; Bode, Christina; Taal, Erik; vd Laar, Mart A F J

    2012-06-01

    Computer-adaptive tests (CATs) can measure precisely at individual level with few items selected from an item bank. Our aim was to select fatigue items to develop a CAT for rheumatoid arthritis (RA) and include expert opinions that are important for content validity of measurement instruments. Items were included from existing fatigue questionnaires and generated from interview material. In a Delphi procedure, rheumatologists, nurses, and patients evaluated the initial pool of 294 items. Items were selected for the CAT development if rated as adequate by at least 80% of the participants (when 50% or less agreed, they were excluded). Remaining items were adjusted based on participants' comments and re-evaluated in the next round. The procedure stopped when all items were selected or rejected. A total of 10 rheumatologists, 20 nurses, and 15 rheumatoid arthritis patients participated. After the first round, 96 of 294 items were directly selected. Nine items were directly excluded, and remaining items were adjusted. In the second round, 124 items were presented for re-evaluation. Ultimately, 245 items were selected. This study revealed a qualitatively evaluated item pool to be used for the item bank/CAT development. The Delphi procedure is a beneficial approach to select adequate items for measuring fatigue in RA.

  20. Assessment of Filtered Back Projection, Adaptive Statistical, and Model-Based Iterative Reconstruction for Reduced Dose Abdominal Computed Tomography.

    Science.gov (United States)

    Padole, Atul; Singh, Sarabjeet; Lira, Diego; Blake, Michael A; Pourjabbar, Sarvenaz; Khawaja, Ranish Deedar Ali; Choy, Garry; Saini, Sanjay; Do, Synho; Kalra, Mannudeep K

    2015-01-01

    To compare standard of care and reduced dose (RD) abdominal computed tomography (CT) images reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), model-based iterative reconstruction (MBIR) techniques. In an Institutional Review Board-approved, prospective clinical study, 28 patients (mean age 59 ± 13 years ), undergoing clinically indicated routine abdominal CT on a 64-channel multi-detector CT scanner, gave written informed consent for acquisition of an additional RD (reconstructed with FBP, ASIR, and MBIR and compared with FBP images of standard dose abdomen CT. Two radiologists performed randomized, independent, and blinded comparison for lesion detection, lesion margin, visibility of normal structures, and diagnostic confidence. Mean CT dose index volume was 10 ± 3.4 mGy and 1.3 ± 0.3 mGy for standard and RD CT, respectively. There were 73 "true positive" lesions detected on standard of care CT. Nine lesions (iterative reconstruction techniques used for reconstruction of RD data sets. The visibility of lesion margin was suboptimal in (23/28) patients with RD FBP, (15/28) patients with RD ASIR, and (14/28) patients with RD MBIR compared to standard of care FBP images (P iterative reconstruction techniques. Clinically significant lesions (reconstruction techniques (FBP, ASIR, and MBIR).

  1. Adaptive Statistical Iterative Reconstruction-V: Impact on Image Quality in Ultralow-Dose Coronary Computed Tomography Angiography.

    Science.gov (United States)

    Benz, Dominik C; Gräni, Christoph; Mikulicic, Fran; Vontobel, Jan; Fuchs, Tobias A; Possner, Mathias; Clerc, Olivier F; Stehli, Julia; Gaemperli, Oliver; Pazhenkottil, Aju P; Buechel, Ronny R; Kaufmann, Philipp A

    The clinical utility of a latest generation iterative reconstruction algorithm (adaptive statistical iterative reconstruction [ASiR-V]) has yet to be elucidated for coronary computed tomography angiography (CCTA). This study evaluates the impact of ASiR-V on signal, noise and image quality in CCTA. Sixty-five patients underwent clinically indicated CCTA on a 256-slice CT scanner using an ultralow-dose protocol. Data sets from each patient were reconstructed at 6 different levels of ASiR-V. Signal intensity was measured by placing a region of interest in the aortic root, LMA, and RCA. Similarly, noise was measured in the aortic root. Image quality was visually assessed by 2 readers. Median radiation dose was 0.49 mSv. Image noise decreased with increasing levels of ASiR-V resulting in a significant increase in signal-to-noise ratio in the RCA and LMA (P ASiR-V (P ASiR-V yields substantial noise reduction and improved image quality enabling introduction of ultralow-dose CCTA.

  2. Development of an item bank for the EORTC Role Functioning Computer Adaptive Test (EORTC RF-CAT)

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Petersen, Morten Aa.; Aaronson, Neil

    2016-01-01

    with good psychometric properties. The resulting item bank exhibits excellent reliability (mean reliability = 0.85, median = 0.95). Using the RF-CAT may allow sample size savings from 11 % up to 50 % compared to using the QLQ-C30 RF scale. CONCLUSIONS: The RF-CAT item bank improves the precision...... a computer-adaptive test (CAT) for RF. This was part of a larger project whose objective is to develop a CAT version of the EORTC QLQ-C30 which is one of the most widely used HRQOL instruments in oncology. METHODS: In accordance with EORTC guidelines, the development of the RF-CAT comprised four phases......, and evaluation of the psychometric performance of the RF-CAT. RESULTS: Phases I-III yielded a list of 12 items eligible for phase IV field-testing. The field-testing sample included 1,023 patients from Austria, Denmark, Italy, and the UK. Psychometric evaluation and item response theory analyses yielded 10 items...

  3. Enhancing performance of LCoS-SLM as adaptive optics by using computer-generated holograms modulation software

    Science.gov (United States)

    Tsai, Chun-Wei; Lyu, Bo-Han; Wang, Chen; Hung, Cheng-Chieh

    2017-05-01

    We have already developed multi-function and easy-to-use modulation software that was based on LabVIEW system. There are mainly four functions in this modulation software, such as computer generated holograms (CGH) generation, CGH reconstruction, image trimming, and special phase distribution. Based on the above development of CGH modulation software, we could enhance the performance of liquid crystal on silicon - spatial light modulator (LCoSSLM) as similar as the diffractive optical element (DOE) and use it on various adaptive optics (AO) applications. Through the development of special phase distribution, we are going to use the LCoS-SLM with CGH modulation software into AO technology, such as optical microscope system. When the LCOS-SLM panel is integrated in an optical microscope system, it could be placed on the illumination path or on the image forming path. However, LCOS-SLM provides a program-controllable liquid crystal array for optical microscope. It dynamically changes the amplitude or phase of light and gives the obvious advantage, "Flexibility", to the system

  4. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    Science.gov (United States)

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Application of Reinforcement Learning Algorithms for the Adaptive Computation of the Smoothing Parameter for Probabilistic Neural Network.

    Science.gov (United States)

    Kusy, Maciej; Zajdel, Roman

    2015-09-01

    In this paper, we propose new methods for the choice and adaptation of the smoothing parameter of the probabilistic neural network (PNN). These methods are based on three reinforcement learning algorithms: Q(0)-learning, Q(λ)-learning, and stateless Q-learning. We regard three types of PNN classifiers: the model that uses single smoothing parameter for the whole network, the model that utilizes single smoothing parameter for each data attribute, and the model that possesses the matrix of smoothing parameters different for each data variable and data class. Reinforcement learning is applied as the method of finding such a value of the smoothing parameter, which ensures the maximization of the prediction ability. PNN models with smoothing parameters computed according to the proposed algorithms are tested on eight databases by calculating the test error with the use of the cross validation procedure. The results are compared with state-of-the-art methods for PNN training published in the literature up to date and, additionally, with PNN whose sigma is determined by means of the conjugate gradient approach. The results demonstrate that the proposed approaches can be used as alternative PNN training procedures.

  6. Reduction of radiation exposure and improvement of image quality with BMI-adapted prospective cardiac computed tomography and iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Hosch, Waldemar, E-mail: waldemar.hosch@med.uni-heidelberg.de [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Stiller, Wolfram [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Mueller, Dirk [Philips GmbH Healthcare Division, Hamburg (Germany); Gitsioudis, Gitsios [University of Heidelberg, Department of Cardiology, Heidelberg (Germany); Welzel, Johanna; Dadrich, Monika [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Buss, Sebastian J.; Giannitsis, Evangelos [University of Heidelberg, Department of Cardiology, Heidelberg (Germany); Kauczor, Hans U. [University of Heidelberg, Department of Diagnostic and Interventional Radiology, Heidelberg (Germany); Katus, Hugo A. [University of Heidelberg, Department of Cardiology, Heidelberg (Germany); Korosoglou, Grigorios, E-mail: gkorosoglou@hotmail.com [University of Heidelberg, Department of Cardiology, Heidelberg (Germany)

    2012-11-15

    Purpose: To assess the impact of body mass index (BMI)-adapted protocols and iterative reconstruction algorithms (iDose) on patient radiation exposure and image quality in patients undergoing prospective ECG-triggered 256-slice coronary computed tomography angiography (CCTA). Methods: Image quality and radiation exposure were systematically analyzed in 100 patients. 60 Patients underwent prospective ECG-triggered CCTA using a non-tailored protocol and served as a 'control' group (Group 1: 120 kV, 200 mA s). 40 Consecutive patients with suspected coronary artery disease (CAD) underwent prospective CCTA, using BMI-adapted tube voltage and standard (Group 2: 100/120 kV, 100-200 mA s) versus reduced tube current (Group 3: 100/120 kV, 75-150 mA s). Iterative reconstructions were provided with different iDose levels and were compared to filtered back projection (FBP) reconstructions. Image quality was assessed in consensus of 2 experienced observers and using a 5-grade scale (1 = best to 5 = worse), and signal- and contrast-to-noise ratios (SNR and CNR) were quantified. Results: CCTA was performed without adverse events in all patients (n = 100, heart rate of 47-87 bpm and BMI of 19-38 kg/m{sup 2}). Patients examined using the non-tailored protocol in Group 1 had the highest radiation exposure (3.2 {+-} 0.4 mSv), followed by Group 2 (1.7 {+-} 0.7 mSv) and Group 3 (1.2 {+-} 0.6 mSv) (radiation savings of 47% and 63%, respectively, p < 0.001). Iterative reconstructions provided increased SNR and CNR, particularly when higher iDose level 5 was applied with Multi-Frequency reconstruction (iDose5 MFR) (14.1 {+-} 4.6 versus 21.2 {+-} 7.3 for SNR and 12.0 {+-} 4.2 versus 18.1 {+-} 6.6 for CNR, for FBP versus iDose5 MFR, respectively, p < 0.001). The combination of BMI adaptation with iterative reconstruction reduced radiation exposure and simultaneously improved image quality (subjective image quality of 1.4 {+-} 0.4 versus 1.9 {+-} 0.5 for Group 2 reconstructed using

  7. Short-term heat acclimation training improves physical performance: a systematic review, and exploration of physiological adaptations and application for team sports.

    Science.gov (United States)

    Chalmers, Samuel; Esterman, Adrian; Eston, Roger; Bowering, K Jane; Norton, Kevin

    2014-07-01

    Studies have demonstrated that longer-term heat acclimation training (≥8 heat exposures) improves physical performance. The physiological adaptations gained through short-term heat acclimation (STHA) training suggest that physical performance can be enhanced within a brief timeframe. The aim of this systematic review was to determine if STHA training (≤7 heat exposures) can improve physical performance in healthy adults. MEDLINE, PubMed, and SPORTDiscus™ databases were searched for available literature. Studies were included if they met the following criteria: STHA intervention, performance measure outcome, apparently healthy participants, adult participants (≥18 years of age), primary data, and human participants. A modified McMaster critical appraisal tool determined the level of bias in each included study. Eight papers met the inclusion criteria. Studies varied from having a low to a high risk of bias. The review identified aerobic-based tests of performance benefit from STHA training. Peak anaerobic power efforts have not been demonstrated to improve. At the review level, this systematic review did not include tolerance time exercise tests; however, certain professions may be interested in this type of exercise (e.g. fire-fighters). At the outcome level, the review was limited by the moderate level of bias that exists in the field. Only two randomized controlled trials were included. Furthermore, a limited number of studies could be identified (eight), and only one of these articles focused on women participants. The review identified that aerobic-based tests of performance benefit from STHA training. This is possibly through a number of cardiovascular, thermoregulatory, and metabolic adaptations improving the perception of effort and fatigue through a reduction in anaerobic energy release and elevation of the anaerobic threshold. These results should be viewed with caution due to the level of available evidence, and the limited number of papers that

  8. Online selection of short-lived particles on many-core computer architectures in the CBM experiment at FAIR

    Energy Technology Data Exchange (ETDEWEB)

    Zyzak, Maksym

    2016-07-07

    Modern experiments in heavy ion collisions operate with huge data rates that can not be fully stored on the currently available storage devices. Therefore the data flow should be reduced by selecting those collisions that potentially carry the information of the physics interest. The future CBM experiment will have no simple criteria for selecting such collisions and requires the full online reconstruction of the collision topology including reconstruction of short-lived particles. In this work the KF Particle Finder package for online reconstruction and selection of short-lived particles is proposed and developed. It reconstructs more than 70 decays, covering signals from all the physics cases of the CBM experiment: strange particles, strange resonances, hypernuclei, low mass vector mesons, charmonium, and open-charm particles. The package is based on the Kalman filter method providing a full set of the particle parameters together with their errors including position, momentum, mass, energy, lifetime, etc. It shows a high quality of the reconstructed particles, high efficiencies, and high signal to background ratios. The KF Particle Finder is extremely fast for achieving the reconstruction speed of 1.5 ms per minimum-bias AuAu collision at 25 AGeV beam energy on single CPU core. It is fully vectorized and parallelized and shows a strong linear scalability on the many-core architectures of up to 80 cores. It also scales within the First Level Event Selection package on the many-core clusters up to 3200 cores. The developed KF Particle Finder package is a universal platform for short- lived particle reconstruction, physics analysis and online selection.

  9. Health related quality of life in patients with diabetic foot ulceration - translation and Polish adaptation of Diabetic Foot Ulcer Scale short form.

    Science.gov (United States)

    Macioch, Tomasz; Sobol, Elżbieta; Krakowiecki, Arkadiusz; Mrozikiewicz-Rakowska, Beata; Kasprowicz, Monika; Hermanowski, Tomasz

    2017-01-21

    Diabetic foot ulcer (DFU) is a common complication of diabetes and not only an important factor of mortality among patients with diabetes but also decreases the quality of life. The short form of Diabetic Foot Ulcer Scale (DFS-SF) provides comprehensive measurement of the impact of diabetic foot ulcers on patients' health related quality of life (HRQoL). The purpose of this study was to translate DFS-SF into Polish and evaluate its psychometric performance in patients with diabetic foot ulcers. The DFS-SF translation process was performed in line with Principles of Good Practice for the Translation and Cultural Adaptation Process for patient reported outcome measures (PROMs) developed by ISPOR TCA group. Assessment of the reliability and validity of Polish DFS-SF was performed in native Polish patients with current DFU. The DFS-SF validation study involved 212 patients diagnosed with DFU, with 4.4 years of DFU duration on average. The average ulcer size was 5.5 sq. cm, and generally only one limb was affected. Men (72%) and type 2 diabetes patients (86%) prevailed, with 17.8 years representing the mean time since diagnosis. The mean population age was 62.5 years. The internal consistency of all scales of the Polish DFS-SF was high (Cronbach's alpha ranged from 0.82 to 0.93). Item convergent and discriminant validity was satisfactory (median corrected item-scale correlation ranged from 0.61 to 0.81). The Polish DFS-SF demonstrated good construct validity when correlated with the SF-36v2 and showed better psychometric performance than SF-36v2. The newly translated Polish DFS-SF may be used to assess the impact of DFU on HRQoL in Polish patients.

  10. Cultural adaptation and validation of the “Kidney Disease and Quality of Life - Short Form (KDQOL-SF™ version 1.3” questionnaire in Egypt

    Directory of Open Access Journals (Sweden)

    Abd ElHafeez Samar

    2012-12-01

    Full Text Available Abstract Background Health Related Quality of Life (HRQOL instruments need disease and country specific validation. In Arab countries, there is no specific validated questionnaire for assessment of HRQOL in chronic kidney disease (CKD patients. The aim of this study was to present an Arabic translation, adaptation, and the subsequent validation of the kidney disease quality of life-short form (KDQOL-SFTM version 1.3 questionnaire in a representative series of Egyptian CKD patients. Methods KDQOL-SFTM version 1.3 was translated into Arabic by two independent translators, and then subsequently translated back into English. After translation disparities were reconciled, the final Arabic questionnaire was tested by interviewing 100 pre-dialysis CKD (stage 1-4 patients randomly selected from outpatients attending the Nephrology clinic at the Main Alexandria University Hospital. Test re-test reliability was performed, with a subsample of 50 consecutive CKD patients, by two interviews 7 days apart and internal consistency estimated by Cronbach’s α. Discriminant, concept, and construct validity were assessed. Results All items of SF-36 met the criterion for internal consistency and were reproducible. Of the 10 kidney disease targeted scales, only three had Cronbach’s α TM 1.3 were significantly inter-correlated. Finally, principal component analysis of the kidney disease targeted scale indicated that this part of the questionnaire could be summarized into 10 factors that together explained 70.9% of the variance. Conclusion The results suggest that this Arabic version of the KDQOL-SFTM 1.3 questionnaire is a valid and reliable tool for use in Egyptian patients with CKD.

  11. Adaptive-predictive organ localization using cone-beam computed tomography for improved accuracy in external beam radiotherapy for bladder cancer.

    Science.gov (United States)

    Lalondrelle, Susan; Huddart, Robert; Warren-Oseni, Karole; Hansen, Vibeke Nordmark; McNair, Helen; Thomas, Karen; Dearnaley, David; Horwich, Alan; Khoo, Vincent

    2011-03-01

    To examine patterns of bladder wall motion during high-dose hypofractionated bladder radiotherapy and to validate a novel adaptive planning method, A-POLO, to prevent subsequent geographic miss. Patterns of individual bladder filling were obtained with repeat computed tomography planning scans at 0, 15, and 30minutes after voiding. A series of patient-specific plans corresponding to these time-displacement points was created. Pretreatment cone-beam computed tomography was performed before each fraction and assessed retrospectively for adaptive intervention. In fractions that would have required intervention, the most appropriate plan was chosen from the patient's "library," and the resulting target coverage was reassessed with repeat cone-beam computed tomography. A large variation in patterns of bladder filling and interfraction displacement was seen. During radiotherapy, predominant translations occurred cranially (maximum 2.5 cm) and anteriorly (maximum 1.75 cm). No apparent explanation was found for this variation using pretreatment patient factors. A need for adaptive planning was demonstrated by 51% of fractions, and 73% of fractions would have been delivered correctly using A-POLO. The adaptive strategy improved target coverage and was able to account for intrafraction motion also. Bladder volume variation will result in geographic miss in a high proportion of delivered bladder radiotherapy treatments. The A-POLO strategy can be used to correct for this and can be implemented from the first fraction of radiotherapy; thus, it is particularly suited to hypofractionated bladder radiotherapy regimens. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Short communication: Milk meal pattern of dairy calves is affected by computer-controlled milk feeder set-up

    DEFF Research Database (Denmark)

    Jensen, Margit Bak

    2009-01-01

    Ninety-six calves housed in groups of 8 were fed either a high milk allowance (heavy breeds 9.6 L/d; Jerseys 7.2 L/d) or a low milk allowance (heavy breeds 4.8 L/d; Jerseys 3.6 L/d) via a computer-controlled milk feeder. Half of the calves on each allowance could ingest the milk in 2 or more daily....... Thus, the development from small and frequent milk meals to fewer and larger meals reported by studies of natural suckling was also found among high-fed calves on a computer-controlled milk feeder. Irrespectively of minimum number of milk portions, the low-fed calves had more unrewarded visits...... to the computer-controlled milk feeder, indicating that they were attempting to get more milk. The results of the present study suggest that offering a high milk allowance and avoiding restriction on meal pattern may result in a feeder use that more closely resembles natural suckling....

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. Multidetector row computed tomography of acute pancreatitis: Utility of single portal phase CT scan in short-term follow up

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yongwonn [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Park, Hee Sun, E-mail: heesun.park@gmail.com [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Kim, Young Jun; Jung, Sung Il; Jeon, Hae Jeong [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of)

    2012-08-15

    Objective: The purpose of this study is to evaluate the question of whether nonenhanced CT or contrast enhanced portal phase CT can replace multiphasic pancreas protocol CT in short term monitoring in patients with acute pancreatitis. Materials and methods: This retrospective study was approved by the Institutional Review Board. From April 2006 to May 2010, a total of 52 patients having acute pancreatitis who underwent initial dual phase multidetector row CT (unenhanced, arterial, and portal phase) at admission and a short term (within 30 days) follow up dual phase CT (mean interval 10.3 days, range 3-28 days) were included. Two abdominal radiologists performed an independent review of three sets of follow up CT images (nonenhanced scan, single portal phase scan, and dual phase scan). Interpretation of each image set was done with at least 2-week interval. Radiologists evaluated severity of acute pancreatitis with regard to pancreatic inflammation, pancreatic necrosis, and extrapancreatic complication, based on the modified CT severity index. Scores of each image set were compared using a paired t-test and interobserver agreement was evaluated using intraclass correlation coefficient statistics. Results: Mean scores of sum of CT severity index on nonenhanced scan, portal phase scan, and dual phase scan were 5.7, 6.6, and 6.5 for radiologist 1, and 5.0, 5.6, and 5.8 for radiologist 2, respectively. In both radiologists, contrast enhanced scan (portal phase scan and dual phase scan) showed significantly higher severity score compared with that of unenhanced scan (P < 0.05), while portal phase and dual phase scan showed no significant difference each other. The trend was similar regarding pancreatic inflammation and extrapancreatic complications, in which contrast enhanced scans showed significantly higher score compared with those of unenhanced scan, while no significant difference was observed between portal phase scan and dual phase scan. In pancreatic necrosis

  15. Construct validity of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT) in children with medical complexity.

    Science.gov (United States)

    Dumas, Helene M; Fragala-Pinkham, Maria A; Rosen, Elaine L; O'Brien, Jane E

    2017-11-01

    To assess construct (convergent and divergent) validity of the Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) in a sample of children with complex medical conditions. Demographics, clinical information, PEDI-CAT normative score, and the Post-Acute Acuity Rating for Children (PAARC) level were collected for all post-acute hospital admissions (n = 110) from 1 April 2015 to 1 March 2016. Correlations between the PEDI-CAT Daily Activities, Mobility, and Social/Cognitive domain scores for the total sample and across three age groups (infant, preschool, and school-age) were calculated. Differences in mean PEDI-CAT scores for each domain across two groups, children with "Less Complexity," or "More Complexity" based on PAARC level were examined. All correlations for the total sample and age subgroups were statistically significant and trends across age groups were evident with the stronger associations between domains for the infant group. Significant differences were found between mean PEDI-CAT Daily Activities, Mobility, and Social/Cognitive normative scores across the two complexity groups with children in the "Less Complex" group having higher PEDI-CAT scores for all domains. This study provides evidence indicating the PEDI-CAT can be used with confidence in capturing and differentiating children's level of function in a post-acute care setting. Implications for Rehabilitation The PEDI-CAT is measure of function for children with a variety of conditions and can be used in any clinical setting. Convergent validity of the PEDI-CAT's Daily Activities, Mobility, and Social/Cognitive domains was significant and particularly strong for infants and young children with medical complexity. The PEDI-CAT was able to discriminate groups of children with differing levels of medical complexity admitted to a pediatric post-acute care hospital.

  16. Evaluating the Discriminant Validity of the Pediatric Evaluation of Disability Inventory: Computer Adaptive Test in Children With Cerebral Palsy.

    Science.gov (United States)

    Shore, Benjamin J; Allar, Benjamin G; Miller, Patricia E; Matheney, Travis H; Snyder, Brian D; Fragala-Pinkham, Maria A

    2017-06-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test (PEDI-CAT) is a new clinical assessment for children and youth from birth through 20 years of age. To determine the discriminant validity of the PEDI-CAT according to the Gross Motor Function Classification System (GMFCS) and Manual Ability Classification System (MACS) in children with cerebral palsy (CP). A prospective convenience cross-sectional sample of 101 school-age children with CP was stratified by GMFCS level. Participants were excluded if they underwent recent surgery (analysis was used to quantify the discriminant validity of the PEDI-CAT domains to distinguish the level of independence in fine and gross motor function. General linear modeling was used to assess discriminant ability across all GMFCS and MACS levels. Mean age was 11 years, 11 months (SD 3.7). Mobility and Daily Activities domains exhibited excellent discriminant validity distinguishing between ambulatory and nonambulatory participants [area under the curve (AUC) = 0.98 and 0.97, respectively] and the Daily Activities domain exhibited excellent discriminant validity distinguishing between independent and dependent hand function (AUC = 0.93). All PEDI-CAT domains were able to discriminate between ambulatory (GMFCS levels I-III) or nonambulatory (GMFCS levels IV-V) as well as manually independent (MACS levels I-II) or manually dependent functional levels (MACS levels III-V) ( P < .001). Our convenience cross-sectional sample included school-age children with primarily Caucasian, middle-income parents and may not be representative of other cultural, socioeconomic backgrounds. Not all participants had a MACS level assigned, however, no differences were found in PEDI-CAT scores between those with and without MACS scores. These results demonstrate that the PEDI-CAT is a valid outcome instrument for measuring functional abilities in children with CP, able to differentiate across fine and gross motor functional levels.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  19. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  20. Short-range order in ab initio computer generated amorphous and liquid Cu–Zr alloys: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    Galván-Colín, Jonathan, E-mail: jgcolin@ciencias.unam.mx [Instituto de Investigaciones en Materiales, Universidad Nacional Autónoma de México, Apartado Postal 70-360, México, D.F. 04510, México (Mexico); Valladares, Ariel A., E-mail: valladar@unam.mx [Instituto de Investigaciones en Materiales, Universidad Nacional Autónoma de México, Apartado Postal 70-360, México, D.F. 04510, México (Mexico); Valladares, Renela M.; Valladares, Alexander [Facultad de Ciencias, Universidad Nacional Autónoma de México, Apartado Postal 70-542, México, D.F. 04510, México (Mexico)

    2015-10-15

    Using ab initio molecular dynamics and a new approach based on the undermelt-quench method we generated amorphous and liquid samples of Cu{sub x}Zr{sub 100−x} (x=64, 50, 36) alloys. We characterized the topology of our resulting structures by means of the pair distribution function and the bond-angle distribution; a coordination number distribution was also calculated. Our results for both amorphous and liquids agree well with experiment. Dependence of short-range order with the concentration is reported. We found that icosahedron-like geometry plays a major role whenever the alloys are Cu-rich or Zr-rich disregarding if the samples are amorphous or liquid. The validation of these results, in turn would let us calculate other properties so far disregarded in the literature.

  1. Short-range order in ab initio computer generated amorphous and liquid Cu-Zr alloys: A new approach

    Science.gov (United States)

    Galván-Colín, Jonathan; Valladares, Ariel A.; Valladares, Renela M.; Valladares, Alexander

    2015-10-01

    Using ab initio molecular dynamics and a new approach based on the undermelt-quench method we generated amorphous and liquid samples of CuxZr100-x (x=64, 50, 36) alloys. We characterized the topology of our resulting structures by means of the pair distribution function and the bond-angle distribution; a coordination number distribution was also calculated. Our results for both amorphous and liquids agree well with experiment. Dependence of short-range order with the concentration is reported. We found that icosahedron-like geometry plays a major role whenever the alloys are Cu-rich or Zr-rich disregarding if the samples are amorphous or liquid. The validation of these results, in turn would let us calculate other properties so far disregarded in the literature.

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. Translation, adaptation, validation and performance of the American Weight Efficacy Lifestyle Questionnaire Short Form (WEL-SF to a Norwegian version: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Tone N. Flølo

    2014-09-01

    Full Text Available Background. Researchers have emphasized a need to identify predictors that can explain the variability in weight management after bariatric surgery. Eating self-efficacy has demonstrated predictive impact on patients’ adherence to recommended eating habits following multidisciplinary treatment programs, but has to a limited extent been subject for research after bariatric surgery. Recently an American short form version (WEL-SF of the commonly used Weight Efficacy Lifestyle Questionnaire (WEL was available for research and clinical purposes.Objectives. We intended to translate and culturally adapt the WEL-SF to Norwegian conditions, and to evaluate the new versions’ psychometrical properties in a Norwegian population of morbidly obese patients eligible for bariatric surgery.Design. Cross-sectionalMethods. A total of 225 outpatients selected for Laparoscopic sleeve gastrectomy (LSG were recruited; 114 non-operated and 111 operated patients, respectively. The questionnaire was translated through forward and backward procedures. Structural properties were assessed performing principal component analysis (PCA, correlation and regression analysis were conducted to evaluate convergent validity and sensitivity, respectively. Data was assessed by mean, median, item response, missing values, floor- and ceiling effect, Cronbach’s alpha and alpha if item deleted.Results. The PCA resulted in one factor with eigenvalue > 1, explaining 63.0% of the variability. The WEL-SF sum scores were positively correlated with the Self-efficacy and quality of life instruments (p < 0.001. The WEL-SF was associated with body mass index (BMI (p < 0.001 and changes in BMI (p = 0.026. A very high item response was obtained with only one missing value (0.4%. The ceiling effect was in average 0.9 and 17.1% in the non-operated and operated sample, respectively. Strong internal consistency (r = 0.92 was obtained, and Cronbach’s alpha remained high (0.86–0.92 if single

  4. An adaptive maneuvering logic computer program for the simulation of one-to-one air-to-air combat. Volume 2: Program description

    Science.gov (United States)

    Burgin, G. H.; Owens, A. J.

    1975-01-01

    A detailed description is presented of the computer programs in order to provide an understanding of the mathematical and geometrical relationships as implemented in the programs. The individual sbbroutines and their underlying mathematical relationships are described, and the required input data and the output provided by the program are explained. The relationship of the adaptive maneuvering logic program with the program to drive the differential maneuvering simulator is discussed.

  5. Effect of the Novel Polysaccharide PolyGlycopleX® on Short-Chain Fatty Acid Production in a Computer-Controlled in Vitro Model of the Human Large Intestine

    Directory of Open Access Journals (Sweden)

    Raylene A. Reimer

    2014-03-01

    Full Text Available Many of the health benefits associated with dietary fiber are attributed to their fermentation by microbiota and production of short chain fatty acids (SCFA. The aim of this study was to investigate the fermentability of the functional fiber PolyGlyopleX® (PGX® in vitro. A validated dynamic, computer-controlled in vitro system simulating the conditions in the proximal large intestine (TIM-2 was used. Sodium hydroxide (NaOH consumption in the system was used as an indicator of fermentability and SCFA and branched chain fatty acids (BCFA production was determined. NaOH consumption was significantly higher for Fructooligosaccharide (FOS than PGX, which was higher than cellulose (p = 0.002. At 32, 48 and 72 h, acetate and butyrate production were higher for FOS and PGX versus cellulose. Propionate production was higher for PGX than cellulose at 32, 48, 56 and 72 h and higher than FOS at 72 h (p = 0.014. Total BCFA production was lower for FOS compared to cellulose, whereas production with PGX was lower than for cellulose at 72 h. In conclusion, PGX is fermented by the colonic microbiota which appeared to adapt to the substrate over time. The greater propionate production for PGX may explain part of the cholesterol-lowering properties of PGX seen in rodents and humans.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  7. First Clinical Investigation of Cone Beam Computed Tomography and Deformable Registration for Adaptive Proton Therapy for Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Veiga, Catarina [Proton and Advanced RadioTherapy Group, Department of Medical Physics and Biomedical Engineering, University College London, London (United Kingdom); Janssens, Guillaume [Ion Beam Applications SA, Louvain-la-Neuve (Belgium); Teng, Ching-Ling [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Baudier, Thomas; Hotoiu, Lucian [iMagX Project, ICTEAM Institute, Université Catholique de Louvain, Louvain-la-Neuve (Belgium); McClelland, Jamie R. [Centre for Medical Image Computing, Department of Medical Physics and Biomedical Engineering, University College London, London (United Kingdom); Royle, Gary [Proton and Advanced RadioTherapy Group, Department of Medical Physics and Biomedical Engineering, University College London, London (United Kingdom); Lin, Liyong; Yin, Lingshu; Metz, James; Solberg, Timothy D.; Tochner, Zelig; Simone, Charles B.; McDonough, James [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Kevin Teo, Boon-Keng, E-mail: teok@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States)

    2016-05-01

    Purpose: An adaptive proton therapy workflow using cone beam computed tomography (CBCT) is proposed. It consists of an online evaluation of a fast range-corrected dose distribution based on a virtual CT (vCT) scan. This can be followed by more accurate offline dose recalculation on the vCT scan, which can trigger a rescan CT (rCT) for replanning. Methods and Materials: The workflow was tested retrospectively for 20 consecutive lung cancer patients. A diffeomorphic Morphon algorithm was used to generate the lung vCT by deforming the average planning CT onto the CBCT scan. An additional correction step was applied to account for anatomic modifications that cannot be modeled by deformation alone. A set of clinical indicators for replanning were generated according to the water equivalent thickness (WET) and dose statistics and compared with those obtained on the rCT scan. The fast dose approximation consisted of warping the initial planned dose onto the vCT scan according to the changes in WET. The potential under- and over-ranges were assessed as a variation in WET at the target's distal surface. Results: The range-corrected dose from the vCT scan reproduced clinical indicators similar to those of the rCT scan. The workflow performed well under different clinical scenarios, including atelectasis, lung reinflation, and different types of tumor response. Between the vCT and rCT scans, we found a difference in the measured 95% percentile of the over-range distribution of 3.4 ± 2.7 mm. The limitations of the technique consisted of inherent uncertainties in deformable registration and the drawbacks of CBCT imaging. The correction step was adequate when gross errors occurred but could not recover subtle anatomic or density changes in tumors with complex topology. Conclusions: A proton therapy workflow based on CBCT provided clinical indicators similar to those using rCT for patients with lung cancer with considerable anatomic changes.

  8. A qualitative and quantitative analysis of radiation dose and image quality of computed tomography images using adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Hussain, Fahad Ahmed; Mail, Noor; Shamy, Abdulrahman M; Suliman, Alghamdi; Saoudi, Abdelhamid

    2016-05-08

    Image quality is a key issue in radiology, particularly in a clinical setting where it is important to achieve accurate diagnoses while minimizing radiation dose. Some computed tomography (CT) manufacturers have introduced algorithms that claim significant dose reduction. In this study, we assessed CT image quality produced by two reconstruction algorithms provided with GE Healthcare's Discovery 690 Elite positron emission tomography (PET) CT scanner. Image quality was measured for images obtained at various doses with both conventional filtered back-projection (FBP) and adaptive statistical iterative reconstruction (ASIR) algorithms. A stan-dard CT dose index (CTDI) phantom and a pencil ionization chamber were used to measure the CT dose at 120 kVp and an exposure of 260 mAs. Image quality was assessed using two phantoms. CT images of both phantoms were acquired at tube voltage (kV) of 120 with exposures ranging from 25 mAs to 400 mAs. Images were reconstructed using FBP and ASIR ranging from 10% to 100%, then analyzed for noise, low-contrast detectability, contrast-to-noise ratio (CNR), and modulation transfer function (MTF). Noise was 4.6 HU in water phantom images acquired at 260 mAs/FBP 120 kV and 130 mAs/50% ASIR 120 kV. The large objects (fre-quency ASIR, compared to 260 mAs/FBP. The application of ASIR for small objects (frequency >7 lp/cm) showed poor visibility compared to FBP at 260 mAs and even worse for images acquired at less than 130 mAs. ASIR blending more than 50% at low dose tends to reduce contrast of small objects (frequency >7 lp/cm). We concluded that dose reduction and ASIR should be applied with close attention if the objects to be detected or diagnosed are small (frequency > 7 lp/cm). Further investigations are required to correlate the small objects (frequency > 7 lp/cm) to patient anatomy and clinical diagnosis.

  9. Reliability, validity and administrative burden of the community reintegration of injured service members computer adaptive test (CRIS-CAT”

    Directory of Open Access Journals (Sweden)

    Resnik Linda

    2012-09-01

    Full Text Available Abstract Background The Computer Adaptive Test version of the Community Reintegration of Injured Service Members measure (CRIS-CAT consists of three scales measuring Extent of, Perceived Limitations in, and Satisfaction with community integration. The CRIS-CAT was developed using item response theory methods. The purposes of this study were to assess the reliability, concurrent, known group and predictive validity and respondent burden of the CRIS-CAT. The CRIS-CAT was developed using item response theory methods. The purposes of this study were to assess the reliability, concurrent, known group and predictive validity and respondent burden of the CRIS-CAT. Methods This was a three-part study that included a 1 a cross-sectional field study of 517 homeless, employed, and Operation Enduring Freedom / Operation Iraqi Freedom (OEF/OIF Veterans; who completed all items in the CRIS item set, 2 a cohort study with one year follow-up study of 135 OEF/OIF Veterans, and 3 a 50-person study of CRIS-CAT administration. Conditional reliability of simulated CAT scores was calculated from the field study data, and concurrent validity and known group validity were examined using Pearson product correlations and ANOVAs. Data from the cohort were used to examine the ability of the CRIS-CAT to predict key one year outcomes. Data from the CRIS-CAT administration study were used to calculate ICC (2,1 minimum detectable change (MDC, and average number of items used during CAT administration. Results Reliability scores for all scales were above 0.75, but decreased at both ends of the score continuum. CRIS-CAT scores were correlated with concurrent validity indicators and differed significantly between the three Veteran groups (P 0.9. MDCs were 5.9, 6.2, and 3.6, respectively for Extent, Perceived and Satisfaction subscales. Number of items (mn, SD administered at Visit 1 were 14.6 (3.8 10.9 (2.7 and 10.4 (1.7 respectively for Extent, Perceived and Satisfaction

  10. Human short-term exposure to electromagnetic fields emitted by mobile phones decreases computer-assisted visual reaction time.

    Science.gov (United States)

    Mortazavi, S M J; Rouintan, M S; Taeb, S; Dehghan, N; Ghaffarpanah, A A; Sadeghi, Z; Ghafouri, F

    2012-06-01

    The worldwide dramatic increase in mobile phone use has generated great concerns about the detrimental effects of microwave radiations emitted by these communication devices. Reaction time plays a critical role in performing tasks necessary to avoid hazards. As far as we know, this study is the first survey that reports decreased reaction time after exposure to electromagnetic fields generated by a high specific absorption rate mobile phone. It is also the first study in which previous history of mobile phone use is taken into account. The aim of this study was to assess both the acute and chronic effects of electromagnetic fields emitted by mobile phones on reaction time in university students. Visual reaction time (VRT) of young university students was recorded with a simple blind computer-assisted-VRT test, before and after a 10 min real/sham exposure to electromagnetic fields of mobile phones. Participants were 160 right-handed university students aged 18-31. To assess the effect of chronic exposures, the reaction time in sham-exposed phases were compared among low level, moderate and frequent users of mobile phones. The mean ± SD reaction time after real exposure and sham exposure were 286.78 ± 31.35 ms and 295.86 ± 32.17 ms (P reaction time either in talk or in standby mode. The reaction time either in talk or in standby mode was shorter in male students. The students' VRT was significantly affected by exposure to electromagnetic fields emitted by a mobile phone. It can be concluded that these exposures cause decreased reaction time, which may lead to a better response to different hazards. In this light, this phenomenon might decrease the chances of human errors and fatal accidents.

  11. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  12. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    Science.gov (United States)

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  13. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  14. Finite element model approach of a cylindrical lithium ion battery cell with a focus on minimization of the computational effort and short circuit prediction

    Science.gov (United States)

    Raffler, Marco; Sevarin, Alessio; Ellersdorfer, Christian; Heindl, Simon F.; Breitfuss, Christoph; Sinz, Wolfgang

    2017-08-01

    In this research, a parameterized beam-element-based mechanical modeling approach for cylindrical lithium ion batteries is developed. With the goal to use the cell model in entire vehicle crash simulations, focus of development is on minimizing the computational effort whilst simultaneously obtaining accurate mechanical behavior. The cylindrical cell shape is approximated by radial beams connected to each other in circumferential and longitudinal directions. The discrete beam formulation is used to define an anisotropic material behavior. An 18650 lithium ion cell model constructed in LS-Dyna is used to show the high degree of parameterization of the approach. A criterion which considers the positive pole deformation and the radial deformation of the cell is developed for short circuit prediction during simulation. An abuse testing program, consisting of radial crush, axial crush, and penetration is performed to evaluate the mechanical properties and internal short circuit behavior of a commercially available 18650 lithium cell. Additional 3-point-bending tests are performed to verify the approach objectively. By reducing the number of strength-related elements to 1600, a fast and accurate cell model can be created. Compared to typical cell models in technical literature, simulation time of a single cell load case can be reduced by approx. 90%.

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. SAGE: A 2-D self-adaptive grid evolution code and its application in computational fluid dynamics

    Science.gov (United States)

    Davies, Carol B.; Venkatapathy, Ethiraj; Deiwert, George S.

    1989-01-01

    SAGE is a user-friendly, highly efficient, two-dimensional self-adaptive grid code based on Nakahashi and Deiwert's variational principles method. Grid points are redistributed into regions of high flowfield gradients while maintaining smoothness and orthogonality of the grid. Efficiency is obtained by splitting the adaption into 2 directions and applying one-sided torsion control, thus producing a 1-D elliptic system that can be solved as a set of tridiagonal equations.

  1. An interpolation-free ALE scheme for unsteady inviscid flows computations with large boundary displacements over three-dimensional adaptive grids

    Science.gov (United States)

    Re, B.; Dobrzynski, C.; Guardone, A.

    2017-07-01

    A novel strategy to solve the finite volume discretization of the unsteady Euler equations within the Arbitrary Lagrangian-Eulerian framework over tetrahedral adaptive grids is proposed. The volume changes due to local mesh adaptation are treated as continuous deformations of the finite volumes and they are taken into account by adding fictitious numerical fluxes to the governing equation. This peculiar interpretation enables to avoid any explicit interpolation of the solution between different grids and to compute grid velocities so that the Geometric Conservation Law is automatically fulfilled also for connectivity changes. The solution on the new grid is obtained through standard ALE techniques, thus preserving the underlying scheme properties, such as conservativeness, stability and monotonicity. The adaptation procedure includes node insertion, node deletion, edge swapping and points relocation and it is exploited both to enhance grid quality after the boundary movement and to modify the grid spacing to increase solution accuracy. The presented approach is assessed by three-dimensional simulations of steady and unsteady flow fields. The capability of dealing with large boundary displacements is demonstrated by computing the flow around the translating infinite- and finite-span NACA 0012 wing moving through the domain at the flight speed. The proposed adaptive scheme is applied also to the simulation of a pitching infinite-span wing, where the bi-dimensional character of the flow is well reproduced despite the three-dimensional unstructured grid. Finally, the scheme is exploited in a piston-induced shock-tube problem to take into account simultaneously the large deformation of the domain and the shock wave. In all tests, mesh adaptation plays a crucial role.

  2. Effects of a manualized short-term treatment of internet and computer game addiction (STICA): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Jäger, Susanne; Müller, Kai W; Ruckes, Christian; Wittig, Tobias; Batra, Anil; Musalek, Michael; Mann, Karl; Wölfling, Klaus; Beutel, Manfred E

    2012-04-27

    In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA) and computer addiction (CA) in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA). The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression) will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. ClinicalTrials (NCT01434589).

  3. Effects of a manualized short-term treatment of internet and computer game addiction (STICA: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Jäger Susanne

    2012-04-01

    Full Text Available Abstract Background In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA and computer addiction (CA in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. Methods/design This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA. The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. Discussion A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. Trial Registration ClinicalTrials (NCT01434589

  4. Online adaptation of a c-VEP Brain-computer Interface(BCI) based on error-related potentials and unsupervised learning.

    Science.gov (United States)

    Spüler, Martin; Rosenstiel, Wolfgang; Bogdan, Martin

    2012-01-01

    The goal of a Brain-Computer Interface (BCI) is to control a computer by pure brain activity. Recently, BCIs based on code-modulated visual evoked potentials (c-VEPs) have shown great potential to establish high-performance communication. In this paper we present a c-VEP BCI that uses online adaptation of the classifier to reduce calibration time and increase performance. We compare two different approaches for online adaptation of the system: an unsupervised method and a method that uses the detection of error-related potentials. Both approaches were tested in an online study, in which an average accuracy of 96% was achieved with adaptation based on error-related potentials. This accuracy corresponds to an average information transfer rate of 144 bit/min, which is the highest bitrate reported so far for a non-invasive BCI. In a free-spelling mode, the subjects were able to write with an average of 21.3 error-free letters per minute, which shows the feasibility of the BCI system in a normal-use scenario. In addition we show that a calibration of the BCI system solely based on the detection of error-related potentials is possible, without knowing the true class labels.

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  7. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  8. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. Adaptive Allocation of Decision Making Responsibility Between Human and Computer in Multi-Task Situations. Ph.D. Thesis

    Science.gov (United States)

    Chu, Y. Y.

    1978-01-01

    A unified formulation of computer-aided, multi-task, decision making is presented. Strategy for the allocation of decision making responsibility between human and computer is developed. The plans of a flight management systems are studied. A model based on the queueing theory was implemented.

  13. Micro-Computed Tomography Study of Filling Material Removal from Oval-shaped Canals by Using Rotary, Reciprocating, and Adaptive Motion Systems.

    Science.gov (United States)

    Crozeta, Bruno Monguilhott; Silva-Sousa, Yara Teresinha Correa; Leoni, Graziela Bianchi; Mazzi-Chaves, Jardel Francisco; Fantinato, Thais; Baratto-Filho, Flares; Sousa-Neto, Manoel Damião

    2016-05-01

    This study evaluated filling material removal from distal oval-shaped canals of mandibular molars with rotary, reciprocating, and adaptive motion systems by using micro-computed tomography. After cone-beam computed tomography scanning, 21 teeth were selected, prepared up to a size 40 file, root filled, and divided into 3 groups (n = 7) according to the filling material removal technique: group PTUR, ProTaper Universal Retreatment combined with ProTaper Universal F2, F3, F4, and F5 files; group RP, Reciproc R50 file; and group TFA: TF Adaptive 50.04 files. The specimens were scanned preoperatively and postoperatively to assess filling material removal by using micro-computed tomography imaging, and the percent volume of residual filling material was calculated. The statistical analysis showed the lowest percent volume of residual filling material at the coronal third in all groups (P  .05). In the middle third, group TFA (31.2 ± 10.1) showed lower volume of residual filling material than group RP (52.4 ± 14.1) (P material than group RP (70.6 ± 7.2) (P material from the canals. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  15. Cross-cultural adaptation of the Safety Attitudes Questionnaire - Short Form 2006 for Brazil Cuestionario de actitudes de seguridad: adaptación transcultural del Safety Attitudes Questionnaire - Short Form 2006 para Brasil Questionário Atitudes de Segurança: adaptação transcultural do Safety Attitudes Questionnaire - Short Form 2006 para o Brasil

    Directory of Open Access Journals (Sweden)

    Rhanna Emanuela Fontenele Lima de Carvalho

    2012-06-01

    Full Text Available The objective of this study was to perform a cross-cultural adaptation of the Safety Attitudes Questionnaire - Short Form 2006 for Brazil. The instrument was applied in six hospitals in three regions of Brazil. Content, face, and construct validity was performed. Analysis of the instrument's reliability was performed by verifying the items' internal consistency through Cronbach's alpha. The sample was composed of 1301 professionals working in clinical and surgical wards of six hospitals. Confirmatory analysis showed that the model including 41 items was satisfactory. The Portuguese version presented an alpha of 0.89. The item-total correlations among the domains were moderate to strong, except for the domain Stress Recognition. We concluded that the instrument's version adapted to Portuguese and applied in our sample is valid and reliable.El objetivo de este estudio fue el de adaptación transcultural del cuestionario Actitudes de Seguridad - Short Form 2006 para Brasil. Métodos: El instrumento fue aplicado en seis hospitales en tres regiones del Brasil. Se realizó la validez de contenido, la cara y la construcción. El análisis de confiabilidad del instrumento se realizó mediante el análisis de la consistencia interna de los ítems a través de alfa de Cronbach. Resultados: La muestra del estudio fue compuesto por 1.301 profesionales en salas clínicas y cirugía. El análisis confirmatorio mostró que el ajuste del modelo final de los 41 ítems fue satisfactorio. La versión en portugués del instrumento mostró un alfa de 0,89. Las correlaciones ítem-total entre los dominios se consideran entre moderados y fuertes, con la excepción de dominio Percepción del Estrés. Conclusión: Se concluye, que la versión adaptada del instrumento al portugués se considera válida y fiable en la muestra.O objetivo deste estudo foi realizar a adaptação transcultural do Safety Attitudes Questionnaire - Short Form 2006 para o Brasil. O instrumento

  16. Cross-cultural development of an item list for computer-adaptive testing of fatigue in oncological patients

    NARCIS (Netherlands)

    Giesinger, J.M.; Petersen, M.A.; Groenvold, M.; Aaronson, N.K.; Arraras, J.I.; Conroy, T.; Gamper, E.M.; Kemmler, G.; King, M.T.; Oberguggenberger, A.S.; Velikova, G.; Young, T.; Holzner, B.

    2011-01-01

    Introduction Within an ongoing project of the EORTC Quality of Life Group, we are developing computerized adaptive test (CAT) measures for the QLQ-C30 scales. These new CAT measures are conceptualised to reflect the same constructs as the QLQ-C30 scales. Accordingly, the Fatigue-CAT is intended to

  17. Cross-cultural development of an item list for computer-adaptive testing of fatigue in oncological patients

    DEFF Research Database (Denmark)

    Giesinger, Johannes M; Aa Petersen, Morten; Groenvold, Mogens

    2011-01-01

    Within an ongoing project of the EORTC Quality of Life Group, we are developing computerized adaptive test (CAT) measures for the QLQ-C30 scales. These new CAT measures are conceptualised to reflect the same constructs as the QLQ-C30 scales. Accordingly, the Fatigue-CAT is intended to capture phy...

  18. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi?GPU System

    KAUST Repository

    Charara, Ali

    2014-05-04

    European Extreme Large Telescope (E-ELT) is a high priority project in ground based astronomy that aims at constructing the largest telescope ever built. MOSAIC is an instrument proposed for E-ELT using Multi- Object Adaptive Optics (MOAO) technique for astronomical telescopes, which compensates for effects of atmospheric turbulence on image quality, and operates on patches across a large FoV.

  19. A mathematical model of neuromuscular adaptation to resistance training and its application in a computer simulation of accommodating loads.

    Science.gov (United States)

    Arandjelović, Ognjen

    2010-10-01

    A large corpus of data obtained by means of empirical study of neuromuscular adaptation is currently of limited use to athletes and their coaches. One of the reasons lies in the unclear direct practical utility of many individual trials. This paper introduces a mathematical model of adaptation to resistance training, which derives its elements from physiological fundamentals on the one side, and empirical findings on the other. The key element of the proposed model is what is here termed the athlete's capability profile. This is a generalization of length and velocity dependent force production characteristics of individual muscles, to an exercise with arbitrary biomechanics. The capability profile, a two-dimensional function over the capability plane, plays the central role in the proposed model of the training-adaptation feedback loop. Together with a dynamic model of resistance the capability profile is used in the model's predictive stage when exercise performance is simulated using a numerical approximation of differential equations of motion. Simulation results are used to infer the adaptational stimulus, which manifests itself through a fed back modification of the capability profile. It is shown how empirical evidence of exercise specificity can be formulated mathematically and integrated in this framework. A detailed description of the proposed model is followed by examples of its application-new insights into the effects of accommodating loading for powerlifting are demonstrated. This is followed by a discussion of the limitations of the proposed model and an overview of avenues for future work.

  20. Computer adaptive practice of Maths ability using a new item response model for on the fly ability and difficulty estimation

    NARCIS (Netherlands)

    Klinkenberg, S.; Straatemeier, M.; van der Maas, H.L.J.

    2011-01-01

    In this paper we present a model for computerized adaptive practice and monitoring. This model is used in the Maths Garden, a web-based monitoring system, which includes a challenging web environment for children to practice arithmetic. Using a new item response model based on the Elo (1978) rating

  1. An Interactive Computer-Aided Instructional Strategy and Assessment Methods for System Identification and Adaptive Control Laboratory

    Science.gov (United States)

    Özbek, Necdet Sinan; Eker, Ilyas

    2015-01-01

    This study describes a set of real-time interactive experiments that address system identification and model reference adaptive control (MRAC) techniques. In constructing laboratory experiments that contribute to efficient teaching, experimental design and instructional strategy are crucial, but a process for doing this has yet to be defined. This…

  2. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    2017-08-12

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P V 90% showed superior LCD and had the highest CNR in the liver, aorta, and, pancreas, measuring 7.32 ± 3.22, 11.60 ± 4.25, and 4.60 ± 2.31, respectively, compared with the next best series of ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  3. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  4. A computational method for determination of a frequency response characteristic of flexibly supported rigid rotors attenuated by short magnetorheological squeeze film dampers

    Directory of Open Access Journals (Sweden)

    Zapoměl J.

    2011-06-01

    Full Text Available Lateral vibration of rotors can be significantly reduced by inserting the damping elements between the shaft and the casing. The theoretical analysis, confirmed by computational simulations, shows that to achieve the optimum compromise between attenuation of the oscillation amplitude and magnitude of the forces transmitted through the coupling elements between the rotor and the stationary part, the damping effect must be controllable. For this purpose, the squeeze film dampers lubricated by magnetorheological fluid can be applied. The damping effect is controlled by the change of intensity of the magnetic field in the lubricating film. This article presents a procedure developed for investigation of the steady state response of rigid rotors coupled with the casing by flexible elements and short magnetorheological dampers. Their lateral vibration is governed by nonlinear (due to the damping forces equations of motion. The steady state solution is obtained by application of a collocation method, which arrives at solving a set of nonlinear algebraic equations. The pressure distribution in the oil film is described by a Reynolds equation modified for the case of short dampers and Bingham fluid. Components of the damping force are calculated by integration of the pressure distribution around the circumference and along the length of the damper. The developed procedure makes possible to determine the steady state response of rotors excited by their unbalance, to determine magnitude of the forces transmitted through the coupling elements in the supports into the stationary part and is intended for proposing the control of the damping effect to achieve optimum performance of the dampers.

  5. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  6. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface

    National Research Council Canada - National Science Library

    Acqualagna, Laura; Botrel, Loic; Vidaurre, Carmen; Kübler, Andrea; Blankertz, Benjamin

    2016-01-01

    In the last years Brain Computer Interface (BCI) technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration...

  7. ADAPTATION OF JOHNSON SEQUENCING ALGORITHM FOR JOB SCHEDULING TO MINIMISE THE AVERAGE WAITING TIME IN CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    SOUVIK PAL

    2016-09-01

    Full Text Available Cloud computing is an emerging paradigm of Internet-centric business computing where Cloud Service Providers (CSPs are providing services to the customer according to their needs. The key perception behind cloud computing is on-demand sharing of resources available in the resource pool provided by CSP, which implies new emerging business model. The resources are provisioned when jobs arrive. The job scheduling and minimization of waiting time are the challenging issue in cloud computing. When a large number of jobs are requested, they have to wait for getting allocated to the servers which in turn may increase the queue length and also waiting time. This paper includes system design for implementation which is concerned with Johnson Scheduling Algorithm that provides the optimal sequence. With that sequence, service times can be obtained. The waiting time and queue length can be reduced using queuing model with multi-server and finite capacity which improves the job scheduling model.

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  9. Intestinal adaptation is stimulated by partial enteral nutrition supplemented with the prebiotic short-chain fructooligosaccharide in a neonatal intestinal failure piglet model

    DEFF Research Database (Denmark)

    Barnes, Jennifer L; Hartmann, Bolette; Holst, Jens Juul

    2012-01-01

    Butyrate has been shown to stimulate intestinal adaptation when added to parenteral nutrition (PN) following small bowel resection but is not available in current PN formulations. The authors hypothesized that pre- and probiotic administration may be a clinically feasible method to administer...

  10. Short-term follow-up of masticatory adaptation after rehabilitation with an immediately loaded implant-supported prosthesis: a pilot assessment.

    Science.gov (United States)

    Tanaka, Mihoko; Bruno, Collaert; Jacobs, Reinhilde; Torisu, Tetsurou; Murata, Hiroshi

    2017-12-01

    When teeth are extracted, sensory function is decreased by a loss of periodontal ligament receptions. When replacing teeth by oral implants, one hopes to restore the sensory feedback pathway as such to allow for physiological implant integration and optimized oral function with implant-supported prostheses. What remains to be investigated is how to adapt to different oral rehabilitations. The purpose of this pilot study was to assess four aspects of masticatory adaptation after rehabilitation with an immediately loaded implant-supported prosthesis and to observe how each aspect will recover respectively. Eight participants with complete dentures were enrolled. They received an implant-supported acrylic resin provisional bridge, 1 day after implant surgery. Masticatory adaptation was examined by assessing occlusal contact, approximate maximum bite force, masticatory efficiency of gum-like specimens, and food hardness perception. Occlusal contact and approximate maximum bite force were significantly increased 3 months after implant rehabilitation, with the bite force gradually building up to a 72% increase compared to baseline. Masticatory efficiency increased by 46% immediately after surgery, stabilizing at around 40% 3 months after implant rehabilitation. Hardness perception also improved, with a reduction of the error rate by 16% over time. This assessment demonstrated masticatory adaptation immediately after implant rehabilitation with improvements noted up to 3 months after surgery and rehabilitation. It was also observed that, despite gradually improved bite force in all patients, masticatory efficiency and food hardness perception did not necessarily follow this tendency. The findings in this pilot may also be used to assess adaptation of oral function after implant rehabilitation by studying the combined outcome of four tests (occlusal contact, maximum bite force, masticatory efficiency, and food hardness perception).

  11. Translation, cultural adaptation assessment, and both validity and reliability testing of the kidney disease quality of life - short form version 1.3 for use with Iranian patients

    DEFF Research Database (Denmark)

    Pakpour, Amir; Yekaninejad, Mirsaeed; Mølsted, Stig

    2011-01-01

    AIM: The aims of the study were to translate the Kidney Disease Quality of Life--Short Form version 1.3 (KDQOL-SF ver. 1.3) questionnaire into Iranian (Farsi), and to then assess it in terms of validity and reliability on Iranian patients. METHODS: The questionnaire was first translated into Fars...

  12. Computer-aided detection of colonic polyps with level set-based adaptive convolution in volumetric mucosa to advance CT colonography toward a screening modality

    Directory of Open Access Journals (Sweden)

    Hongbin Zhu

    2009-03-01

    Full Text Available Hongbin Zhu1, Chaijie Duan1, Perry Pickhardt2, Su Wang1, Zhengrong Liang1,31Department of Radiology, 3Department of Computer Science, State University of New York, Stony Brook, NY, USA; 2Department of Radiology, University of Wisconsin Medical School, Madison, WI, USAAbstract: As a promising second reader of computed tomographic colonography (CTC screening, the computer-aided detection (CAD of colonic polyps has earned fast growing research interest. In this paper, we present a CAD scheme to automatically detect colonic polyps in CTC images. First, a thick colon wall representation, ie, a volumetric mucosa (VM with several voxels wide in general, was segmented from CTC images by a partial-volume image segmentation algorithm. Based on the VM, we employed a level set-based adaptive convolution method for calculating the first- and second-order spatial derivatives more accurately to start the geometric analysis. Furthermore, to emphasize the correspondence among different layers in the VM, we introduced a middle-layer enhanced integration along the image gradient direction inside the VM to improve the operation of extracting the geometric information, like the principal curvatures. Initial polyp candidates (IPCs were then determined by thresholding the geometric measurements. Based on IPCs, several features were extracted for each IPC, and fed into a support vector machine to reduce false positives (FPs. The final detections were displayed in a commercial system to provide second opinions for radiologists. The CAD scheme was applied to 26 patient CTC studies with 32 confirmed polyps by both optical and virtual colonoscopies. Compared to our previous work, all the polyps can be detected successfully with less FPs. At the 100% by polyp sensitivity, the new method yielded 3.5 FPs/dataset. Keywords: colonic polyps, level set, adaptive convolution, middle-layer enhanced integration, volumetric mucosa, support vector machine, computer-aided detection, CT

  13. Application of an Adaptive Polynomial Chaos Expansion on Computationally Expensive Three-Dimensional Cardiovascular Models for Uncertainty Quantification and Sensitivity Analysis.

    Science.gov (United States)

    Quicken, Sjeng; Donders, Wouter P; van Disseldorp, Emiel M J; Gashi, Kujtim; Mees, Barend M E; van de Vosse, Frans N; Lopata, Richard G P; Delhaas, Tammo; Huberts, Wouter

    2016-12-01

    When applying models to patient-specific situations, the impact of model input uncertainty on the model output uncertainty has to be assessed. Proper uncertainty quantification (UQ) and sensitivity analysis (SA) techniques are indispensable for this purpose. An efficient approach for UQ and SA is the generalized polynomial chaos expansion (gPCE) method, where model response is expanded into a finite series of polynomials that depend on the model input (i.e., a meta-model). However, because of the intrinsic high computational cost of three-dimensional (3D) cardiovascular models, performing the number of model evaluations required for the gPCE is often computationally prohibitively expensive. Recently, Blatman and Sudret (2010, "An Adaptive Algorithm to Build Up Sparse Polynomial Chaos Expansions for Stochastic Finite Element Analysis," Probab. Eng. Mech., 25(2), pp. 183-197) introduced the adaptive sparse gPCE (agPCE) in the field of structural engineering. This approach reduces the computational cost with respect to the gPCE, by only including polynomials that significantly increase the meta-model's quality. In this study, we demonstrate the agPCE by applying it to a 3D abdominal aortic aneurysm (AAA) wall mechanics model and a 3D model of flow through an arteriovenous fistula (AVF). The agPCE method was indeed able to perform UQ and SA at a significantly lower computational cost than the gPCE, while still retaining accurate results. Cost reductions ranged between 70-80% and 50-90% for the AAA and AVF model, respectively.

  14. Adaptive discrete cosine transform-based image compression method on a heterogeneous system platform using Open Computing Language

    Science.gov (United States)

    Alqudami, Nasser; Kim, Shin-Dug

    2014-11-01

    Discrete cosine transform (DCT) is one of the major operations in image compression standards and it requires intensive and complex computations. Recent computer systems and handheld devices are equipped with high computing capability devices such as a general-purpose graphics processing unit (GPGPU) in addition to the traditional multicores CPU. We develop an optimized parallel implementation of the forward DCT algorithm for the JPEG image compression using the recently proposed Open Computing Language (OpenCL). This OpenCL parallel implementation combines a multicore CPU and a GPGPU in a single solution to perform DCT computations in an efficient manner by applying certain optimization techniques to enhance the kernel execution time and data movements. A separate optimal OpenCL kernel code was developed (CPU-based and GPU-based kernels) based on certain appropriate device-based optimization factors, such as thread-mapping, thread granularity, vector-based memory access, and the given workload. The performance of DCT is evaluated on a heterogeneous environment and our OpenCL parallel implementation results in speeding up the execution of the DCT by the factors of 3.68 and 5.58 for different image sizes and formats in terms of workload allocations and data transfer mechanisms. The obtained speedup indicates the scalability of the DCT performance.

  15. Ensemble based adaptive over-sampling method for imbalanced data learning in computer aided detection of microaneurysm.

    Science.gov (United States)

    Ren, Fulong; Cao, Peng; Li, Wei; Zhao, Dazhe; Zaiane, Osmar

    2017-01-01

    Diabetic retinopathy (DR) is a progressive disease, and its detection at an early stage is crucial for saving a patient's vision. An automated screening system for DR can help in reduce the chances of complete blindness due to DR along with lowering the work load on ophthalmologists. Among the earliest signs of DR are microaneurysms (MAs). However, current schemes for MA detection appear to report many false positives because detection algorithms have high sensitivity. Inevitably some non-MAs structures are labeled as MAs in the initial MAs identification step. This is a typical "class imbalance problem". Class imbalanced data has detrimental effects on the performance of conventional classifiers. In this work, we propose an ensemble based adaptive over-sampling algorithm for overcoming the class imbalance problem in the false positive reduction, and we use Boosting, Bagging, Random subspace as the ensemble framework to improve microaneurysm detection. The ensemble based over-sampling methods we proposed combine the strength of adaptive over-sampling and ensemble. The objective of the amalgamation of ensemble and adaptive over-sampling is to reduce the induction biases introduced from imbalanced data and to enhance the generalization classification performance of extreme learning machines (ELM). Experimental results show that our ASOBoost method has higher area under the ROC curve (AUC) and G-mean values than many existing class imbalance learning methods. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Short-term outcomes and safety of computed tomography-guided percutaneous microwave ablation of solitary adrenal metastasis from lung cancer: A multi-center retrospective study

    Energy Technology Data Exchange (ETDEWEB)

    Men, Min; Ye, Xin; Yang, Xia; Zheng, Aimin; Huang, Guang Hui; Wei, Zhigang [Dept. of Oncology, Shandong Provincial Hospital Affiliated with Shandong University, Jinan (China); Fan, Wei Jun [Imaging and Interventional Center, Sun Yat-sen University Cancer Center, Guangzhou (China); Zhang, Kaixian [Dept. of Oncology, Teng Zhou Central People' s Hospital Affiliated with Jining Medical College, Tengzhou (China); Bi, Jing Wang [Dept. of Oncology, Jinan Military General Hospital of Chinese People' s Liberation Army, Jinan (China)

    2016-11-15

    To retrospectively evaluate the short-term outcomes and safety of computed tomography (CT)-guided percutaneous microwave ablation (MWA) of solitary adrenal metastasis from lung cancer. From May 2010 to April 2014, 31 patients with unilateral adrenal metastasis from lung cancer who were treated with CT-guided percutaneous MWA were enrolled. This study was conducted with approval from local Institutional Review Board. Clinical outcomes and complications of MWA were assessed. Their tumors ranged from 1.5 to 5.4 cm in diameter. After a median follow-up period of 11.1 months, primary efficacy rate was 90.3% (28/31). Local tumor progression was detected in 7 (22.6%) of 31 cases. Their median overall survival time was 12 months. The 1-year overall survival rate was 44.3%. Median local tumor progression-free survival time was 9 months. Local tumor progression-free survival rate was 77.4%. Of 36 MWA sessions, two (5.6%) had major complications (hypertensive crisis). CT-guided percutaneous MWA may be fairly safe and effective for treating solitary adrenal metastasis from lung cancer.

  17. The Effect of Lactulose on the Composition of the Intestinal Microbiota and Short-chain Fatty Acid Production in Human Volunteers and a Computer-controlled Model of the Proximal Large Intestine

    NARCIS (Netherlands)

    Venema, K.; Nuenen, M.H.M.C. van; Heuvel, E.G. van den; Pool, W.; Vossen, J.M.B.M. van der

    2003-01-01

    The objective of this study was to compare the in vivo effect of lactulose on faecal parameters with the effect in a dynamic, computer-controlled in vitro model of the proximal large intestine (TIM-2). Faecal samples from 10 human volunteers collected before (non-adapted) and after 1 week of

  18. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  19. Using an adaptive expertise lens to understand the quality of teachers' classroom implementation of computer-supported complex systems curricula in high school science

    Science.gov (United States)

    Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric

    2015-05-01

    Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to better understand teachers' classroom practices as they attempt to navigate myriad variables in the implementation of biology units that include working with computer simulations, and learning about and teaching through complex systems ideas. Sample: Research participants were three high school biology teachers, two females and one male, ranging in teaching experience from six to 16 years. Their teaching contexts also ranged in student achievement from 14-47% advanced science proficiency. Design and methods: We used a holistic multiple case study methodology and collected data during the 2011-2012 school year. Data sources include classroom observations, teacher and student surveys, and interviews. Data analyses and trustworthiness measures were conducted through qualitative mining of data sources and triangulation of findings. Results: We illustrate the characteristics of adaptive expertise of more or less successful teaching and learning when implementing complex systems curricula. We also demonstrate differences between case study teachers in terms of particular variables associated with adaptive expertise. Conclusions: This research contributes to scholarship on practices and professional development needed to better support teachers to teach through a complex systems pedagogical and curricular approach.

  20. Radiation dose considerations by intra-individual Monte Carlo simulations in dual source spiral coronary computed tomography angiography with electrocardiogram-triggered tube current modulation and adaptive pitch

    Energy Technology Data Exchange (ETDEWEB)

    May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael [University of Erlangen, Department of Radiology, Erlangen (Germany); Deak, Paul; Kalender, Willi A. [University of Erlangen, Department of Medical Physics, Erlangen (Germany); Keller, Andrea K.; Haeberle, Lothar [University of Erlangen, Department of Medical Informatics, Biometry and Epidemiology, Erlangen (Germany); Achenbach, Stephan; Seltmann, Martin [University of Erlangen, Department of Cardiology, Erlangen (Germany)

    2012-03-15

    To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 {+-} 2.1 mSv/100 mAs for TCM and 12.5 {+-} 5.3 mSv/100 mAs for CTC (P < 0.001). Relative dose reduction at low HR ({<=}60 bpm) was highest (49 {+-} 5%) compared to intermediate (60-70 bpm, 33 {+-} 12%) and high HR (>70 bpm, 29 {+-} 12%). However lowest ED is achieved at high HR (5.2 {+-} 1.5 mSv/100 mAs), compared with intermediate (6.7 {+-} 1.6 mSv/100 mAs) and low (8.3 {+-} 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)

  1. Developing a new case based computer-aided detection scheme and an adaptive cueing method to improve performance in detecting mammographic lesions

    Science.gov (United States)

    Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-01-01

    The purpose of this study is to evaluate a new method to improve performance of computer-aided detection (CAD) schemes of screening mammograms with two approaches. In the first approach, we developed a new case based CAD scheme using a set of optimally selected global mammographic density, texture, spiculation, and structural similarity features computed from all four full-field digital mammography images of the craniocaudal (CC) and mediolateral oblique (MLO) views by using a modified fast and accurate sequential floating forward selection feature selection algorithm. Selected features were then applied to a ‘scoring fusion’ artificial neural network classification scheme to produce a final case based risk score. In the second approach, we combined the case based risk score with the conventional lesion based scores of a conventional lesion based CAD scheme using a new adaptive cueing method that is integrated with the case based risk scores. We evaluated our methods using a ten-fold cross-validation scheme on 924 cases (476 cancer and 448 recalled or negative), whereby each case had all four images from the CC and MLO views. The area under the receiver operating characteristic curve was AUC  =  0.793  ±  0.015 and the odds ratio monotonically increased from 1 to 37.21 as CAD-generated case based detection scores increased. Using the new adaptive cueing method, the region based and case based sensitivities of the conventional CAD scheme at a false positive rate of 0.71 per image increased by 2.4% and 0.8%, respectively. The study demonstrated that supplementary information can be derived by computing global mammographic density image features to improve CAD-cueing performance on the suspicious mammographic lesions.

  2. Verification of computer system PROLOG - software tool for rapid assessments of consequences of short-term radioactive releases to the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Kiselev, Alexey A.; Krylov, Alexey L.; Bogatov, Sergey A. [Nuclear Safety Institute (IBRAE), Bolshaya Tulskaya st. 52, 115191, Moscow (Russian Federation)

    2014-07-01

    In case of nuclear and radiation accidents emergency response authorities require a tool for rapid assessments of possible consequences. One of the most significant problems is lack of data on initial state of an accident. The lack can be especially critical in case the accident occurred in a location that was not thoroughly studied beforehand (during transportation of radioactive materials for example). One of possible solutions is the hybrid method when a model that enables rapid assessments with the use of reasonable minimum of input data is used conjointly with an observed data that can be collected shortly after accidents. The model is used to estimate parameters of the source and uncertain meteorological parameters on the base of some observed data. For example, field of fallout density can be observed and measured within hours after an accident. After that the same model with the use of estimated parameters is used to assess doses and necessity of recommended and mandatory countermeasures. The computer system PROLOG was designed to solve the problem. It is based on the widely used Gaussian model. The standard Gaussian model is supplemented with several sub-models that allow to take into account: polydisperse aerosols, aerodynamic shade from buildings in the vicinity of the place of accident, terrain orography, initial size of the radioactive cloud, effective height of the release, influence of countermeasures on the doses of radioactive exposure of humans. It uses modern GIS technologies and can use web map services. To verify ability of PROLOG to solve the problem it is necessary to test its ability to assess necessary parameters of real accidents in the past. Verification of the computer system on the data of Chazhma Bay accident (Russian Far East, 1985) was published previously. In this work verification was implemented on the base of observed contamination from the Kyshtym disaster (PA Mayak, 1957) and the Tomsk accident (1993). Observations of Sr-90

  3. Short- and Medium-Term Efficacy of a Web-Based Computer-Tailored Nutrition Education Intervention for Adults Including Cognitive and Environmental Feedback: Randomized Controlled Trial

    Science.gov (United States)

    Lechner, Lilian; de Vries, Hein; Candel, Math JJM; Oenema, Anke

    2015-01-01

    Background Web-based, computer-tailored nutrition education interventions can be effective in modifying self-reported dietary behaviors. Traditional computer-tailored programs primarily targeted individual cognitions (knowledge, awareness, attitude, self-efficacy). Tailoring on additional variables such as self-regulation processes and environmental-level factors (the home food environment arrangement and perception of availability and prices of healthy food products in supermarkets) may improve efficacy and effect sizes (ES) of Web-based computer-tailored nutrition education interventions. Objective This study evaluated the short- and medium-term efficacy and educational differences in efficacy of a cognitive and environmental feedback version of a Web-based computer-tailored nutrition education intervention on self-reported fruit, vegetable, high-energy snack, and saturated fat intake compared to generic nutrition information in the total sample and among participants who did not comply with dietary guidelines (the risk groups). Methods A randomized controlled trial was conducted with a basic (tailored intervention targeting individual cognition and self-regulation processes; n=456), plus (basic intervention additionally targeting environmental-level factors; n=459), and control (generic nutrition information; n=434) group. Participants were recruited from the general population and randomly assigned to a study group. Self-reported fruit, vegetable, high-energy snack, and saturated fat intake were assessed at baseline and at 1- (T1) and 4-months (T2) postintervention using online questionnaires. Linear mixed model analyses examined group differences in change over time. Educational differences were examined with group×time×education interaction terms. Results In the total sample, the basic (T1: ES=–0.30; T2: ES=–0.18) and plus intervention groups (T1: ES=–0.29; T2: ES=–0.27) had larger decreases in high-energy snack intake than the control group. The

  4. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Kongshaug, Jesper; Søndergaard, Karin

    2015-01-01

    offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...... to be static, and no longer acts as a kind of spatial constancy maintaining stability and order? Moreover, what new potentials open in lighting design? This book is one of four books that is published in connection with the research project entitled LED Lighting; Interdisciplinary LED Lighting Research...

  5. The nociceptive withdrawal reflex does not adapt to joint position change and short-term motor practice [v2; ref status: indexed, http://f1000r.es/2lr

    Directory of Open Access Journals (Sweden)

    Nathan Eckert

    2013-12-01

    Full Text Available The nociceptive withdrawal reflex is a protective mechanism to mediate interactions within a potentially dangerous environment. The reflex is formed by action-based sensory encoding during the early post-natal developmental period, and it is unknown if the protective motor function of the nociceptive withdrawal reflex in the human upper-limb is adaptable based on the configuration of the arm or if it can be modified by short-term practice of a similar or opposing motor action. In the present study, nociceptive withdrawal reflexes were evoked by a brief train of electrical stimuli applied to digit II, 1 in five different static arm positions and, 2 before and after motor practice that was opposite (EXT or similar (FLEX to the stereotyped withdrawal response, in 10 individuals. Withdrawal responses were quantified by the electromyography (EMG reflex response in several upper limb muscles, and by the forces and moments recorded at the wrist. EMG onset latencies and response amplitudes were not significantly different across the arm positions or between the EXT and FLEX practice conditions, and the general direction of the withdrawal response was similar across arm positions. In addition, the force vectors were not different after practice in either the practice condition or between EXT and FLEX conditions. We conclude the withdrawal response is insensitive to changes in elbow or shoulder joint angles as well as remaining resistant to short-term adaptations from the practice of motor actions, resulting in a generalized limb withdrawal in each case. It is further hypothesized that the multisensory feedback is weighted differently in each arm position, but integrated to achieve a similar withdrawal response to safeguard against erroneous motor responses that could cause further harm. The results remain consistent with the concept that nociceptive withdrawal reflexes are shaped through long-term and not short-term action based sensory encoding.

  6. Short-Term Local Adaptation of Historical Common Bean (Phaseolus vulgaris L.) Varieties and Implications for In Situ Management of Bean Diversity.

    Science.gov (United States)

    Klaedtke, Stephanie M; Caproni, Leonardo; Klauck, Julia; de la Grandville, Paul; Dutartre, Martin; Stassart, Pierre M; Chable, Véronique; Negri, Valeria; Raggi, Lorenzo

    2017-02-28

    Recognizing both the stakes of traditional European common bean diversity and the role farmers' and gardeners' networks play in maintaining this diversity, the present study examines the role that local adaptation plays for the management of common bean diversity in situ. To the purpose, four historical bean varieties and one modern control were multiplied on two organic farms for three growing seasons. The fifteen resulting populations, the initial ones and two populations of each variety obtained after the three years of multiplication, were then grown in a common garden. Twenty-two Simple Sequence Repeat (SSR) markers and 13 phenotypic traits were assessed. In total, 68.2% of tested markers were polymorphic and a total of 66 different alleles were identified. FST analysis showed that the genetic composition of two varieties multiplied in different environments changed. At the phenotypic level, differences were observed in flowering date and leaf length. Results indicate that three years of multiplication suffice for local adaptation to occur. The spatial dynamics of genetic and phenotypic bean diversity imply that the maintenance of diversity should be considered at the scale of the network, rather than individual farms and gardens. The microevolution of bean populations within networks of gardens and farms emerges as a research perspective.

  7. Robust removal of short-duration artifacts in long neonatal EEG recordings using wavelet-enhanced ICA and adaptive combining of tentative reconstructions.

    Science.gov (United States)

    Zima, M; Tichavský, P; Paul, K; Krajča, V

    2012-08-01

    The goal of this paper is to describe a robust artifact removal (RAR) method, an automatic sequential procedure which is capable of removing short-duration, high-amplitude artifacts from long-term neonatal EEG recordings. Such artifacts are mainly caused by movement activity, and have an adverse effect on the automatic processing of long-term sleep recordings. The artifacts are removed sequentially in short-term signals using independent component analysis (ICA) transformation and wavelet denoising. In order to gain robustness of the RAR method, the whole EEG recording is processed multiple times. The resulting tentative reconstructions are then combined. We show results in a data set of signals from ten healthy newborns. Those results prove, both qualitatively and quantitatively, that the RAR method is capable of automatically rejecting the mentioned artifacts without changes in overall signal properties such as the spectrum. The method is shown to perform better than either the wavelet-enhanced ICA or the simple artifact rejection method without the combination procedure.

  8. Cross-cultural development of an item list for computer-adaptive testing of fatigue in oncological patients

    Directory of Open Access Journals (Sweden)

    Oberguggenberger Anne S

    2011-03-01

    Full Text Available Abstract Introduction Within an ongoing project of the EORTC Quality of Life Group, we are developing computerized adaptive test (CAT measures for the QLQ-C30 scales. These new CAT measures are conceptualised to reflect the same constructs as the QLQ-C30 scales. Accordingly, the Fatigue-CAT is intended to capture physical and general fatigue. Methods The EORTC approach to CAT development comprises four phases (literature search, operationalisation, pre-testing, and field testing. Phases I-III are described in detail in this paper. A literature search for fatigue items was performed in major medical databases. After refinement through several expert panels, the remaining items were used as the basis for adapting items and/or formulating new items fitting the EORTC item style. To obtain feedback from patients with cancer, these English items were translated into Danish, French, German, and Spanish and tested in the respective countries. Results Based on the literature search a list containing 588 items was generated. After a comprehensive item selection procedure focusing on content, redundancy, item clarity and item difficulty a list of 44 fatigue items was generated. Patient interviews (n = 52 resulted in 12 revisions of wording and translations. Discussion The item list developed in phases I-III will be further investigated within a field-testing phase (IV to examine psychometric characteristics and to fit an item response theory model. The Fatigue CAT based on this item bank will provide scores that are backward-compatible to the original QLQ-C30 fatigue scale.

  9. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs; Straaldosreglering vid kroppsdatortomografi - bakgrund till dosregleringsprogrammet OmnimAs

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, Ulf; Kristiansson, Mattias [Trelleborg Hospital (Sweden); Leitz, Wolfram [Swedish Radiation Protection Authority, Stockholm (Sweden); Paahlstorp, Per-Aake [Siemens Medical Solutions, Solna (Sweden)

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations.

  10. APPLICATIONS OF LASERS AND OTHER TOPICS IN LASER PHYSICS AND TECHNOLOGY: Adaptive focusing of high-intensity light beams over short paths

    Science.gov (United States)

    Kanev, Fedor Yu; Chesnokov, S. S.

    1987-10-01

    Numerical experiments were used to analyze the efficiency of adaptive control of the wavefronts of light beams traveling under conditions of steady-state wind refraction over paths amounting to 0.1 of the diffraction length. The equations describing the propagation of the light waves emitted and scattered by an object were solved in a lens system of coordinates, which made it possible to increase considerably the reliability of numerical prediction. The results were used to propose wavefront control by an algorithm for modified phase conjugation based on storage of a phase profile ensuring the best compensation of nonlinear distortions in all the preceding iterations. This algorithm was found to increase the concentration of the field on an object by 40-45% compared with nonadaptive focusing.

  11. A computational atlas of the hippocampal formation using ex vivo, ultra-high resolution MRI: Application to adaptive segmentation of in vivo MRI

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Augustinack, Jean C.; Nguyen, Khoa

    2015-01-01

    an algorithm that can analyze multimodal data and adapt to variations in MRI contrast due to differences in acquisition hardware or pulse sequences. The applicability of the atlas, which we are releasing as part of FreeSurfer (version 6.0), is demonstrated with experiments on three different publicly available......Automated analysis of MRI data of the subregions of the hippocampus requires computational atlases built at a higher resolution than those that are typically used in current neuroimaging studies. Here we describe the construction of a statistical atlas of the hippocampal formation at the subregion...... level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise...

  12. Redefining diagnostic symptoms of depression using Rasch analysis: testing an item bank suitable for DSM-V and computer adaptive testing.

    Science.gov (United States)

    Mitchell, Alex J; Smith, Adam B; Al-salihy, Zerak; Rahim, Twana A; Mahmud, Mahmud Q; Muhyaldin, Asma S

    2011-10-01

    We aimed to redefine the optimal self-report symptoms of depression suitable for creation of an item bank that could be used in computer adaptive testing or to develop a simplified screening tool for DSM-V. Four hundred subjects (200 patients with primary depression and 200 non-depressed subjects), living in Iraqi Kurdistan were interviewed. The Mini International Neuropsychiatric Interview (MINI) was used to define the presence of major depression (DSM-IV criteria). We examined symptoms of depression using four well-known scales delivered in Kurdish. The Partial Credit Model was applied to each instrument. Common-item equating was subsequently used to create an item bank and differential item functioning (DIF) explored for known subgroups. A symptom level Rasch analysis reduced the original 45 items to 24 items of the original after the exclusion of 21 misfitting items. A further six items (CESD13 and CESD17, HADS-D4, HADS-D5 and HADS-D7, and CDSS3 and CDSS4) were removed due to misfit as the items were added together to form the item bank, and two items were subsequently removed following the DIF analysis by diagnosis (CESD20 and CDSS9, both of which were harder to endorse for women). Therefore the remaining optimal item bank consisted of 17 items and produced an area under the curve (AUC) of 0.987. Using a bank restricted to the optimal nine items revealed only minor loss of accuracy (AUC = 0.989, sensitivity 96%, specificity 95%). Finally, when restricted to only four items accuracy was still high (AUC was still 0.976; sensitivity 93%, specificity 96%). An item bank of 17 items may be useful in computer adaptive testing and nine or even four items may be used to develop a simplified screening tool for DSM-V major depressive disorder (MDD). Further examination of this item bank should be conducted in different cultural settings.

  13. The adaptive computer-aided diagnosis system based on tumor sizes for the classification of breast tumors detected at screening ultrasound.

    Science.gov (United States)

    Moon, Woo Kyung; Chen, I-Ling; Chang, Jung Min; Shin, Sung Ui; Lo, Chung-Ming; Chang, Ruey-Feng

    2017-04-01

    Screening ultrasound (US) is increasingly used as a supplement to mammography in women with dense breasts, and more than 80% of cancers detected by US alone are 1cm or smaller. An adaptive computer-aided diagnosis (CAD) system based on tumor size was proposed to classify breast tumors detected at screening US images using quantitative morphological and textural features. In the present study, a database containing 156 tumors (78 benign and 78 malignant) was separated into two subsets of different tumor sizes (<1cm and ⩾1cm) to explore the improvement in the performance of the CAD system. After adaptation, the accuracies, sensitivities, specificities and Az values of the CAD for the entire database increased from 73.1% (114/156), 73.1% (57/78), 73.1% (57/78), and 0.790 to 81.4% (127/156), 83.3% (65/78), 79.5% (62/78), and 0.852, respectively. In the data subset of tumors larger than 1cm, the performance improved from 66.2% (51/77), 68.3% (28/41), 63.9% (23/36), and 0.703 to 81.8% (63/77), 85.4% (35/41), 77.8% (28/36), and 0.855, respectively. The proposed CAD system can be helpful to classify breast tumors detected at screening US. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A Computational Approach to Model Vascular Adaptation During Chronic Hemodialysis: Shape Optimization as a Substitute for Growth Modeling

    Science.gov (United States)

    Mahmoudzadeh Akherat, S. M. Javid; Boghosian, Michael; Cassel, Kevin; Hammes, Mary

    2015-11-01

    End-stage-renal disease patients depend on successful long-term hemodialysis via vascular access, commonly facilitated via a Brachiocephalic Fistula (BCF). The primary cause of BCF failure is Cephalic Arch Stenosis (CAS). It is believed that low Wall Shear Stress (WSS) regions, which occur because of the high flow rates through the natural bend in the cephalic vein, create hemodynamic circumstances that trigger the onset and development of Intimal Hyperplasia (IH) and subsequent CAS. IH is hypothesized to be a natural effort to reshape the vessel, aiming to bring the WSS values back to a physiologically acceptable range. We seek to explore the correlation between regions of low WSS and subsequent IH and CAS in patient-specific geometries. By utilizing a shape optimization framework, a method is proposed to predict cardiovascular adaptation that could potentially be an alternative to vascular growth and remodeling. Based on an objective functional that seeks to alter the vessel shape in such a way as to readjust the WSS to be within the normal physiological range, CFD and shape optimization are then coupled to investigate whether the optimal shape evolution is correlated with actual patient-specific geometries thereafter. Supported by the National Institute of Diabetes and Digestive and Kidney Diseases of the National Institutes of Health (R01 DK90769).

  15. Efficient and Adaptive Methods for Computing Accurate Potential Surfaces for Quantum Nuclear Effects: Applications to Hydrogen-Transfer Reactions.

    Science.gov (United States)

    DeGregorio, Nicole; Iyengar, Srinivasan S

    2018-01-09

    We present two sampling measures to gauge critical regions of potential energy surfaces. These sampling measures employ (a) the instantaneous quantum wavepacket density, an approximation to the (b) potential surface, its (c) gradients, and (d) a Shannon information theory based expression that estimates the local entropy associated with the quantum wavepacket. These four criteria together enable a directed sampling of potential surfaces that appears to correctly describe the local oscillation frequencies, or the local Nyquist frequency, of a potential surface. The sampling functions are then utilized to derive a tessellation scheme that discretizes the multidimensional space to enable efficient sampling of potential surfaces. The sampled potential surface is then combined with four different interpolation procedures, namely, (a) local Hermite curve interpolation, (b) low-pass filtered Lagrange interpolation, (c) the monomial symmetrization approximation (MSA) developed by Bowman and co-workers, and (d) a modified Shepard algorithm. The sampling procedure and the fitting schemes are used to compute (a) potential surfaces in highly anharmonic hydrogen-bonded systems and (b) study hydrogen-transfer reactions in biogenic volatile organic compounds (isoprene) where the transferring hydrogen atom is found to demonstrate critical quantum nuclear effects. In the case of isoprene, the algorithm discussed here is used to derive multidimensional potential surfaces along a hydrogen-transfer reaction path to gauge the effect of quantum-nuclear degrees of freedom on the hydrogen-transfer process. Based on the decreased computational effort, facilitated by the optimal sampling of the potential surfaces through the use of sampling functions discussed here, and the accuracy of the associated potential surfaces, we believe the method will find great utility in the study of quantum nuclear dynamics problems, of which application to hydrogen-transfer reactions and hydrogen

  16. Adaptive changes of pancreatic protease secretion to a short-term vegan diet: influence of reduced intake and modification of protein.

    Science.gov (United States)

    Walkowiak, Jaroslaw; Mądry, Edyta; Lisowska, Aleksandra; Szaflarska-Popławska, Anna; Grzymisławski, Marian; Stankowiak-Kulpa, Hanna; Przysławski, Juliusz

    2012-01-01

    In our previous study, we demonstrated that abstaining from meat, for 1 month, by healthy omnivores (lacto-ovovegetarian model) resulted in a statistical decrease in pancreatic secretion as measured by faecal elastase-1 output. However, no correlation between relative and non-relative changes of energy and nutrient consumption and pancreatic secretion was documented. Therefore, in the present study, we aimed to assess the changes of exocrine pancreatic secretion with a more restrictive dietetic modification, by applying a vegan diet. A total of twenty-one healthy omnivores (sixteen females and five males) participated in the prospective study lasting for 6 weeks. The nutrient intake and faecal output of pancreatic enzymes (elastase-1, chymotrypsin and lipase) were assessed twice during the study. Each assessment period lasted for 7 d: the first before the transition to the vegan diet (omnivore diet) and the second during the last week of the study (vegan diet). The dietary modification resulted in a significant decrease in faecal elastase-1 (P vegan diet resulted in an adaptation of pancreatic protease secretion in healthy volunteers.

  17. An Ad Hoc Adaptive Hashing Technique forNon-Uniformly Distributed IP Address Lookup in Computer Networks

    Directory of Open Access Journals (Sweden)

    Christopher Martinez

    2007-02-01

    Full Text Available Hashing algorithms long have been widely adopted to design a fast address look-up process which involves a search through a large database to find a record associated with a given key. Hashing algorithms involve transforming a key inside each target data to a hash value hoping that the hashing would render the database a uniform distribution with respect to this new hash value. The close the final distribution is to uniform, the less search time would be required when a query is made. When the database is already key-wise uniformly distributed, any regular hashing algorithm, such as bit-extraction, bit-group XOR, etc., would easily lead to a statistically perfect uniform distribution after the hashing. On the other hand, if records in the database are instead not uniformly distributed as in almost all known practical applications, then even different regular hash functions would lead to very different performance. When the target database has a key with a highly skewed distributed value, performance delivered by regular hashing algorithms usually becomes far from desirable. This paper aims at designing a hashing algorithm to achieve the highest probability in leading to a uniformly distributed hash result from a non-uniformly distributed database. An analytical pre-process on the original database is first performed to extract critical information that would significantly benefit the design of a better hashing algorithm. This process includes sorting on the bits of the key to prioritize the use of them in the XOR hashing sequence, or in simple bit extraction, or even a combination of both. Such an ad hoc hash design is critical to adapting to all real-time situations when there exists a changing (and/or expanding database with an irregular non-uniform distribution. Significant improvement from simulation results is obtained in randomly generated data as well as real data.

  18. Validation of the factor structure of the Greek adaptation of the Situational Inventory of Body-Image Dysphoria-Short Form (SIBID-S).

    Science.gov (United States)

    Argyrides, Marios; Kkeli, Natalie

    2015-12-01

    Body image is a psychological construct that refers to one's perceptions, feelings, and thoughts towards one's body and appearance. The intensity and frequency of dysphoric body-image emotions depend upon situational events (i.e., situations involving body exposure, social comparisons, wearing certain clothing, looking in the mirror, and so forth). The Situational Inventory of Body-Image Dysphoria-Short Form (SIBID-S; Cash, Manual for the situational inventory of body-image dysphoria, 2000) was originally developed to assess one's negative body-image emotions in certain situational events. The current study aimed to confirm the factor structure and reliability of the newly translated Greek version of the SIBID-S. Participants consisted of a convenient sample of 2664 high school students (1119 males, 1545 females) who answered the measures of interest. Results indicated that the original one-factor structure of the SIBID-S was retained and fitted very well with the original model for both males and females. In addition, the Greek version had satisfactory reliability and convergent validity coefficients. Gender differences were also noted. The Greek SIBID-S has very good validity and reliability data and will serve as a useful measure of body-image dysphoria enabling research with the Greek-speaking population as well as cross-cultural research.

  19. Comparative study of open and arthroscopic coracoid transfer for shoulder anterior instability (Latarjet)-computed tomography evaluation at a short term follow-up. Part II.

    Science.gov (United States)

    Kordasiewicz, Bartłomiej; Kicinski, Maciej; Małachowski, Konrad; Wieczorek, Janusz; Chaberek, Sławomir; Pomianowski, Stanisław

    2018-01-04

    The aim of this study was to evaluate and to compare the radiological parameters after arthroscopic and open Latarjet technique via evaluation of computed tomography (CT) scans. Our hypothesis was that the radiological results after arthroscopic stabilisation remained in the proximity of those results achieved after open stabilisation. CT scan evaluation results of patients after primary Latarjet procedure were analysed. Patients operated on between 2006 and 2011 using an open technique composed the OPEN group and patients operated on arthroscopically between 2011 and 2013 composed the ARTHRO group. Forty-three out of 55 shoulders (78.2%) in OPEN and 62 out of 64 shoulders (95.3%) in ARTHRO were available for CT scan evaluation. The average age at surgery was 28 years in OPEN and 26 years in ARTHRO. The mean follow-up was 54.2 months in OPEN and 23.4 months in ARTHRO. CT scan evaluation was used to assess graft fusion and osteolysis. Bone block position and screw orientation were assessed in the axial and the sagittal views. The subscapularis muscle fatty infiltration was evaluated according to Goutallier classification. The non-union rate was significantly higher in OPEN than in ARTHRO: 5 (11.9%) versus 1 (1.7%) (p  0.05). These results should be evaluated very carefully due to significant difference in the follow-up of both groups. A significantly higher rate of partial graft osteolysis at the level of the superior screw was reported in ARTHRO with 32 patients (53.3%) versus 10 (23.8%) in OPEN (p  0.05). However, in the position between 3 and 5 o'clock there were 56.7% of the grafts in ARTHRO versus 87.8% in OPEN (p Arthroscopic Latarjet stabilisation showed satisfactory radiographic results, comparable to the open procedure, however the short-term follow-up can bias this evaluation. Graft healing rate was very high in the arthroscopic technique, but yet osteolysis of the superior part of the graft and more superior graft position in the sagittal

  20. Access to College for All: ITAC Project--Computer and Adaptive Computer Technologies in the Cegeps for Students with Disabilities = L'accessibilite au cegep pour tous: Projet ITAC--informatique et technologies adaptees dans les cegeps pour les etudiants handicapes.

    Science.gov (United States)

    Fichten, Catherine S.; Barile, Maria

    This report discusses outcomes of three empirical studies which investigated the computer and adaptive computer technology needs and concerns of Quebec college students with various disabilities, professors, and individuals responsible for providing services to students with disabilities. Key findings are highlighted and recommendations are made…

  1. Comparison of Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection for Detecting Hepatic Metastases on Submillisievert Low-Dose Computed Tomography.

    Science.gov (United States)

    Son, Jung Hee; Kim, Seung Ho; Yoon, Jung-Hee; Lee, Yedaun; Lim, Yun-Jung; Kim, Seon-Jeong

    The aim of the study was to compare the diagnostic performance of model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP) on submillisievert low-dose computed tomography (LDCT) for detecting hepatic metastases. Thirty-eight patients having hepatic metastases underwent abdomen CT. Computed tomography protocol consisted of routine standard-dose portal venous phase scan (120 kVp) and 90-second delayed low-dose scan (80 kVp). The LDCT images were reconstructed with FBP, ASIR, and MBIR, respectively. Two readers recorded the number of hepatic metastases on each image set. A total of 105 metastatic lesions were analyzed. For reader 1, sensitivity for detecting metastases was stationary between FBP (49%) and ASIR (52%, P = 0.0697); however, sensitivity increased in MBIR (66%, P = 0.0035). For reader 2, it was stationary for all the following sets: FBP (65%), ASIR (68%), and MBIR (67%, P > 0.05). The MBIR and ASIR showed a limited sensitivity for detecting hepatic metastases in submillisievert LDCT.

  2. Football training in men with prostate cancer undergoing androgen deprivation therapy: activity profile and short-term skeletal and postural balance adaptations.

    Science.gov (United States)

    Uth, Jacob; Hornstrup, Therese; Christensen, Jesper F; Christensen, Karl B; Jørgensen, Niklas R; Helge, Eva W; Schmidt, Jakob F; Brasso, Klaus; Helge, Jørn W; Jakobsen, Markus D; Andersen, Lars L; Rørth, Mikael; Midtgaard, Julie; Krustrup, Peter

    2016-03-01

    To investigate the activity profile of football training and its short-term effects on bone mass, bone turnover markers (BTMs) and postural balance in men with prostate cancer (PCa) undergoing androgen deprivation therapy (ADT). This was a randomised 12-week study in which men with PCa undergoing ADT were assigned to a football intervention group [FTG, n = 29, 67 ± 7 (±SD) years] training 2‒3 times per week for 45‒60 min or to a control group (n = 28, 66 ± 5 years). The activity profile was measured using a 5-Hz GPS. The outcomes were total body and leg bone mineral content (BMC) and density, BTMs and postural balance. In the last part of the 12 weeks, FTG performed 194 ± 41 accelerations and 296 ± 65 decelerations at >0.6 m/s/s and covered a distance of 905 ± 297 m at speeds >6 km/h and 2646 ± 705 m per training session. Analysis of baseline-to-12-week change scores showed between-group differences in favour of FTG in total body BMC [26.4 g, 95 % confidence interval (CI): 5.8-46.9 g, p = 0.013], leg BMC (13.8 g, 95 % CI: 7.0‒20.5 g, p Football training involves numerous runs, accelerations and decelerations, which may be linked to marked increases in bone formation markers and preserved bone mass in middle-aged and elderly men with PCa undergoing ADT. ClinicalTrials.gov: NCT01711892.

  3. Effect of coma and spherical aberration on depth-of-focus measured using adaptive optics and computationally blurred images.

    Science.gov (United States)

    Legras, Richard; Benard, Yohann; Lopez-Gil, Norberto

    2012-03-01

    To compare the effect of primary spherical aberration and vertical coma on depth of focus measured with 2 methods. Laboratoire Aimé Cotton, Centre National de la Recherche Scientifique, and Université Paris-Sud, Orsay, France. Evaluation of technology. The subjective depth of focus, defined as the interval of vision for which the target was still perceived acceptable, was evaluated using 2 methods. In the first method, the subject changed the defocus term by reshaping the mirror, which also corrected the subject's aberrations and induced a certain value of coma or primary spherical aberration. In the second procedure, the subject changed the displayed images, which were calculated for various defocuses and with the desired aberration using a numerical eye model. Depth of focus was measured using a 0.18 diopter (D) step in 4 nonpresbyopic subjects corrected for the entire eye aberrations with a 6.0 mm and 3.0 mm pupil and with the addition of 0.3 μm and 0.6 μm of positive primary spherical aberration or vertical coma. There was good concordance between the depth of focus measured with both methods (differences within 1/3 D, r(2) = 0.88). Image-quality metrics failed to predict the subjective depth of focus (r(2) < 0.41). These data confirm that defocus in the retinal image can be generated by optical or computational methods and that both can be used to assess the effect of higher-order aberrations on depth of focus. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  4. Long-term adaptation of Daphnia to toxic environment in Lake Orta: the effects of short-term exposure to copper and acidification

    Directory of Open Access Journals (Sweden)

    Marina MANCA

    2010-08-01

    Full Text Available Because of its 80-year history of heavy pollution and re-colonization, Lake Orta provides a good opportunity for investigating the response of zooplankton organisms to heavy metals and acidification as well as the mechanisms involved. After the recent establishment of Daphnia galeata Sars, and the detection of an extremely low clonal diversity of Lake Orta population, we carried out a study to investigate the lethal tolerance to ionic copper, as well as to acidity, and the impact of newborn Daphnia exposure to sublethal concentrations of copper for their later development and reproduction. We conducted acute toxicity tests to estimate the EC50 for ionic copper and tolerance to low pH, as well as life table experiments. Tolerance to ionic copper was high, three times that reported in literature. An increased mortality soon after exposure to low pH confirmed a high sensitivity to acidity and explained the success of the species in Lake Orta only after pH recovery. An analysis of reproductive and demographic parameters revealed that D. galeata Sars was stressed at concentrations of ionic copper only twice higher than those presently recorded in the lake (i.e., ca 3 μg L-1. An increased cumulative number of eggs produced by each female were in fact counterbalanced by an increasing abortion rate, which resulted in an unaltered or lower intrinsic rate of population increase. Our results are likely due to the strong selective pressure, more than physiological processes (acclimation, in a polluted area in which only specific adapted clones are able to grow, confirming the results previously obtained on Lake Orta's D. obtusa Kurz population. The reproductive response and the relatively low within treatment variability suggest that clone specificity, rather than physiological acclimation, was the driving force. The low variability confirmed results previously obtained from life tables experiments on Lake Orta's D. obtusa clone. Overall, our results

  5. Effectiveness of Adaptive Statistical Iterative Reconstruction for 64-Slice Dual-Energy Computed Tomography Pulmonary Angiography in Patients With a Reduced Iodine Load: Comparison With Standard Computed Tomography Pulmonary Angiography.

    Science.gov (United States)

    Lee, Ji Won; Lee, Geewon; Lee, Nam Kyung; Moon, Jin Il; Ju, Yun Hye; Suh, Young Ju; Jeong, Yeon Joo

    2016-01-01

    The aim of the study was to assess the effectiveness of the adaptive statistical iterative reconstruction (ASIR) for dual-energy computed tomography pulmonary angiography (DE-CTPA) with a reduced iodine load. One hundred forty patients referred for chest CT were randomly divided into a DE-CTPA group with a reduced iodine load or a standard CTPA group. Quantitative and qualitative image qualities of virtual monochromatic spectral (VMS) images with filtered back projection (VMS-FBP) and those with 50% ASIR (VMS-ASIR) in the DE-CTPA group were compared. Image qualities of VMS-ASIR images in the DE-CTPA group and ASIR images in the standard CTPA group were also compared. All quantitative and qualitative indices, except attenuation value of pulmonary artery in the VMS-ASIR subgroup, were superior to those in the VMS-FBP subgroup (all P ASIR images were superior to those of ASIR images in the standard CTPA group (P ASIR images of the DE-CTPA group than in ASIR images of the standard CTPA group (P = 0.001). The ASIR technique tends to improve the image quality of VMS imaging. Dual-energy computed tomography pulmonary angiography with ASIR can reduce contrast medium volume and produce images of comparable quality with those of standard CTPA.

  6. Cross-cultural adaptation, reliability and validity of the Spanish version of the Quality of Life in Adult Cancer Survivors (QLACS) questionnaire: application in a sample of short-term survivors.

    Science.gov (United States)

    Escobar, Antonio; Trujillo-Martín, Maria del Mar; Rueda, Antonio; Pérez-Ruiz, Elisabeth; Avis, Nancy E; Bilbao, Amaia

    2015-11-16

    The aim of this study was to validate the Quality of Life in Adult Cancer Survivors (QLACS) in short-term Spanish cancer survivor's patients. Patients with breast, colorectal or prostate cancer that had finished their initial cancer treatment 3 years before the beginning of this study completed QLACS, WHOQOL, Short Form-36, Hospital Anxiety and Depression Scale, EORTC-QLQ-BR23 and EQ-5D. Cultural adaptation was made based on established guidelines. Reliability was evaluated using internal consistency and test-retest. Convergent validity was studied by mean of Pearson's correlation coefficient. Structural validity was determined by a second-order confirmatory factor analysis (CFA) and Rasch analysis was used to assess the unidimensionality of the Generic and Cancer-specific scales. Cronbach's alpha were above 0.7 in all domains and summary scales. Test-retest coefficients were 0.88 for Generic and 0.82 for Cancer-specific summary scales. QLACS generic summary scale was correlated with other generic criterion measures, SF-36 MCS (r = - 0.74) and EQ-VAS (r = - 0.63). QLACS cancer-specific scale had lower values with the same constructs. CFA provided satisfactory fit indices in all cases. The RMSEA value was 0.061 and CFI and TLI values were 0.929 and 0.925, respectively. All factor loadings were higher than 0.40 and statistically significant (P validity and reliability of QLACS questionnaire to be used in short-term cancer survivors.

  7. Cross-cultural adaptation of the short-form condom attitude scale: validity assessment in a sub-sample of rural-to-urban migrant workers in Bangladesh

    Science.gov (United States)

    2013-01-01

    Background The reliable and valid measurement of attitudes towards condom use are essential to assist efforts to design population specific interventions aimed at promoting positive attitude towards, and increased use of condoms. Although several studies, mostly in English speaking western world, have demonstrated the utility of condom attitude scales, very limited culturally relevant condom attitude measures have been developed till to date. We have developed a scale and evaluated its psychometric properties in a sub-sample of rural-to-urban migrant workers in Bangladesh. Methods This paper reports mostly on cross-sectional survey components of a mixed methods sexual health research in Bangladesh. The survey sample (n = 878) comprised rural-to-urban migrant taxi drivers (n = 437) and restaurant workers (n = 441) in Dhaka (aged 18–35 years). The study also involved focus group sessions with same populations to establish the content validity and cultural equivalency of the scale. The current scale was administered with a large sexual health survey questionnaire and consisted of 10 items. Quantitative and qualitative data were assessed with statistical and thematic analysis, respectively, and then presented. Results The participants found the scale simple and easy to understand and use. The internal consistency (α) of the scale was 0.89 with high construct validity (the first component accounted for about 52% of variance and second component about 20% of the total variance with an Eigen-value for both factors greater than one). The test-retest reliability (repeatability) was also found satisfactory with high inter-item correlations (the majority of the intra-class correlation coefficient values was above 2 and was significant for all items on the scale, p Bengali version of the scale have good metric properties for assessing attitudes toward condom use. Validated scale is a short, simple and reliable instrument for measuring attitudes towards condom

  8. Effects of Diclofenac, L-NAME, L-Arginine, and Pentadecapeptide BPC 157 on Gastrointestinal, Liver, and Brain Lesions, Failed Anastomosis, and Intestinal Adaptation Deterioration in 24 Hour-Short-Bowel Rats.

    Science.gov (United States)

    Lojo, Nermin; Rasic, Zarko; Zenko Sever, Anita; Kolenc, Danijela; Vukusic, Darko; Drmic, Domagoj; Zoricic, Ivan; Sever, Marko; Seiwerth, Sven; Sikiric, Predrag

    2016-01-01

    Stable gastric pentadecapeptide BPC 157 was previously used to ameliorate wound healing following major surgery and counteract diclofenac toxicity. To resolve the increasing early risks following major massive small bowel resectioning surgery, diclofenac combined with nitric oxide (NO) system blockade was used, suggesting therapy with BPC 157 and the nitric oxide synthase (NOS substrate) L-arginine, is efficacious. Immediately after anastomosis creation, short-bowel rats were untreated or administered intraperitoneal diclofenac (12 mg/kg), BPC 157 (10 μg/kg or 10 ng/kg), L-NG-nitroarginine methyl ester (L-NAME, 5 mg/kg), L-arginine (100 mg/kg) alone or combined, and assessed 24 h later. Short-bowel rats exhibited poor anastomosis healing, failed intestine adaptation, and gastrointestinal, liver, and brain lesions, which worsened with diclofenac. This was gradually ameliorated by immediate therapy with BPC 157 and L-arginine. Contrastingly, NOS-blocker L-NAME induced further aggravation and lesions gradually worsened. Specifically, rats with surgery alone exhibited mild stomach/duodenum lesions, considerable liver lesions, and severe cerebral/hippocampal lesions while those also administered diclofenac showed widespread severe lesions in the gastrointestinal tract, liver, cerebellar nuclear/Purkinje cells, and cerebrum/hippocampus. Rats subjected to surgery, diclofenac, and L-NAME exhibited the mentioned lesions, worsening anastomosis, and macro/microscopical necrosis. Thus, rats subjected to surgery alone showed evidence of deterioration. Furtheremore, rats subjected to surgery and administered diclofenac showed worse symptoms, than the rats subjected to surgery alone did. Rats subjected to surgery combined with diclofenac and L-NAME showed the worst deterioration. Rats subjected to surgery exhibited habitual adaptation of the remaining small intestine, which was markedly reversed in rats subjected to surgery and diclofenac, and those with surgery, diclofenac, and

  9. A comparison between radiation therapists and medical specialists in the use of kilovoltage cone-beam computed tomography scans for potential lung cancer radiotherapy target verification and adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Watt, Sandie Carolyn, E-mail: sandie.watt@sswahs.gov.au [Liverpool and Macarthur Cancer Therapy Centres, NSW (Australia); University of Sydney, Sydney, NSW (Australia); Ingham Institute for Applied Medical Research, Liverpool, NSW (Australia); Vinod, Shalini K. [Liverpool and Macarthur Cancer Therapy Centres, NSW (Australia); Ingham Institute for Applied Medical Research, Liverpool, NSW (Australia); South Western Sydney Clinical School, The University of New South Wales, Liverpool, NSW (Australia); Department of Radiation Oncology, Prince of Wales Hospital, NSW (Australia); Dimigen, Marion [Department of Radiology, Liverpool Hospital, NSW (Australia); Department of Radiation Oncology, Prince of Wales Hospital, NSW (Australia); Descallar, Joseph [Ingham Institute for Applied Medical Research, Liverpool, NSW (Australia); South Western Sydney Clinical School, The University of New South Wales, Liverpool, NSW (Australia); Zogovic, Branimere [Department of Radiation Oncology, Prince of Wales Hospital, NSW (Australia); Atyeo, John [University of Sydney, Sydney, NSW (Australia); Wallis, Sian [University of Western Sydney, NSW (Australia); Holloway, Lois C. [Liverpool and Macarthur Cancer Therapy Centres, NSW (Australia); University of Sydney, Sydney, NSW (Australia); Institute of Medical Physics, University of Sydney, Sydney, NSW (Australia); Centre for Medical Radiation Physics, University of Wollongong, Wollongong, NSW, Australia. (Australia); Ingham Institute for Applied Medical Research, Liverpool, NSW (Australia)

    2016-04-01

    Target volume matching using cone-beam computed tomography (CBCT) is the preferred treatment verification method for lung cancer in many centers. However, radiation therapists (RTs) are trained in bony matching and not soft tissue matching. The purpose of this study was to determine whether RTs were equivalent to radiation oncologists (ROs) and radiologists (RDs) in alignment of the treatment CBCT with the gross tumor volume (GTV) defined at planning and in delineating the GTV on the treatment CBCT, as may be necessary for adaptive radiotherapy. In this study, 10 RTs, 1 RO, and 1 RD performed a manual tumor alignment and correction of the planning GTV to a treatment CBCT to generate an isocenter correction distance for 15 patient data sets. Participants also contoured the GTV on the same data sets. The isocenter correction distance and the contoured GTVs from the RTs were compared with the RD and RO. The mean difference in isocenter correction distances was 0.40 cm between the RO and RD, 0.51 cm between the RTs, and RO and 0.42 cm between the RTs and RD. The 95% CIs were smaller than the equivalence limit of 0.5 cm, indicating that the RTs were equivalent to the RO and RD. For GTV delineation comparisons, the RTs were not found to be equivalent to the RD or RO. The alignment of the planning defined GTV and treatment CBCT using soft tissue matching by the RTs has been shown to be equivalent to those by the RO and RD. However, tumor delineation by the RTs on the treatment CBCT was not equivalent to that of the RO and RD. Thus, it may be appropriate for RTs to undertake soft tissue alignment based on CBCT; however, further investigation may be necessary before RTs undertake delineation for adaptive radiotherapy purposes.

  10. Enhancing Classification Performance of Functional Near-Infrared Spectroscopy- Brain-Computer Interface Using Adaptive Estimation of General Linear Model Coefficients.

    Science.gov (United States)

    Qureshi, Nauman Khalid; Naseer, Noman; Noori, Farzan Majeed; Nazeer, Hammad; Khan, Rayyan Azam; Saleem, Sajid

    2017-01-01

    In this paper, a novel methodology for enhanced classification of functional near-infrared spectroscopy (fNIRS) signals utilizable in a two-class [motor imagery (MI) and rest; mental rotation (MR) and rest] brain-computer interface (BCI) is presented. First, fNIRS signals corresponding to MI and MR are acquired from the motor and prefrontal cortex, respectively, afterward, filtered to remove physiological noises. Then, the signals are modeled using the general linear model, the coefficients of which are adaptively estimated using the least squares technique. Subsequently, multiple feature combinations of estimated coefficients were used for classification. The best classification accuracies achieved for five subjects, for MI versus rest are 79.5, 83.7, 82.6, 81.4, and 84.1% whereas those for MR versus rest are 85.5, 85.2, 87.8, 83.7, and 84.8%, respectively, using support vector machine. These results are compared with the best classification accuracies obtained using the conventional hemodynamic response. By means of the proposed methodology, the average classification accuracy obtained was significantly higher ( p  classification-performance fNIRS-BCI.

  11. The optimal monochromatic spectral computed tomographic imaging plus adaptive statistical iterative reconstruction algorithm can improve the superior mesenteric vessel image quality.

    Science.gov (United States)

    Yin, Xiao-Ping; Zuo, Zi-Wei; Xu, Ying-Jin; Wang, Jia-Ning; Liu, Huai-Jun; Liang, Guang-Lu; Gao, Bu-Lang

    2017-04-01

    To investigate the effect of the optimal monochromatic spectral computed tomography (CT) plus adaptive statistical iterative reconstruction on the improvement of the image quality of the superior mesenteric artery and vein. The gemstone spectral CT angiographic data of 25 patients were reconstructed in the following three groups: 70KeV, the optimal monochromatic imaging, and the optimal monochromatic plus 40%iterative reconstruction mode. The CT value, image noises (IN), background CT value and noises, contrast-to-noise ratio (CNR), signal-to-noise ratio (SNR) and image scores of the vessels and surrounding tissues were analyzed. In the 70KeV, the optimal monochromatic and the optimal monochromatic images plus 40% iterative reconstruction group, the mean scores of image quality were 3.86, 4.24 and 4.25 for the superior mesenteric artery and 3.46, 3.78 and 3.81 for the superior mesenteric vein, respectively. The image quality scores for the optimal monochromatic and the optimal monochromatic plus 40% iterative reconstruction groups were significantly greater than for the 70KeV group (Piterative reconstruction group than in the 70KeV group. The optimal monochromatic plus 40% iterative reconstruction group had significantly (Piterative reconstruction using low-contrast agent dosage and low injection rate can significantly improve the image quality of the superior mesenteric artery and vein. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Reducing radiation dose in the diagnosis of pulmonary embolism using adaptive statistical iterative reconstruction and lower tube potential in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kaul, David [Campus Virchow-Klinikum, Department of Radiation Oncology, Charite School of Medicine and University Hospital, Berlin (Germany); Charite School of Medicine and University Hospital, Department of Radiology, Berlin (Germany); Grupp, Ulrich; Kahn, Johannes; Wiener, Edzard; Hamm, Bernd; Streitparth, Florian [Charite School of Medicine and University Hospital, Department of Radiology, Berlin (Germany); Ghadjar, Pirus [Campus Virchow-Klinikum, Department of Radiation Oncology, Charite School of Medicine and University Hospital, Berlin (Germany)

    2014-11-15

    To assess the impact of ASIR (adaptive statistical iterative reconstruction) and lower tube potential on dose reduction and image quality in chest computed tomography angiographies (CTAs) of patients with pulmonary embolism. CT data from 44 patients with pulmonary embolism were acquired using different protocols - Group A: 120 kV, filtered back projection, n = 12; Group B: 120 kV, 40 % ASIR, n = 12; Group C: 100 kV, 40 % ASIR, n = 12 and Group D: 80 kV, 40 % ASIR, n = 8. Normalised effective dose was calculated; image quality was assessed quantitatively and qualitatively. Normalised effective dose in Group B was 33.8 % lower than in Group A (p = 0.014) and 54.4 % lower in Group C than in Group A (p < 0.001). Group A, B and C did not show significant differences in qualitative or quantitative analysis of image quality. Group D showed significantly higher noise levels in qualitative and quantitative analysis, significantly more artefacts and decreased overall diagnosability. Best results, considering dose reduction and image quality, were achieved in Group C. The combination of ASIR and lower tube potential is an option to reduce radiation without significant worsening of image quality in the diagnosis of pulmonary embolism. (orig.)

  13. Enhancing Classification Performance of Functional Near-Infrared Spectroscopy- Brain–Computer Interface Using Adaptive Estimation of General Linear Model Coefficients

    Directory of Open Access Journals (Sweden)

    Nauman Khalid Qureshi

    2017-07-01

    Full Text Available In this paper, a novel methodology for enhanced classification of functional near-infrared spectroscopy (fNIRS signals utilizable in a two-class [motor imagery (MI and rest; mental rotation (MR and rest] brain–computer interface (BCI is presented. First, fNIRS signals corresponding to MI and MR are acquired from the motor and prefrontal cortex, respectively, afterward, filtered to remove physiological noises. Then, the signals are modeled using the general linear model, the coefficients of which are adaptively estimated using the least squares technique. Subsequently, multiple feature combinations of estimated coefficients were used for classification. The best classification accuracies achieved for five subjects, for MI versus rest are 79.5, 83.7, 82.6, 81.4, and 84.1% whereas those for MR versus rest are 85.5, 85.2, 87.8, 83.7, and 84.8%, respectively, using support vector machine. These results are compared with the best classification accuracies obtained using the conventional hemodynamic response. By means of the proposed methodology, the average classification accuracy obtained was significantly higher (p < 0.05. These results serve to demonstrate the feasibility of developing a high-classification-performance fNIRS-BCI.

  14. Image quality of low-dose CCTA in obese patients: impact of high-definition computed tomography and adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Gebhard, Cathérine; Fuchs, Tobias A; Fiechter, Michael; Stehli, Julia; Stähli, Barbara E; Gaemperli, Oliver; Kaufmann, Philipp A

    2013-10-01

    The accuracy of coronary computed tomography angiography (CCTA) in obese persons is compromised by increased image noise. We investigated CCTA image quality acquired on a high-definition 64-slice CT scanner using modern adaptive statistical iterative reconstruction (ASIR). Seventy overweight and obese patients (24 males; mean age 57 years, mean body mass index 33 kg/m(2)) were studied with clinically-indicated contrast enhanced CCTA. Thirty-five patients underwent a standard definition protocol with filtered backprojection reconstruction (SD-FBP) while 35 patients matched for gender, age, body mass index and coronary artery calcifications underwent a novel high definition protocol with ASIR (HD-ASIR). Segment by segment image quality was assessed using a four-point scale (1 = excellent, 2 = good, 3 = moderate, 4 = non-diagnostic) and revealed better scores for HD-ASIR compared to SD-FBP (1.5 ± 0.43 vs. 1.8 ± 0.48; p definition backprojection protocol (SD-FBP), a newer high definition scan protocol in combination with ASIR (HD-ASIR) incrementally improved image quality and visualization of distal coronary artery segments in overweight and obese individuals, without increasing image noise and radiation dose.

  15. Short tunnels.

    NARCIS (Netherlands)

    Schreuder, D.A.

    1965-01-01

    Before dealing with the question of lighting short tunnels, it is necessary define what is meant by a tunnel and when it should be called 'short'. Confined to motorized road traffic the following is the most apt definition of a tunnel: every form of roofing-over a road section, irrespective of it

  16. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    these feelings can drive the manager to fight or flight . Managers either leave the organization or resist it by holding back information. As a...when man is threatened he resorts to a " fight or flight " reaction. This reaction, 36 ....-.. ..... anxiety producing in itself, is man’s attempt to...stimulates " fight or flight ," which pushes man into a stressful state, which prompts either aggressive behavior or departure from the system [Ref

  17. The influence of short-time period of an adaptation to decreased ambient temperature on interleukin-6 and corticosterone levels in female Wistar strain rats in the proestrous phase of the reproductive cycle.

    Directory of Open Access Journals (Sweden)

    Grazyna Wójcik

    2008-04-01

    insignificant. Our observations confirm the proposition, that even short-time changes of ambient conditions can activate adaptation mechanisms in the organism, which in part, is the activation of the immune system.

  18. [Does the medial-lateral stability of total knee replacements have an effect on short-term clinical outcomes? One-year results of a multicentre study with computer assisted surgery].

    Science.gov (United States)

    Martín-Hernández, C; Revenga-Giertych, C; Hernández-Vaquero, D; Albareda-Albareda, J; Queiruga-Dios, J A; García-Aguilera, D; Ranera-García, M

    2014-01-01

    To evaluate the influence of the medial-lateral stability of the joint on the short-term clinical outcomes after performing navigation in total knee replacement. A multicentre prospective study was conducted on 111 consecutive total knee replacements performed with computer assisted surgery. The study included the evaluation of KSS, WOMAC, and SF-12 preoperatively, and at 3 and 12 months of follow-up, and correlation with stability data obtained during surgery, in extension and at 20° and 90° of flexion. No differences were found in WOMAC, KSS and SF-12 relative to coronal stability during surgery. Variations in coronal stability were shown to have no influence on the short-term clinical results of navigated total knee replacement. Copyright © 2013 SECOT. Published by Elsevier Espana. All rights reserved.

  19. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  20. Gait adaptation to visual kinematic perturbations using a real-time closed-loop brain computer interface to a virtual reality avatar

    Science.gov (United States)

    Luu, Trieu Phat; He, Yongtian; Brown, Samuel; Nakagame, Sho; Contreras-Vidal, Jose L.

    2017-01-01

    Objective The control of human bipedal locomotion is of great interest to the field of lower-body brain computer interfaces (BCIs) for gait rehabilitation. While the feasibility of closed-loop BCI systems for the control of a lower body exoskeleton has been recently shown, multi-day closed-loop neural decoding of human gait in a BCI virtual reality (BCI-VR) environment has yet to be demonstrated. BCI-VR systems provide valuable alternatives for movement rehabilitation when wearable robots are not desirable due to medical conditions, cost, accessibility, usability, or patient preferences. Approach In this study, we propose a real-time closed-loop BCI that decodes lower limb joint angles from scalp electroencephalography (EEG) during treadmill walking to control a walking avatar in a virtual environment. Fluctuations in the amplitude of slow cortical potentials of EEG in the delta band (0.1 – 3 Hz) were used for prediction; thus, the EEG features correspond to time-domain amplitude modulated (AM) potentials in the delta band. Virtual kinematic perturbations resulting in asymmetric walking gait patterns of the avatar were also introduced to investigate gait adaptation using the closed-loop BCI-VR system over a period of eight days. Main results Our results demonstrate the feasibility of using a closed-loop BCI to learn to control a walking avatar under normal and altered visuomotor perturbations, which involved cortical adaptations. The average decoding accuracies (Pearson’s r values) in real-time BCI across all subjects increased from (Hip: 0.18 ± 0.31; Knee: 0.23 ± 0.33; Ankle: 0.14 ± 0.22) on Day 1 to (Hip: 0.40 ± 0.24; Knee: 0.55 ± 0.20; Ankle: 0.29 ± 0.22) on Day 8. Significance These findings have implications for the development of a real-time closed-loop EEG-based BCI-VR system for gait rehabilitation after stroke and for understanding cortical plasticity induced by a closed-loop BCI-VR system. PMID:27064824