WorldWideScience

Sample records for computer adaptive short

  1. A fiber orientation-adapted integration scheme for computing the hyperelastic Tucker average for short fiber reinforced composites

    Science.gov (United States)

    Goldberg, Niels; Ospald, Felix; Schneider, Matti

    2017-10-01

    In this article we introduce a fiber orientation-adapted integration scheme for Tucker's orientation averaging procedure applied to non-linear material laws, based on angular central Gaussian fiber orientation distributions. This method is stable w.r.t. fiber orientations degenerating into planar states and enables the construction of orthotropic hyperelastic energies for truly orthotropic fiber orientation states. We establish a reference scenario for fitting the Tucker average of a transversely isotropic hyperelastic energy, corresponding to a uni-directional fiber orientation, to microstructural simulations, obtained by FFT-based computational homogenization of neo-Hookean constituents. We carefully discuss ideas for accelerating the identification process, leading to a tremendous speed-up compared to a naive approach. The resulting hyperelastic material map turns out to be surprisingly accurate, simple to integrate in commercial finite element codes and fast in its execution. We demonstrate the capabilities of the extracted model by a finite element analysis of a fiber reinforced chain link.

  2. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  3. Towards psychologically adaptive brain-computer interfaces

    Science.gov (United States)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  4. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  5. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  6. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  7. Wavefront measurement using computational adaptive optics.

    Science.gov (United States)

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  8. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    , we are improving computational performance by exploiting modern hardware features, such as dynamic voltage-frequency scaling and transactional memory. Adapting software is an iterative process, requiring that we continually revisit it to meet new requirements or realities; a time consuming process......The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen...

  9. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  10. A stereotactic adapter compatible with computed tomography

    International Nuclear Information System (INIS)

    Cacak, R.K.; Law, J.D.

    1982-01-01

    One application of computed-tomographic (CT) scanners is the localization of intracranial targets for stereotactic surgery. Unfortunately, conventional stereotactic devices affixed to the patient cause artifacts which obscure anatomic features in CT images. The authors describe the initial phase of a project to eliminate this problem by using an adapter that is free of metallic objects. Localization of the target point relative to the coordinate system of a Leksell stereotactic frame is achieved from CT image measurements

  11. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  12. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang

    2010-10-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting the changes in a grid system can help to alarm the anomalies, clean the noises, and report the new patterns. In this paper, we proposed an approach of self-adaptive change detection based on the Page-Hinkley statistic test. It handles the non-stationary distribution without the assumption of data distribution and the empirical setting of parameters. We validate the approach on the EGEE streaming jobs, and report its better performance on achieving higher accuracy comparing to the other change detection methods. Meanwhile this change detection process could help to discover the device fault which was not claimed in the system logs. © 2010 IEEE.

  13. COMPUTER VISION SYNDROME: A SHORT REVIEW.

    OpenAIRE

    Sameena; Mohd Inayatullah

    2012-01-01

    Computers are probably one of the biggest scientific inventions of the modern era, and since then they have become an integral part of our life. The increased usage of computers have lead to variety of ocular symptoms which includ es eye strain, tired eyes, irritation, redness, blurred vision, and diplopia, collectively referred to as Computer Vision Syndrome (CVS). CVS may have a significant impact not only on visual com fort but also occupational productivit...

  14. Very-long-term and short-term chromatic adaptation: are their influences cumulative?

    Science.gov (United States)

    Belmore, Suzanne C; Shevell, Steven K

    2011-02-09

    Very-long-term (VLT) chromatic adaptation results from exposure to an altered chromatic environment for days or weeks. Color shifts from VLT adaptation are observed hours or days after leaving the altered environment. Short-term chromatic adaptation, on the other hand, results from exposure for a few minutes or less, with color shifts measured within seconds or a few minutes after the adapting light is extinguished; recovery to the pre-adapted state is complete in less than an hour. Here, both types of adaptation were combined. All adaptation was to reddish-appearing long-wavelength light. Shifts in unique yellow were measured following adaptation. Previous studies demonstrate shifts in unique yellow due to VLT chromatic adaptation, but shifts from short-term chromatic adaptation to comparable adapting light can be far greater than from VLT adaptation. The question considered here is whether the color shifts from VLT adaptation are cumulative with large shifts from short-term adaptation or, alternatively, does simultaneous short-term adaptation eliminate color shifts caused by VLT adaptation. The results show the color shifts from VLT and short-term adaptation together are cumulative, which indicates that both short-term and very-long-term chromatic adaptation affect color perception during natural viewing. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. ICAN Computer Code Adapted for Building Materials

    Science.gov (United States)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  16. Short-Term Effects of Playing Computer Games on Attention

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-01-01

    Objective: The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. Method: One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour.…

  17. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  18. Understanding Coral's Short-term Adaptive Ability to Changing Environment

    Science.gov (United States)

    Tisthammer, K.; Richmond, R. H.

    2016-02-01

    Corals in Maunalua Bay, Hawaii are under chronic pressures from sedimentation and terrestrial runoffs containing multiple pollutants as a result of large scale urbanization that has taken place in the last 100 years. However, some individual corals thrive despite the prolonged exposure to these environmental stressors, which suggests that these individuals may have adapted to withstand such stressors. A recent survey showed that the lobe coral Porites lobata from the `high-stress' nearshore site had an elevated level of stress ixnduced proteins, compared to those from the `low-stress,' less polluted offshore site. To understand the genetic basis for the observed differential stress responses between the nearshore and offshore P. lobata populations, an analysis of the lineage-scale population genetic structure, as well as a reciprocal transplant experiment were conducted. The result of the genetic analysis revealed a clear genetic differentiation between P. lobata from the nearshore site and the offshore site. Following the 30- day reciprocal transplant experiment, protein expression profiles and other stress-related physiological characteristics were compared between the two populations. The experimental results suggest that the nearshore genotype can cope better with sedimentation/pollutants than the offshore genotype. This indicates that the observed genetic differentiation is due to selection for tolerance to these environmental stressors. Understanding the little-known, linage-scale genetic variation in corals offers a critical insight into their short-term adaptive ability, which is indispensable for protecting corals from impending environmental and climate change. The results of this study also offer a valuable tool for resource managers to make effective decisions on coral reef conservation, such as designing marine protected areas that incorporate and maintain such genetic diversity, and establishing acceptable pollution run-off levels.

  19. Computerized adaptive testing in computer assisted learning?

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick

    2011-01-01

    A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and

  20. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    NARCIS (Netherlands)

    de Beurs, D.P.; de Vries, A.L.M.; de Groot, M.H.; de Keijser, J.; Kerkhof, A.J.F.M.

    2014-01-01

    Background: The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce

  1. Adaptation of HAMMER computer code to CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Pinheiro, A.M.B.S.; Nair, R.P.K.

    1982-01-01

    The adaptation of HAMMER computer code to CYBER 170/750 computer is presented. The HAMMER code calculates cell parameters by multigroup transport theory and reactor parameters by few group diffusion theory. The auxiliary programs, the carried out modifications and the use of HAMMER system adapted to CYBER 170/750 computer are described. (M.C.K.) [pt

  2. Short generators without quantum computers : the case of multiquadratics

    NARCIS (Netherlands)

    Bauch, J.; Bernstein, D.J.; de Valence, H.; Lange, T.; van Vredendaal, C.; Coron, J.-S.; Nielsen, J.B.

    2017-01-01

    Finding a short element g of a number field, given the ideal generated by g, is a classic problem in computational algebraic number theory. Solving this problem recovers the private key in cryptosystems introduced by Gentry, Smart–Vercauteren, Gentry–Halevi, Garg– Gentry–Halevi, et al. Work over the

  3. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  4. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  5. 21 CFR 874.1070 - Short increment sensitivity index (SISI) adapter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Short increment sensitivity index (SISI) adapter. 874.1070 Section 874.1070 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... increment sensitivity index (SISI) adapter. (a) Identification. A short increment sensitivity index (SISI...

  6. Discrete linear canonical transform computation by adaptive method.

    Science.gov (United States)

    Zhang, Feng; Tao, Ran; Wang, Yue

    2013-07-29

    The linear canonical transform (LCT) describes the effect of quadratic phase systems on a wavefield and generalizes many optical transforms. In this paper, the computation method for the discrete LCT using the adaptive least-mean-square (LMS) algorithm is presented. The computation approaches of the block-based discrete LCT and the stream-based discrete LCT using the LMS algorithm are derived, and the implementation structures of these approaches by the adaptive filter system are considered. The proposed computation approaches have the inherent parallel structures which make them suitable for efficient VLSI implementations, and are robust to the propagation of possible errors in the computation process.

  7. Unauthorised adaptation of computer programmes - is ...

    African Journals Online (AJOL)

    Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty) Ltd. This case note inter alia analyses the possibility of an author being sued for ...

  8. A New Adaptive Checkpointing Strategy for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    MENChaoguang; ZUODecheng; YANGXiaozong

    2005-01-01

    Adaptive checkpointing strategy is an efficient recovery scheme, which is suitable for mobile computing system. However, all existing adaptive checkpointing schemes are not correct to recover system when failure occurs in some special period. In this paper, the issues that will lead to system inconsistency are first discussed and then a new adaptive strategy that can recover system to correct consistent state is proposed. Our algorithm improves system recovery performance because only failure process needs rollback through logging.

  9. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  10. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  11. Short-term effects of playing computer games on attention.

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  12. Adaptive security protocol selection for mobile computing

    NARCIS (Netherlands)

    Pontes Soares Rocha, B.; Costa, D.N.O.; Moreira, R.A.; Rezende, C.G.; Loureiro, A.A.F.; Boukerche, A.

    2010-01-01

    The mobile computing paradigm has introduced new problems for application developers. Challenges include heterogeneity of hardware, software, and communication protocols, variability of resource limitations and varying wireless channel quality. In this scenario, security becomes a major concern for

  13. Computing three-point functions for short operators

    International Nuclear Information System (INIS)

    Bargheer, Till; Institute for Advanced Study, Princeton, NJ; Minahan, Joseph A.; Pereira, Raul

    2013-11-01

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  14. Computing three-point functions for short operators

    Energy Technology Data Exchange (ETDEWEB)

    Bargheer, Till [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Minahan, Joseph A.; Pereira, Raul [Uppsala Univ. (Sweden). Dept. of Physics and Astronomy

    2013-11-15

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  15. Simple and Effective Algorithms: Computer-Adaptive Testing.

    Science.gov (United States)

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  16. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  17. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  18. Unauthorised adaptation of computer programmes - is criminalisation a solution?

    Directory of Open Access Journals (Sweden)

    L Muswaka

    2011-12-01

    Full Text Available In Haupt t/a Softcopy v Brewers Marketing Intelligence (Pty Ltd 2006 4 SA 458 (SCA Haupt sought to enforce a copyright claim in the Data Explorer computer programme against Brewers Marketing Intelligence (Pty Ltd. His claim was dismissed in the High Court and he appealed to the Supreme Court of Appeal. The Court held that copyright in the Data Explorer programme vested in Haupt. Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty Ltd.This case note inter alia analyses the possibility of an author being sued for infringement even though he has acquired copyright in a work that he created by making unauthorised adaptations to another's copyright material. Furthermore, it examines whether or not the law adequately protects copyright owners in situations where infringement takes the form of unauthorised adaptations of computer programmes. It is argued that the protection afforded by the Copyright Act 98 of 1978 (Copyright Act in terms of section 27(1 to copyright owners of computer programmes is narrowly defined. It excludes from its ambit of criminal liability the act of making unauthorised adaptation of computer programmes. The issue that is considered is therefore whether or not the unauthorised adaptation of computer programmes should attract a criminal sanction. In addressing this issue and with the aim of making recommendations, the legal position in the United Kingdom (UK is analysed. From the analysis it is recommended that the Copyright Act be amended by the insertion of a new section, section 27(1(A, which will make the act of making an unauthorised adaptation of a computer programme an offence. This recommended section will close the gap that currently exists in our law with regard to unauthorised adaptations of computer programmes.

  19. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    Science.gov (United States)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  20. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  1. Effect of Short-Term Study Abroad Programs on Students' Cultural Adaptability

    Science.gov (United States)

    Mapp, Susan C.

    2012-01-01

    The number of U.S. students studying abroad has been growing, particularly those participating in short-term trips. However, literature on the effect of these short-term trips is lacking. The purpose of this study was to assess quantitatively the effect on bachelor students' cross-cultural adaptability using a pre-post design. Significant changes…

  2. An adaptive random search for short term generation scheduling with network constraints.

    Directory of Open Access Journals (Sweden)

    J A Marmolejo

    Full Text Available This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  3. Adaptive synchrosqueezing based on a quilted short-time Fourier transform

    Science.gov (United States)

    Berrian, Alexander; Saito, Naoki

    2017-08-01

    In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.

  4. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...

  5. Computer-Adaptive Testing in Second Language Contexts.

    Science.gov (United States)

    Chalhoub-Deville, Micheline; Deville, Craig

    1999-01-01

    Provides a broad overview of computerized testing issues with an emphasis on computer-adaptive testing (CAT). A survey of the potential benefits and drawbacks of CAT are given, the process of CAT development is described; and some L2 instruments developed to assess various language skills are summarized. (Author/VWL)

  6. Sequential decision making in computational sustainability via adaptive submodularity

    Science.gov (United States)

    Krause, Andreas; Golovin, Daniel; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  7. Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.

    Science.gov (United States)

    Saller, Maximilian A C; Habershon, Scott

    2017-07-11

    Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.

  8. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  9. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  10. Polynomial Phase Estimation Based on Adaptive Short-Time Fourier Transform.

    Science.gov (United States)

    Jing, Fulong; Zhang, Chunjie; Si, Weijian; Wang, Yu; Jiao, Shuhong

    2018-02-13

    Polynomial phase signals (PPSs) have numerous applications in many fields including radar, sonar, geophysics, and radio communication systems. Therefore, estimation of PPS coefficients is very important. In this paper, a novel approach for PPS parameters estimation based on adaptive short-time Fourier transform (ASTFT), called the PPS-ASTFT estimator, is proposed. Using the PPS-ASTFT estimator, both one-dimensional and multi-dimensional searches and error propagation problems, which widely exist in PPSs field, are avoided. In the proposed algorithm, the instantaneous frequency (IF) is estimated by S-transform (ST), which can preserve information on signal phase and provide a variable resolution similar to the wavelet transform (WT). The width of the ASTFT analysis window is equal to the local stationary length, which is measured by the instantaneous frequency gradient (IFG). The IFG is calculated by the principal component analysis (PCA), which is robust to the noise. Moreover, to improve estimation accuracy, a refinement strategy is presented to estimate signal parameters. Since the PPS-ASTFT avoids parameter search, the proposed algorithm can be computed in a reasonable amount of time. The estimation performance, computational cost, and implementation of the PPS-ASTFT are also analyzed. The conducted numerical simulations support our theoretical results and demonstrate an excellent statistical performance of the proposed algorithm.

  11. Short-term adaptations as a response to travel time: results of a stated adaptation experimentincreases

    NARCIS (Netherlands)

    Psarra, I.; Arentze, T.A.; Timmermans, H.J.P.

    2016-01-01

    This study focused on short-term dynamics of activity-travel behavior as a response to travel time increases. It is assumed that short-term changes are triggered by stress, which is defined as the deviation between an individual’s aspirations and his or her daily experiences. When stress exceeds a

  12. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  13. Computational modeling of ultra-short-pulse ablation of enamel

    Energy Technology Data Exchange (ETDEWEB)

    London, R.A.; Bailey, D.S.; Young, D.A. [and others

    1996-02-29

    A computational model for the ablation of tooth enamel by ultra-short laser pulses is presented. The role of simulations using this model in designing and understanding laser drilling systems is discussed. Pulses of duration 300 sec and intensity greater than 10{sup 12} W/cm{sup 2} are considered. Laser absorption proceeds via multi-photon initiated plasma mechanism. The hydrodynamic response is calculated with a finite difference method, using an equation of state constructed from thermodynamic functions including electronic, ion motion, and chemical binding terms. Results for the ablation efficiency are presented. An analytic model describing the ablation threshold and ablation depth is presented. Thermal coupling to the remaining tissue and long-time thermal conduction are calculated. Simulation results are compared to experimental measurements of the ablation efficiency. Desired improvements in the model are presented.

  14. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  15. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  16. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  17. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  18. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  19. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    Science.gov (United States)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  20. Method and system for environmentally adaptive fault tolerant computing

    Science.gov (United States)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  1. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish

    OpenAIRE

    Rutkowski, Radosław; Gałczyńska-Rusin, Małgorzata; Gizińska, Małgorzata; Straburzyński-Lupa, Marcin; Zdanowska, Agata; Romanowski, Mateusz Wojciech; Romanowski, Wojciech; Budiman-Mak, Elly; Straburzyńska-Lupa, Anna

    2017-01-01

    Purpose The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS) questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA). Methods The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheu...

  2. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  3. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  4. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  5. Intricacies of Feedback in Computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    Prism Adaptation Therapy (PAT) is an intervention method for treatment of attentional disorders, such as neglect e.g. 1,2. The method involves repeated pointing at specified targets with or without prism glasses using a specifically designed wooden box. The aim of this study was to ascertain...... whether the PAT method can be executed with similar effect using a computer with a touch screen.   62 healthy subjects were subjected to two experimental conditions: 1) pointing out at targets using the original box, 2) pointing out at targets on a computer attached touch screen. In both conditions......, the subjects performed a pre-test consisting of 30 targets without feedback, then an exposure-test of 90 targets with prism glasses and feedback, and finally a post-test of 60 targets, with no glasses and no feedback. Two experiments were carried out, 1) the feedback was provided by showing a cross...

  6. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  7. Configurable multiplier modules for an adaptive computing system

    Directory of Open Access Journals (Sweden)

    O. A. Pfänder

    2006-01-01

    Full Text Available The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  8. Computer prediction of subsurface radionuclide transport: an adaptive numerical method

    International Nuclear Information System (INIS)

    Neuman, S.P.

    1983-01-01

    Radionuclide transport in the subsurface is often modeled with the aid of the advection-dispersion equation. A review of existing computer methods for the solution of this equation shows that there is need for improvement. To answer this need, a new adaptive numerical method is proposed based on an Eulerian-Lagrangian formulation. The method is based on a decomposition of the concentration field into two parts, one advective and one dispersive, in a rigorous manner that does not leave room for ambiguity. The advective component of steep concentration fronts is tracked forward with the aid of moving particles clustered around each front. Away from such fronts the advection problem is handled by an efficient modified method of characteristics called single-step reverse particle tracking. When a front dissipates with time, its forward tracking stops automatically and the corresponding cloud of particles is eliminated. The dispersion problem is solved by an unconventional Lagrangian finite element formulation on a fixed grid which involves only symmetric and diagonal matrices. Preliminary tests against analytical solutions of ne- and two-dimensional dispersion in a uniform steady state velocity field suggest that the proposed adaptive method can handle the entire range of Peclet numbers from 0 to infinity, with Courant numbers well in excess of 1

  9. Water System Adaptation To Hydrological Changes: Module 11, Methods and Tools: Computational Models

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  10. Passenger thermal perceptions, thermal comfort requirements, and adaptations in short- and long-haul vehicles.

    Science.gov (United States)

    Lin, Tzu-Ping; Hwang, Ruey-Lung; Huang, Kuo-Tsang; Sun, Chen-Yi; Huang, Ying-Che

    2010-05-01

    While thermal comfort in mass transportation vehicles is relevant to service quality and energy consumption, benchmarks for such comfort that reflect the thermal adaptations of passengers are currently lacking. This study reports a field experiment involving simultaneous physical measurements and a questionnaire survey, collecting data from 2,129 respondents, that evaluated thermal comfort in short- and long-haul buses and trains. Experimental results indicate that high air temperature, strong solar radiation, and low air movement explain why passengers feel thermally uncomfortable. The overall insulation of clothing worn by passengers and thermal adaptive behaviour in vehicles differ from those in their living and working spaces. Passengers in short-haul vehicles habitually adjust the air outlets to increase thermal comfort, while passengers in long-haul vehicles prefer to draw the drapes to reduce discomfort from extended exposure to solar radiation. The neutral temperatures for short- and long-haul vehicles are 26.2 degrees C and 27.4 degrees C, while the comfort zones are 22.4-28.9 degrees C and 22.4-30.1 degrees C, respectively. The results of this study provide a valuable reference for practitioners involved in determining the adequate control and management of in-vehicle thermal environments, as well as facilitating design of buses and trains, ultimately contributing to efforts to achieve a balance between the thermal comfort satisfaction of passengers and energy conserving measures for air-conditioning in mass transportation vehicles.

  11. Quinoa - Adaptive Computational Fluid Dynamics, 0.2

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter, an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh

  12. Short-term saccadic adaptation in the macaque monkey: a binocular mechanism

    Science.gov (United States)

    Schultz, K. P.

    2013-01-01

    Saccadic eye movements are rapid transfers of gaze between objects of interest. Their duration is too short for the visual system to be able to follow their progress in time. Adaptive mechanisms constantly recalibrate the saccadic responses by detecting how close the landings are to the selected targets. The double-step saccadic paradigm is a common method to simulate alterations in saccadic gain. While the subject is responding to a first target shift, a second shift is introduced in the middle of this movement, which masks it from visual detection. The error in landing introduced by the second shift is interpreted by the brain as an error in the programming of the initial response, with gradual gain changes aimed at compensating the apparent sensorimotor mismatch. A second shift applied dichoptically to only one eye introduces disconjugate landing errors between the two eyes. A monocular adaptive system would independently modify only the gain of the eye exposed to the second shift in order to reestablish binocular alignment. Our results support a binocular mechanism. A version-based saccadic adaptive process detects postsaccadic version errors and generates compensatory conjugate gain alterations. A vergence-based saccadic adaptive process detects postsaccadic disparity errors and generates corrective nonvisual disparity signals that are sent to the vergence system to regain binocularity. This results in striking dynamical similarities between visually driven combined saccade-vergence gaze transfers, where the disparity is given by the visual targets, and the double-step adaptive disconjugate responses, where an adaptive disparity signal is generated internally by the saccadic system. PMID:23076111

  13. A short course in computational geometry and topology

    CERN Document Server

    Edelsbrunner, Herbert

    2014-01-01

    With the aim to bring the subject of Computational Geometry and Topology closer to the scientific audience, this book is written in thirteen ready-to-teach sections organized in four parts: tessellations, complexes, homology, persistence. To speak to the non-specialist, detailed formalisms are often avoided in favor of lively 2- and 3-dimensional illustrations. The book is warmly recommended to everybody who loves geometry and the fascinating world of shapes.

  14. Adaptive phase measurements in linear optical quantum computation

    International Nuclear Information System (INIS)

    Ralph, T C; Lund, A P; Wiseman, H M

    2005-01-01

    Photon counting induces an effective non-linear optical phase shift in certain states derived by linear optics from single photons. Although this non-linearity is non-deterministic, it is sufficient in principle to allow scalable linear optics quantum computation (LOQC). The most obvious way to encode a qubit optically is as a superposition of the vacuum and a single photon in one mode-so-called 'single-rail' logic. Until now this approach was thought to be prohibitively expensive (in resources) compared to 'dual-rail' logic where a qubit is stored by a photon across two modes. Here we attack this problem with real-time feedback control, which can realize a quantum-limited phase measurement on a single mode, as has been recently demonstrated experimentally. We show that with this added measurement resource, the resource requirements for single-rail LOQC are not substantially different from those of dual-rail LOQC. In particular, with adaptive phase measurements an arbitrary qubit state α vertical bar 0>+β vertical bar 1> can be prepared deterministically

  15. Significant decimal digits for energy representation on short-word computers

    International Nuclear Information System (INIS)

    Sartori, E.

    1989-01-01

    The general belief that single precision floating point numbers have always at least seven significant decimal digits on short word computers such as IBM is erroneous. Seven significant digits are required however for representing the energy variable in nuclear cross-section data sets containing sharp p-wave resonances at 0 Kelvin. It is suggested that either the energy variable is stored in double precision or that cross-section resonances are reconstructed to room temperature or higher on short word computers

  16. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  17. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    Science.gov (United States)

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  18. Passive adaptation to stress in adulthood after short-term social instability stress during adolescence in mice.

    Science.gov (United States)

    de Lima, A P N; Massoco, C O

    2017-05-01

    This study reports that short-term social instability stress (SIS) in adolescence increases passive-coping in adulthood in male mice. Short-term SIS decreased the latency of immobility and increased the frequency and time of immobility in tail suspension test. These findings support the hypothesis that adolescent stress can induce a passive adaptation to stress in adulthood, even if it is a short period of stress.

  19. Computational identification of adaptive mutants using the VERT system

    Directory of Open Access Journals (Sweden)

    Winkler James

    2012-04-01

    Full Text Available Background Evolutionary dynamics of microbial organisms can now be visualized using the Visualizing Evolution in Real Time (VERT system, in which several isogenic strains expressing different fluorescent proteins compete during adaptive evolution and are tracked using fluorescent cell sorting to construct a population history over time. Mutations conferring enhanced growth rates can be detected by observing changes in the fluorescent population proportions. Results Using data obtained from several VERT experiments, we construct a hidden Markov-derived model to detect these adaptive events in VERT experiments without external intervention beyond initial training. Analysis of annotated data revealed that the model achieves consensus with human annotation for 85-93% of the data points when detecting adaptive events. A method to determine the optimal time point to isolate adaptive mutants is also introduced. Conclusions The developed model offers a new way to monitor adaptive evolution experiments without the need for external intervention, thereby simplifying adaptive evolution efforts relying on population tracking. Future efforts to construct a fully automated system to isolate adaptive mutants may find the algorithm a useful tool.

  20. Integrable discretizations and self-adaptive moving mesh method for a coupled short pulse equation

    International Nuclear Information System (INIS)

    Feng, Bao-Feng; Chen, Junchao; Chen, Yong; Maruno, Ken-ichi; Ohta, Yasuhiro

    2015-01-01

    In the present paper, integrable semi-discrete and fully discrete analogues of a coupled short pulse (CSP) equation are constructed. The key to the construction are the bilinear forms and determinant structure of the solutions of the CSP equation. We also construct N-soliton solutions for the semi-discrete and fully discrete analogues of the CSP equations in the form of Casorati determinants. In the continuous limit, we show that the fully discrete CSP equation converges to the semi-discrete CSP equation, then further to the continuous CSP equation. Moreover, the integrable semi-discretization of the CSP equation is used as a self-adaptive moving mesh method for numerical simulations. The numerical results agree with the analytical results very well. (paper)

  1. Adaptive short-term electricity price forecasting using artificial neural networks in the restructured power markets

    International Nuclear Information System (INIS)

    Yamin, H.Y.; Shahidehpour, S.M.; Li, Z.

    2004-01-01

    This paper proposes a comprehensive model for the adaptive short-term electricity price forecasting using Artificial Neural Networks (ANN) in the restructured power markets. The model consists: price simulation, price forecasting, and performance analysis. The factors impacting the electricity price forecasting, including time factors, load factors, reserve factors, and historical price factor are discussed. We adopted ANN and proposed a new definition for the MAPE using the median to study the relationship between these factors and market price as well as the performance of the electricity price forecasting. The reserve factors are included to enhance the performance of the forecasting process. The proposed model handles the price spikes more efficiently due to considering the median instead of the average. The IEEE 118-bus system and California practical system are used to demonstrate the superiority of the proposed model. (author)

  2. Speed regulating Effects of Incentive-based Intelligent Speed Adaptation in the short and medium term

    DEFF Research Database (Denmark)

    Agerholm, Niels

    Speed regulating Effects of Incentive-based Intelligent Speed Adaptation in the short and medium term Despite massive improvements in vehicles’ safety equipment, more information and safer road network, inappropriate road safety is still causing that more than 250 people are killed and several...... thousands injured each year in Denmark. Until a few years ago the number of fatalities in most countries had decreased while the amount of traffic increased. However, this trend has been replaced by a more uncertain development towards a constant or even somewhat increasing risk. Inappropriate speeding...... is a central cause for the high number of fatalities on the roads. Despite speed limits, speed limit violating driving behaviour is still widespread in Denmark. Traditional solutions to prevent speed violation have been enforcement, information, and enhanced road design. It seems, however, hard to achieve...

  3. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  4. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  5. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    Science.gov (United States)

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  6. Short-term cardiorespiratory adaptation to high altitude in children compared with adults.

    Science.gov (United States)

    Kriemler, S; Radtke, T; Bürgi, F; Lambrecht, J; Zehnder, M; Brunner-La Rocca, H P

    2016-02-01

    As short-term cardiorespiratory adaptation to high altitude (HA) exposure has not yet been studied in children, we assessed acute mountain sickness (AMS), hypoxic ventilatory response (HVR) at rest and maximal exercise capacity (CPET) at low altitude (LA) and HA in pre-pubertal children and their fathers. Twenty father-child pairs (11 ± 1 years and 44 ± 4 years) were tested at LA (450 m) and HA (3450 m) at days 1, 2, and 3 after fast ascent (HA1/2/3). HVR was measured at rest and CPET was performed on a cycle ergometer. AMS severity was mild to moderate with no differences between generations. HVR was higher in children than adults at LA and increased at HA similarly in both groups. Peak oxygen uptake (VO2 peak) relative to body weight was similar in children and adults at LA and decreased significantly by 20% in both groups at HA; maximal heart rate did not change at HA in children while it decreased by 16% in adults (P < 0.001). Changes in HVR and VO2 peak from LA to HA were correlated among the biological child-father pairs. In conclusion, cardiorespiratory adaptation to altitude seems to be at least partly hereditary. Even though children and their fathers lose similar fractions of aerobic capacity going to high altitude, the mechanisms might be different. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish

    Directory of Open Access Journals (Sweden)

    Radosław Rutkowski

    2017-01-01

    Full Text Available Purpose. The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA. Methods. The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheumatological hospital in Śrem (Poland. They represented different sociodemographic characteristics and were characterized as rural and city environments residents. Results. The mean age of the patients was 58.9±10.2 years. The majority of patients (85% were female. The average final FFI-RS score was 62.9±15.3. The internal consistency was achieved at a high level of 0.95 in Cronbach’s alpha test, with an interclass correlation coefficient ranging between 0.78 and 0.84. A strong correlation was observed between the FFI-RS and Health Assessment Questionnaire-Disability Index (HAQ-DI questionnaires. Conclusion. The Polish version of FFI-RS-PL indicator is an important tool for evaluating the functional condition of patients’ feet and can be applied in the diagnosis and treatment of Polish-speaking patients suffering from RA.

  8. Adaptation and Validation of the Foot Function Index-Revised Short Form into Polish.

    Science.gov (United States)

    Rutkowski, Radosław; Gałczyńska-Rusin, Małgorzata; Gizińska, Małgorzata; Straburzyński-Lupa, Marcin; Zdanowska, Agata; Romanowski, Mateusz Wojciech; Romanowski, Wojciech; Budiman-Mak, Elly; Straburzyńska-Lupa, Anna

    2017-01-01

    The aim of the present study was to adapt the Foot Function Index-Revised Short Form (FFI-RS) questionnaire into Polish and verify its reliability and validity in a group of patients with rheumatoid arthritis (RA). The study included 211 patients suffering from RA. The FFI-RS questionnaire underwent standard linguistic adaptation and its psychometric parameters were investigated. The enrolled participants had been recruited for seven months as a convenient sample from the rheumatological hospital in Śrem (Poland). They represented different sociodemographic characteristics and were characterized as rural and city environments residents. The mean age of the patients was 58.9 ± 10.2 years. The majority of patients (85%) were female. The average final FFI-RS score was 62.9 ± 15.3. The internal consistency was achieved at a high level of 0.95 in Cronbach's alpha test, with an interclass correlation coefficient ranging between 0.78 and 0.84. A strong correlation was observed between the FFI-RS and Health Assessment Questionnaire-Disability Index (HAQ-DI) questionnaires. The Polish version of FFI-RS-PL indicator is an important tool for evaluating the functional condition of patients' feet and can be applied in the diagnosis and treatment of Polish-speaking patients suffering from RA.

  9. Cross-Cultural Adaptation and Psychometric Properties of the Malay Version of the Short Sensory Profile.

    Science.gov (United States)

    Ee, Su Im; Loh, Siew Yim; Chinna, Karuthan; Marret, Mary J

    2016-01-01

    To translate, culturally adapt, and examine psychometric properties of the Malay version Short Sensory Profile (SSP-M). Pretesting (n = 30) of the original English SSP established its applicability for use with Malaysian children aged 3-10 years. This was followed by the translation and cross-cultural adaptation of the SSP-M. Two forward and two back translations were compared and reviewed by a committee of 10 experts who validated the content of the SSP-M, before pilot testing (n = 30). The final SSP-M questionnaire was completed by 419 parents of typically developing children aged 3-10 years. Cronbach's alpha of each section of the SSP-M ranged from 0.73 to 0.93 and the intraclass correlation coefficient (ICC) indicated good reliability (0.62-0.93). The seven factor model of the SSP-M had an adequate fit with evidence of convergent and discriminant validity. We conclude that the SSP-M is a valid and reliable screening tool for use in Malaysia with Malay-speaking parents of children aged 3-10 years. The SSP-M enables Malay-speaking parents to answer the questionnaire with better reliability, and provides occupational therapists with a valid tool to screen for sensory processing difficulties.

  10. Short (

    NARCIS (Netherlands)

    Telleman, Gerdien; den Hartog, Laurens

    2013-01-01

    Aim: This systematic review assessed the implant survival rate of short (<10 mm) dental implants installed in partially edentulous patients. A case report of a short implant in the posterior region have been added. Materials and methods: A search was conducted in the electronic databases of MEDLINE

  11. A case study of evolutionary computation of biochemical adaptation

    International Nuclear Information System (INIS)

    François, Paul; Siggia, Eric D

    2008-01-01

    Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein–protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature

  12. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale – Short Form

    Directory of Open Access Journals (Sweden)

    Fernanda Figueredo Chaves

    Full Text Available ABSTRACT OBJECTIVE To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale – Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. METHODS Assessment of the instrument’s conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee’s assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. RESULTS Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes – Versão Curta, was established. The scale had acceptable internal consistency with Cronbach’s alpha of 0.634 (95%CI 0.494– 0.737, while the correlation of the total score in the two periods was considered moderate (0.47. The intraclass correlation coefficient was 0.50. CONCLUSIONS The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv for recording the Expert Committee responses as well as the responses in the

  13. Computational Bench Testing to Evaluate the Short-Term Mechanical Performance of a Polymeric Stent.

    Science.gov (United States)

    Bobel, A C; Petisco, S; Sarasua, J R; Wang, W; McHugh, P E

    2015-12-01

    Over the last decade, there has been a significant volume of research focussed on the utilization of biodegradable polymers such as poly-L-lactide-acid (PLLA) for applications associated with cardiovascular disease. More specifically, there has been an emphasis on upgrading current clinical shortfalls experienced with conventional bare metal stents and drug eluting stents. One such approach, the adaption of fully formed polymeric stents has led to a small number of products being commercialized. Unfortunately, these products are still in their market infancy, meaning there is a clear non-occurrence of long term data which can support their mechanical performance in vivo. Moreover, the load carry capacity and other mechanical properties essential to a fully optimized polymeric stent are difficult, timely and costly to establish. With the aim of compiling rapid and representative performance data for specific stent geometries, materials and designs, in addition to reducing experimental timeframes, Computational bench testing via finite element analysis (FEA) offers itself as a very powerful tool. On this basis, the research presented in this paper is concentrated on the finite element simulation of the mechanical performance of PLLA, which is a fully biodegradable polymer, in the stent application, using a non-linear viscous material model. Three physical stent geometries, typically used for fully polymeric stents, are selected, and a comparative study is performed in relation to their short-term mechanical performance, with the aid of experimental data. From the simulated output results, an informed understanding can be established in relation to radial strength, flexibility and longitudinal resistance, that can be compared with conventional permanent metal stent functionality, and the results show that it is indeed possible to generate a PLLA stent with comparable and sufficient mechanical performance. The paper also demonstrates the attractiveness of FEA as a tool

  14. Adaptive weighted anisotropic diffusion for computed tomography denoising

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhi; Silver, Michael D. [Toshiba Medical Research Institute USA, Inc., Vernon Hills, IL (United States); Noshi, Yasuhiro [Toshiba Medical System Corporation, Tokyo (Japan)

    2011-07-01

    With increasing awareness of radiation safety, dose reduction has become an important task of modern CT system development. This paper proposes an adaptive weighted anisotropic diffusion method and an adaptive weighted sharp source anisotropic diffusion method as image domain filters to potentially help dose reduction. Different from existing anisotropic diffusion methods, the proposed methods incorporate an edge-sensitive adaptive source term as part of the diffusion iteration. It provides better edge and detail preservation. Visual evaluation showed that the new methods can reduce noise substantially without apparent edge and detail loss. The quantitative evaluations also showed over 50% of noise reduction in terms of noise standard deviations, which is equivalent to over 75% of dose reduction for a normal dose image quality. (orig.)

  15. An adaptable Boolean net trainable to control a computing robot

    International Nuclear Information System (INIS)

    Lauria, F. E.; Prevete, R.; Milo, M.; Visco, S.

    1999-01-01

    We discuss a method to implement in a Boolean neural network a Hebbian rule so to obtain an adaptable universal control system. We start by presenting both the Boolean neural net and the Hebbian rule we have considered. Then we discuss, first, the problems arising when the latter is naively implemented in a Boolean neural net, second, the method consenting us to overcome them and the ensuing adaptable Boolean neural net paradigm. Next, we present the adaptable Boolean neural net as an intelligent control system, actually controlling a writing robot, and discuss how to train it in the execution of the elementary arithmetic operations on operands represented by numerals with an arbitrary number of digits

  16. Efficient computation in adaptive artificial spiking neural networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); R.B.P. Nusselder (Roeland); H.S. Scholte; S.M. Bohte (Sander)

    2017-01-01

    textabstractArtificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of

  17. Cross-cultural adaptation and validation of the Danish version of the Short Musculoskeletal Function Assessment Questionnaire (SMFA)

    DEFF Research Database (Denmark)

    Lindahl, Marianne Pia; Andersen, Signe; Jørgensen, Annette

    2017-01-01

    PURPOSE: The aim of this study was to translate and culturally adapt the Short Musculoskeletal Function Assessment (SMFA) into Danish (SMFA-DK) and assess the psychometric properties. METHODS: SMFA was translated and cross-culturally adapted according to a standardized procedure. Minor changes......, content validity as coding according to the International Classification of Functioning, Disability and Health (ICF), floor/ceiling effects, construct validity as factor analysis, correlations between SMFA-DK and Short Form 36 and also known group method. Responsiveness and effect size were calculated...

  18. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    Science.gov (United States)

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  19. Administration of a dipeptidyl peptidase IV inhibitor enhances the intestinal adaptation in a mouse model of short bowel syndrome

    DEFF Research Database (Denmark)

    Okawada, Manabu; Holst, Jens Juul; Teitelbaum, Daniel H

    2011-01-01

    Glucagon-like peptide-2 induces small intestine mucosal epithelial cell proliferation and may have benefit for patients who suffer from short bowel syndrome. However, glucagon-like peptide-2 is inactivated rapidly in vivo by dipeptidyl peptidase IV. Therefore, we hypothesized that selectively inh...... inhibiting dipeptidyl peptidase IV would prolong the circulating life of glucagon-like peptide-2 and lead to increased intestinal adaptation after development of short bowel syndrome....

  20. Improving Short-Range Ensemble Kalman Storm Surge Forecasting Using Robust Adaptive Inflation

    KAUST Repository

    Altaf, Muhammad

    2013-08-01

    This paper presents a robust ensemble filtering methodology for storm surge forecasting based on the singular evolutive interpolated Kalman (SEIK) filter, which has been implemented in the framework of the H∞ filter. By design, an H∞ filter is more robust than the common Kalman filter in the sense that the estimation error in the H∞ filter has, in general, a finite growth rate with respect to the uncertainties in assimilation. The computational hydrodynamical model used in this study is the Advanced Circulation (ADCIRC) model. The authors assimilate data obtained from Hurricanes Katrina and Ike as test cases. The results clearly show that the H∞-based SEIK filter provides more accurate short-range forecasts of storm surge compared to recently reported data assimilation results resulting from the standard SEIK filter.

  1. Improving Short-Range Ensemble Kalman Storm Surge Forecasting Using Robust Adaptive Inflation

    KAUST Repository

    Altaf, Muhammad; Butler, T.; Luo, X.; Dawson, C.; Mayo, T.; Hoteit, Ibrahim

    2013-01-01

    This paper presents a robust ensemble filtering methodology for storm surge forecasting based on the singular evolutive interpolated Kalman (SEIK) filter, which has been implemented in the framework of the H∞ filter. By design, an H∞ filter is more robust than the common Kalman filter in the sense that the estimation error in the H∞ filter has, in general, a finite growth rate with respect to the uncertainties in assimilation. The computational hydrodynamical model used in this study is the Advanced Circulation (ADCIRC) model. The authors assimilate data obtained from Hurricanes Katrina and Ike as test cases. The results clearly show that the H∞-based SEIK filter provides more accurate short-range forecasts of storm surge compared to recently reported data assimilation results resulting from the standard SEIK filter.

  2. Applications of decision theory to computer-based adaptive instructional systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate

  3. An adaptive short-term prediction scheme for wind energy storage management

    International Nuclear Information System (INIS)

    Blonbou, Ruddy; Monjoly, Stephanie; Dorville, Jean-Francois

    2011-01-01

    Research highlights: → We develop a real time algorithm for grid-connected wind energy storage management. → The method aims to guarantee, with ±5% error margin, the power sent to the grid. → Dynamic scheduling of energy storage is based on short-term energy prediction. → Accurate predictions reduce the need in storage capacity. -- Abstract: Efficient forecasting scheme that includes some information on the likelihood of the forecast and based on a better knowledge of the wind variations characteristics along with their influence on power output variation is of key importance for the optimal integration of wind energy in island's power system. In the Guadeloupean archipelago (French West-Indies), with a total wind power capacity of 25 MW; wind energy can represent up to 5% of the instantaneous electricity production. At this level, wind energy contribution can be equivalent to the current network primary control reserve, which causes balancing difficult. The share of wind energy is due to grow even further since the objective is set to reach 118 MW by 2020. It is an absolute evidence for the network operator that due to security concerns of the electrical grid, the share of wind generation should not increase unless solutions are found to solve the prediction problem. The University of French West-Indies and Guyana has developed a short-term wind energy prediction scheme that uses artificial neural networks and adaptive learning procedures based on Bayesian approach and Gaussian approximation. This paper reports the results of the evaluation of the proposed approach; the improvement with respect to the simple persistent prediction model was globally good. A discussion on how such a tool combined with energy storage capacity could help to smooth the wind power variation and improve the wind energy penetration rate into island utility network is also proposed.

  4. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator

    Science.gov (United States)

    Ma, J.; Liu, Q.

    2018-02-01

    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  5. 3D Printing device adaptable to Computer Numerical Control (CNC)

    OpenAIRE

    GARDAN , Julien; Danesi , F.; Roucoules , Lionel; Schneider , A.

    2014-01-01

    This article presents the development of a 3D printing device for the additive manufacturing adapted to a CNC machining. The application involves the integration of a specific printing head. Additive manufacturing technology is most commonly used for modeling, prototyping, tooling through an exclusive machine or 3D printer. A global review and analysis of technologies show the additive manufacturing presents little independent solutions [6][9]. The problem studied especially the additive manu...

  6. Extensive Intestinal Resection Triggers Behavioral Adaptation, Intestinal Remodeling and Microbiota Transition in Short Bowel Syndrome

    Directory of Open Access Journals (Sweden)

    Camille Mayeur

    2016-03-01

    Full Text Available Extensive resection of small bowel often leads to short bowel syndrome (SBS. SBS patients develop clinical mal-absorption and dehydration relative to the reduction of absorptive area, acceleration of gastrointestinal transit time and modifications of the gastrointestinal intra-luminal environment. As a consequence of severe mal-absorption, patients require parenteral nutrition (PN. In adults, the overall adaptation following intestinal resection includes spontaneous and complex compensatory processes such as hyperphagia, mucosal remodeling of the remaining part of the intestine and major modifications of the microbiota. SBS patients, with colon in continuity, harbor a specific fecal microbiota that we called “lactobiota” because it is enriched in the Lactobacillus/Leuconostoc group and depleted in anaerobic micro-organisms (especially Clostridium and Bacteroides. In some patients, the lactobiota-driven fermentative activities lead to an accumulation of fecal d/l-lactates and an increased risk of d-encephalopathy. Better knowledge of clinical parameters and lactobiota characteristics has made it possible to stratify patients and define group at risk for d-encephalopathy crises.

  7. Short-term effects of implemented high intensity shoulder elevation during computer work

    DEFF Research Database (Denmark)

    Larsen, Mette K.; Samani, Afshin; Madeleine, Pascal

    2009-01-01

    computer work to prevent neck-shoulder pain may be possible without affecting the working routines. However, the unexpected reduction in clavicular trapezius rest during a pause with preceding high intensity contraction requires further investigation before high intensity shoulder elevations can......BACKGROUND: Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary...... contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction...

  8. Short- and long-term adaptation to ethanol stress and its cross-protective consequences in Lactobacillus plantarum

    NARCIS (Netherlands)

    Bokhorst-van de Veen, van H.; Abee, T.; Tempelaars, M.H.; Bron, P.A.; Kleerebezem, M.; Marco, M.L.

    2011-01-01

    This paper describes the molecular responses of Lactobacillus plantarum WCFS1 toward ethanol exposure. Global transcriptome profiling using DNA microarrays demonstrated adaptation of the microorganism to the presence of 8% ethanol over short (10-min and 30-min) and long (24-h) time intervals. A

  9. Discriminating Children with Autism from Children with Learning Difficulties with an Adaptation of the Short Sensory Profile

    Science.gov (United States)

    O'Brien, Justin; Tsermentseli, Stella; Cummins, Omar; Happe, Francesca; Heaton, Pamela; Spencer, Janine

    2009-01-01

    In this article, we examine the extent to which children with autism and children with learning difficulties can be discriminated from their responses to different patterns of sensory stimuli. Using an adapted version of the Short Sensory Profile (SSP), sensory processing was compared in 34 children with autism to 33 children with typical…

  10. Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

    DEFF Research Database (Denmark)

    Saari, Pasi; Fazekas, György; Eerola, Tuomas

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are prop......This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling...... related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outperforms a semantic computing technique that does not exploit genre information, and ACTwg-SLPwg outperforms conventional techniques and other genre-adaptive alternatives. In particular, improvements......-based genre representation for genre-adaptive music mood analysis....

  11. Complex adaptative systems and computational simulation in Archaeology

    Directory of Open Access Journals (Sweden)

    Salvador Pardo-Gordó

    2017-07-01

    Full Text Available Traditionally the concept of ‘complexity’ is used as a synonym for ‘complex society’, i.e., human groups with characteristics such as urbanism, inequalities, and hierarchy. The introduction of Nonlinear Systems and Complex Adaptive Systems to the discipline of archaeology has nuanced this concept. This theoretical turn has led to the rise of modelling as a method of analysis of historical processes. This work has a twofold objective: to present the theoretical current characterized by generative thinking in archaeology and to present a concrete application of agent-based modelling to an archaeological problem: the dispersal of the first ceramic production in the western Mediterranean.

  12. Privacy context model for dynamic privacy adaptation in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Koenings, Bastian; Dietzel, Stefan; Weber, M.; Kargl, Frank

    Ubiquitous computing is characterized by the merger of physical and virtual worlds as physical artifacts gain digital sensing, processing, and communication capabilities. Maintaining an appropriate level of privacy in the face of such complex and often highly dynamic systems is challenging. We argue

  13. The effects of short-lasting anti-saccade training in homonymous hemianopia with and without saccadic adaptation

    Directory of Open Access Journals (Sweden)

    Delphine eLévy-Bencheton

    2016-01-01

    Full Text Available Homonymous Visual Field Defects (HVFD are common following stroke and can be highly debilitating for visual perception and higher level cognitive functions such as exploring visual scene or reading a text. Rehabilitation using oculomotor compensatory methods with automatic training over a short duration (~15 days have been shown as efficient as longer voluntary training methods (>1 month. Here, we propose to evaluate and compare the effect of an original HVFD rehabilitation method based on a single 15 min voluntary anti-saccades task (AS toward the blind hemifield, with automatic sensorimotor adaptation to increase AS amplitude. In order to distinguish between adaptation and training effect, fourteen left- or right-HVFD patients were exposed, one month apart, to three training, two isolated AS task (Delayed-shift & No-shift paradigm and one combined with AS adaptation (Adaptation paradigm. A quality of life questionnaire (NEI-VFQ 25 and functional measurements (reading speed, visual exploration time in pop-out and serial tasks as well as oculomotor measurements were assessed before and after each training. We could not demonstrate significant adaptation at the group level, but we identified a group of 9 adapted patients. While AS training itself proved to demonstrate significant functional improvements in the overall patient group , we could also demonstrate in the sub-group of adapted patients and specifically following the adaptation training, an increase of saccade amplitude during the reading task (left-HVFD patients and the Serial exploration task, and improvement of the visual quality of life. We conclude that short-lasting AS training combined with adaptation could be implemented in rehabilitation methods of cognitive dysfunctions following HVFD. Indeed, both voluntary and automatic processes have shown interesting effects on the control of visually guided saccades in different cognitive tasks.

  14. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  15.   Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

      Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  16. HAMSTRING ARCHITECTURAL AND FUNCTIONAL ADAPTATIONS FOLLOWING LONG VS. SHORT MUSCLE LENGTH ECCENTRIC TRAINING

    Directory of Open Access Journals (Sweden)

    Kenny Guex

    2016-08-01

    Full Text Available Most common preventive eccentric-based exercises, such as Nordic hamstring do not include any hip flexion. So, the elongation stress reached is lower than during the late swing phase of sprinting. The aim of this study was to assess the evolution of hamstring architectural (fascicle length and pennation angle and functional (concentric and eccentric optimum angles and concentric and eccentric peak torques parameters following a 3-week eccentric resistance program performed at long (LML versus short muscle length (SML. Both groups performed eight sessions of 3-5x8 slow maximal eccentric knee extensions on an isokinetic dynamometer: the SML group at 0° and the LML group at 80° of hip flexion. Architectural parameters were measured using ultrasound imaging and functional parameters using the isokinetic dynamometer. The fascicle length increased by 4.9% (p<0.01, medium effect size in the SML and by 9.3% (p<0.001, large effect size in the LML group. The pennation angle did not change (p=0.83 in the SML and tended to decrease by 0.7° (p=0.09, small effect size in the LML group. The concentric optimum angle tended to decrease by 8.8° (p=0.09, medium effect size in the SML and by 17.3° (p<0.01, large effect size in the LML group. The eccentric optimum angle did not change (p=0.19, small effect size in the SML and tended to decrease by 10.7° (p=0.06, medium effect size in the LML group. The concentric peak torque did not change in the SML (p=0.37 and the LML (p=0.23 groups, whereas eccentric peak torque increased by 12.9% (p<0.01, small effect size and 17.9% (p<0.001, small effect size in the SML and the LML group, respectively. No group-by-time interaction was found for any parameters. A correlation was found between the training-induced change in fascicle length and the change in concentric optimum angle (r=-0.57, p<0.01. These results suggest that performing eccentric exercises lead to several architectural and functional adaptations. However

  17. Sauna exposure immediately prior to short-term heat acclimation accelerates phenotypic adaptation in females.

    Science.gov (United States)

    Mee, Jessica A; Peters, Sophie; Doust, Jonathan H; Maxwell, Neil S

    2018-02-01

    Investigate whether a sauna exposure prior to short-term heat acclimation (HA) accelerates phenotypic adaptation in females. Randomised, repeated measures, cross-over trial. Nine females performed two 5-d HA interventions (controlled hyperthermia T re ≥38.5°C), separated by 7-wk, during the follicular phase of the menstrual cycle confirmed by plasma concentrations of 17-β estradiol and progesterone. Prior to each 90-min HA session participants sat for 20-min in either a temperate environment (20°C, 40% RH; HA temp ) wearing shorts and sports bra or a hot environment (50°C, 30% RH) wearing a sauna suit to replicate sauna conditions (HA sauna ). Participants performed a running heat tolerance test (RHTT) 24-h pre and 24-h post HA. Mean heart rate (HR) (85±4 vs. 68±5 bpm, p≤0.001), sweat rate (0.4±0.2 vs. 0.0±0.0Lh -1 , p≤0.001), and thermal sensation (6±0 vs. 5±1, p=0.050) were higher during the sauna compared to temperate exposure. Resting rectal temperature (T re ) (-0.28±0.16°C), peak T re (-0.42±0.22°C), resting HR (-10±4 bpm), peak HR (-12±7 bpm), T re at sweating onset (-0.29±0.17°C) (p≤0.001), thermal sensation (-0.5±0.5; p=0.002), and perceived exertion (-3±2; p≤0.001) reduced during the RHTT, following HA sauna ; but not HA temp . Plasma volume expansion was greater following HA sauna (HA sauna , 9±7%; HA temp , 1±5%; p=0.013). Sweat rate (p≤0.001) increased and sweat NaCl (p=0.006) reduced during the RHTT following HA sauna and HA temp . This novel strategy initiated HA with an attenuation of thermoregulatory, cardiovascular, and perceptual strain in females due to a measurably greater strain in the sauna compared to temperate exposure when adopted prior to STHA. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  19. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  20. Changes in Jump-Down Performance After Space Flight: Short- and Long-Term Adaptation

    Science.gov (United States)

    Kofman, I. S.; Reschke, M. F.; Cerisano, J. M.; Fisher, E. A.; Lawrence, E. L.; Peters, B. T.; Bloomberg, J. J.

    2010-01-01

    INTRODUCTION Successful jump performance requires functional coordination of visual, vestibular, and somatosensory systems, which are affected by prolonged exposure to microgravity. Astronauts returning from space flight exhibit impaired ability to coordinate effective landing strategies when jumping from a platform to the ground. This study compares the jump strategies used by astronauts before and after flight, the changes to those strategies within a test session, and the recoveries in jump-down performance parameters across several postflight test sessions. These data were obtained as part of an ongoing interdisciplinary study (Functional Task Test, FTT) designed to evaluate both astronaut postflight functional performance and related physiological changes. METHODS Six astronauts from short-duration (Shuttle) and three from long-duration (International Space Station) flights performed 3 two-footed jumps from a platform 30 cm high. A force plate measured the ground reaction forces and center-of-pressure displacement from the landings. Muscle activation data were collected from the medial gastrocnemius and anterior tibialis of both legs using surface electromyography electrodes. Two load cells in the platform measured the load exerted by each foot during the takeoff phase of the jump. Data were collected in 2 preflight sessions, on landing day (Shuttle only), and 1, 6, and 30 days after flight. RESULTS AND CONCLUSION Many of the astronauts tested were unable to maintain balance on their first postflight jump landing but recovered by the third jump, showing a learning progression in which the performance improvement could be attributed to adjustments of strategy on takeoff, landing, or both. Takeoff strategy changes were evident in air time (time between takeoff and landing), which was significantly reduced after flight, and also in increased asymmetry in foot latencies on takeoff. Landing modifications were seen in changes in ground reaction force curves. The

  1. Short-term adaptations following Complex Training in team-sports: A meta-analysis.

    Science.gov (United States)

    Freitas, Tomás T; Martinez-Rodriguez, Alejandro; Calleja-González, Julio; Alcaraz, Pedro E

    2017-01-01

    The purpose of this meta-analysis was to study the short-term adaptations on sprint and vertical jump (VJ) performance following Complex Training (CT) in team-sports. CT is a resistance training method aimed at developing both strength and power, which has a direct effect on sprint and VJ. It consists on alternating heavy resistance training exercises with plyometric/power ones, set for set, on the same workout. A search of electronic databases up to July 2016 (PubMed-MEDLINE, SPORTDiscus, Web of Knowledge) was conducted. Inclusion criteria: 1) at least one CT intervention group; 2) training protocols ≥4-wks; 3) sample of team-sport players; 4) sprint or VJ as an outcome variable. Effect sizes (ES) of each intervention were calculated and subgroup analyses were performed. A total of 9 studies (13 CT groups) met the inclusion criteria. Medium effect sizes (ES) (ES = 0.73) were obtained for pre-post improvements in sprint, and small (ES = 0.41) in VJ, following CT. Experimental-groups presented better post-intervention sprint (ES = 1.01) and VJ (ES = 0.63) performance than control-groups. large ESs were exhibited in younger athletes (training programs >12 total sessions (ES = 0.74). Large ESs in programs with >12 total sessions (ES = 0.81). Medium ESs obtained for under-Division I individuals (ES = 0.56); protocols with intracomplex rest intervals ≥2 min (ES = 0.55); conditioning activities with intensities ≤85% 1RM (ES = 0.64); basketball/volleyball players (ES = 0.55). Small ESs were found for younger athletes (ES = 0.42); interventions ≥6 weeks (ES = 0.45). CT interventions have positive medium effects on sprint performance and small effects on VJ in team-sport athletes. This training method is a suitable option to include in the season planning.

  2. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  3. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale - Short Form.

    Science.gov (United States)

    Chaves, Fernanda Figueredo; Reis, Ilka Afonso; Pagano, Adriana Silvina; Torres, Heloísa de Carvalho

    2017-03-23

    To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale - Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. Assessment of the instrument's conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee's assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes - Versão Curta, was established. The scale had acceptable internal consistency with Cronbach's alpha of 0.634 (95%CI 0.494- 0.737), while the correlation of the total score in the two periods was considered moderate (0.47). The intraclass correlation coefficient was 0.50. The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv) for recording the Expert Committee responses as well as the responses in the validation tests proved to be a reliable, safe and innovative method. Traduzir

  4. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  5. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  6. Computer modeling describes gravity-related adaptation in cell cultures.

    Science.gov (United States)

    Alexandrov, Ludmil B; Alexandrova, Stoyana; Usheva, Anny

    2009-12-16

    Questions about the changes of biological systems in response to hostile environmental factors are important but not easy to answer. Often, the traditional description with differential equations is difficult due to the overwhelming complexity of the living systems. Another way to describe complex systems is by simulating them with phenomenological models such as the well-known evolutionary agent-based model (EABM). Here we developed an EABM to simulate cell colonies as a multi-agent system that adapts to hyper-gravity in starvation conditions. In the model, the cell's heritable characteristics are generated and transferred randomly to offspring cells. After a qualitative validation of the model at normal gravity, we simulate cellular growth in hyper-gravity conditions. The obtained data are consistent with previously confirmed theoretical and experimental findings for bacterial behavior in environmental changes, including the experimental data from the microgravity Atlantis and the Hypergravity 3000 experiments. Our results demonstrate that it is possible to utilize an EABM with realistic qualitative description to examine the effects of hypergravity and starvation on complex cellular entities.

  7. Adaptive zooming in X-ray computed tomography.

    Science.gov (United States)

    Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan

    2014-01-01

    In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. Increase the spatial resolution of the reconstructed image by optimal zooming during scanning. A new approach is proposed, in which the full width of the detector is exploited for every projection angle. This approach is based on the use of prior information about the object's convex hull to move the source as close as possible to the object, while avoiding truncation of the projections. Experiments show that the proposed approach can significantly improve reconstruction quality, producing reconstructions with smaller errors and revealing more details in the object. The proposed approach can lead to more accurate reconstructions and increased spatial resolution in the object compared to the conventional circular trajectory.

  8. The Simulation Computer Based Learning (SCBL) for Short Circuit Multi Machine Power System Analysis

    Science.gov (United States)

    Rahmaniar; Putri, Maharani

    2018-03-01

    Strengthening Competitiveness of human resources become the reply of college as a conductor of high fomal education. Electrical Engineering Program UNPAB (Prodi TE UNPAB) as one of the department of electrical engineering that manages the field of electrical engineering expertise has a very important part in preparing human resources (HR), Which is required by where graduates are produced by DE UNPAB, Is expected to be able to compete globally, especially related to the implementation of Asean Economic Community (AEC) which requires the active participation of graduates with competence and quality of human resource competitiveness. Preparation of HR formation Competitive is done with the various strategies contained in the Seven (7) Higher Education Standard, one part of which is the implementation of teaching and learning process in Electrical system analysis with short circuit analysis (SCA) This course is a course The core of which is the basis for the competencies of other subjects in the advanced semester at Development of Computer Based Learning model (CBL) is done in the learning of interference analysis of multi-machine short circuit which includes: (a) Short-circuit One phase, (B) Two-phase Short Circuit Disruption, (c) Ground Short Circuit Disruption, (d) Short Circuit Disruption One Ground Floor Development of CBL learning model for Electrical System Analysis course provides space for students to be more active In learning in solving complex (complicated) problems, so it is thrilling Ilkan flexibility of student learning how to actively solve the problem of short-circuit analysis and to form the active participation of students in learning (Student Center Learning, in the course of electrical power system analysis.

  9. PEAC: A Power-Efficient Adaptive Computing Technology for Enabling Swarm of Small Spacecraft and Deployable Mini-Payloads

    Data.gov (United States)

    National Aeronautics and Space Administration — This task is to develop and demonstrate a path-to-flight and power-adaptive avionics technology PEAC (Power Efficient Adaptive Computing). PEAC will enable emerging...

  10. A self-adaptive chaotic particle swarm algorithm for short term hydroelectric system scheduling in deregulated environment

    International Nuclear Information System (INIS)

    Jiang Chuanwen; Bompard, Etorre

    2005-01-01

    This paper proposes a short term hydroelectric plant dispatch model based on the rule of maximizing the benefit. For the optimal dispatch model, which is a large scale nonlinear planning problem with multi-constraints and multi-variables, this paper proposes a novel self-adaptive chaotic particle swarm optimization algorithm to solve the short term generation scheduling of a hydro-system better in a deregulated environment. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed approach introduces chaos mapping and an adaptive scaling term into the particle swarm optimization algorithm, which increases its convergence rate and resulting precision. The new method has been examined and tested on a practical hydro-system. The results are promising and show the effectiveness and robustness of the proposed approach in comparison with the traditional particle swarm optimization algorithm

  11. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  12. Examining the short term effects of emotion under an Adaptation Level Theory model of tinnitus perception.

    Science.gov (United States)

    Durai, Mithila; O'Keeffe, Mary G; Searchfield, Grant D

    2017-03-01

    Existing evidence suggests a strong relationship between tinnitus and emotion. The objective of this study was to examine the effects of short-term emotional changes along valence and arousal dimensions on tinnitus outcomes. Emotional stimuli were presented in two different modalities: auditory and visual. The authors hypothesized that (1) negative valence (unpleasant) stimuli and/or high arousal stimuli will lead to greater tinnitus loudness and annoyance than positive valence and/or low arousal stimuli, and (2) auditory emotional stimuli, which are in the same modality as the tinnitus, will exhibit a greater effect on tinnitus outcome measures than visual stimuli. Auditory and visual emotive stimuli were administered to 22 participants (12 females and 10 males) with chronic tinnitus, recruited via email invitations send out to the University of Auckland Tinnitus Research Volunteer Database. Emotional stimuli used were taken from the International Affective Digital Sounds- Version 2 (IADS-2) and the International Affective Picture System (IAPS) (Bradley and Lang, 2007a, 2007b). The Emotion Regulation Questionnaire (Gross and John, 2003) was administered alongside subjective ratings of tinnitus loudness and annoyance, and psychoacoustic sensation level matches to external sounds. Males had significantly different emotional regulation scores than females. Negative valence emotional auditory stimuli led to higher tinnitus loudness ratings in males and females and higher annoyance ratings in males only; loudness matches of tinnitus remained unchanged. The visual stimuli did not have an effect on tinnitus ratings. The results are discussed relative to the Adaptation Level Theory Model of Tinnitus. The results indicate that the negative valence dimension of emotion is associated with increased tinnitus magnitude judgements and gender effects may also be present, but only when the emotional stimulus is in the auditory modality. Sounds with emotional associations may be

  13. The Psychological Well-Being and Sociocultural Adaptation of Short-Term International Students in Ireland

    Science.gov (United States)

    O'Reilly, Aileen; Ryan, Dermot; Hickey, Tina

    2010-01-01

    This article reports on an empirical study of the psychosocial adaptation of international students in Ireland. Using measures of social support, loneliness, stress, psychological well-being, and sociocultural adaptation, data were obtained from international students and a comparison sample of Irish students. The study found that, although…

  14. A comparison of computerized adaptive testing and fixed-length short forms for the Prosthetic Limb Users Survey of Mobility (PLUS-MTM).

    Science.gov (United States)

    Amtmann, Dagmar; Bamer, Alyssa M; Kim, Jiseon; Bocell, Fraser; Chung, Hyewon; Park, Ryoungsun; Salem, Rana; Hafner, Brian J

    2017-09-01

    New health status instruments can be administered by computerized adaptive test or short forms. The Prosthetic Limb Users Survey of Mobility (PLUS-M TM ) is a self-report measure of mobility for prosthesis users with lower limb loss. This study used the PLUS-M to examine advantages and disadvantages of computerized adaptive test and short forms. To compare scores obtained from computerized adaptive test to scores obtained from fixed-length short forms (7-item and 12-item) in order to provide guidance to researchers and clinicians on how to select the best form of administration for different uses. Cross-sectional, observational study. Individuals with lower limb loss completed the PLUS-M by computerized adaptive test and short forms. Administration time, correlations between the scores, and standard errors were compared. Scores and standard errors from the computerized adaptive test, 7-item short form, and 12-item short form were highly correlated and all forms of administration were efficient. Computerized adaptive test required less time to administer than either paper or electronic short forms; however, time savings were minimal compared to the 7-item short form. Results indicate that the PLUS-M computerized adaptive test is most efficient, and differences in scores between administration methods are minimal. The main advantage of the computerized adaptive test was more reliable scores at higher levels of mobility compared to short forms. Clinical relevance Health-related item banks, like the Prosthetic Limb Users Survey of Mobility (PLUS-M TM ), can be administered by computerized adaptive testing (CAT) or as fixed-length short forms (SFs). Results of this study will help clinicians and researchers decide whether they should invest in a CAT administration system or whether SFs are more appropriate.

  15. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Quantum computer based on activated dielectric nanoparticles selectively interacting with short optical pulses

    International Nuclear Information System (INIS)

    Gadomskii, Oleg N; Kharitonov, Yu Ya

    2004-01-01

    The operation principle of a quantum computer is proposed based on a system of dielectric nanoparticles activated with two-level atoms - cubits, in which electric dipole transitions are excited by short intense optical pulses. It is proved that the logical operation (logical operator) CNOT (controlled NOT) is performed by means of time-dependent transfer of quantum information over 'long' (of the order of 10 4 nm) distances between spherical nanoparticles owing to the delayed interaction between them in the optical radiation field. It is shown that one-cubit and two-cubit logical operators required for quantum calculations can be realised by selectively exciting dielectric particles with short optical pulses. (quantum calculations)

  17. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  18. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  19. Features of the adaptive control and measuring the effectiveness of distant teaching to computer science

    Directory of Open Access Journals (Sweden)

    Евгений Игоревич Горюшкин

    2009-06-01

    Full Text Available In title approaches to construction of effective monitoring systems of productivity of training to computer science in high schools are described. It is offered to put adaptive testing at which in development of tests artificial neural networks are applied in a basis of such systems.

  20. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  1. Short-term adaptation and chronic cardiac remodelling to high altitude in lowlander natives and Himalayan Sherpa.

    Science.gov (United States)

    Stembridge, Mike; Ainslie, Philip N; Shave, Rob

    2015-11-01

    What is the topic of this review? At high altitude, the cardiovascular system must adapt in order to meet the metabolic demand for oxygen. This review summarizes recent findings relating to short-term and life-long cardiac adaptation to high altitude in the context of exercise capacity. What advances does it highlight? Both Sherpa and lowlanders exhibit smaller left ventricular volumes at high altitude; however, myocardial relaxation, as evidenced by diastolic untwist, is reduced only in Sherpa, indicating that short-term hypoxia does not impair diastolic relaxation. Potential remodelling of systolic function, as evidenced by lower left ventricular systolic twist in Sherpa, may facilitate the requisite sea-level mechanical reserve required during exercise, although this remains to be confirmed. Both short-term and life-long high-altitude exposure challenge the cardiovascular system to meet the metabolic demand for O2 in a hypoxic environment. As the demand for O2 delivery increases during exercise, the circulatory component of oxygen transport is placed under additional stress. Acute adaptation and chronic remodelling of cardiac structure and function may occur to facilitate O2 delivery in lowlanders during sojourn to high altitude and in permanent highland residents. However, our understanding of cardiac structural and functional adaption in Sherpa remains confined to a higher maximal heart rate, lower pulmonary vascular resistance and no differences in resting cardiac output. Ventricular form and function are intrinsically linked through the left ventricular (LV) mechanics that facilitate efficient ejection, minimize myofibre stress during contraction and aid diastolic recoil. Recent examination of LV mechanics has allowed detailed insight into fundamental cardiac adaptation in high-altitude Sherpa. In this symposium report, we review recent advances in our understanding of LV function in both lowlanders and Sherpa at rest and discuss the potential consequences

  2. Odor-context effects in free recall after a short retention interval: a new methodology for controlling adaptation.

    Science.gov (United States)

    Isarida, Takeo; Sakai, Tetsuya; Kubota, Takayuki; Koga, Miho; Katayama, Yu; Isarida, Toshiko K

    2014-04-01

    The present study investigated context effects of incidental odors in free recall after a short retention interval (5 min). With a short retention interval, the results are not confounded by extraneous odors or encounters with the experimental odor and possible rehearsal during a long retention interval. A short study time condition (4 s per item), predicted not to be affected by adaptation to the odor, and a long study time condition (8 s per item) were used. Additionally, we introduced a new method for recovery from adaptation, where a dissimilar odor was briefly presented at the beginning of the retention interval, and we demonstrated the effectiveness of this technique. An incidental learning paradigm was used to prevent overshadowing from confounding the results. In three experiments, undergraduates (N = 200) incidentally studied words presented one-by-one and received a free recall test. Two pairs of odors and a third odor having different semantic-differential characteristics were selected from 14 familiar odors. One of the odors was presented during encoding, and during the test, the same odor (same-context condition) or the other odor within the pair (different-context condition) was presented. Without using a recovery-from-adaptation method, a significant odor-context effect appeared in the 4-s/item condition, but not in the 8-s/item condition. Using the recovery-from-adaptation method, context effects were found for both the 8- and the 4-s/item conditions. The size of the recovered odor-context effect did not change with study time. There were no serial position effects. Implications of the present findings are discussed.

  3. A computer simulation of an adaptive noise canceler with a single input

    Science.gov (United States)

    Albert, Stuart D.

    1991-06-01

    A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.

  4. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei; Guyet, Thomas; Quiniou, René ; Cordier, Marie-Odile; Masseglia, Florent; Zhang, Xiangliang

    2014-01-01

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  5. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  6. Adaptive control of Parkinson's state based on a nonlinear computational model with unknown parameters.

    Science.gov (United States)

    Su, Fei; Wang, Jiang; Deng, Bin; Wei, Xi-Le; Chen, Ying-Yuan; Liu, Chen; Li, Hui-Yan

    2015-02-01

    The objective here is to explore the use of adaptive input-output feedback linearization method to achieve an improved deep brain stimulation (DBS) algorithm for closed-loop control of Parkinson's state. The control law is based on a highly nonlinear computational model of Parkinson's disease (PD) with unknown parameters. The restoration of thalamic relay reliability is formulated as the desired outcome of the adaptive control methodology, and the DBS waveform is the control input. The control input is adjusted in real time according to estimates of unknown parameters as well as the feedback signal. Simulation results show that the proposed adaptive control algorithm succeeds in restoring the relay reliability of the thalamus, and at the same time achieves accurate estimation of unknown parameters. Our findings point to the potential value of adaptive control approach that could be used to regulate DBS waveform in more effective treatment of PD.

  7. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  9. Computational adaptive optics for broadband interferometric tomography of tissues and cells

    Science.gov (United States)

    Adie, Steven G.; Mulligan, Jeffrey A.

    2016-03-01

    Adaptive optics (AO) can shape aberrated optical wavefronts to physically restore the constructive interference needed for high-resolution imaging. With access to the complex optical field, however, many functions of optical hardware can be achieved computationally, including focusing and the compensation of optical aberrations to restore the constructive interference required for diffraction-limited imaging performance. Holography, which employs interferometric detection of the complex optical field, was developed based on this connection between hardware and computational image formation, although this link has only recently been exploited for 3D tomographic imaging in scattering biological tissues. This talk will present the underlying imaging science behind computational image formation with optical coherence tomography (OCT) -- a beam-scanned version of broadband digital holography. Analogous to hardware AO (HAO), we demonstrate computational adaptive optics (CAO) and optimization of the computed pupil correction in 'sensorless mode' (Zernike polynomial corrections with feedback from image metrics) or with the use of 'guide-stars' in the sample. We discuss the concept of an 'isotomic volume' as the volumetric extension of the 'isoplanatic patch' introduced in astronomical AO. Recent CAO results and ongoing work is highlighted to point to the potential biomedical impact of computed broadband interferometric tomography. We also discuss the advantages and disadvantages of HAO vs. CAO for the effective shaping of optical wavefronts, and highlight opportunities for hybrid approaches that synergistically combine the unique advantages of hardware and computational methods for rapid volumetric tomography with cellular resolution.

  10. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    Science.gov (United States)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable

  11. Short-term electricity demand and gas price forecasts using wavelet transforms and adaptive models

    International Nuclear Information System (INIS)

    Nguyen, Hang T.; Nabney, Ian T.

    2010-01-01

    This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their NMSEs are 0.02314 and 0.15384 respectively. (author)

  12. Cross-cultural adaptation and validation of the Danish version of the Short Musculoskeletal Function Assessment questionnaire (SMFA).

    Science.gov (United States)

    Lindahl, Marianne; Andersen, Signe; Joergensen, Annette; Frandsen, Christian; Jensen, Liselotte; Benedikz, Eirikur

    2018-01-01

    The aim of this study was to translate and culturally adapt the Short Musculoskeletal Function Assessment (SMFA) into Danish (SMFA-DK) and assess the psychometric properties. SMFA was translated and cross-culturally adapted according to a standardized procedure. Minor changes in the wording in three items were made to adapt to Danish conditions. Acute patients (n = 201) and rehabilitation patients (n = 231) with musculoskeletal problems aged 18-87 years were included. The following analysis were made to evaluate psychometric quality of SMFA-DK: Reliability with Chronbach's alpha, content validity as coding according to the International Classification of Functioning, Disability and Health (ICF), floor/ceiling effects, construct validity as factor analysis, correlations between SMFA-DK and Short Form 36 and also known group method. Responsiveness and effect size were calculated. Cronbach's alpha values were between 0.79 and 0.94. SMFA-DK captured all components of the ICF, and there were no floor/ceiling effects. Factor analysis demonstrated four subscales. SMFA-DK correlated good with the SF-36 subscales for the rehabilitation patients and lower for the newly injured patients. Effect sizes were excellent and better for SMFA-DK than for SF-36. The study indicates that SMFA-DK can be a valid and responsive measure of outcome in rehabilitation settings.

  13. Adaptation and validation of the short version WHOQOL-HIV in Ethiopia

    DEFF Research Database (Denmark)

    Tesfaye Woldeyohannes, Markos; Olsen, Mette Frahm; Medhin, Girmay

    2016-01-01

    BACKGROUND: Quality of life of patients is an important element in the evaluation of outcome of health care, social services and clinical trials. The WHOQOL instruments were originally developed for measurement of quality of life across cultures. However, there were concerns raised about the cross-cultural...... equivalence of the WHOQOL-HIV when used among people with HIV in Ethiopia. Therefore, this study aimed at adapting the WHOQOL-HIV bref for the Ethiopian setting. METHODS: A step-wise adaptation of the WHOQOL-HIV bref for use in Ethiopia was conducted to produce an Ethiopian version...... were recruited from HIV clinics. RESULTS: In the process of adaptation, new items of relevance to the context were added while seven items were deleted because of problems with acceptability and poor psychometric properties. The Cronbach's α for the final tool with twenty-seven items WHOQOL...

  14. Implementing Molecular Dynamics for Hybrid High Performance Computers - 1. Short Range Forces

    International Nuclear Information System (INIS)

    Brown, W. Michael; Wang, Peng; Plimpton, Steven J.; Tharrington, Arnold N.

    2011-01-01

    The use of accelerators such as general-purpose graphics processing units (GPGPUs) have become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high performance computers, machines with more than one type of floating-point processor, are now becoming more prevalent due to these advantages. In this work, we discuss several important issues in porting a large molecular dynamics code for use on parallel hybrid machines - (1) choosing a hybrid parallel decomposition that works on central processing units (CPUs) with distributed memory and accelerator cores with shared memory, (2) minimizing the amount of code that must be ported for efficient acceleration, (3) utilizing the available processing power from both many-core CPUs and accelerators, and (4) choosing a programming model for acceleration. We present our solution to each of these issues for short-range force calculation in the molecular dynamics package LAMMPS. We describe algorithms for efficient short range force calculation on hybrid high performance machines. We describe a new approach for dynamic load balancing of work between CPU and accelerator cores. We describe the Geryon library that allows a single code to compile with both CUDA and OpenCL for use on a variety of accelerators. Finally, we present results on a parallel test cluster containing 32 Fermi GPGPUs and 180 CPU cores.

  15. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  16. Computer intervention impact on psychosocial adaptation of rural women with chronic conditions.

    Science.gov (United States)

    Weinert, Clarann; Cudney, Shirley; Comstock, Bryan; Bansal, Aasthaa

    2011-01-01

    Adapting to living with chronic conditions is a life-long psychosocial challenge. The purpose of this study was to report the effect of a computer intervention on the psychosocial adaptation of rural women with chronic conditions. A two-group study design was used with 309 middle-aged, rural women who had chronic conditions, randomized into either a computer-based intervention or a control group. Data were collected at baseline, at the end of the intervention, and 6 months later on the psychosocial indicators of social support, self-esteem, acceptance of illness, stress, depression, and loneliness. The impact of the computer-based intervention was statistically significant for five of six of the psychosocial outcomes measured, with a modest impact on social support. The largest benefits were seen in depression, stress, and acceptance. The women-to-women intervention resulted in positive psychosocial responses that have the potential to contribute to successful management of illness and adaptation. Other components of adaptation to be examined are the impact of the intervention on illness management and quality of life and the interrelationships among environmental stimuli, psychosocial response, and illness management.

  17. Adult zebrafish intestine resection: a novel model of short bowel syndrome, adaptation, and intestinal stem cell regeneration.

    Science.gov (United States)

    Schall, K A; Holoyda, K A; Grant, C N; Levin, D E; Torres, E R; Maxwell, A; Pollack, H A; Moats, R A; Frey, M R; Darehzereshki, A; Al Alam, D; Lien, C; Grikscheit, T C

    2015-08-01

    Loss of significant intestinal length from congenital anomaly or disease may lead to short bowel syndrome (SBS); intestinal failure may be partially offset by a gain in epithelial surface area, termed adaptation. Current in vivo models of SBS are costly and technically challenging. Operative times and survival rates have slowed extension to transgenic models. We created a new reproducible in vivo model of SBS in zebrafish, a tractable vertebrate model, to facilitate investigation of the mechanisms of intestinal adaptation. Proximal intestinal diversion at segment 1 (S1, equivalent to jejunum) was performed in adult male zebrafish. SBS fish emptied distal intestinal contents via stoma as in the human disease. After 2 wk, S1 was dilated compared with controls and villus ridges had increased complexity, contributing to greater villus epithelial perimeter. The number of intervillus pockets, the intestinal stem cell zone of the zebrafish increased and contained a higher number of bromodeoxyuridine (BrdU)-labeled cells after 2 wk of SBS. Egf receptor and a subset of its ligands, also drivers of adaptation, were upregulated in SBS fish. Igf has been reported as a driver of intestinal adaptation in other animal models, and SBS fish exposed to a pharmacological inhibitor of the Igf receptor failed to demonstrate signs of intestinal adaptation, such as increased inner epithelial perimeter and BrdU incorporation. We describe a technically feasible model of human SBS in the zebrafish, a faster and less expensive tool to investigate intestinal stem cell plasticity as well as the mechanisms that drive intestinal adaptation. Copyright © 2015 the American Physiological Society.

  18. Are We Measuring Teachers’ Attitudes towards Computers in Detail?: Adaptation of a Questionnaire into Turkish Culture

    Directory of Open Access Journals (Sweden)

    Nilgün Günbaş

    2017-04-01

    Full Text Available Teachers’ perceptions of computers play an important role in integrating computers into education. The related literature includes studies developing or adapting a survey instrument in Turkish culture measuring teachers’ attitudes toward computers. These instruments have three to four factors (e.g., computer importance, computer enjoyment, computer confidence and 18 to 26 items under these factors. The purpose of the present study is to adapt a more detailed and stronger survey questionnaire measuring more dimensions related to teachers’ attitudes. The source instrument was developed by Christensen and Kenzek (2009 and called Teachers’ Attitudes toward Computers (TAC. It has nine factors with 51 items. Before testing the instrument, the interaction (e-mail factor was taken out because of the cultural differences. The reliability and validity testing of the translated instrument was completed with 273 teachers’ candidates in a Faculty of Education in Turkey. The results showed that the translated instrument (Cronbach’s Alpha: .94 included eight factors and consisted of 42 items under these factors, which were consistent with the original instrument. These factors were: Interest (α: .83, Comfort (α: .90, Accommodation (α: .87, Concern (α: .79, Utility (α: .90, Perception (α: .89, Absorption (α: .84, and Significance (α: .83. Additionally, the confirmatory factor analysis result for the model with eight factors was: RMSEA=0.050, χ2/df=1.69, RMR=0.075, SRMR=0.057, GFI= 0.81, AGFI= 0.78, NFI= 0.94, NNFI=0.97, CFI=0.97, IFI= 0.97. Accordingly, as a reliable, valid and stronger instrument, the adapted survey instrument can be suggested for the use in Turkish academic studies.

  19. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    International Nuclear Information System (INIS)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E; Loukopoulos, Klearchos

    2011-01-01

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  20. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Loukopoulos, Klearchos, E-mail: m.hoban@ucl.ac.uk [Department of Materials, Oxford University, Parks Road, Oxford OX1 4PH (United Kingdom)

    2011-02-15

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  1. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  2. Effect of IGF-rich colostrum on bowel adaptation in neonatal piglets with short bowel syndrome

    NARCIS (Netherlands)

    Heemskerk, V. H.; van Heurn, L. W. E.; Farla, P.; Buurman, W. A.; Piersma, F.; ter Riet, G.; Heineman, E.

    2002-01-01

    BACKGROUND: Insulin-like growth factor 1 (IGF-1), a polypeptide growth factor with mitogenic effects on intestinal epithelial crypt cells occurs naturally in high concentrations in colostrum. The hypothesis for this study was that colostrum rich in IGF-1 could promote small bowel adaptation in

  3. A Combined Methodology of Adaptive Neuro-Fuzzy Inference System and Genetic Algorithm for Short-term Energy Forecasting

    Directory of Open Access Journals (Sweden)

    KAMPOUROPOULOS, K.

    2014-02-01

    Full Text Available This document presents an energy forecast methodology using Adaptive Neuro-Fuzzy Inference System (ANFIS and Genetic Algorithms (GA. The GA has been used for the selection of the training inputs of the ANFIS in order to minimize the training result error. The presented algorithm has been installed and it is being operating in an automotive manufacturing plant. It periodically communicates with the plant to obtain new information and update the database in order to improve its training results. Finally the obtained results of the algorithm are used in order to provide a short-term load forecasting for the different modeled consumption processes.

  4. A systems biology analysis of long and short-term memories of osmotic stress adaptation in fungi

    Directory of Open Access Journals (Sweden)

    You Tao

    2012-05-01

    Full Text Available Abstract Background Saccharomyces cerevisiae senses hyperosmotic conditions via the HOG signaling network that activates the stress-activated protein kinase, Hog1, and modulates metabolic fluxes and gene expression to generate appropriate adaptive responses. The integral control mechanism by which Hog1 modulates glycerol production remains uncharacterized. An additional Hog1-independent mechanism retains intracellular glycerol for adaptation. Candida albicans also adapts to hyperosmolarity via a HOG signaling network. However, it remains unknown whether Hog1 exerts integral or proportional control over glycerol production in C. albicans. Results We combined modeling and experimental approaches to study osmotic stress responses in S. cerevisiae and C. albicans. We propose a simple ordinary differential equation (ODE model that highlights the integral control that Hog1 exerts over glycerol biosynthesis in these species. If integral control arises from a separation of time scales (i.e. rapid HOG activation of glycerol production capacity which decays slowly under hyperosmotic conditions, then the model predicts that glycerol production rates elevate upon adaptation to a first stress and this makes the cell adapts faster to a second hyperosmotic stress. It appears as if the cell is able to remember the stress history that is longer than the timescale of signal transduction. This is termed the long-term stress memory. Our experimental data verify this. Like S. cerevisiae, C. albicans mimimizes glycerol efflux during adaptation to hyperosmolarity. Also, transient activation of intermediate kinases in the HOG pathway results in a short-term memory in the signaling pathway. This determines the amplitude of Hog1 phosphorylation under a periodic sequence of stress and non-stressed intervals. Our model suggests that the long-term memory also affects the way a cell responds to periodic stress conditions. Hence, during osmohomeostasis, short-term memory is

  5. Cephalopods as Predators: A Short Journey among Behavioral Flexibilities, Adaptions, and Feeding Habits

    OpenAIRE

    Villanueva, Roger; Perricone, Valentina; Fiorito, Graziano

    2017-01-01

    The diversity of cephalopod species and the differences in morphology and the habitats in which they live, illustrates the ability of this class of molluscs to adapt to all marine environments, demonstrating a wide spectrum of patterns to search, detect, select, capture, handle, and kill prey. Photo-, mechano-, and chemoreceptors provide tools for the acquisition of information about their potential preys. The use of vision to detect prey and high attack speed seem to be a predominant pattern...

  6. Whatever after next? adaptive predictions based on short- and long-term memory in visual search

    OpenAIRE

    Conci, M.; Zellin, M.; Muller, Hermann J.

    2012-01-01

    Generating predictions for task-relevant goals is a fundamental requirement of human information processing, as it ensures adaptive success in our complex natural environment. Clark (in press) proposed a model of hierarchical predictive processing, in which perception, attention, and learning are unified within a coherent framework. In this view, incoming sensory signals are constantly matched with top-down expectations or predictions, with the aim of minimizing the prediction error to genera...

  7. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  8. A short note on the use of the red-black tree in Cartesian adaptive mesh refinement algorithms

    Science.gov (United States)

    Hasbestan, Jaber J.; Senocak, Inanc

    2017-12-01

    Mesh adaptivity is an indispensable capability to tackle multiphysics problems with large disparity in time and length scales. With the availability of powerful supercomputers, there is a pressing need to extend time-proven computational techniques to extreme-scale problems. Cartesian adaptive mesh refinement (AMR) is one such method that enables simulation of multiscale, multiphysics problems. AMR is based on construction of octrees. Originally, an explicit tree data structure was used to generate and manipulate an adaptive Cartesian mesh. At least eight pointers are required in an explicit approach to construct an octree. Parent-child relationships are then used to traverse the tree. An explicit octree, however, is expensive in terms of memory usage and the time it takes to traverse the tree to access a specific node. For these reasons, implicit pointerless methods have been pioneered within the computer graphics community, motivated by applications requiring interactivity and realistic three dimensional visualization. Lewiner et al. [1] provides a concise review of pointerless approaches to generate an octree. Use of a hash table and Z-order curve are two key concepts in pointerless methods that we briefly discuss next.

  9. The self-adaptation to dynamic failures for efficient virtual organization formations in grid computing context

    International Nuclear Information System (INIS)

    Han Liangxiu

    2009-01-01

    Grid computing aims to enable 'resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations (VOs)'. However, due to the nature of heterogeneous and dynamic resources, dynamic failures in the distributed grid environment usually occur more than in traditional computation platforms, which cause failed VO formations. In this paper, we develop a novel self-adaptive mechanism to dynamic failures during VO formations. Such a self-adaptive scheme allows an individual and member of VOs to automatically find other available or replaceable one once a failure happens and therefore makes systems automatically recover from dynamic failures. We define dynamic failure situations of a system by using two standard indicators: mean time between failures (MTBF) and mean time to recover (MTTR). We model both MTBF and MTTR as Poisson distributions. We investigate and analyze the efficiency of the proposed self-adaptation mechanism to dynamic failures by comparing the success probability of VO formations before and after adopting it in three different cases: (1) different failure situations; (2) different organizational structures and scales; (3) different task complexities. The experimental results show that the proposed scheme can automatically adapt to dynamic failures and effectively improve the dynamic VO formation performance in the event of node failures, which provide a valuable addition to the field.

  10. Adaptive Remodeling of Achilles Tendon: A Multi-scale Computational Model.

    Directory of Open Access Journals (Sweden)

    Stuart R Young

    2016-09-01

    Full Text Available While it is known that musculotendon units adapt to their load environments, there is only a limited understanding of tendon adaptation in vivo. Here we develop a computational model of tendon remodeling based on the premise that mechanical damage and tenocyte-mediated tendon damage and repair processes modify the distribution of its collagen fiber lengths. We explain how these processes enable the tendon to geometrically adapt to its load conditions. Based on known biological processes, mechanical and strain-dependent proteolytic fiber damage are incorporated into our tendon model. Using a stochastic model of fiber repair, it is assumed that mechanically damaged fibers are repaired longer, whereas proteolytically damaged fibers are repaired shorter, relative to their pre-damage length. To study adaptation of tendon properties to applied load, our model musculotendon unit is a simplified three-component Hill-type model of the human Achilles-soleus unit. Our model results demonstrate that the geometric equilibrium state of the Achilles tendon can coincide with minimization of the total metabolic cost of muscle activation. The proposed tendon model independently predicts rates of collagen fiber turnover that are in general agreement with in vivo experimental measurements. While the computational model here only represents a first step in a new approach to understanding the complex process of tendon remodeling in vivo, given these findings, it appears likely that the proposed framework may itself provide a useful theoretical foundation for developing valuable qualitative and quantitative insights into tendon physiology and pathology.

  11. Increased performance in the short-term water demand forecasting through the use of a parallel adaptive weighting strategy

    Science.gov (United States)

    Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.

    2018-03-01

    Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.

  12. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees.

    Science.gov (United States)

    Chien, Tsair-Wei; Lai, Wen-Pin; Lu, Chih-Wei; Wang, Weng-Chung; Chen, Shih-Chung; Wang, Hsien-Yi; Su, Shih-Bin

    2011-04-17

    To develop a web-based computer adaptive testing (CAT) application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37) could evaluate the job satisfaction of individual employees as a single construct. The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  13. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees

    Directory of Open Access Journals (Sweden)

    Chen Shih-Chung

    2011-04-01

    Full Text Available Abstract Background To develop a web-based computer adaptive testing (CAT application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37 could evaluate the job satisfaction of individual employees as a single construct. Methods The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Results Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. Conclusions A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  14. Arabic validation of the Urogenital Distress Inventory and Adapted Incontinence Impact Questionnaires--short forms.

    Science.gov (United States)

    El-Azab, Ahmed S; Mascha, Edward J

    2009-01-01

    The purpose of this study was to adapt the IIQ-7 to suit the Egyptian culture and then to assess validity and reliability of the adapted and translated IIQ-7 and UDI-6. IIQ-7 was modified to suit Egyptian culture. Linguistic validation of the two questionnaires was done. Initial test-retest reliability and internal consistency of adapted translated questionnaires were done in a pilot study. The final validity, test-retest reliability and internal consistency study included 204 women with urinary incontinence (UI). Participants completed the two questionnaires at enrollment and after 2 weeks. All participants underwent urodynamics. Baseline urodynamic diagnosis was compared with diagnoses made by questionnaires to assess validity. Test-retest reliability was excellent for both the IIQ-7 and UDI-6. For the UDI-6, the mean difference (SD) between first and second visits was -1.63 (7.0), and the 95% CI for the mean difference was -2.6 and -0.68. The 95% limits of agreement were -15.3 and 12.0. Lin's concordance correlation coefficient (LCCC) (95% CI) for the UDI was 0.89 (0.85 and 0.91). For the IIQ-7, the mean difference (SD) was 0.37 (7.1), and the 95% CI for the mean difference was -0.60 and 1.3. The 95% limits of agreement were -13.5 and 14.2. LCCC (95% CI) for the IIQ was 0.90 (0.87 and 0.92). Internal consistency as assessed using Cronbach's alpha was 0.32 and 0.31 for the UDI-6 and IIQ-7, respectively. Validity assessments indicated that both IIQ and UDI scales can distinguish objective disease states. UDI-6 and the modified IIQ-7 are easy to administer, test-retest reliable, and valid questionnaires, with relatively low internal consistency. (c) 2008 Wiley-Liss, Inc.

  15. Reduced short term adaptation to robot generated dynamic environment in children affected by Cerebral Palsy.

    Science.gov (United States)

    Masia, Lorenzo; Frascarelli, Flaminia; Morasso, Pietro; Di Rosa, Giuseppe; Petrarca, Maurizio; Castelli, Enrico; Cappa, Paolo

    2011-05-21

    It is known that healthy adults can quickly adapt to a novel dynamic environment, generated by a robotic manipulandum as a structured disturbing force field. We suggest that it may be of clinical interest to evaluate to which extent this kind of motor learning capability is impaired in children affected by cerebal palsy. We adapted the protocol already used with adults, which employs a velocity dependant viscous field, and compared the performance of a group of subjects affected by Cerebral Palsy (CP group, 7 subjects) with a Control group of unimpaired age-matched children. The protocol included a familiarization phase (FA), during which no force was applied, a force field adaptation phase (CF), and a wash-out phase (WO) in which the field was removed. During the CF phase the field was shut down in a number of randomly selected "catch" trials, which were used in order to evaluate the "learning index" for each single subject and the two groups. Lateral deviation, speed and acceleration peaks and average speed were evaluated for each trajectory; a directional analysis was performed in order to inspect the role of the limb's inertial anisotropy in the different experimental phases. During the FA phase the movements of the CP subjects were more curved, displaying greater and variable directional error; over the course of the CF phase both groups showed a decreasing trend in the lateral error and an after-effect at the beginning of the wash-out, but the CP group had a non significant adaptation rate and a lower learning index, suggesting that CP subjects have reduced ability to learn to compensate external force. Moreover, a directional analysis of trajectories confirms that the control group is able to better predict the force field by tuning the kinematic features of the movements along different directions in order to account for the inertial anisotropy of arm. Spatial abnormalities in children affected by cerebral palsy may be related not only to disturbance in

  16. Reduced short term adaptation to robot generated dynamic environment in children affected by Cerebral Palsy

    Directory of Open Access Journals (Sweden)

    Di Rosa Giuseppe

    2011-05-01

    Full Text Available Abstract Background It is known that healthy adults can quickly adapt to a novel dynamic environment, generated by a robotic manipulandum as a structured disturbing force field. We suggest that it may be of clinical interest to evaluate to which extent this kind of motor learning capability is impaired in children affected by cerebal palsy. Methods We adapted the protocol already used with adults, which employs a velocity dependant viscous field, and compared the performance of a group of subjects affected by Cerebral Palsy (CP group, 7 subjects with a Control group of unimpaired age-matched children. The protocol included a familiarization phase (FA, during which no force was applied, a force field adaptation phase (CF, and a wash-out phase (WO in which the field was removed. During the CF phase the field was shut down in a number of randomly selected "catch" trials, which were used in order to evaluate the "learning index" for each single subject and the two groups. Lateral deviation, speed and acceleration peaks and average speed were evaluated for each trajectory; a directional analysis was performed in order to inspect the role of the limb's inertial anisotropy in the different experimental phases. Results During the FA phase the movements of the CP subjects were more curved, displaying greater and variable directional error; over the course of the CF phase both groups showed a decreasing trend in the lateral error and an after-effect at the beginning of the wash-out, but the CP group had a non significant adaptation rate and a lower learning index, suggesting that CP subjects have reduced ability to learn to compensate external force. Moreover, a directional analysis of trajectories confirms that the control group is able to better predict the force field by tuning the kinematic features of the movements along different directions in order to account for the inertial anisotropy of arm. Conclusions Spatial abnormalities in children affected

  17. Adaptive tight frame based medical image reconstruction: a proof-of-concept study for computed tomography

    International Nuclear Information System (INIS)

    Zhou, Weifeng; Cai, Jian-Feng; Gao, Hao

    2013-01-01

    A popular approach for medical image reconstruction has been through the sparsity regularization, assuming the targeted image can be well approximated by sparse coefficients under some properly designed system. The wavelet tight frame is such a widely used system due to its capability for sparsely approximating piecewise-smooth functions, such as medical images. However, using a fixed system may not always be optimal for reconstructing a variety of diversified images. Recently, the method based on the adaptive over-complete dictionary that is specific to structures of the targeted images has demonstrated its superiority for image processing. This work is to develop the adaptive wavelet tight frame method image reconstruction. The proposed scheme first constructs the adaptive wavelet tight frame that is task specific, and then reconstructs the image of interest by solving an l 1 -regularized minimization problem using the constructed adaptive tight frame system. The proof-of-concept study is performed for computed tomography (CT), and the simulation results suggest that the adaptive tight frame method improves the reconstructed CT image quality from the traditional tight frame method. (paper)

  18. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  19. Adaptive Radiotherapy Planning on Decreasing Gross Tumor Volumes as Seen on Megavoltage Computed Tomography Images

    International Nuclear Information System (INIS)

    Woodford, Curtis; Yartsev, Slav; Dar, A. Rashid; Bauman, Glenn; Van Dyk, Jake

    2007-01-01

    Purpose: To evaluate gross tumor volume (GTV) changes for patients with non-small-cell lung cancer by using daily megavoltage (MV) computed tomography (CT) studies acquired before each treatment fraction on helical tomotherapy and to relate the potential benefit of adaptive image-guided radiotherapy to changes in GTV. Methods and Materials: Seventeen patients were prescribed 30 fractions of radiotherapy on helical tomotherapy for non-small-cell lung cancer at London Regional Cancer Program from Dec 2005 to March 2007. The GTV was contoured on the daily MVCT studies of each patient. Adapted plans were created using merged MVCT-kilovoltage CT image sets to investigate the advantages of replanning for patients with differing GTV regression characteristics. Results: Average GTV change observed over 30 fractions was -38%, ranging from -12 to -87%. No significant correlation was observed between GTV change and patient's physical or tumor features. Patterns of GTV changes in the 17 patients could be divided broadly into three groups with distinctive potential for benefit from adaptive planning. Conclusions: Changes in GTV are difficult to predict quantitatively based on patient or tumor characteristics. If changes occur, there are points in time during the treatment course when it may be appropriate to adapt the plan to improve sparing of normal tissues. If GTV decreases by greater than 30% at any point in the first 20 fractions of treatment, adaptive planning is appropriate to further improve the therapeutic ratio

  20. Moving finite elements: A continuously adaptive method for computational fluid dynamics

    International Nuclear Information System (INIS)

    Glasser, A.H.; Miller, K.; Carlson, N.

    1991-01-01

    Moving Finite Elements (MFE), a recently developed method for computational fluid dynamics, promises major advances in the ability of computers to model the complex behavior of liquids, gases, and plasmas. Applications of computational fluid dynamics occur in a wide range of scientifically and technologically important fields. Examples include meteorology, oceanography, global climate modeling, magnetic and inertial fusion energy research, semiconductor fabrication, biophysics, automobile and aircraft design, industrial fluid processing, chemical engineering, and combustion research. The improvements made possible by the new method could thus have substantial economic impact. Moving Finite Elements is a moving node adaptive grid method which has a tendency to pack the grid finely in regions where it is most needed at each time and to leave it coarse elsewhere. It does so in a manner which is simple and automatic, and does not require a large amount of human ingenuity to apply it to each particular problem. At the same time, it often allows the time step to be large enough to advance a moving shock by many shock thicknesses in a single time step, moving the grid smoothly with the solution and minimizing the number of time steps required for the whole problem. For 2D problems (two spatial variables) the grid is composed of irregularly shaped and irregularly connected triangles which are very flexible in their ability to adapt to the evolving solution. While other adaptive grid methods have been developed which share some of these desirable properties, this is the only method which combines them all. In many cases, the method can save orders of magnitude of computing time, equivalent to several generations of advancing computer hardware

  1. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Science.gov (United States)

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Adapting the short form of the Coping Inventory for Stressful Situations into Chinese

    Directory of Open Access Journals (Sweden)

    Li C

    2017-06-01

    Full Text Available Chun Li,1 Qing Liu,2 Ti Hu,3 Xiaoyan Jin1 1International School of Chinese Studies, Northeast Normal University, Changchun, 2Department of Nuclear Medicine and Medical PET Center, The Second Hospital of Zhejiang University School of Medicine, Zhejiang University, Hangzhou, 3School of Physical Education and Sports, Beijing Normal University, Beijing, People’s Republic of China Objectives: The Coping Inventory for Stressful Situations (CISS is a measurement tool for evaluating stress that has good psychometric properties. We investigated the applicability of a short-form version of the CISS in a large sample of Chinese university students. Methods: Nine hundred and seventy-two Chinese university students aged 18–30 years (mean =20.15, standard deviation =3.26 were chosen as subjects, of whom 101 were randomly selected to be retested after a 2-week interval. Results: The results of a confirmatory factor analysis revealed that the root mean square error of approximation of a four-factor model was 0.06, while the comparative fit index was 0.91, the incremental fit index was 0.93, the non-normed fit index was 0.91, and the root mean residual was 0.07. The Cronbach’s α coefficients for the task-oriented, emotion-oriented, distraction, and social diversion coping subscales were 0.81, 0.74, 0.7, and 0.66, respectively. The 2-week test–retest reliability was 0.78, 0.74, 0.7, and 0.65 for the task-oriented, emotion-oriented, distraction, and social diversion coping subscales, respectively. In the Chinese version of the CISS short form, task-oriented coping was positively correlated with positive affect and extraversion and negatively correlated with neuroticism; emotion-oriented coping was negatively correlated with extraversion and positively correlated with negative affect, anxiety, and neuroticism; distraction coping was positively correlated with neuroticism, extroversion, anxiety, positive affect, and negative affect and negatively

  3. Adaptation.

    Science.gov (United States)

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  4. Muscle Adaptations Following Short-Duration Bed Rest with Integrated Resistance, Interval, and Aerobic Exercise

    Science.gov (United States)

    Hackney, Kyle J.; Scott, Jessica M.; Buxton, Roxanne; Redd-Goetchius, Elizabeth; Crowell, J. Brent; Everett, Meghan E.; Wickwire, Jason; Ryder, Jeffrey W.; Bloomberg, Jacob J.; Ploutz-Snyder, Lori L.

    2011-01-01

    Unloading of the musculoskeletal system during space flight results in deconditioning that may impair mission-related task performance in astronauts. Exercise countermeasures have been frequently tested during bed rest (BR) and limb suspension; however, high-intensity, short-duration exercise prescriptions have not been fully explored. PURPOSE: To determine if a high intensity resistance, interval, and aerobic exercise program could protect against muscle atrophy and dysfunction when performed during short duration BR. METHODS: Nine subjects (1 female, 8 male) performed a combination of supine exercises during 2 weeks of horizontal BR. Resistance exercise (3 d / wk) consisted of squat, leg press, hamstring curl, and heel raise exercises (3 sets, 12 repetitions). Aerobic (6 d / wk) sessions alternated continuous (75% VO2 peak) and interval exercise (30 s, 2 min, and 4 min) and were completed on a supine cycle ergometer and vertical treadmill, respectively. Muscle volumes of the upper leg were calculated pre, mid, and post-BR using magnetic resonance imaging. Maximal isometric force (MIF), rate of force development (RFD), and peak power of the lower body extensors were measured twice before BR (averaged to represent pre) and once post BR. ANOVA with repeated measures and a priori planned contrasts were used to test for differences. RESULTS: There were no changes to quadriceps, hamstring, and adductor muscle volumes at mid and post BR time points compared to pre BR (Table 1). Peak power increased significantly from 1614 +/- 372 W to 1739 +/- 359 W post BR (+7.7%, p = 0.035). Neither MIF (pre: 1676 +/- 320 N vs. post: 1711 +/- 250 N, +2.1%, p = 0.333) nor RFD (pre: 7534 +/- 1265 N/ms vs. post: 6951 +/- 1241 N/ms, -7.7%, p = 0.136) were significantly impaired post BR.

  5. Adapting to the destitute situations: poverty cues lead to short-term choice.

    Science.gov (United States)

    Liu, Lei; Feng, Tingyong; Suo, Tao; Lee, Kang; Li, Hong

    2012-01-01

    Why do some people live for the present, whereas others save for the future? The evolutionary framework of life history theory predicts that preference for delay of gratification should be influenced by social economic status (SES). However, here we propose that the decision to choose alternatives in immediate and delayed gratification in poverty environments may have a psychological dimension. Specifically, the perception of environmental poverty cues may induce people alike to favor choices with short-term, likely smaller benefit than choices with long-term, greater benefit. The present study was conducted to explore how poverty and affluence cues affected individuals' intertemporal choices. In our first two experiments, individuals exposed explicitly (Experiment 1) and implicitly (Experiment 2) to poverty pictures (the poverty cue) were induced to prefer immediate gratification compared with those exposed to affluence pictures (the affluence cue). Furthermore, by the manipulation of temporary perceptions of poverty and affluence status using a lucky draw game; individuals in the poverty state were more impulsive in a manner, which made them pursue immediate gratification in intertemporal choices (Experiment 3). Thus, poverty cues can lead to short-term choices. Decision makers chose more frequently the sooner-smaller reward over the later-larger reward as they were exposed to the poverty cue. This indicates that it is that just the feeling of poverty influences intertemporal choice - the actual reality of poverty (restricted resources, etc.) is not necessary to get the effect. Furthermore, our findings emphasize that it is a change of the poverty-affluence status, not a trait change, can influence individual preference in intertemporal choice.

  6. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  7. Adaptation

    International Development Research Centre (IDRC) Digital Library (Canada)

    building skills, knowledge or networks on adaptation, ... the African partners leading the AfricaAdapt network, together with the UK-based Institute of Development Studies; and ... UNCCD Secretariat, Regional Coordination Unit for Africa, Tunis, Tunisia .... 26 Rural–urban Cooperation on Water Management in the Context of.

  8. Processing-Efficient Distributed Adaptive RLS Filtering for Computationally Constrained Platforms

    Directory of Open Access Journals (Sweden)

    Noor M. Khan

    2017-01-01

    Full Text Available In this paper, a novel processing-efficient architecture of a group of inexpensive and computationally incapable small platforms is proposed for a parallely distributed adaptive signal processing (PDASP operation. The proposed architecture runs computationally expensive procedures like complex adaptive recursive least square (RLS algorithm cooperatively. The proposed PDASP architecture operates properly even if perfect time alignment among the participating platforms is not available. An RLS algorithm with the application of MIMO channel estimation is deployed on the proposed architecture. Complexity and processing time of the PDASP scheme with MIMO RLS algorithm are compared with sequentially operated MIMO RLS algorithm and liner Kalman filter. It is observed that PDASP scheme exhibits much lesser computational complexity parallely than the sequential MIMO RLS algorithm as well as Kalman filter. Moreover, the proposed architecture provides an improvement of 95.83% and 82.29% decreased processing time parallely compared to the sequentially operated Kalman filter and MIMO RLS algorithm for low doppler rate, respectively. Likewise, for high doppler rate, the proposed architecture entails an improvement of 94.12% and 77.28% decreased processing time compared to the Kalman and RLS algorithms, respectively.

  9. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  10. Spectral indices of cardiovascular adaptations to short-term simulated microgravity exposure

    Science.gov (United States)

    Patwardhan, A. R.; Evans, J. M.; Berk, M.; Grande, K. J.; Charles, J. B.; Knapp, C. F.

    1995-01-01

    We investigated the effects of exposure to microgravity on the baseline autonomic balance in cardiovascular regulation using spectral analysis of cardiovascular variables measured during supine rest. Heart rate, arterial pressure, radial flow, thoracic fluid impedance and central venous pressure were recorded from nine volunteers before and after simulated microgravity, produced by 20 hours of 6 degrees head down bedrest plus furosemide. Spectral powers increased after simulated microgravity in the low frequency region (centered at about 0.03 Hz) in arterial pressure, heart rate and radial flow, and decreased in the respiratory frequency region (centered at about 0.25 Hz) in heart rate. Reduced heart rate power in the respiratory frequency region indicates reduced parasympathetic influence on the heart. A concurrent increase in the low frequency power in arterial pressure, heart rate, and radial flow indicates increased sympathetic influence. These results suggest that the baseline autonomic balance in cardiovascular regulation is shifted towards increased sympathetic and decreased parasympathetic influence after exposure to short-term simulated microgravity.

  11. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  12. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Science.gov (United States)

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  13. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    Science.gov (United States)

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  14. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali; Ltaief, Hatem; Gratadour, Damien; Keyes, David E.; Sevin, Arnaud; Abdelfattah, Ahmad; Gendron, Eric; Morel, Carine; Vidal, Fabrice

    2014-01-01

    called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used

  15. Cephalopods as Predators: A Short Journey among Behavioral Flexibilities, Adaptions, and Feeding Habits

    Directory of Open Access Journals (Sweden)

    Roger Villanueva

    2017-08-01

    Full Text Available The diversity of cephalopod species and the differences in morphology and the habitats in which they live, illustrates the ability of this class of molluscs to adapt to all marine environments, demonstrating a wide spectrum of patterns to search, detect, select, capture, handle, and kill prey. Photo-, mechano-, and chemoreceptors provide tools for the acquisition of information about their potential preys. The use of vision to detect prey and high attack speed seem to be a predominant pattern in cephalopod species distributed in the photic zone, whereas in the deep-sea, the development of mechanoreceptor structures and the presence of long and filamentous arms are more abundant. Ambushing, luring, stalking and pursuit, speculative hunting and hunting in disguise, among others are known modes of hunting in cephalopods. Cannibalism and scavenger behavior is also known for some species and the development of current culture techniques offer evidence of their ability to feed on inert and artificial foods. Feeding requirements and prey choice change throughout development and in some species, strong ontogenetic changes in body form seem associated with changes in their diet and feeding strategies, although this is poorly understood in planktonic and larval stages. Feeding behavior is altered during senescence and particularly in brooding octopus females. Cephalopods are able to feed from a variety of food sources, from detritus to birds. Their particular requirements of lipids and copper may help to explain why marine crustaceans, rich in these components, are common prey in all cephalopod diets. The expected variation in climate change and ocean acidification and their effects on chemoreception and prey detection capacities in cephalopods are unknown and needs future research.

  16. Cephalopods as Predators: A Short Journey among Behavioral Flexibilities, Adaptions, and Feeding Habits.

    Science.gov (United States)

    Villanueva, Roger; Perricone, Valentina; Fiorito, Graziano

    2017-01-01

    The diversity of cephalopod species and the differences in morphology and the habitats in which they live, illustrates the ability of this class of molluscs to adapt to all marine environments, demonstrating a wide spectrum of patterns to search, detect, select, capture, handle, and kill prey. Photo-, mechano-, and chemoreceptors provide tools for the acquisition of information about their potential preys. The use of vision to detect prey and high attack speed seem to be a predominant pattern in cephalopod species distributed in the photic zone, whereas in the deep-sea, the development of mechanoreceptor structures and the presence of long and filamentous arms are more abundant. Ambushing, luring, stalking and pursuit, speculative hunting and hunting in disguise, among others are known modes of hunting in cephalopods. Cannibalism and scavenger behavior is also known for some species and the development of current culture techniques offer evidence of their ability to feed on inert and artificial foods. Feeding requirements and prey choice change throughout development and in some species, strong ontogenetic changes in body form seem associated with changes in their diet and feeding strategies, although this is poorly understood in planktonic and larval stages. Feeding behavior is altered during senescence and particularly in brooding octopus females. Cephalopods are able to feed from a variety of food sources, from detritus to birds. Their particular requirements of lipids and copper may help to explain why marine crustaceans, rich in these components, are common prey in all cephalopod diets. The expected variation in climate change and ocean acidification and their effects on chemoreception and prey detection capacities in cephalopods are unknown and needs future research.

  17. Adaptation of MPDATA Heterogeneous Stencil Computation to Intel Xeon Phi Coprocessor

    Directory of Open Access Journals (Sweden)

    Lukasz Szustak

    2015-01-01

    Full Text Available The multidimensional positive definite advection transport algorithm (MPDATA belongs to the group of nonoscillatory forward-in-time algorithms and performs a sequence of stencil computations. MPDATA is one of the major parts of the dynamic core of the EULAG geophysical model. In this work, we outline an approach to adaptation of the 3D MPDATA algorithm to the Intel MIC architecture. In order to utilize available computing resources, we propose the (3 + 1D decomposition of MPDATA heterogeneous stencil computations. This approach is based on combination of the loop tiling and fusion techniques. It allows us to ease memory/communication bounds and better exploit the theoretical floating point efficiency of target computing platforms. An important method of improving the efficiency of the (3 + 1D decomposition is partitioning of available cores/threads into work teams. It permits for reducing inter-cache communication overheads. This method also increases opportunities for the efficient distribution of MPDATA computation onto available resources of the Intel MIC architecture, as well as Intel CPUs. We discuss preliminary performance results obtained on two hybrid platforms, containing two CPUs and Intel Xeon Phi. The top-of-the-line Intel Xeon Phi 7120P gives the best performance results, and executes MPDATA almost 2 times faster than two Intel Xeon E5-2697v2 CPUs.

  18. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  19. Translation, cross-cultural adaptation and psychometric evaluation of yoruba version of the short-form 36 health survey.

    Science.gov (United States)

    Mbada, Chidozie Emmanuel; Adeogun, Gafar Atanda; Ogunlana, Michael Opeoluwa; Adedoyin, Rufus Adesoji; Akinsulore, Adesanmi; Awotidebe, Taofeek Oluwole; Idowu, Opeyemi Ayodiipo; Olaoye, Olumide Ayoola

    2015-09-14

    The Short-Form Health Survey (SF-36) is a valid quality of life tool often employed to determine the impact of medical intervention and the outcome of health care services. However, the SF-36 is culturally sensitive which necessitates its adaptation and translation into different languages. This study was conducted to cross-culturally adapt the SF-36 into Yoruba language and determine its reliability and validity. Based on the International Quality of Life Assessment project guidelines, a sequence of translation, test of item-scale correlation, and validation was implemented for the translation of the Yoruba version of the SF-36. Following pilot testing, the English and the Yoruba versions of the SF-36 were administered to a random sample of 1087 apparently healthy individuals to test validity and 249 respondents completed the Yoruba SF-36 again after two weeks to test reliability. Data was analyzed using Pearson's product moment correlation analysis, independent t-test, one-way analysis of variance, multi trait scaling analysis and Intra-Class Correlation (ICC) at p Yoruba SF-36 ranges between 0.636 and 0.843 for scales; and 0.783 and 0.851 for domains. The data quality, concurrent and discriminant validity, reliability and internal consistency of the Yoruba version of the SF-36 are adequate and it is recommended for measuring health-related quality of life among Yoruba population.

  20. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    International Nuclear Information System (INIS)

    Ball, D.G.; Cheverton, R.D.

    1985-01-01

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  1. Short-term locomotor adaptation to a robotic ankle exoskeleton does not alter soleus Hoffmann reflex amplitude.

    Science.gov (United States)

    Kao, Pei-Chun; Lewis, Cara L; Ferris, Daniel P

    2010-07-26

    To improve design of robotic lower limb exoskeletons for gait rehabilitation, it is critical to identify neural mechanisms that govern locomotor adaptation to robotic assistance. Previously, we demonstrated soleus muscle recruitment decreased by approximately 35% when walking with a pneumatically-powered ankle exoskeleton providing plantar flexor torque under soleus proportional myoelectric control. Since a substantial portion of soleus activation during walking results from the stretch reflex, increased reflex inhibition is one potential mechanism for reducing soleus recruitment when walking with exoskeleton assistance. This is clinically relevant because many neurologically impaired populations have hyperactive stretch reflexes and training to reduce the reflexes could lead to substantial improvements in their motor ability. The purpose of this study was to quantify soleus Hoffmann (H-) reflex responses during powered versus unpowered walking. We tested soleus H-reflex responses in neurologically intact subjects (n=8) that had trained walking with the soleus controlled robotic ankle exoskeleton. Soleus H-reflex was tested at the mid and late stance while subjects walked with the exoskeleton on the treadmill at 1.25 m/s, first without power (first unpowered), then with power (powered), and finally without power again (second unpowered). We also collected joint kinematics and electromyography. When the robotic plantar flexor torque was provided, subjects walked with lower soleus electromyographic (EMG) activation (27-48%) and had concomitant reductions in H-reflex amplitude (12-24%) compared to the first unpowered condition. The H-reflex amplitude in proportion to the background soleus EMG during powered walking was not significantly different from the two unpowered conditions. These findings suggest that the nervous system does not inhibit the soleus H-reflex in response to short-term adaption to exoskeleton assistance. Future studies should determine if the

  2. An adaptive network-based fuzzy inference system for short-term natural gas demand estimation: Uncertain and complex environments

    International Nuclear Information System (INIS)

    Azadeh, A.; Asadzadeh, S.M.; Ghanbari, A.

    2010-01-01

    Accurate short-term natural gas (NG) demand estimation and forecasting is vital for policy and decision-making process in energy sector. Moreover, conventional methods may not provide accurate results. This paper presents an adaptive network-based fuzzy inference system (ANFIS) for estimation of NG demand. Standard input variables are used which are day of the week, demand of the same day in previous year, demand of a day before and demand of 2 days before. The proposed ANFIS approach is equipped with pre-processing and post-processing concepts. Moreover, input data are pre-processed (scaled) and finally output data are post-processed (returned to its original scale). The superiority and applicability of the ANFIS approach is shown for Iranian NG consumption from 22/12/2007 to 30/6/2008. Results show that ANFIS provides more accurate results than artificial neural network (ANN) and conventional time series approach. The results of this study provide policy makers with an appropriate tool to make more accurate predictions on future short-term NG demand. This is because the proposed approach is capable of handling non-linearity, complexity as well as uncertainty that may exist in actual data sets due to erratic responses and measurement errors.

  3. Computational complexity of algorithms for sequence comparison, short-read assembly and genome alignment.

    Science.gov (United States)

    Baichoo, Shakuntala; Ouzounis, Christos A

    A multitude of algorithms for sequence comparison, short-read assembly and whole-genome alignment have been developed in the general context of molecular biology, to support technology development for high-throughput sequencing, numerous applications in genome biology and fundamental research on comparative genomics. The computational complexity of these algorithms has been previously reported in original research papers, yet this often neglected property has not been reviewed previously in a systematic manner and for a wider audience. We provide a review of space and time complexity of key sequence analysis algorithms and highlight their properties in a comprehensive manner, in order to identify potential opportunities for further research in algorithm or data structure optimization. The complexity aspect is poised to become pivotal as we will be facing challenges related to the continuous increase of genomic data on unprecedented scales and complexity in the foreseeable future, when robust biological simulation at the cell level and above becomes a reality. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  5. Computational adaptive optics for broadband optical interferometric tomography of biological tissue.

    Science.gov (United States)

    Adie, Steven G; Graf, Benedikt W; Ahmad, Adeel; Carney, P Scott; Boppart, Stephen A

    2012-05-08

    Aberrations in optical microscopy reduce image resolution and contrast, and can limit imaging depth when focusing into biological samples. Static correction of aberrations may be achieved through appropriate lens design, but this approach does not offer the flexibility of simultaneously correcting aberrations for all imaging depths, nor the adaptability to correct for sample-specific aberrations for high-quality tomographic optical imaging. Incorporation of adaptive optics (AO) methods have demonstrated considerable improvement in optical image contrast and resolution in noninterferometric microscopy techniques, as well as in optical coherence tomography. Here we present a method to correct aberrations in a tomogram rather than the beam of a broadband optical interferometry system. Based on Fourier optics principles, we correct aberrations of a virtual pupil using Zernike polynomials. When used in conjunction with the computed imaging method interferometric synthetic aperture microscopy, this computational AO enables object reconstruction (within the single scattering limit) with ideal focal-plane resolution at all depths. Tomographic reconstructions of tissue phantoms containing subresolution titanium-dioxide particles and of ex vivo rat lung tissue demonstrate aberration correction in datasets acquired with a highly astigmatic illumination beam. These results also demonstrate that imaging with an aberrated astigmatic beam provides the advantage of a more uniform depth-dependent signal compared to imaging with a standard gaussian beam. With further work, computational AO could enable the replacement of complicated and expensive optical hardware components with algorithms implemented on a standard desktop computer, making high-resolution 3D interferometric tomography accessible to a wider group of users and nonspecialists.

  6. Short Paper and Poster Proceedings of the 22nd Annual Conference on Computer Animation and Social Agents

    NARCIS (Netherlands)

    Nijholt, Antinus; Egges, A.; van Welbergen, H.; Hondorp, G.H.W.

    2009-01-01

    These are the proceedings containing the short and poster papers of CASA 2009, the twenty second international conference on Computer Animation and Social Agents. CASA 2009 was organized in Amsterdam, the Netherlands from the 17th to the 19th of June 2009. CASA is organized under the auspices of the

  7. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  8. Computing Adaptive Feature Weights with PSO to Improve Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2017-01-01

    Full Text Available Android malware detection is a complex and crucial issue. In this paper, we propose a malware detection model using a support vector machine (SVM method based on feature weights that are computed by information gain (IG and particle swarm optimization (PSO algorithms. The IG weights are evaluated based on the relevance between features and class labels, and the PSO weights are adaptively calculated to result in the best fitness (the performance of the SVM classification model. Moreover, to overcome the defects of basic PSO, we propose a new adaptive inertia weight method called fitness-based and chaotic adaptive inertia weight-PSO (FCAIW-PSO that improves on basic PSO and is based on the fitness and a chaotic term. The goal is to assign suitable weights to the features to ensure the best Android malware detection performance. The results of experiments indicate that the IG weights and PSO weights both improve the performance of SVM and that the performance of the PSO weights is better than that of the IG weights.

  9. Cross-Cultural adaptation of an instrument to computer accessibility evaluation for students with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Gerusa Ferreira Lourenço

    2015-03-01

    Full Text Available The specific literature indicates that the successful education of children with cerebral palsy may require the implementation of appropriate assistive technology resources, allowing students to improve their performance and complete everyday tasks more efficiently and independently. To this end, these resources must be selected properly, emphasizing the importance of an appropriate initial assessment of the child and the possibilities of the resources available. The present study aimed to translate and adapt theoretically an American instrument that evaluates computer accessibility for people with cerebral palsy, in order to contextualize it for applicability to Brazilian students with cerebral palsy. The methodology involved the steps of translation and cross-cultural adaptation of this instrument, as well as the construction of a supplementary script for additional use of that instrument in the educational context. Translation procedures, theoretical and technical adaptation of the American instrument and theoretical analysis (content and semantics were carried out with the participation of professional experts of the special education area as adjudicators. The results pointed to the relevance of the proposal of the translated instrument in conjunction with the script built to the reality of professionals involved with the education of children with cerebral palsy, such as occupational therapists and special educators.

  10. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  11. THE PLUTO CODE FOR ADAPTIVE MESH COMPUTATIONS IN ASTROPHYSICAL FLUID DYNAMICS

    International Nuclear Information System (INIS)

    Mignone, A.; Tzeferacos, P.; Zanni, C.; Bodo, G.; Van Straalen, B.; Colella, P.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  12. Adjusting for cross-cultural differences in computer-adaptive tests of quality of life.

    Science.gov (United States)

    Gibbons, C J; Skevington, S M

    2018-04-01

    Previous studies using the WHOQOL measures have demonstrated that the relationship between individual items and the underlying quality of life (QoL) construct may differ between cultures. If unaccounted for, these differing relationships can lead to measurement bias which, in turn, can undermine the reliability of results. We used item response theory (IRT) to assess differential item functioning (DIF) in WHOQOL data from diverse language versions collected in UK, Zimbabwe, Russia, and India (total N = 1332). Data were fitted to the partial credit 'Rasch' model. We used four item banks previously derived from the WHOQOL-100 measure, which provided excellent measurement for physical, psychological, social, and environmental quality of life domains (40 items overall). Cross-cultural differential item functioning was assessed using analysis of variance for item residuals and post hoc Tukey tests. Simulated computer-adaptive tests (CATs) were conducted to assess the efficiency and precision of the four items banks. Splitting item parameters by DIF results in four linked item banks without DIF or other breaches of IRT model assumptions. Simulated CATs were more precise and efficient than longer paper-based alternatives. Assessing differential item functioning using item response theory can identify measurement invariance between cultures which, if uncontrolled, may undermine accurate comparisons in computer-adaptive testing assessments of QoL. We demonstrate how compensating for DIF using item anchoring allowed data from all four countries to be compared on a common metric, thus facilitating assessments which were both sensitive to cultural nuance and comparable between countries.

  13. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    Science.gov (United States)

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  15. Cross-cultural adaptation of the US consumer form of the short Primary Care Assessment Tool (PCAT): the Korean consumer form of the short PCAT (KC PCAT) and the Korean standard form of the short PCAT (KS PCAT).

    Science.gov (United States)

    Jeon, Ki-Yeob

    2011-01-01

    It is well known that countries with well-structured primary care have better health outcomes, better health equity and reduced healthcare costs. This study aimed to culturally modify and validate the US consumer form of the short Primary Care Assessment Tool (PCAT) in primary care in the Republic of Korea (hereafter referred to as Korea). The Korean consumer form of the short PCAT (KC PCAT) was cross-culturally modified from the original version using a standardised transcultural adaptation method. A pre-test version of the KC PCAT was formulated by replacement of four items and modification of a further four items from the 37 items of the original consumer form of the short PCAT at face value evaluation meetings. Pilot testing was done with a convenience sample of 15 responders at two different sites. Test-retest showed high reliability. To validate the KC PCAT, 606 clients participated in a survey carried out in Korea between February and May 2006. Internal consistency reliability, test-retest reliability and factor analysis were conducted in order to test validity. Psychometric testing was carried out on 37 items of the KC PCAT to make the KS PCAT which has 30 items and has seven principal domains: first contact utilisation, first contact accessibility, ongoing accountable care (ongoing care and coordinated rapport care), integrated care (patient-centred care with integration between primary and specialty care or between different specialties), comprehensive care, community-oriented care and culturally-oriented care. Component factors of the verified KS PCAT explained 58.28% of the total variance in the total item scores of primary care. The verified KS PCAT has been characterised by the seven classic domains of primary care with minor modifications. This may provide clues concerning differences in expectations for primary care in the Korean population as compared with that of the US. The KS PCAT is a reliable and valid tool for the evaluation of the quality of

  16. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    Science.gov (United States)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  17. Effect of Preparation Depth on the Marginal and Internal Adaptation of Computer-aided Design/Computer-assisted Manufacture Endocrowns.

    Science.gov (United States)

    Gaintantzopoulou, M D; El-Damanhoury, H M

    The aim of the study was to evaluate the effect of preparation depth and intraradicular extension on the marginal and internal adaptation of computer-aided design/computer-assisted manufacture (CAD/CAM) endocrown restorations. Standardized preparations were made in resin endodontic tooth models (Nissin Dental), with an intracoronal preparation depth of 2 mm (group H2), with extra 1- (group H3) or 2-mm (group H4) intraradicular extensions in the root canals (n=12). Vita Enamic polymer-infiltrated ceramic-network material endocrowns were fabricated using the CEREC AC CAD/CAM system and were seated on the prepared teeth. Specimens were evaluated by microtomography. Horizontal and vertical tomographic sections were recorded and reconstructed by using the CTSkan software (TView v1.1, Skyscan).The surface/void volume (S/V) in the region of interest was calculated. Marginal gap (MG), absolute marginal discrepancy (MD), and internal marginal gap were measured at various measuring locations and calculated in microscale (μm). Marginal and internal discrepancy data (μm) were analyzed with nonparametric Kruskal-Wallis analysis of variance by ranks with Dunn's post hoc, whereas S/V data were analyzed by one-way analysis of variance and Bonferroni multiple comparisons (α=0.05). Significant differences were found in MG, MD, and internal gap width values between the groups, with H2 showing the lowest values from all groups. S/V calculations presented significant differences between H2 and the other two groups (H3 and H4) tested, with H2 again showing the lowest values. Increasing the intraradicular extension of endocrown restorations increased the marginal and internal gap of endocrown restorations.

  18. Comparisons of clustered regularly interspaced short palindromic repeats and viromes in human saliva reveal bacterial adaptations to salivary viruses.

    Science.gov (United States)

    Pride, David T; Salzman, Julia; Relman, David A

    2012-09-01

    Explorations of human microbiota have provided substantial insight into microbial community composition; however, little is known about interactions between various microbial components in human ecosystems. In response to the powerful impact of viral predation, bacteria have acquired potent defences, including an adaptive immune response based on the clustered regularly interspaced short palindromic repeats (CRISPRs)/Cas system. To improve our understanding of the interactions between bacteria and their viruses in humans, we analysed 13 977 streptococcal CRISPR sequences and compared them with 2 588 172 virome reads in the saliva of four human subjects over 17 months. We found a diverse array of viruses and CRISPR spacers, many of which were specific to each subject and time point. There were numerous viral sequences matching CRISPR spacers; these matches were highly specific for salivary viruses. We determined that spacers and viruses coexist at the same time, which suggests that streptococcal CRISPR/Cas systems are under constant pressure from salivary viruses. CRISPRs in some subjects were just as likely to match viral sequences from other subjects as they were to match viruses from the same subject. Because interactions between bacteria and viruses help to determine the structure of bacterial communities, CRISPR-virus analyses are likely to provide insight into the forces shaping the human microbiome. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.

  19. Sub-module Short Circuit Fault Diagnosis in Modular Multilevel Converter Based on Wavelet Transform and Adaptive Neuro Fuzzy Inference System

    DEFF Research Database (Denmark)

    Liu, Hui; Loh, Poh Chiang; Blaabjerg, Frede

    2015-01-01

    for continuous operation and post-fault maintenance. In this article, a fault diagnosis technique is proposed for the short circuit fault in a modular multi-level converter sub-module using the wavelet transform and adaptive neuro fuzzy inference system. The fault features are extracted from output phase voltage...

  20. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  1. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Suleman Khan

    2014-01-01

    Full Text Available Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  2. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    Science.gov (United States)

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  3. Implementation and adaption of the Computer Code ECOSYS/EXCEL for Austria as OECOSYS/EXCEL

    International Nuclear Information System (INIS)

    Hick, H.; Suda, M.; Mueck, K.

    1998-03-01

    During 1989, under contract to the Austrian Chamber of the Federal Chancellor, department VII, the radioecological forecast model OECOSYS was implemented by the Austrian Research Centre Seibersdorf on a VAX computer using VAX Fortran. OECOSYS allows the prediction of the consequences after a large scale contamination event. During 1992, under contract to the Austrian Federal Ministry of Health, Sports and Consumer Protection, department III OECOSYS - in the version of 1989 - was implemented on PC's in Seibersdorf and the Ministry using OS/2 and Microsoft -Fortran. In March 1993, the Ministry ordered an update which had become necessary and the evaluation of two exercise scenarios. Since that time the prognosis model with its auxiliary program and communication facilities is kept on stand-by and yearly exercises are performed to maintain its readiness. The current report describes the implementation and adaption to Austrian conditions of the newly available EXCEL version of the German ECOSYS prognosis model as OECOSYS. (author)

  4. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Science.gov (United States)

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  5. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  6. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  7. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    Science.gov (United States)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  8. Cone Beam Computed Tomography-Derived Adaptive Radiotherapy for Radical Treatment of Esophageal Cancer

    International Nuclear Information System (INIS)

    Hawkins, Maria A.; Brooks, Corrinne; Hansen, Vibeke N.; Aitken, Alexandra; Tait, Diana M.

    2010-01-01

    Purpose: To investigate the potential for reduction in normal tissue irradiation by creating a patient specific planning target volume (PTV) using cone beam computed tomography (CBCT) imaging acquired in the first week of radiotherapy for patients receiving radical radiotherapy. Methods and materials: Patients receiving radical RT for carcinoma of the esophagus were investigated. The PTV is defined as CTV(tumor, nodes) plus esophagus outlined 3 to 5 cm cranio-caudally and a 1.5-cm circumferential margin is added (clinical plan). Prefraction CBCT are acquired on Days 1 to 4, then weekly. No correction for setup error made. The images are imported into the planning system. The tumor and esophagus for the length of the PTV are contoured on each CBCT and 5 mm margin is added. A composite volume (PTV1) is created using Week 1 composite CBCT volumes. The same process is repeated using CBCT Week 2 to 6 (PTV2). A new plan is created using PTV1 (adaptive plan). The coverage of the 95% isodose of PTV1 is evaluated on PTV2. Dose-volume histograms (DVH) for lungs, heart, and cord for two plans are compared. Results: A total of 139 CBCT for 14 cases were analyzed. For the adaptive plan the coverage of the 95% prescription isodose for PTV1 = 95.6% ± 4% and the PTV2 = 96.8% ± 4.1% (t test, 0.19). Lungs V20 (15.6 Gy vs. 10.2 Gy) and heart mean dose (26.9 Gy vs. 20.7 Gy) were significantly smaller for the adaptive plan. Conclusions: A reduced planning volume can be constructed within the first week of treatment using CBCT. A single plan modification can be performed within the second week of treatment with considerable reduction in organ at risk dose.

  9. The impacts of computer adaptive testing from a variety of perspectives

    Directory of Open Access Journals (Sweden)

    Tetsuo Kimura

    2017-05-01

    Full Text Available Computer adaptive testing (CAT is a kind of tailored testing, in that it is a form of computer-based testing that is adaptive to each test-taker’s ability level. In this review, the impacts of CAT are discussed from different perspectives in order to illustrate crucial points to keep in mind during the development and implementation of CAT. Test developers and psychometricians often emphasize the efficiency and accuracy of CAT in comparison to traditional linear tests. However, many test-takers report feeling discouraged after taking CATs, and this feeling can reduce learning self-efficacy and motivation. A trade-off must be made between the psychological experiences of test-takers and measurement efficiency. From the perspective of educators and subject matter experts, nonstatistical specifications, such as content coverage, content balance, and form length are major concerns. Thus, accreditation bodies may be faced with a discrepancy between the perspectives of psychometricians and those of subject matter experts. In order to improve test-takers’ impressions of CAT, the author proposes increasing the target probability of answering correctly in the item selection algorithm even if doing so consequently decreases measurement efficiency. Two different methods, CAT with a shadow test approach and computerized multistage testing, have been developed in order to ensure the satisfaction of subject matter experts. In the shadow test approach, a full-length test is assembled that meets the constraints and provides maximum information at the current ability estimate, while computerized multistage testing gives subject matter experts an opportunity to review all test forms prior to administration.

  10. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    Science.gov (United States)

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  11. Adaptive hybrid brain-computer interaction: ask a trainer for assistance!

    Science.gov (United States)

    Müller-Putz, Gernot R; Steyrl, David; Faller, Josef

    2014-01-01

    In applying mental imagery brain-computer interfaces (BCIs) to end users, training is a key part for novice users to get control. In general learning situations, it is an established concept that a trainer assists a trainee to improve his/her aptitude in certain skills. In this work, we want to evaluate whether we can apply this concept in the context of event-related desynchronization (ERD) based, adaptive, hybrid BCIs. Hence, in a first session we merged the features of a high aptitude BCI user, a trainer, and a novice user, the trainee, in a closed-loop BCI feedback task and automatically adapted the classifier over time. In a second session the trainees operated the system unassisted. Twelve healthy participants ran through this protocol. Along with the trainer, the trainees achieved a very high overall peak accuracy of 95.3 %. In the second session, where users operated the BCI unassisted, they still achieved a high overall peak accuracy of 83.6%. Ten of twelve first time BCI users successfully achieved significantly better than chance accuracy. Concluding, we can say that this trainer-trainee approach is very promising. Future research should investigate, whether this approach is superior to conventional training approaches. This trainer-trainee concept could have potential for future application of BCIs to end users.

  12. Efficient computation of the elastography inverse problem by combining variational mesh adaption and a clustering technique

    International Nuclear Information System (INIS)

    Arnold, Alexander; Bruhns, Otto T; Reichling, Stefan; Mosler, Joern

    2010-01-01

    This paper is concerned with an efficient implementation suitable for the elastography inverse problem. More precisely, the novel algorithm allows us to compute the unknown stiffness distribution in soft tissue by means of the measured displacement field by considerably reducing the numerical cost compared to previous approaches. This is realized by combining and further elaborating variational mesh adaption with a clustering technique similar to those known from digital image compression. Within the variational mesh adaption, the underlying finite element discretization is only locally refined if this leads to a considerable improvement of the numerical solution. Additionally, the numerical complexity is reduced by the aforementioned clustering technique, in which the parameters describing the stiffness of the respective soft tissue are sorted according to a predefined number of intervals. By doing so, the number of unknowns associated with the elastography inverse problem can be chosen explicitly. A positive side effect of this method is the reduction of artificial noise in the data (smoothing of the solution). The performance and the rate of convergence of the resulting numerical formulation are critically analyzed by numerical examples.

  13. An Adaptive and Integrated Low-Power Framework for Multicore Mobile Computing

    Directory of Open Access Journals (Sweden)

    Jongmoo Choi

    2017-01-01

    Full Text Available Employing multicore in mobile computing such as smartphone and IoT (Internet of Things device is a double-edged sword. It provides ample computing capabilities required in recent intelligent mobile services including voice recognition, image processing, big data analysis, and deep learning. However, it requires a great deal of power consumption, which causes creating a thermal hot spot and putting pressure on the energy resource in a mobile device. In this paper, we propose a novel framework that integrates two well-known low-power techniques, DPM (Dynamic Power Management and DVFS (Dynamic Voltage and Frequency Scaling for energy efficiency in multicore mobile systems. The key feature of the proposed framework is adaptability. By monitoring the online resource usage such as CPU utilization and power consumption, the framework can orchestrate diverse DPM and DVFS policies according to workload characteristics. Real implementation based experiments using three mobile devices have shown that it can reduce the power consumption ranging from 22% to 79%, while affecting negligibly the performance of workloads.

  14. Adapting a computer-delivered brief alcohol intervention for veterans with Hepatitis C.

    Science.gov (United States)

    Cucciare, Michael A; Jamison, Andrea L; Combs, Ann S; Joshi, Gauri; Cheung, Ramsey C; Rongey, Catherine; Huggins, Joe; Humphreys, Keith

    2017-12-01

    This study adapted an existing computer-delivered brief alcohol intervention (cBAI) for use in Veterans with the hepatitis C virus (HCV) and examined its acceptability and feasibility in this patient population. A four-stage model consisting of initial pilot testing, qualitative interviews with key stakeholders, development of a beta version of the cBAI, and usability testing was used to achieve the study objectives. In-depth interviews gathered feedback for modifying the cBAI, including adding HCV-related content such as the health effects of alcohol on liver functioning, immune system functioning, and management of HCV, a preference for concepts to be displayed through "newer looking" graphics, and limiting the use of text to convey key concepts. Results from usability testing indicated that the modified cBAI was acceptable and feasible for use in this patient population. The development model used in this study is effective for gathering actionable feedback that can inform the development of a cBAI and can result in the development of an acceptable and feasible intervention for use in this population. Findings also have implications for developing computer-delivered interventions targeting behavior change more broadly.

  15. Effect of radiation dose and adaptive statistical iterative reconstruction on image quality of pulmonary computed tomography

    International Nuclear Information System (INIS)

    Sato, Jiro; Akahane, Masaaki; Inano, Sachiko; Terasaki, Mariko; Akai, Hiroyuki; Katsura, Masaki; Matsuda, Izuru; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    The purpose of this study was to assess the effects of dose and adaptive statistical iterative reconstruction (ASIR) on image quality of pulmonary computed tomography (CT). Inflated and fixed porcine lungs were scanned with a 64-slice CT system at 10, 20, 40 and 400 mAs. Using automatic exposure control, 40 mAs was chosen as standard dose. Scan data were reconstructed with filtered back projection (FBP) and ASIR. Image pairs were obtained by factorial combination of images at a selected level. Using a 21-point scale, three experienced radiologists independently rated differences in quality between adjacently displayed paired images for image noise, image sharpness and conspicuity of tiny nodules. A subjective quality score (SQS) for each image was computed based on Anderson's functional measurement theory. The standard deviation was recorded as a quantitative noise measurement. At all doses examined, SQSs improved with ASIR for all evaluation items. No significant differences were noted between the SQSs for 40%-ASIR images obtained at 20 mAs and those for FBP images at 40 mAs. Compared to the FBP algorithm, ASIR for lung CT can enable an approximately 50% dose reduction from the standard dose while preserving visualization of small structures. (author)

  16. On-Line Testing and Reconfiguration of Field Programmable Gate Arrays (FPGAs) for Fault-Tolerant (FT) Applications in Adaptive Computing Systems (ACS)

    National Research Council Canada - National Science Library

    Abramovici, Miron

    2002-01-01

    Adaptive computing systems (ACS) rely on reconfigurable hardware to adapt the system operation to changes in the external environment, and to extend mission capability by implementing new functions on the same hardware platform...

  17. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    Directory of Open Access Journals (Sweden)

    Jack Burston

    2014-09-01

    Full Text Available This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2 English ability of incoming students is by means of a computer-based test since such evaluation can be administered quickly, automatically corrected, and the outcome known as soon as the test is completed. While the option of using a commercial CAT is available to institutions with the ability to pay substantial annual fees, or the means of passing these expenses on to their students, language instructors without these resources can only avail themselves of the advantages of CAT evaluation by creating their own tests.  As is demonstrated by the E-CAT project described in this paper, this is a viable alternative even for those lacking any computer programing expertise.  However, language teaching experience and testing expertise are critical to such an undertaking, which requires considerable effort and, above all, collaborative teamwork to succeed. A number of practical skills are also required. Firstly, the operation of a CAT authoring programme must be learned. Once this is done, test makers must master the art of creating a question database and assigning difficulty levels to test items. Lastly, if multimedia resources are to be exploited in a CAT, test creators need to be able to locate suitable copyright-free resources and re-edit them as needed.

  18. Modeling the Effects of Trait Diversity on Short-term Adaptive Capacity and Long-term Productivity of Phytoplankton Communities

    Science.gov (United States)

    Smith, S. L.; Vallina, S. M.; Merico, A.

    2016-02-01

    We examine Biodiversity and Ecosystem Function (BEF) in a model phytoplankton community, using two recently developed mechanisms for sustaining diversity. The Trait Diffusion (TD) formulation represents the maintenance of diversity via endogenous mechanisms, such as inter-generational trait plasticity and rapid evolution. The 'Kill-the-Winner' (KTW) formulation for grazing sustains prey biodiversity via the exogenous mechanism of active prey switching. We implement both TD and KTW in a continuous trait-distribution model using simplified size-scalings to define a gleaner-opportunist trade-off for a phytoplankton community. By simulating semi-continuous culture experiments with periodic external dilutions, we test the dynamic response of the phytoplankton community to different scenarios of pulsed nutrient supply. We quantify the short-term Adaptive Capacity (AC) of the community by the specific growth rate averaged over the first 3 days of perturbations, and the Long-term Productivity (LP) by its average over the entire 120 day period of perturbations. When either the frequency or intensity of pulses is low, both AC and LP tend to decrease with diversity (and vice versa). Trait diversity has more impact on AC, particularly for pulses of high frequency or intensity, for which it tends to increase gradually at first, then steeply, and then to saturate with increasing diversity. For pulses of moderate intensity and frequency, increasing trait diversity from low to moderate levels leads to a trade-off between enhancing AC while slightly decreasing LP. Ultimately, we find that sustaining diversity increases the speed at which the phytoplankton community changes its composition in terms of size and hence nutrient acquisition traits, which may have implications for the transfer of productivity through the foodweb.

  19. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  20. Cross-Cultural Adaptation of a Farsi Version of the Impulsive Behavior Scale‎-Short Form in Iran

    Directory of Open Access Journals (Sweden)

    Omid Shokri

    2016-12-01

    Full Text Available Background: The aim of the present study was to investigate psychometric properties of the Impulsive Behavior Scale-Short Form (IBS-SF among undergraduate Farsi-speaking Iranian students. In this study, 201 individuals (95 men, 106 women answered to the IBS-SF and the Problematic and Risky Internet Use Screening Scale‎ (PRIUSS.Methods: The confirmatory factor analysis and internal consistency methods were used to compute the factorial validity and reliability of the IBS-SF, respectively. In order to examine the construct validity of the IBS-SF, the correlation of different dimensions of IBS-SF with PRIUSS was determined.Results: The results of confirmatory factor analysis showed that a 5-factor structure of the negative urgency, lack of perseverance, lack of premeditation, sensation seeking, and positive urgency was replicated in the Iranian sample. The IBS-SF convergent validity was confirmed by a correlation between different features of impulsivity trait and problematic and risky internet use behavior. The internal consistency of the different subscales of impulsivity trait ranged from 0.67 to 0.80.Conclusion: The present study revealed that the IBS-SF is a valid and reliable scale for measuring impulsivity trait among undergraduate Farsi-speaking Iranian students.

  1. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  2. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Directory of Open Access Journals (Sweden)

    Cohen Laurent

    2006-05-01

    Full Text Available Abstract Background In a companion article 1, we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed.

  3. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Science.gov (United States)

    Wilson, Anna J; Revkin, Susannah K; Cohen, David; Cohen, Laurent; Dehaene, Stanislas

    2006-01-01

    Background In a companion article [1], we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed. PMID:16734906

  4. Computationally Efficient Blind Code Synchronization for Asynchronous DS-CDMA Systems with Adaptive Antenna Arrays

    Directory of Open Access Journals (Sweden)

    Chia-Chang Hu

    2005-04-01

    Full Text Available A novel space-time adaptive near-far robust code-synchronization array detector for asynchronous DS-CDMA systems is developed in this paper. There are the same basic requirements that are needed by the conventional matched filter of an asynchronous DS-CDMA system. For the real-time applicability, a computationally efficient architecture of the proposed detector is developed that is based on the concept of the multistage Wiener filter (MWF of Goldstein and Reed. This multistage technique results in a self-synchronizing detection criterion that requires no inversion or eigendecomposition of a covariance matrix. As a consequence, this detector achieves a complexity that is only a linear function of the size of antenna array (J, the rank of the MWF (M, the system processing gain (N, and the number of samples in a chip interval (S, that is, 𝒪(JMNS. The complexity of the equivalent detector based on the minimum mean-squared error (MMSE or the subspace-based eigenstructure analysis is a function of 𝒪((JNS3. Moreover, this multistage scheme provides a rapid adaptive convergence under limited observation-data support. Simulations are conducted to evaluate the performance and convergence behavior of the proposed detector with the size of the J-element antenna array, the amount of the L-sample support, and the rank of the M-stage MWF. The performance advantage of the proposed detector over other DS-CDMA detectors is investigated as well.

  5. Colonic GLP-2 is not sufficient to promote jejunal adaptation in a PN-dependent rat model of human short bowel syndrome

    DEFF Research Database (Denmark)

    Koopmann, Matthew C; Liu, Xiaowen; Boehler, Christopher J

    2009-01-01

    BACKGROUND: Bowel resection may lead to short bowel syndrome (SBS), which often requires parenteral nutrition (PN) due to inadequate intestinal adaptation. The objective of this study was to determine the time course of adaptation and proglucagon system responses after bowel resection in a PN...... and digestive capacity were assessed by mucosal mass, protein, DNA, histology, and sucrase activity. Plasma insulin-like growth factor I (IGF-I) and bioactive glucagon-like peptide 2 (GLP-2) were measured by radioimmunoassay. RESULTS: Jejunum cellularity changed significantly over time with resection...

  6. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    Science.gov (United States)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the

  7. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  8. Computational hydrodynamics and optical performance of inductively-coupled plasma adaptive lenses

    Energy Technology Data Exchange (ETDEWEB)

    Mortazavi, M.; Urzay, J., E-mail: jurzay@stanford.edu; Mani, A. [Center for Turbulence Research, Stanford University, Stanford, California 94305-3024 (United States)

    2015-06-15

    This study addresses the optical performance of a plasma adaptive lens for aero-optical applications by using both axisymmetric and three-dimensional numerical simulations. Plasma adaptive lenses are based on the effects of free electrons on the phase velocity of incident light, which, in theory, can be used as a phase-conjugation mechanism. A closed cylindrical chamber filled with Argon plasma is used as a model lens into which a beam of light is launched. The plasma is sustained by applying a radio-frequency electric current through a coil that envelops the chamber. Four different operating conditions, ranging from low to high powers and induction frequencies, are employed in the simulations. The numerical simulations reveal complex hydrodynamic phenomena related to buoyant and electromagnetic laminar transport, which generate, respectively, large recirculating cells and wall-normal compression stresses in the form of local stagnation-point flows. In the axisymmetric simulations, the plasma motion is coupled with near-wall axial striations in the electron-density field, some of which propagate in the form of low-frequency traveling disturbances adjacent to vortical quadrupoles that are reminiscent of Taylor-Görtler flow structures in centrifugally unstable flows. Although the refractive-index fields obtained from axisymmetric simulations lead to smooth beam wavefronts, they are found to be unstable to azimuthal disturbances in three of the four three-dimensional cases considered. The azimuthal striations are optically detrimental, since they produce high-order angular aberrations that account for most of the beam wavefront error. A fourth case is computed at high input power and high induction frequency, which displays the best optical properties among all the three-dimensional simulations considered. In particular, the increase in induction frequency prevents local thermalization and leads to an axisymmetric distribution of electrons even after introduction of

  9. Computer-Aided Modelling of Short-Path Evaporation for Chemical Product Purification, Analysis and Design

    DEFF Research Database (Denmark)

    Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    method, suitable for separation and purification of thermally unstable materials whose design and analysis can be efficiently performed through reliable model-based techniques. This paper presents a generalized model for short-path evaporation and highlights its development, implementation and solution...

  10. Computer-Automated Approach for Scoring Short Essays in an Introductory Statistics Course

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Kang, Hyun Bin; Kim, Kyung; Gao, Mengzhao; Johnson, Glenn; Clariana, Roy; Zhang, Fan

    2018-01-01

    Over two semesters short essay prompts were developed for use with the Graphical Interface for Knowledge Structure (GIKS), an automated essay scoring system. Participants were students in an undergraduate-level online introductory statistics course. The GIKS compares students' writing samples with an expert's to produce keyword occurrence and…

  11. Computer adaptive test performance in children with and without disabilities: prospective field study of the PEDI-CAT

    NARCIS (Netherlands)

    Dumas, H.M.; Fragala-Pinkham, M.A.; Haley, S.M.; Ni, P.; Coster, W.; Kramer, J.M.; Kao, Y.C.; Moed, R.; Ludlow, L.H.

    2012-01-01

    PURPOSE: To examine the discriminant validity, test-retest reliability, administration time and acceptability of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT). METHODS: A sample of 102 parents of children 3 through 20 years of age with (n = 50) and without (n =

  12. Usability of an adaptive computer assistant that improves self-care and health literacy of older adults

    NARCIS (Netherlands)

    Blanson Henkemans, O.A.; Rogers, W.A.; Fisk, A.D.; Neerincx, M.A.; Lindenberg, J.; Mast, C.A.P.G. van der

    2008-01-01

    Objectives: We developed an adaptive computer assistant for the supervision of diabetics' self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient's electronic diary. Accordingly, it provides

  13. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  14. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF...

  15. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    Science.gov (United States)

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  16. Using Artificial Intelligence to Control and Adapt Level of Difficulty in Computer Based, Cognitive Therapy – an Explorative Study

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2011-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...

  17. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  18. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  19. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  20. Ultralow dose computed tomography attenuation correction for pediatric PET CT using adaptive statistical iterative reconstruction

    International Nuclear Information System (INIS)

    Brady, Samuel L.; Shulkin, Barry L.

    2015-01-01

    Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (10–35 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV bw ) of various diameter targets (range 8–37 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV bw , background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.3–0.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake

  1. Influence of Adaptive Statistical Iterative Reconstruction on coronary plaque analysis in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Kitslaar, Pieter H; Broersen, Alexander; Dijkstra, Jouke; Gerke, Oke; Thygesen, Jesper; Egstrup, Kenneth; Lambrechtsen, Jess

    The purpose of this study was to study the effect of iterative reconstruction (IR) software on quantitative plaque measurements in coronary computed tomography angiography (CCTA). Thirty patients with a three clinical risk factors for coronary artery disease (CAD) had one CCTA performed. Images were reconstructed using FBP, 30% and 60% adaptive statistical IR (ASIR). Coronary plaque analysis was performed as per patient and per vessel (LM, LAD, CX and RCA) measurements. Lumen and vessel volumes and plaque burden measurements were based on automatic detected contours in each reconstruction. Lumen and plaque intensity measurements and HU based plaque characterization were based on corrected contours copied to each reconstruction. No significant changes between FBP and 30% ASIR were found except for lumen- (-2.53 HU) and plaque intensities (-1.28 HU). Between FBP and 60% ASIR the change in total volume showed an increase of 0.94%, 4.36% and 2.01% for lumen, plaque and vessel, respectively. The change in total plaque burden between FBP and 60% ASIR was 0.76%. Lumen and plaque intensities decreased between FBP and 60% ASIR with -9.90 HU and -1.97 HU, respectively. The total plaque component volume changes were all small with a maximum change of -1.13% of necrotic core between FBP and 60% ASIR. Quantitative plaque measurements only showed modest differences between FBP and the 60% ASIR level. Differences were increased lumen-, vessel- and plaque volumes, decreased lumen- and plaque intensities and a small percentage change in the individual plaque component volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  2. Ultralow dose computed tomography attenuation correction for pediatric PET CT using adaptive statistical iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Brady, Samuel L., E-mail: samuel.brady@stjude.org [Division of Diagnostic Imaging, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States); Shulkin, Barry L. [Nuclear Medicine and Department of Radiological Sciences, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States)

    2015-02-15

    Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (10–35 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV{sub bw}) of various diameter targets (range 8–37 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV{sub bw}, background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.3–0.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake.

  3. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  4. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    Science.gov (United States)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and

  5. An embedded implementation based on adaptive filter bank for brain-computer interface systems.

    Science.gov (United States)

    Belwafi, Kais; Romain, Olivier; Gannouni, Sofien; Ghaffari, Fakhreddine; Djemal, Ridha; Ouni, Bouraoui

    2018-07-15

    Brain-computer interface (BCI) is a new communication pathway for users with neurological deficiencies. The implementation of a BCI system requires complex electroencephalography (EEG) signal processing including filtering, feature extraction and classification algorithms. Most of current BCI systems are implemented on personal computers. Therefore, there is a great interest in implementing BCI on embedded platforms to meet system specifications in terms of time response, cost effectiveness, power consumption, and accuracy. This article presents an embedded-BCI (EBCI) system based on a Stratix-IV field programmable gate array. The proposed system relays on the weighted overlap-add (WOLA) algorithm to perform dynamic filtering of EEG-signals by analyzing the event-related desynchronization/synchronization (ERD/ERS). The EEG-signals are classified, using the linear discriminant analysis algorithm, based on their spatial features. The proposed system performs fast classification within a time delay of 0.430 s/trial, achieving an average accuracy of 76.80% according to an offline approach and 80.25% using our own recording. The estimated power consumption of the prototype is approximately 0.7 W. Results show that the proposed EBCI system reduces the overall classification error rate for the three datasets of the BCI-competition by 5% compared to other similar implementations. Moreover, experiment shows that the proposed system maintains a high accuracy rate with a short processing time, a low power consumption, and a low cost. Performing dynamic filtering of EEG-signals using WOLA increases the recognition rate of ERD/ERS patterns of motor imagery brain activity. This approach allows to develop a complete prototype of a EBCI system that achieves excellent accuracy rates. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Thygesen, Jesper; Gerke, Oke; Egstrup, Kenneth; Waaler, Dag; Lambrechtsen, Jess

    2016-12-01

    Coronary computed tomography angiography (CCTA) requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR) techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. To evaluate whether adaptive statistical iterative reconstruction (ASIR) enhances perceived image quality in CCTA compared to filtered back projection (FBP). Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA) and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR]) was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR ( P  = 0.004). The objective measures showed significant differences between FBP and 60% ASIR ( P  < 0.0001) for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  7. Numerical Computation of Underground Inundation in Multiple Layers Using the Adaptive Transfer Method

    Directory of Open Access Journals (Sweden)

    Hyung-Jun Kim

    2018-01-01

    Full Text Available Extreme rainfall causes surface runoff to flow towards lowlands and subterranean facilities, such as subway stations and buildings with underground spaces in densely packed urban areas. These facilities and areas are therefore vulnerable to catastrophic submergence. However, flood modeling of underground space has not yet been adequately studied because there are difficulties in reproducing the associated multiple horizontal layers connected with staircases or elevators. This study proposes a convenient approach to simulate underground inundation when two layers are connected. The main facet of this approach is to compute the flow flux passing through staircases in an upper layer and to transfer the equivalent quantity to a lower layer. This is defined as the ‘adaptive transfer method’. This method overcomes the limitations of 2D modeling by introducing layers connecting concepts to prevent large variations in mesh sizes caused by complicated underlying obstacles or local details. Consequently, this study aims to contribute to the numerical analysis of flow in inundated underground spaces with multiple floors.

  8. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  9. The Short International Physical Activity Questionnaire: cross-cultural adaptation, validation and reliability of the Hausa language version in Nigeria.

    Science.gov (United States)

    Oyeyemi, Adewale L; Oyeyemi, Adetoyeje Y; Adegoke, Babatunde O; Oyetoke, Fatima O; Aliyu, Habeeb N; Aliyu, Salamatu U; Rufai, Adamu A

    2011-11-22

    Accurate assessment of physical activity is important in determining the risk for chronic diseases such as cardiovascular disease, stroke, type 2 diabetes, cancer and obesity. The absence of culturally relevant measures in indigenous languages could pose challenges to epidemiological studies on physical activity in developing countries. The purpose of this study was to translate and cross-culturally adapt the Short International Physical Activity Questionnaire (IPAQ-SF) to the Hausa language, and to evaluate the validity and reliability of the Hausa version of IPAQ-SF in Nigeria. The English IPAQ-SF was translated into the Hausa language, synthesized, back translated, and subsequently subjected to expert committee review and pre-testing. The final product (Hausa IPAQ-SF) was tested in a cross-sectional study for concurrent (correlation with the English version) and construct validity, and test-retest reliability in a sample of 102 apparently healthy adults. The Hausa IPAQ-SF has good concurrent validity with Spearman correlation coefficients (ρ) ranging from 0.78 for vigorous activity (Min Week-1) to 0.92 for total physical activity (Metabolic Equivalent of Task [MET]-Min Week-1), but poor construct validity, with cardiorespiratory fitness (ρ = 0.21, p = 0.01) and body mass index (ρ = 0.22, p = 0.04) significantly correlated with only moderate activity and sitting time (Min Week-1), respectively. Reliability was good for vigorous (ICC = 0.73, 95% C.I = 0.55-0.84) and total physical activity (ICC = 0.61, 95% C.I = 0.47-0.72), but fair for moderate activity (ICC = 0.33, 95% C.I = 0.12-0.51), and few meaningful differences were found in the gender and socioeconomic status specific analyses. The Hausa IPAQ-SF has acceptable concurrent validity and test-retest reliability for vigorous-intensity activity, walking, sitting and total physical activity, but demonstrated only fair construct validity for moderate and sitting activities. The Hausa IPAQ-SF can be used for

  10. Computational Design of Short Pulse Laser Driven Iron Opacity Measurements at Stellar-Relevant Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Madison E. [Univ. of Florida, Gainesville, FL (United States)

    2017-05-20

    Opacity is a critical parameter in the simulation of radiation transport in systems such as inertial con nement fusion capsules and stars. The resolution of current disagreements between solar models and helioseismological observations would bene t from experimental validation of theoretical opacity models. Overall, short pulse laser heated iron experiments reaching stellar-relevant conditions have been designed with consideration of minimizing tamper emission and optical depth effects while meeting plasma condition and x-ray emission goals.

  11. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  12. Turning the Page on Pen-and-Paper Questionnaires: Combining Ecological Momentary Assessment and Computer Adaptive Testing to Transform Psychological Assessment in the 21st Century.

    Science.gov (United States)

    Gibbons, Chris J

    2016-01-01

    The current paper describes new opportunities for patient-centred assessment methods which have come about by the increased adoption of affordable smart technologies in biopsychosocial research and medical care. In this commentary, we review modern assessment methods including item response theory (IRT), computer adaptive testing (CAT), and ecological momentary assessment (EMA) and explain how these methods may be combined to improve psychological assessment. We demonstrate both how a 'naïve' selection of a small group of items in an EMA can lead to unacceptably unreliable assessments and how IRT can provide detailed information on the individual information that each item gives thus allowing short form assessments to be selected with acceptable reliability. The combination of CAT and IRT can ensure assessments are precise, efficient, and well targeted to the individual; allowing EMAs to be both brief and accurate.

  13. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  14. Computation of short-time diffusion using the particle simulation method

    International Nuclear Information System (INIS)

    Janicke, L.

    1983-01-01

    The method of particle simulation allows a correct description of turbulent diffusion even in areas near the source and the computation of overall average values (anticipated values). The model is suitable for dealing with complex situation. It is derived from the K-model which describes the dispersion of noxious matter using the diffusion formula. (DG) [de

  15. Computer literacy enhancement in the Teaching Hospital Olomouc. Part I: project management techniques. Short communication.

    Science.gov (United States)

    Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera

    2003-11-01

    Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.

  16. Intelligent Adaptation and Personalization Techniques in Computer-Supported Collaborative Learning

    CERN Document Server

    Demetriadis, Stavros; Xhafa, Fatos

    2012-01-01

    Adaptation and personalization have been extensively studied in CSCL research community aiming to design intelligent systems that adaptively support eLearning processes and collaboration. Yet, with the fast development in Internet technologies, especially with the emergence of new data technologies and the mobile technologies, new opportunities and perspectives are opened for advanced adaptive and personalized systems. Adaptation and personalization are posing new research and development challenges to nowadays CSCL systems. In particular, adaptation should be focused in a multi-dimensional way (cognitive, technological, context-aware and personal). Moreover, it should address the particularities of both individual learners and group collaboration. As a consequence, the aim of this book is twofold. On the one hand, it discusses the latest advances and findings in the area of intelligent adaptive and personalized learning systems. On the other hand it analyzes the new implementation perspectives for intelligen...

  17. Validation of a computer-adaptive test to evaluate generic health-related quality of life

    Directory of Open Access Journals (Sweden)

    Zardaín Pilar C

    2010-12-01

    Full Text Available Abstract Background Health Related Quality of Life (HRQoL is a relevant variable in the evaluation of health outcomes. Questionnaires based on Classical Test Theory typically require a large number of items to evaluate HRQoL. Computer Adaptive Testing (CAT can be used to reduce tests length while maintaining and, in some cases, improving accuracy. This study aimed at validating a CAT based on Item Response Theory (IRT for evaluation of generic HRQoL: the CAT-Health instrument. Methods Cross-sectional study of subjects aged over 18 attending Primary Care Centres for any reason. CAT-Health was administered along with the SF-12 Health Survey. Age, gender and a checklist of chronic conditions were also collected. CAT-Health was evaluated considering: 1 feasibility: completion time and test length; 2 content range coverage, Item Exposure Rate (IER and test precision; and 3 construct validity: differences in the CAT-Health scores according to clinical variables and correlations between both questionnaires. Results 396 subjects answered CAT-Health and SF-12, 67.2% females, mean age (SD 48.6 (17.7 years. 36.9% did not report any chronic condition. Median completion time for CAT-Health was 81 seconds (IQ range = 59-118 and it increased with age (p Conclusions Although domain-specific CATs exist for various areas of HRQoL, CAT-Health is one of the first IRT-based CATs designed to evaluate generic HRQoL and it has proven feasible, valid and efficient, when administered to a broad sample of individuals attending primary care settings.

  18. Development of a computer-adaptive physical function instrument for Social Security Administration disability determination.

    Science.gov (United States)

    Ni, Pengsheng; McDonough, Christine M; Jette, Alan M; Bogusz, Kara; Marfeo, Elizabeth E; Rasch, Elizabeth K; Brandt, Diane E; Meterko, Mark; Haley, Stephen M; Chan, Leighton

    2013-09-01

    To develop and test an instrument to assess physical function for Social Security Administration (SSA) disability programs, the SSA-Physical Function (SSA-PF) instrument. Item response theory (IRT) analyses were used to (1) create a calibrated item bank for each of the factors identified in prior factor analyses, (2) assess the fit of the items within each scale, (3) develop separate computer-adaptive testing (CAT) instruments for each scale, and (4) conduct initial psychometric testing. Cross-sectional data collection; IRT analyses; CAT simulation. Telephone and Internet survey. Two samples: SSA claimants (n=1017) and adults from the U.S. general population (n=999). None. Model fit statistics, correlation, and reliability coefficients. IRT analyses resulted in 5 unidimensional SSA-PF scales: Changing & Maintaining Body Position, Whole Body Mobility, Upper Body Function, Upper Extremity Fine Motor, and Wheelchair Mobility for a total of 102 items. High CAT accuracy was demonstrated by strong correlations between simulated CAT scores and those from the full item banks. On comparing the simulated CATs with the full item banks, very little loss of reliability or precision was noted, except at the lower and upper ranges of each scale. No difference in response patterns by age or sex was noted. The distributions of claimant scores were shifted to the lower end of each scale compared with those of a sample of U.S. adults. The SSA-PF instrument contributes important new methodology for measuring the physical function of adults applying to the SSA disability programs. Initial evaluation revealed that the SSA-PF instrument achieved considerable breadth of coverage in each content domain and demonstrated noteworthy psychometric properties. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  19. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography

    Directory of Open Access Journals (Sweden)

    Helle Precht

    2016-12-01

    Full Text Available Background Coronary computed tomography angiography (CCTA requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. Purpose To evaluate whether adaptive statistical iterative reconstruction (ASIR enhances perceived image quality in CCTA compared to filtered back projection (FBP. Material and Methods Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR] was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. Results VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR (P = 0.004. The objective measures showed significant differences between FBP and 60% ASIR (P < 0.0001 for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. Conclusion ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  20. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs

    International Nuclear Information System (INIS)

    Nyman, Ulf; Kristiansson, Mattias; Leitz, Wolfram; Paahlstorp, Per-Aake

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations

  1. A practical approach to compute short-wave irradiance interacting with subgrid-scale buildings

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, Uwe; Frueh, Barbara [Deutscher Wetterdienst, Offenbach am Main (Germany)

    2012-08-15

    A numerical approach for the calculation of short-wave irradiances at the ground as well as the walls and roofs of buildings in an environment with unresolved built-up is presented. In this radiative parameterization scheme the properties of the unresolved built-up are assigned to settlement types which are characterized by mean values of the volume density of the buildings and their wall area density. Therefore it is named wall area approach. In the vertical direction the range of building heights may be subdivided into several layers. In the case of non-uniform building heights the shadowing of the lower roofs by the taller buildings is taken into account. The method includes the approximate calculation of sky view and sun view factors. For an idealized building arrangement it is shown that the obtained approximate factors are in good agreement with exact calculations just as for the comparison of the calculated and measured effective albedo values. For arrangements with isolated single buildings the presented wall area approach yields a better agreement with the observations than similar methods where the unresolved built-up is characterized by the aspect ratio of a representative street canyon (aspect ratio approach). In the limiting case where the built-up is well represented by an ensemble of idealized street canyons both approaches become equivalent. The presented short-wave radiation scheme is part of the microscale atmospheric model MUKLIMO 3 where it contributes to the calculation of surface temperatures on the basis of energy-flux equilibrium conditions. (orig.)

  2. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jiří

    2017-01-01

    Roč. 45, č. 12 (2017), č. článku e111. ISSN 0305-1048 R&D Projects: GA ČR GBP501/12/G090; GA MŠk(CZ) LM2015047 Institutional support: RVO:60077344 Keywords : in-situ hybridization * repetitive sequences * tandem repeats * vicia-faba Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 10.162, year: 2016

  3. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  4. Self-adaptive method to distinguish inner and outer contours of industrial computed tomography image for rapid prototype

    International Nuclear Information System (INIS)

    Duan Liming; Ye Yong; Zhang Xia; Zuo Jian

    2013-01-01

    A self-adaptive identification method is proposed for realizing more accurate and efficient judgment about the inner and outer contours of industrial computed tomography (CT) slice images. The convexity-concavity of the single-pixel-wide closed contour is identified with angle method at first. Then, contours with concave vertices are distinguished to be inner or outer contours with ray method, and contours without concave vertices are distinguished with extreme coordinate value method. The method was chosen to automatically distinguish contours by means of identifying the convexity and concavity of the contours. Thus, the disadvantages of single distinguishing methods, such as ray method's time-consuming and extreme coordinate method's fallibility, can be avoided. The experiments prove the adaptability, efficiency, and accuracy of the self-adaptive method. (authors)

  5. Computationally efficient implementation of sarse-tap FIR adaptive filters with tap-position control on intel IA-32 processors

    OpenAIRE

    Hirano, Akihiro; Nakayama, Kenji

    2008-01-01

    This paper presents an computationally ef cient implementation of sparse-tap FIR adaptive lters with tapposition control on Intel IA-32 processors with single-instruction multiple-data (SIMD) capability. In order to overcome randomorder memory access which prevents a ectorization, a blockbased processing and a re-ordering buffer are introduced. A dynamic register allocation and the use of memory-to-register operations help the maximization of the loop-unrolling level. Up to 66percent speedup ...

  6. Refficientlib: an efficient load-rebalanced adaptive mesh refinement algorithm for high-performance computational physics meshes

    OpenAIRE

    Baiges Aznar, Joan; Bayona Roa, Camilo Andrés

    2017-01-01

    No separate or additional fees are collected for access to or distribution of the work. In this paper we present a novel algorithm for adaptive mesh refinement in computational physics meshes in a distributed memory parallel setting. The proposed method is developed for nodally based parallel domain partitions where the nodes of the mesh belong to a single processor, whereas the elements can belong to multiple processors. Some of the main features of the algorithm presented in this paper a...

  7. Computer simulation of yielding supports under static and short-term dynamic load

    Directory of Open Access Journals (Sweden)

    Kumpyak Oleg

    2018-01-01

    Full Text Available Dynamic impacts that became frequent lately cause large human and economic losses, and their prevention methods are not always effective and reasonable. The given research aims at studying the way of enhancing explosion safety of building structures by means of yielding supports. The paper presents results of numerical studies of strength and deformation property of yielding supports in the shape of annular tubes under static and short-term dynamic loading. The degree of influence of yielding supports was assessed taking into account three peculiar stages of deformation: elastic; elasto-plastic; and elasto-plastic with hardening. The methodology for numerical studies performance was described using finite element analysis with program software Ansys Mechanical v17.2. It was established that rigidity of yielding supports influences significantly their stress-strain state. The research determined that with the increase in deformable elements rigidity dependence between load and deformation of the support in elastic and plastic stages have linear character. Significant reduction of the dynamic response and increase in deformation time of yielding supports were observed due to increasing the plastic component. Therefore, it allows assuming on possibility of their application as supporting units in RC beams.

  8. Change of short-term memory effect in acute ischemic ventricular myocardium: a computational study.

    Science.gov (United States)

    Mei, Xi; Wang, Jing; Zhang, Hong; Liu, Zhi-cheng; Zhang, Zhen-xi

    2014-02-01

    The ionic mechanism of change in short-term memory (STM) during acute myocardial ischemia has not been well understood. In this paper, an advanced guinea pig ventricular model developed by Luo and Rudy was used to investigate STM property of ischemic ventricular myocardium. STM response was calculated by testing the time to reach steady-state action potential duration (APD) after an abrupt shortening of basic cycling length (BCL) in the pacing protocol. Electrical restitution curves (RCs), which can simultaneously visualize multiple aspects of APD restitution and STM, were obtained from dynamic and local S1S2 restitution portrait (RP), which consist of a longer interval stimulus (S1) and a shorter interval stimulus (S2). The angle between dynamic RC and local S1S2 RC reflects the amount of STM. Our results indicated that compared with control (normal) condition, time constant of STM response in the ischemic condition decreased significantly. Meanwhile the angle which reflects STM amount is less in ischemic model than that in control model. By tracking the effect of ischemia on intracellular ion concentration and membrane currents, we declared that changes in membrane currents caused by ischemia exert subtle influences on STM; it is only the decline of intracellular calcium concentration that give rise to the most decrement of STM. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Short term load forecasting of anomalous load using hybrid soft computing methods

    Science.gov (United States)

    Rasyid, S. A.; Abdullah, A. G.; Mulyadi, Y.

    2016-04-01

    Load forecast accuracy will have an impact on the generation cost is more economical. The use of electrical energy by consumers on holiday, show the tendency of the load patterns are not identical, it is different from the pattern of the load on a normal day. It is then defined as a anomalous load. In this paper, the method of hybrid ANN-Particle Swarm proposed to improve the accuracy of anomalous load forecasting that often occur on holidays. The proposed methodology has been used to forecast the half-hourly electricity demand for power systems in the Indonesia National Electricity Market in West Java region. Experiments were conducted by testing various of learning rate and learning data input. Performance of this methodology will be validated with real data from the national of electricity company. The result of observations show that the proposed formula is very effective to short-term load forecasting in the case of anomalous load. Hybrid ANN-Swarm Particle relatively simple and easy as a analysis tool by engineers.

  10. Automatic Delineation of On-Line Head-And-Neck Computed Tomography Images: Toward On-Line Adaptive Radiotherapy

    International Nuclear Information System (INIS)

    Zhang Tiezhi; Chi Yuwei; Meldolesi, Elisa; Yan Di

    2007-01-01

    Purpose: To develop and validate a fully automatic region-of-interest (ROI) delineation method for on-line adaptive radiotherapy. Methods and Materials: On-line adaptive radiotherapy requires a robust and automatic image segmentation method to delineate ROIs in on-line volumetric images. We have implemented an atlas-based image segmentation method to automatically delineate ROIs of head-and-neck helical computed tomography images. A total of 32 daily computed tomography images from 7 head-and-neck patients were delineated using this automatic image segmentation method. Manually drawn contours on the daily images were used as references in the evaluation of automatically delineated ROIs. Two methods were used in quantitative validation: (1) the dice similarity coefficient index, which indicates the overlapping ratio between the manually and automatically delineated ROIs; and (2) the distance transformation, which yields the distances between the manually and automatically delineated ROI surfaces. Results: Automatic segmentation showed agreement with manual contouring. For most ROIs, the dice similarity coefficient indexes were approximately 0.8. Similarly, the distance transformation evaluation results showed that the distances between the manually and automatically delineated ROI surfaces were mostly within 3 mm. The distances between two surfaces had a mean of 1 mm and standard deviation of <2 mm in most ROIs. Conclusion: With atlas-based image segmentation, it is feasible to automatically delineate ROIs on the head-and-neck helical computed tomography images in on-line adaptive treatments

  11. Short-term adaptations in spinal cord circuits evoked by repetitive transcranial magnetic stimulation: possible underlying mechanisms

    DEFF Research Database (Denmark)

    Perez, Monica A.; Lungholt, Bjarke K.S.; Nielsen, Jens Bo

    2005-01-01

    Repetitive transcranial magnetic stimulation (rTMS) has been shown to induce adaptations in cortical neuronal circuitries. In the present study we investigated whether rTMS, through its effect on corticospinal pathways, also produces adaptations at the spinal level, and what the neuronal mechanisms...... that the depression of the H-reflex by rTMS can be explained, at least partly, by an increased presynaptic inhibition of soleus Ia afferents. In contrast, rTMS had no effect on disynaptic reciprocal Ia inhibition from ankle dorsiflexors to plantarflexors. We conclude that a train of rTMS may modulate transmission...

  12. The use of computer adaptive tests in outcome assessments following upper limb trauma.

    Science.gov (United States)

    Jayakumar, P; Overbeek, C; Vranceanu, A-M; Williams, M; Lamb, S; Ring, D; Gwilym, S

    2018-06-01

    Aims Outcome measures quantifying aspects of health in a precise, efficient, and user-friendly manner are in demand. Computer adaptive tests (CATs) may overcome the limitations of established fixed scales and be more adept at measuring outcomes in trauma. The primary objective of this review was to gain a comprehensive understanding of the psychometric properties of CATs compared with fixed-length scales in the assessment of outcome in patients who have suffered trauma of the upper limb. Study designs, outcome measures and methodological quality are defined, along with trends in investigation. Materials and Methods A search of multiple electronic databases was undertaken on 1 January 2017 with terms related to "CATs", "orthopaedics", "trauma", and "anatomical regions". Studies involving adults suffering trauma to the upper limb, and undergoing any intervention, were eligible. Those involving the measurement of outcome with any CATs were included. Identification, screening, and eligibility were undertaken, followed by the extraction of data and quality assessment using the Consensus-Based Standards for the Selection of Health Measurement Instruments (COSMIN) criteria. The review is reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria and reg istered (PROSPERO: CRD42016053886). Results A total of 31 studies reported trauma conditions alone, or in combination with non-traumatic conditions using CATs. Most were cross-sectional with varying level of evidence, number of patients, type of study, range of conditions and methodological quality. CATs correlated well with fixed scales and had minimal or no floor-ceiling effects. They required significantly fewer questions and/or less time for completion. Patient-Reported Outcomes Measurement Information System (PROMIS) CATs were the most frequently used, and the use of CATs is increasing. Conclusion Early studies show valid and reliable outcome measurement with CATs

  13. Adapting the Computed Tomography Criteria of Hemorrhagic Transformation to Stroke Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Lars Neeb

    2013-08-01

    Full Text Available Background: The main safety aspect in the use of stroke thrombolysis and in clinical trials of new pharmaceutical or interventional stroke therapies is the incidence of hemorrhagic transformation (HT after treatment. The computed tomography (CT-based classification of the European Cooperative Acute Stroke Study (ECASS distinguishes four categories of HTs. An HT can range from a harmless spot of blood accumulation to a symptomatic space-occupying parenchymal bleeding associated with a massive deterioration of symptoms and clinical prognosis. In magnetic resonance imaging (MRI HTs are often categorized using the ECASS criteria although this classification has not been validated in MRI. We developed MRI-specific criteria for the categorization of HT and sought to assess its diagnostic reliability in a retrospective study. Methods: Consecutive acute ischemic stroke patients, who had received a 3-tesla MRI before and 12-36 h after thrombolysis, were screened retrospectively for an HT of any kind in post-treatment MRI. Intravenous tissue plasminogen activator was given to all patients within 4.5 h. HT categorization was based on a simultaneous read of 3 different MRI sequences (fluid-attenuated inversion recovery, diffusion-weighted imaging and T2* gradient-recalled echo. Categorization of HT in MRI accounted for the various aspects of the imaging pattern as the shape of the bleeding area and signal intensity on each sequence. All data sets were independently categorized in a blinded fashion by 3 expert and 3 resident observers. Interobserver reliability of this classification was determined for all observers together and for each group separately by calculating Kendall's coefficient of concordance (W. Results: Of the 186 patients screened, 39 patients (21% had an HT in post-treatment MRI and were included for the categorization of HT by experts and residents. The overall agreement of HT categorization according to the modified classification was

  14. Procedures for Computing Transonic Flows for Control of Adaptive Wind Tunnels. Ph.D. Thesis - Technische Univ., Berlin, Mar. 1986

    Science.gov (United States)

    Rebstock, Rainer

    1987-01-01

    Numerical methods are developed for control of three dimensional adaptive test sections. The physical properties of the design problem occurring in the external field computation are analyzed, and a design procedure suited for solution of the problem is worked out. To do this, the desired wall shape is determined by stepwise modification of an initial contour. The necessary changes in geometry are determined with the aid of a panel procedure, or, with incident flow near the sonic range, with a transonic small perturbation (TSP) procedure. The designed wall shape, together with the wall deflections set during the tunnel run, are the input to a newly derived one-step formula which immediately yields the adapted wall contour. This is particularly important since the classical iterative adaptation scheme is shown to converge poorly for 3D flows. Experimental results obtained in the adaptive test section with eight flexible walls are presented to demonstrate the potential of the procedure. Finally, a method is described to minimize wall interference in 3D flows by adapting only the top and bottom wind tunnel walls.

  15. A Secure, Scalable and Elastic Autonomic Computing Systems Paradigm: Supporting Dynamic Adaptation of Self-* Services from an Autonomic Cloud

    Directory of Open Access Journals (Sweden)

    Abdul Jaleel

    2018-05-01

    Full Text Available Autonomic computing embeds self-management features in software systems using external feedback control loops, i.e., autonomic managers. In existing models of autonomic computing, adaptive behaviors are defined at the design time, autonomic managers are statically configured, and the running system has a fixed set of self-* capabilities. An autonomic computing design should accommodate autonomic capability growth by allowing the dynamic configuration of self-* services, but this causes security and integrity issues. A secure, scalable and elastic autonomic computing system (SSE-ACS paradigm is proposed to address the runtime inclusion of autonomic managers, ensuring secure communication between autonomic managers and managed resources. Applying the SSE-ACS concept, a layered approach for the dynamic adaptation of self-* services is presented with an online ‘Autonomic_Cloud’ working as the middleware between Autonomic Managers (offering the self-* services and Autonomic Computing System (requiring the self-* services. A stock trading and forecasting system is used for simulation purposes. The security impact of the SSE-ACS paradigm is verified by testing possible attack cases over the autonomic computing system with single and multiple autonomic managers running on the same and different machines. The common vulnerability scoring system (CVSS metric shows a decrease in the vulnerability severity score from high (8.8 for existing ACS to low (3.9 for SSE-ACS. Autonomic managers are introduced into the system at runtime from the Autonomic_Cloud to test the scalability and elasticity. With elastic AMs, the system optimizes the Central Processing Unit (CPU share resulting in an improved execution time for business logic. For computing systems requiring the continuous support of self-management services, the proposed system achieves a significant improvement in security, scalability, elasticity, autonomic efficiency, and issue resolving time

  16. Control of Cl-transport in the operculum epithelium of Fundulus heteroclitus : long- and short-term salinity adaptation

    DEFF Research Database (Denmark)

    Hoffmann, E.K.; Hoffmann, Erik; Lang, F.

    2002-01-01

    The eurohaline fish, Fundulus heteroclitus, adapts rapidly to enhanced salinity by increasing the ion secretion by gill chloride cells. An increase of similar to 70 mOsm in plasma osmolarity was previously found during the transition. To mimic this in vitro, isolated opercular epithelia of seawat...

  17. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  18. Power Spectral Analysis of Short-Term Heart Rate Variability in Healthy and Arrhythmia Subjects by the Adaptive Continuous Morlet Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ram Sewak SINGH

    2017-12-01

    Full Text Available Power spectral analysis of short-term heart rate variability (HRV can provide instant valuable information to understand the functioning of autonomic control over the cardiovascular system. In this study, an adaptive continuous Morlet wavelet transform (ACMWT method has been used to describe the time-frequency characteristics of the HRV using band power spectra and the median value of interquartile range. Adaptation of the method was based on the measurement of maximum energy concentration. The ACMWT has been validated on synthetic signals (i.e. stationary, non-stationary as slow varying and fast changing frequency with time modeled as closest to dynamic changes in HRV signals. This method has been also tested in the presence of additive white Gaussian noise (AWGN to show its robustness towards the noise. From the results of testing on synthetic signals, the ACMWT was found to be an enhanced energy concentration estimator for assessment of power spectral of short-term HRV time series compared to adaptive Stockwell transform (AST, adaptive modified Stockwell transform (AMST, standard continuous Morlet wavelet transform (CMWT and Stockwell transform (ST estimators at statistical significance level of 5%. Further, the ACMWT was applied to real HRV data from Fantasia and MIT-BIH databases, grouped as healthy young group (HYG, healthy elderly group (HEG, arrhythmia controlled medication group (ARCMG, and supraventricular tachycardia group (SVTG subjects. The global results demonstrate that spectral indices of low frequency power (LFp and high frequency power (HFp of HRV were decreased in HEG compared to HYG subjects (p<0.0001. While LFp and HFp indices were increased in ARCMG compared to HEG (p<0.00001. The LFp and HFp components of HRV obtained from SVTG were reduced compared to other group subjects (p<0.00001.

  19. Synergistic effect of supplemental enteral nutrients and exogenous glucagon-like peptide 2 on intestinal adaptation in a rat model of short bowel syndrome

    DEFF Research Database (Denmark)

    Liu, Xiaowen; Nelson, David W; Holst, Jens Juul

    2006-01-01

    BACKGROUND: Short bowel syndrome (SBS) can lead to intestinal failure and require total or supplemental parenteral nutrition (TPN or PN, respectively). Glucagon-like peptide 2 (GLP-2) is a nutrient-dependent, proglucagon-derived gut hormone that stimulates intestinal adaptation. OBJECTIVE: Our...... objective was to determine whether supplemental enteral nutrients (SEN) modulate the intestinotrophic response to a low dose of GLP-2 coinfused with PN in a rat model of SBS (60% jejunoileal resection plus cecectomy). DESIGN: Rats were randomly assigned to 8 treatments by using a 2 x 2 x 2 factorial design...

  20. Software abstractions and computational issues in parallel structure adaptive mesh methods for electronic structure calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Weare, J.; Ong, E.; Baden, S.

    1997-05-01

    We have applied structured adaptive mesh refinement techniques to the solution of the LDA equations for electronic structure calculations. Local spatial refinement concentrates memory resources and numerical effort where it is most needed, near the atomic centers and in regions of rapidly varying charge density. The structured grid representation enables us to employ efficient iterative solver techniques such as conjugate gradient with FAC multigrid preconditioning. We have parallelized our solver using an object- oriented adaptive mesh refinement framework.

  1. Sex determination of human mandible using metrical parameters by computed tomography: A prospective radiographic short study

    Directory of Open Access Journals (Sweden)

    Basavaraj N Kallalli

    2016-01-01

    Full Text Available Introduction: Sex determination of unidentified human remains is very important in forensic medicine, medicolegal cases, and forensic anthropology. The mandible is the largest and hardest facial bone that commonly resists postmortem damage and forms an important source of personal identification. Additional studies have demonstrated the applicability of facial reconstruction using three-dimensional computed tomography scan (3D-CT for the purpose of individual identification. Aim: To determine the sex of human mandible using metrical parameters by CT. Materials and Methods: The study included thirty subjects (15 males and 15 females, with age group ranging between 10 and 60 years obtained from the outpatient department of Oral Medicine and Radiology, Narsinhbhai Patel Dental College and Hospital. CT scan was performed on all the subjects, and the data obtained were reconstructed for 3D viewing. After obtaining 3D-CT scan, a total of seven mandibular measurements, i.e., gonial angle (G-angle, ramus length (Ramus-L, minimum ramus breadth and gonion-gnathion length (G-G-L, bigonial breadth, bicondylar breadth (BIC-Br, and coronoid length (CO-L were measured; collected data were analyzed using SPSS statistical analysis program by Student's t-test. Results: The result of the study showed that out of seven parameters, G-angle, Ramus-L, G-G-L, BIC-Br, and CO-L showed a significant statistical difference (P < 0.05, with overall accuracy of 86% for males and 82% for females. Conclusion: Personal identification using mandible by conventional methods has already been proved but with variable efficacies. Advanced imaging modalities can aid in personal identification with much higher accuracy than conventional methods.

  2. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  3. Adequacy of the Ultra-Short-Term HRV to Assess Adaptive Processes in Youth Female Basketball Players.

    Science.gov (United States)

    Nakamura, Fabio Y; Pereira, Lucas A; Cal Abad, Cesar C; Cruz, Igor F; Flatt, Andrew A; Esco, Michael R; Loturco, Irineu

    2017-02-01

    Heart rate variability has been widely used to monitor athletes' cardiac autonomic control changes induced by training and competition, and recently shorter recording times have been sought to improve its practicality. The aim of this study was to test the agreement between the (ultra-short-term) natural log of the root-mean-square difference of successive normal RR intervals (lnRMSSD - measured in only 1 min post-1 min stabilization) and the criterion lnRMSSD (measured in the last 5 min out of 10 min of recording) in young female basketball players. Furthermore, the correlation between training induced delta change in the ultra-short-term lnRMSSD and the criterion lnRMSSD was calculated. Seventeen players were assessed at rest pre- and post-eight weeks of training. Trivial effect sizes (-0.03 in the pre- and 0.10 in the post- treatment) were found in the comparison between the ultra-short-term lnRMSSD (3.29 ± 0.45 and 3.49 ± 0.35 ms, in the pre- and post-, respectively) and the criterion lnRMSSD (3.30 ± 0.40 and 3.45 ± 0.41 ms, in the pre- and post-, respectively) (intraclass correlation coefficient = 0.95 and 0.93). In both cases, the response to training was significant, with Pearson's correlation of 0.82 between the delta changes of the ultra-short-term lnRMSSD and the criterion lnRMSSD. In conclusion, the lnRMSSD can be calculated within only 2 min of data acquisition (the 1 st min discarded) in young female basketball players, with the ultra-short-term measure presenting similar sensitivity to training effects as the standard criterion measure.

  4. Control of Cl- transport in the operculum epithelium of Fundulus heteroclitus: long- and short-term salinity adaptation

    DEFF Research Database (Denmark)

    Hoffmann, E K; Hoffmann, E; Lang, F

    2002-01-01

    kinase A (PKA) inhibitors H-89 and KT 5720 had no effect after mannitol addition whereas isoproterenol stimulation was completely blocked by H-89. This indicates that PKA is involved in the activation of the apical Cl(-) channel via c-AMP whereas the shrinkage activation of the Na(+), K(+), 2Cl......(-) cotransporter is independent of PKA activation. The steady-state Cl(-) secretion was stimulated by an inhibitor of serine/threonine phosphatases of the PP-1 and PP-2A type and inhibited by a PKC inhibitor but not by a PKA inhibitor. Thus, it seems to be determined by continuous phosphorylation...... and dephosphorylation involving PKC but not PKA. The steady-state Cl(-) secretion and the maximal obtainable Cl(-) secretion were measured in freshwater-adapted fish and in fish retransferred to saltwater. No I(SC) could be measured in freshwater-adapted fish or in the fish within the first 18 h after transfer...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  6. Short-chain fatty acids induced autophagy serves as an adaptive strategy for retarding mitochondria-mediated apoptotic cell death

    OpenAIRE

    Tang, Y; Chen, Y; Jiang, H; Nie, D

    2010-01-01

    Short-chain fatty acids (SCFAs) are the major by-products of bacterial fermentation of undigested dietary fibers in the large intestine. SCFAs, mostly propionate and butyrate, inhibit proliferation and induce apoptosis in colon cancer cells, but clinical trials had mixed results regarding the anti-tumor activities of SCFAs. Herein we demonstrate that propionate and butyrate induced autophagy in human colon cancer cells to dampen apoptosis whereas inhibition of autophagy potentiated SCFA induc...

  7. Adaptive computations of flow around a delta wing with vortex breakdown

    Science.gov (United States)

    Modiano, David L.; Murman, Earll M.

    1993-01-01

    An adaptive unstructured mesh solution method for the three-dimensional Euler equations was used to simulate the flow around a sharp edged delta wing. Emphasis was on the breakdown of the leading edge vortex at high angle of attack. Large values of entropy, which indicate vortical regions of the flow, specified the region in which adaptation was performed. The aerodynamic normal force coefficients show excellent agreement with wind tunnel data measured by Jarrah, and demonstrate the importance of adaptation in obtaining an accurate solution. The pitching moment coefficient and the location of vortex breakdown are compared with experimental data measured by Hummel and Srinivasan, showing good agreement in cases in which vortex breakdown is located over the wing.

  8. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  9. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  breast cancer risk.

  10. Translation and cross-cultural adaptation of the Brazilian Portuguese version of the Driving Anger Scale (DAS: long form and short form

    Directory of Open Access Journals (Sweden)

    Jessye Almeida Cantini

    2015-03-01

    Full Text Available Introduction: Driving anger has attracted the attention of researchers in recent years because it may induce individuals to drive aggressively or adopt risk behaviors. The Driving Anger Scale (DAS was designed to evaluate the propensity of drivers to become angry or aggressive while driving. This study describes the cross-cultural adaptation of a Brazilian version of the short form and the long form of the DAS.Methods: Translation and adaptation were made in four steps: two translations and two back-translations carried out by independent evaluators; the development of a brief version by four bilingual experts in mental health and driving behaviors; a subsequent experimental application; and, finally, an investigation of operational equivalence.Results: Final Brazilian versions of the short form and of the long form of the DAS were made and are presented. Conclusions: This important instrument, which assesses driving anger and aggressive behaviors, is now available to evaluate the driving behaviors of the Brazilian population, which facilitates research in this field.

  11. Multi-objective differential evolution with adaptive Cauchy mutation for short-term multi-objective optimal hydro-thermal scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Qin Hui [College of Hydropower and Information Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zhou Jianzhong, E-mail: jz.zhou@hust.edu.c [College of Hydropower and Information Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Lu Youlin; Wang Ying; Zhang Yongchuan [College of Hydropower and Information Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2010-04-15

    A new multi-objective optimization method based on differential evolution with adaptive Cauchy mutation (MODE-ACM) is presented to solve short-term multi-objective optimal hydro-thermal scheduling (MOOHS) problem. Besides fuel cost, the pollutant gas emission is also optimized as an objective. The water transport delay between connected reservoirs and the effect of valve-point loading of thermal units are also taken into account in the presented problem formulation. The proposed algorithm adopts an elitist archive to retain non-dominated solutions obtained during the evolutionary process. It modifies the DE's operators to make it suit for multi-objective optimization (MOO) problems and improve its performance. Furthermore, to avoid premature convergence, an adaptive Cauchy mutation is proposed to preserve the diversity of population. An effective constraints handling method is utilized to handle the complex equality and inequality constraints. The effectiveness of the proposed algorithm is tested on a hydro-thermal system consisting of four cascaded hydro plants and three thermal units. The results obtained by MODE-ACM are compared with several previous studies. It is found that the results obtained by MODE-ACM are superior in terms of fuel cost as well as emission output, consuming a shorter time. Thus it can be a viable alternative to generate optimal trade-offs for short-term MOOHS problem.

  12. Semi-supervised adaptation in ssvep-based brain-computer interface using tri-training

    DEFF Research Database (Denmark)

    Bender, Thomas; Kjaer, Troels W.; Thomsen, Carsten E.

    2013-01-01

    This paper presents a novel and computationally simple tri-training based semi-supervised steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI). It is implemented with autocorrelation-based features and a Naïve-Bayes classifier (NBC). The system uses nine characters...

  13. Monitoring self-adaptive applications within edge computing frameworks: A state-of-the-art review

    NARCIS (Netherlands)

    Taherizadeh, S.; Jones, A.C.; Taylor, I.; Zhao, Z.; Stankovski, V.

    Recently, a promising trend has evolved from previous centralized computation to decentralized edge computing in the proximity of end-users to provide cloud applications. To ensure the Quality of Service (QoS) of such applications and Quality of Experience (QoE) for the end-users, it is necessary to

  14. Effects of Short-Interval and Long-Interval Swimming Protocols on Performance, Aerobic Adaptations, and Technical Parameters: A Training Study.

    Science.gov (United States)

    Dalamitros, Athanasios A; Zafeiridis, Andreas S; Toubekis, Argyris G; Tsalis, George A; Pelarigo, Jailton G; Manou, Vasiliki; Kellis, Spiridon

    2016-10-01

    Dalamitros, AA, Zafeiridis, AS, Toubekis, AG, Tsalis, GA, Pelarigo, JG, Manou, V, and Kellis, S. Effects of short-interval and long-interval swimming protocols on performance, aerobic adaptations, and technical parameters: A training study. J Strength Cond Res 30(10): 2871-2879, 2016-This study compared 2-interval swimming training programs of different work interval durations, matched for total distance and exercise intensity, on swimming performance, aerobic adaptations, and technical parameters. Twenty-four former swimmers were equally divided to short-interval training group (INT50, 12-16 × 50 m with 15 seconds rest), long-interval training group (INT100, 6-8 × 100 m with 30 seconds rest), and a control group (CON). The 2 experimental groups followed the specified swimming training program for 8 weeks. Before and after training, swimming performance, technical parameters, and indices of aerobic adaptations were assessed. ΙΝΤ50 and ΙΝΤ100 improved swimming performance in 100 and 400-m tests and the maximal aerobic speed (p ≤ 0.05); the performance in the 50-m swim did not change. Posttraining V[Combining Dot Above]O2max values were higher compared with pretraining values in both training groups (p ≤ 0.05), whereas peak aerobic power output increased only in INT100 (p ≤ 0.05). The 1-minute heart rate and blood lactate recovery values decreased after training in both groups (p training in both groups (p ≤ 0.05); no changes were observed in stroke rate after training. Comparisons between groups on posttraining mean values, after adjusting for pretraining values, revealed no significant differences between ΙΝΤ50 and ΙΝΤ100 for all variables; however, all measures were improved vs. the respective values in the CON (p training.

  15. Statistical Indexes for Monitoring Item Behavior under Computer Adaptive Testing Environment.

    Science.gov (United States)

    Zhu, Renbang; Yu, Feng; Liu, Su

    A computerized adaptive test (CAT) administration usually requires a large supply of items with accurately estimated psychometric properties, such as item response theory (IRT) parameter estimates, to ensure the precision of examinee ability estimation. However, an estimated IRT model of a given item in any given pool does not always correctly…

  16. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

    NARCIS (Netherlands)

    Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

    2015-01-01

    Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

  17. Circuit motifs for contrast-adaptive differentiation in early sensory systems: the role of presynaptic inhibition and short-term plasticity.

    Science.gov (United States)

    Zhang, Danke; Wu, Si; Rasch, Malte J

    2015-01-01

    In natural signals, such as the luminance value across of a visual scene, abrupt changes in intensity value are often more relevant to an organism than intensity values at other positions and times. Thus to reduce redundancy, sensory systems are specialized to detect the times and amplitudes of informative abrupt changes in the input stream rather than coding the intensity values at all times. In theory, a system that responds transiently to fast changes is called a differentiator. In principle, several different neural circuit mechanisms exist that are capable of responding transiently to abrupt input changes. However, it is unclear which circuit would be best suited for early sensory systems, where the dynamic range of the natural input signals can be very wide. We here compare the properties of different simple neural circuit motifs for implementing signal differentiation. We found that a circuit motif based on presynaptic inhibition (PI) is unique in a sense that the vesicle resources in the presynaptic site can be stably maintained over a wide range of stimulus intensities, making PI a biophysically plausible mechanism to implement a differentiator with a very wide dynamical range. Moreover, by additionally considering short-term plasticity (STP), differentiation becomes contrast adaptive in the PI-circuit but not in other potential neural circuit motifs. Numerical simulations show that the behavior of the adaptive PI-circuit is consistent with experimental observations suggesting that adaptive presynaptic inhibition might be a good candidate neural mechanism to achieve differentiation in early sensory systems.

  18. Image-guided adaptive gating of lung cancer radiotherapy: a computer simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Aristophanous, Michalis; Rottmann, Joerg; Park, Sang-June; Berbeco, Ross I [Department of Radiation Oncology, Brigham and Women' s Hospital, Dana Farber Cancer Institute and Harvard Medical School, Boston, MA (United States); Nishioka, Seiko [Department of Radiology, NTT Hospital, Sapporo (Japan); Shirato, Hiroki, E-mail: maristophanous@lroc.harvard.ed [Department of Radiation Medicine, Hokkaido University School of Medicine, Sapporo (Japan)

    2010-08-07

    The purpose of this study is to investigate the effect that image-guided adaptation of the gating window during treatment could have on the residual tumor motion, by simulating different gated radiotherapy techniques. There are three separate components of this simulation: (1) the 'Hokkaido Data', which are previously measured 3D data of lung tumor motion tracks and the corresponding 1D respiratory signals obtained during the entire ungated radiotherapy treatments of eight patients, (2) the respiratory gating protocol at our institution and the imaging performed under that protocol and (3) the actual simulation in which the Hokkaido Data are used to select tumor position information that could have been collected based on the imaging performed under our gating protocol. We simulated treatments with a fixed gating window and a gating window that is updated during treatment. The patient data were divided into different fractions, each with continuous acquisitions longer than 2 min. In accordance to the imaging performed under our gating protocol, we assume that we have tumor position information for the first 15 s of treatment, obtained from kV fluoroscopy, and for the rest of the fractions the tumor position is only available during the beam-on time from MV imaging. The gating window was set according to the information obtained from the first 15 s such that the residual motion was less than 3 mm. For the fixed gating window technique the gate remained the same for the entire treatment, while for the adaptive technique the range of the tumor motion during beam-on time was measured and used to adapt the gating window to keep the residual motion below 3 mm. The algorithm used to adapt the gating window is described. The residual tumor motion inside the gating window was reduced on average by 24% for the patients with regular breathing patterns and the difference was statistically significant (p-value = 0.01). The magnitude of the residual tumor motion

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. Complications with computer-aided designed/computer-assisted manufactured titanium and soldered gold bars for mandibular implant-overdentures: short-term observations.

    Science.gov (United States)

    Katsoulis, Joannis; Wälchli, Julia; Kobel, Simone; Gholami, Hadi; Mericske-Stern, Regina

    2015-01-01

    Implant-overdentures supported by rigid bars provide stability in the edentulous atrophic mandible. However, fractures of solder joints and matrices, and loosening of screws and matrices were observed with soldered gold bars (G-bars). Computer-aided designed/computer-assisted manufactured (CAD/CAM) titanium bars (Ti-bars) may reduce technical complications due to enhanced material quality. To compare prosthetic-technical maintenance service of mandibular implant-overdentures supported by CAD/CAM Ti-bar and soldered G-bar. Edentulous patients were consecutively admitted for implant-prosthodontic treatment with a maxillary complete denture and a mandibular implant-overdenture connected to a rigid G-bar or Ti-bar. Maintenance service and problems with the implant-retention device complex and the prosthesis were recorded during minimally 3-4 years. Annual peri-implant crestal bone level changes (ΔBIC) were radiographically assessed. Data of 213 edentulous patients (mean age 68 ± 10 years), who had received a total of 477 tapered implants, were available. Ti-bar and G-bar comprised 101 and 112 patients with 231 and 246 implants, respectively. Ti-bar mostly exhibited distal bar extensions (96%) compared to 34% of G-bar (p overdentures supported by soldered gold bars or milled CAD/CAM Ti-bars are a successful treatment modality but require regular maintenance service. These short-term observations support the hypothesis that CAD/CAM Ti-bars reduce technical complications. Fracture location indicated that the titanium thickness around the screw-access hole should be increased. © 2013 Wiley Periodicals, Inc.

  1. Detection of advance item knowledge using response times in computer adaptive testing

    NARCIS (Netherlands)

    Meijer, R.R.; Sotaridona, Leonardo

    2006-01-01

    We propose a new method for detecting item preknowledge in a CAT based on an estimate of “effective response time” for each item. Effective response time is defined as the time required for an individual examinee to answer an item correctly. An unusually short response time relative to the expected

  2. Adaptive Radar Signal Processing-The Problem of Exponential Computational Cost

    National Research Council Canada - National Science Library

    Rangaswamy, Muralidhar

    2003-01-01

    .... Extensions to handle the case of non-Gaussian clutter statistics are presented. Current challenges of limited training data support, computational cost, and severely heterogeneous clutter backgrounds are outlined...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. Responding to Sea Level Rise: Does Short-Term Risk Reduction Inhibit Successful Long-Term Adaptation?

    Science.gov (United States)

    Keeler, A. G.; McNamara, D. E.; Irish, J. L.

    2018-04-01

    Most existing coastal climate-adaptation planning processes, and the research supporting them, tightly focus on how to use land use planning, policy tools, and infrastructure spending to reduce risks from rising seas and changing storm conditions. While central to community response to sea level rise, we argue that the exclusive nature of this focus biases against and delays decisions to take more discontinuous, yet proactive, actions to adapt—for example, relocation and aggressive individual protection investments. Public policies should anticipate real estate market responses to risk reduction to avoid large costs—social and financial—when and if sea level rise and other climate-related factors elevate the risks to such high levels that discontinuous responses become the least bad alternative.

  5. Computing the dynamics of biomembranes by combining conservative level set and adaptive finite element methods

    OpenAIRE

    Laadhari , Aymen; Saramito , Pierre; Misbah , Chaouqi

    2014-01-01

    International audience; The numerical simulation of the deformation of vesicle membranes under simple shear external fluid flow is considered in this paper. A new saddle-point approach is proposed for the imposition of the fluid incompressibility and the membrane inextensibility constraints, through Lagrange multipliers defined in the fluid and on the membrane respectively. Using a level set formulation, the problem is approximated by mixed finite elements combined with an automatic adaptive ...

  6. Short term high dose atorvastatin for the prevention of contrast-induced nephropathy in patients undergoing computed tomography angiography

    Directory of Open Access Journals (Sweden)

    Hamid Sanei

    2014-09-01

    Full Text Available BACKGROUND: Statins are shown effective by some studies in preventing contrast-induced nephropathy (CIN. We evaluated the effectiveness of atorvastatin in the prevention of CIN in computed tomography angiography (CTA candidates. METHODS: This study was conducted on patients referring for elective CTA with normal renal function. Patients received atorvastatin (80 mg/day or placebo from 24 h before to 48 h after administration of the contrast material. Serum creatinine was measured before and 48 h after contrast material injection. CIN was defined as an increase in serum creatinine level of ≥ 0.5 mg/dl or ≥ 25% of the baseline creatinine. RESULTS: A total of 236 patients completed the study; 115 atorvastatin, 121 placebo, mean age = 58.40 ± 9.80 year, 68.6% male. Serum creatinine increased after contrast material injection in both the atorvastatin (1.00 ± 0.16-1.02 ± 0.15 mg/dl, P = 0.017 and placebo groups (1.03 ± 0.17-1.08 ± 0.18 mg/dl, P < 0.001. Controlling for age, gender, comorbidities, drug history, and baseline serum creatinine level, patients who received atorvastatin experienced less increase in serum creatinine after contrast material injection (beta = 0.127, P = 0.034. However, there was no difference between the atorvastatin and placebo groups in the incidence of CIN (4.3 vs. 5.0%, P = 0.535. CONCLUSION: In patients undergoing CTA, a short-term treatment with high dose atorvastatin is effective in preventing contrast-induced renal dysfunction, in terms of less increase in serum creatinine level after contrast material injection. Further trials including larger sample of patients and longer follow-ups are warranted.   Keywords: Kidney Diseases, Multidetector Computed Tomography, Contrast Media, Hydroxymethylglutaryl-CoA Reductase Inhibitors, Atorvastatin 

  7. Spatial co-adaptation of cortical control columns in a micro-ECoG brain-computer interface

    Science.gov (United States)

    Rouse, A. G.; Williams, J. J.; Wheeler, J. J.; Moran, D. W.

    2016-10-01

    Objective. Electrocorticography (ECoG) has been used for a range of applications including electrophysiological mapping, epilepsy monitoring, and more recently as a recording modality for brain-computer interfaces (BCIs). Studies that examine ECoG electrodes designed and implanted chronically solely for BCI applications remain limited. The present study explored how two key factors influence chronic, closed-loop ECoG BCI: (i) the effect of inter-electrode distance on BCI performance and (ii) the differences in neural adaptation and performance when fixed versus adaptive BCI decoding weights are used. Approach. The amplitudes of epidural micro-ECoG signals between 75 and 105 Hz with 300 μm diameter electrodes were used for one-dimensional and two-dimensional BCI tasks. The effect of inter-electrode distance on BCI control was tested between 3 and 15 mm. Additionally, the performance and cortical modulation differences between constant, fixed decoding using a small subset of channels versus adaptive decoding weights using the entire array were explored. Main results. Successful BCI control was possible with two electrodes separated by 9 and 15 mm. Performance decreased and the signals became more correlated when the electrodes were only 3 mm apart. BCI performance in a 2D BCI task improved significantly when using adaptive decoding weights (80%-90%) compared to using constant, fixed weights (50%-60%). Additionally, modulation increased for channels previously unavailable for BCI control under the fixed decoding scheme upon switching to the adaptive, all-channel scheme. Significance. Our results clearly show that neural activity under a BCI recording electrode (which we define as a ‘cortical control column’) readily adapts to generate an appropriate control signal. These results show that the practical minimal spatial resolution of these control columns with micro-ECoG BCI is likely on the order of 3 mm. Additionally, they show that the combination and

  8. Towards incorporating affective computing to virtual rehabilitation; surrogating attributed attention from posture for boosting therapy adaptation

    Science.gov (United States)

    Rivas, Jesús J.; Heyer, Patrick; Orihuela-Espina, Felipe; Sucar, Luis Enrique

    2015-01-01

    Virtual rehabilitation (VR) is a novel motor rehabilitation therapy in which the rehabilitation exercises occurs through interaction with bespoken virtual environments. These virtual environments dynamically adapt their activity to match the therapy progress. Adaptation should be guided by the cognitive and emotional state of the patient, none of which are directly observable. Here, we present our first steps towards inferring non-observable attentional state from unobtrusively observable seated posture, so that this knowledge can later be exploited by a VR platform to modulate its behaviour. The space of seated postures was discretized and 648 pictures of acted representations were exposed to crowd-evaluation to determine attributed state of attention. A semi-supervised classifier based on Na¨ıve Bayes with structural improvement was learnt to unfold a predictive relation between posture and attributed attention. Internal validity was established following a 2×5 cross-fold strategy. Following 4959 votes from crowd, classification accuracy reached a promissory 96.29% (µ±σ = 87.59±6.59) and F-measure reached 82.35% (µ ± σ = 69.72 ± 10.50). With the afforded rate of classification, we believe it is safe to claim posture as a reliable proxy for attributed attentional state. It follows that unobtrusively monitoring posture can be exploited for guiding an intelligent adaptation in a virtual rehabilitation platform. This study further helps to identify critical aspects of posture permitting inference of attention.

  9. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Directory of Open Access Journals (Sweden)

    Cohen Laurent

    2006-05-01

    Full Text Available Abstract Background Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. Methods "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations. Results The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article 1 describes the evolution of number sense and arithmetic scores before and after training. Conclusion The software, open-source and freely available online, is designed for learning disabled children aged 5–8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains.

  10. An 8-item short form of the Eating Disorder Examination-Questionnaire adapted for children (ChEDE-Q8).

    Science.gov (United States)

    Kliem, Sören; Schmidt, Ricarda; Vogel, Mandy; Hiemisch, Andreas; Kiess, Wieland; Hilbert, Anja

    2017-06-01

    Eating disturbances are common in children placing a vulnerable group of them at risk for full-syndrome eating disorders and adverse health outcomes. To provide a valid self-report assessment of eating disorder psychopathology in children, a short form of the child version of the Eating Disorder Examination (ChEDE-Q) was psychometrically evaluated. Similar to the EDE-Q, the ChEDE-Q provides assessment of eating disorder psychopathology related to anorexia nervosa, bulimia nervosa, and binge-eating disorder; however, the ChEDE-Q does not assess symptoms of avoidant/restrictive food intake disorder, pica, or rumination disorder. In 1,836 participants ages 7 to 18 years, recruited from two independent population-based samples, the factor structure of the recently established 8-item short form EDE-Q8 for adults was examined, including measurement invariance analyses on age, gender, and weight status derived from objectively measured weight and height. For convergent validity, the ChEDE-Q global score, body esteem scale, strengths and difficulties questionnaire, and sociodemographic characteristics were used. Item characteristics and age- and gender-specific norms were calculated. Confirmatory factor analysis revealed good model fit for the 8-item ChEDE-Q. Measurement invariance analyses indicated strict invariance for all analyzed subgroups. Convergent validity was provided through associations with well-established questionnaires and age, gender, and weight status, in expected directions. The newly developed ChEDE-Q8 proved to be a psychometrically sound and economical self-report assessment tool of eating disorder psychopathology in children. Further validation studies are needed, particularly concerning discriminant and predictive validity. © 2017 Wiley Periodicals, Inc.

  11. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  12. Integration of adaptive process control with computational simulation for spin-forming

    International Nuclear Information System (INIS)

    Raboin, P. J. LLNL

    1998-01-01

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations

  13. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  14. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  15. Metabolic adaptations to short-term every-other-day feeding in long-living Ames dwarf mice.

    Science.gov (United States)

    Brown-Borg, Holly M; Rakoczy, Sharlene

    2013-09-01

    Restrictive dietary interventions exert significant beneficial physiological effects in terms of aging and age-related disease in many species. Every other day feeding (EOD) has been utilized in aging research and shown to mimic many of the positive outcomes consequent with dietary restriction. This study employed long living Ames dwarf mice subjected to EOD feeding to examine the adaptations of the oxidative phosphorylation and antioxidative defense systems to this feeding regimen. Every other day feeding lowered liver glutathione (GSH) concentrations in dwarf and wild type (WT) mice but altered GSH biosynthesis and degradation in WT mice only. The activities of liver OXPHOS enzymes and corresponding proteins declined in WT mice fed EOD while in dwarf animals, the levels were maintained or increased with this feeding regimen. Antioxidative enzymes were differentially affected depending on the tissue, whether proliferative or post-mitotic. Gene expression of components of liver methionine metabolism remained elevated in dwarf mice when compared to WT mice as previously reported however, enzymes responsible for recycling homocysteine to methionine were elevated in both genotypes in response to EOD feeding. The data suggest that the differences in anabolic hormone levels likely affect the sensitivity of long living and control mice to this dietary regimen, with dwarf mice exhibiting fewer responses in comparison to WT mice. These results provide further evidence that dwarf mice may be better protected against metabolic and environmental perturbations which may in turn, contribute to their extended longevity. © 2013.

  16. Encouraging effects of a short-term, adapted Nordic diet intervention on skin microvascular function and skin oxygen tension in younger and older adults.

    Science.gov (United States)

    Rogerson, David; McNeill, Scott; Könönen, Heidi; Klonizakis, Markos

    2018-05-01

    The microvascular benefits of regional diets appear in the literature; however, little is known about Nordic-type diets. We investigated the effects of a short-term, adapted, Nordic diet on microvascular function in younger and older individuals at rest and during activity. Thirteen young (mean age: 28 y; standard deviation: 5 y) and 15 older (mean age: 68 y; standard deviation: 6 y) participants consumed a modified Nordic diet for 4 wk. Laser Doppler flowmetry and transcutaneous oxygen monitoring were used to assess cutaneous microvascular function and oxygen tension pre- and postintervention; blood pressure, body mass, body fat percentage, ratings of perceived exertion, and peak heart rate during activity were examined concurrently. Axon-mediated vasodilation improved in older participants (1.17 [0.30] to 1.30 [0.30]; P Nordic diet might improve microvascular health. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. [Cross-cultural adaptation and apparent and content validity of the short version of The Eating Motivation Survey (TEMS) in Brazilian Portuguese].

    Science.gov (United States)

    Moraes, Jéssica Maria Muniz; Alvarenga, Marle Dos Santos

    2017-10-26

    Understanding why people eat what they eat is essential for developing nutritional guidelines capable of modifying inadequate and dysfunctional eating patterns. Such understanding can be assessed by specific instruments, amongst which The Eating Motivation Survey (TEMS) allows the identification of factors that determine motivations for eating and food choices. The aim of this study is to present the cross-cultural adaptation of the short version of TEMS for use in studies in the Brazilian population. The process involved conceptual and item equivalences; semantic equivalence by 2 translators, 1 linguist, 22 experts (frequency of response understanding), and 23 bilingual individuals (with response comparisons by the paired t-test, Pearson correlation coefficient, and intra-class correlation coefficient); and operational equivalence, performed with 32 individuals. The measurement equivalence corresponding to psychometric properties is under way. All equivalences showed satisfactory results for the scale's use in Brazil, thus allowing application of TEMS to assess motivations for eating choices in the Brazilian context.

  18. Computer vision elastography: speckle adaptive motion estimation for elastography using ultrasound sequences.

    Science.gov (United States)

    Revell, James; Mirmehdi, Majid; McNally, Donal

    2005-06-01

    We present the development and validation of an image based speckle tracking methodology, for determining temporal two-dimensional (2-D) axial and lateral displacement and strain fields from ultrasound video streams. We refine a multiple scale region matching approach incorporating novel solutions to known speckle tracking problems. Key contributions include automatic similarity measure selection to adapt to varying speckle density, quantifying trajectory fields, and spatiotemporal elastograms. Results are validated using tissue mimicking phantoms and in vitro data, before applying them to in vivo musculoskeletal ultrasound sequences. The method presented has the potential to improve clinical knowledge of tendon pathology from carpel tunnel syndrome, inflammation from implants, sport injuries, and many others.

  19. Adaptation of the short intergenic spacers between co-directional genes to the Shine-Dalgarno motif among prokaryote genomes

    DEFF Research Database (Denmark)

    Caro, Albert Pallejà; García-Vallvé, Santiago; Romeu, Antoni

    2009-01-01

    ABSTRACT: BACKGROUND: In prokaryote genomes most of the co-directional genes are in close proximity. Even the coding sequence or the stop codon of a gene can overlap with the Shine-Dalgarno (SD) sequence of the downstream co-directional gene. In this paper we analyze how the presence of SD may...... influence the stop codon usage or the spacing lengths between co-directional genes. RESULTS: The SD sequences for 530 prokaryote genomes have been predicted using computer calculations of the base-pairing free energy between translation initiation regions and the 16S rRNA 3' tail. Genomes with a large...... to the discussion of which factors affect the intergenic lengths, which cannot be totally explained by the pressure to compact the prokaryote genomes....

  20. The boat hull model : adapting the roofline model to enable performance prediction for parallel computing

    NARCIS (Netherlands)

    Nugteren, C.; Corporaal, H.

    2012-01-01

    Multi-core and many-core were already major trends for the past six years, and are expected to continue for the next decades. With these trends of parallel computing, it becomes increasingly difficult to decide on which architecture to run a given application. In this work, we use an algorithm

  1. An Adaptive Procedure for Task Scheduling Optimization in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Pham Phuoc Hung

    2015-01-01

    Full Text Available Nowadays, mobile cloud computing (MCC has emerged as a new paradigm which enables offloading computation-intensive, resource-consuming tasks up to a powerful computing platform in cloud, leaving only simple jobs to the capacity-limited thin client devices such as smartphones, tablets, Apple’s iWatch, and Google Glass. However, it still faces many challenges due to inherent problems of thin clients, especially the slow processing and low network connectivity. So far, a number of research studies have been carried out, trying to eliminate these problems, yet few have been found efficient. In this paper, we present an enhanced architecture, taking advantage of collaboration of thin clients and conventional desktop or laptop computers, known as thick clients, particularly aiming at improving cloud access. Additionally, we introduce an innovative genetic approach for task scheduling such that the processing time is minimized, while considering network contention and cloud cost. Our simulation shows that the proposed approach is more cost-effective and achieves better performance compared with others.

  2. Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); S.M. Bohte (Sander)

    2016-01-01

    textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. It is an open question how real spiking neurons produce the kind of powerful neural computation that is possible with deep artificial neural networks, using only so very few spikes to communicate. Building on

  3. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    Directory of Open Access Journals (Sweden)

    Laura Acqualagna

    Full Text Available In the last years Brain Computer Interface (BCI technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods.

  4. Burnout syndrome among dental students: a short version of the "Burnout Clinical Subtype Questionnaire" adapted for students (BCSQ-12-SS).

    Science.gov (United States)

    Montero-Marin, Jesus; Monticelli, Francesca; Casas, Marina; Roman, Amparo; Tomas, Inmaculada; Gili, Margarita; Garcia-Campayo, Javier

    2011-12-12

    Burnout has been traditionally defined in relation to the dimensions of "exhaustion", "cynicism", and "inefficiency". More recently, the Burnout Clinical Subtype Questionnaire (BCSQ-12) further established three different subtypes of burnout: the "frenetic" subtype (related to "overload"), the "under-challenged" subtype (related to "lack of development"), and the "worn-out" subtype (related to "neglect"). However, to date, these definitions have not been applied to students. The aims of this research were (1) to adapt a Spanish version of the BCSQ-12 for use with students, (2) to test its factorial validity, internal consistency, convergent and discriminant validity, and (3) to assess potential socio-demographic and occupational risk factors associated with the development of the subtypes. We used a cross-sectional design on a sample of dental students (n = 314) from Santiago and Huesca universities (Spain). Participants completed the Burnout Clinical Subtype Questionnaire Student Survey (BCSQ-12-SS), the Maslach Burnout Inventory Student Survey (MBI-SS), and a series of socio-demographic and occupational questions formulated for the specific purpose of this study. Data were subjected to exploratory factor analysis (EFA) using the principal component method with varimax orthogonal rotation. To assess the relations with the criterion, we calculated the Pearson correlation coefficient (r), multiple correlation coefficient (R(y.123)), and the coefficient of determination (R(2)(y.123)). To assess the association between the subtypes and the socio-demographic variables, we examined the adjusted odds ratio (OR) obtained from multivariate logistic regression models. Factorial analyses supported the theoretical proposition of the BCSQ-12-SS, with α-values exceeding 0.80 for all dimensions. The "overload-exhaustion" relation was r = 0.59 (p burnout as established by the BCSQ-12-SS. As such, the BCSQ-12-SS can be used for the recognition of clinical profiles and for the

  5. Parallel paving: An algorithm for generating distributed, adaptive, all-quadrilateral meshes on parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Lober, R.R.; Tautges, T.J.; Vaughan, C.T.

    1997-03-01

    Paving is an automated mesh generation algorithm which produces all-quadrilateral elements. It can additionally generate these elements in varying sizes such that the resulting mesh adapts to a function distribution, such as an error function. While powerful, conventional paving is a very serial algorithm in its operation. Parallel paving is the extension of serial paving into parallel environments to perform the same meshing functions as conventional paving only on distributed, discretized models. This extension allows large, adaptive, parallel finite element simulations to take advantage of paving`s meshing capabilities for h-remap remeshing. A significantly modified version of the CUBIT mesh generation code has been developed to host the parallel paving algorithm and demonstrate its capabilities on both two dimensional and three dimensional surface geometries and compare the resulting parallel produced meshes to conventionally paved meshes for mesh quality and algorithm performance. Sandia`s {open_quotes}tiling{close_quotes} dynamic load balancing code has also been extended to work with the paving algorithm to retain parallel efficiency as subdomains undergo iterative mesh refinement.

  6. Adaptive Neuro-Fuzzy Computing Technique for Determining Turbulent Flow Friction Coefficient

    Directory of Open Access Journals (Sweden)

    Mohammad Givehchi

    2013-08-01

    Full Text Available Estimation of the friction coefficient in pipes is very important in many water and wastewater engineering issues, such as distribution of velocity and shear stress, erosion, sediment transport and head loss. In analyzing these problems, knowing the friction coefficient, can obtain estimates that are more accurate. In this study in order to estimate the friction coefficient in pipes, using adaptive neuro-fuzzy inference systems (ANFIS, grid partition method was used. For training and testing of neuro-fuzzy model, the data derived from the Colebrook’s equation was used. In the neuro-fuzzy approach, pipe relative roughness and Reynolds number are considered as input variables and friction coefficient as output variable is considered. Performance of the proposed approach was evaluated by using of the data obtained from the Colebrook’s equation and based on statistical indicators such as coefficient determination (R2, root mean squared error (RMSE and mean absolute error (MAE. The results showed that the adaptive nerou-fuzzy inference system with grid partition method and gauss model as an input membership function and linear as an output function could estimate friction coefficient more accurately than other conditions. The new proposed approach in this paper has capability of application in the practical design issues and can be combined with mathematical and numerical models of sediment transfer or real-time updating of these models.

  7. Adaptations to Short, Frequent Sessions of Endurance and Strength Training Are Similar to Longer, Less Frequent Exercise Sessions When the Total Volume Is the Same.

    Science.gov (United States)

    Kilen, Anders; Hjelvang, Line B; Dall, Niels; Kruse, Nanna L; Nordsborg, Nikolai B

    2015-11-01

    The hypothesis that the distribution of weekly training across several short sessions, as opposed to fewer longer sessions, enhances maximal strength gain without compromising maximal oxygen uptake was evaluated. Twenty-nine subjects completed an 8-week controlled parallel-group training intervention. One group ("micro training" [MI]: n = 21) performed nine 15-minute training sessions weekly, whereas a second group ("classical training" [CL]: n = 8) completed exactly the same training on a weekly basis but as three 45-minute sessions. For each group, each session comprised exclusively strength, high-intensity cardiovascular training or muscle endurance training. Both groups increased shuttle run performance (MI: 1,373 ± 133 m vs. 1,498 ± 126 m, p ≤ 0.05; CL: 1,074 ± 213 m vs. 1,451 ± 202 m, p training intervention. In conclusion, similar training adaptations can be obtained with short, frequent exercise sessions or longer, less frequent sessions where the total volume of weekly training performed is the same.

  8. Adaptive Blending of Model and Observations for Automated Short-Range Forecasting: Examples from the Vancouver 2010 Olympic and Paralympic Winter Games

    Science.gov (United States)

    Bailey, Monika E.; Isaac, George A.; Gultepe, Ismail; Heckman, Ivan; Reid, Janti

    2014-01-01

    An automated short-range forecasting system, adaptive blending of observations and model (ABOM), was tested in real time during the 2010 Vancouver Olympic and Paralympic Winter Games in British Columbia. Data at 1-min time resolution were available from a newly established, dense network of surface observation stations. Climatological data were not available at these new stations. This, combined with output from new high-resolution numerical models, provided a unique and exciting setting to test nowcasting systems in mountainous terrain during winter weather conditions. The ABOM method blends extrapolations in time of recent local observations with numerical weather predictions (NWP) model predictions to generate short-range point forecasts of surface variables out to 6 h. The relative weights of the model forecast and the observation extrapolation are based on performance over recent history. The average performance of ABOM nowcasts during February and March 2010 was evaluated using standard scores and thresholds important for Olympic events. Significant improvements over the model forecasts alone were obtained for continuous variables such as temperature, relative humidity and wind speed. The small improvements to forecasts of variables such as visibility and ceiling, subject to discontinuous changes, are attributed to the persistence component of ABOM.

  9. The Short French Internet Addiction Test Adapted to Online Sexual Activities: Validation and Links With Online Sexual Preferences and Addiction Symptoms.

    Science.gov (United States)

    Wéry, Aline; Burnay, Jonathan; Karila, Laurent; Billieux, Joël

    2016-01-01

    The goal of this study was to investigate the psychometric properties of a French version of the short Internet Addiction Test adapted to online sexual activities (s-IAT-sex). The French version of the s-IAT-sex was administered to a sample of 401 men. The participants also completed a questionnaire that screened for sexual addiction (PATHOS). The relationships of s-IAT-sex scores with time spent online for online sexual activities (OSAs) and the types of OSAs favored were also considered. Confirmatory analyses supported a two-factor model of s-IAT-sex, corresponding to the factorial structure found in earlier studies that used the short IAT. The first factor regroups loss of control and time management, whereas the second factor regroups craving and social problems. Internal consistency for each factor was evaluated with Cronbach's α coefficient, resulting in .87 for Factor 1, .76 for Factor 2, and .88 for the global scale. Concurrent validity was supported by relationships with symptoms of sexual addiction, types of OSAs practiced, and time spent online for OSAs. The prevalence of sexual addiction (measured by PATHOS) was 28.1% in the current sample of self-selected male OSA users. The French version of the s-IAT-sex presents good psychometric properties and constitutes a useful tool for researchers and practitioners.

  10. Burnout syndrome among dental students: a short version of the "Burnout Clinical Subtype Questionnaire" adapted for students (BCSQ-12-SS

    Directory of Open Access Journals (Sweden)

    Montero-Marin Jesus

    2011-12-01

    Full Text Available Abstract Background Burnout has been traditionally defined in relation to the dimensions of "exhaustion", "cynicism", and "inefficiency". More recently, the Burnout Clinical Subtype Questionnaire (BCSQ-12 further established three different subtypes of burnout: the "frenetic" subtype (related to "overload", the "under-challenged" subtype (related to "lack of development", and the "worn-out" subtype (related to "neglect". However, to date, these definitions have not been applied to students. The aims of this research were (1 to adapt a Spanish version of the BCSQ-12 for use with students, (2 to test its factorial validity, internal consistency, convergent and discriminant validity, and (3 to assess potential socio-demographic and occupational risk factors associated with the development of the subtypes. Method We used a cross-sectional design on a sample of dental students (n = 314 from Santiago and Huesca universities (Spain. Participants completed the Burnout Clinical Subtype Questionnaire Student Survey (BCSQ-12-SS, the Maslach Burnout Inventory Student Survey (MBI-SS, and a series of socio-demographic and occupational questions formulated for the specific purpose of this study. Data were subjected to exploratory factor analysis (EFA using the principal component method with varimax orthogonal rotation. To assess the relations with the criterion, we calculated the Pearson correlation coefficient (r, multiple correlation coefficient (Ry.123, and the coefficient of determination (R2y.123. To assess the association between the subtypes and the socio-demographic variables, we examined the adjusted odds ratio (OR obtained from multivariate logistic regression models. Results Factorial analyses supported the theoretical proposition of the BCSQ-12-SS, with α-values exceeding 0.80 for all dimensions. The "overload-exhaustion" relation was r = 0.59 (p y.123 = 0.62, 30.25% in "cynicism" (Ry.123 = 0.55, and 26.01% in "inefficiency" (Ry.123 = 0

  11. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L. [Oak Ridge National Lab., TN (United States); Sartori, E. [OCDE/OECD NEA Data Bank, Issy-les-Moulineaux (France); Viedma, L.G. de [Consejo de Seguridad Nuclear, Madrid (Spain)

    1997-06-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee`s Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community`s computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management.

  12. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    International Nuclear Information System (INIS)

    Kirk, B.L.; Sartori, E.; Viedma, L.G. de

    1997-01-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee's Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community's computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management

  13. Adaptation of Toodee-2 computer code for reflood analysis in Angra-1 reactor

    International Nuclear Information System (INIS)

    Praes, J.G.L.; Onusic Junior, J.

    1981-01-01

    A method of calculation the heat transfer coefficient used in Toodee-2 computer code for core reflood analysis in a loss of coolant accident, is presented. Preliminary results are presented with the use of heat transfer correlations based on FLECHT experiments adequate to a geometric arrangement such as 16 x 16 (Angra I). Optional calculations are suggested for the heat transfer coefficients when the cooling of fuel cladding by steam is used. (Author) [pt

  14. Adaptive learning with covariate shift-detection for motor imagery-based brain–computer interface

    OpenAIRE

    Raza, H; Cecotti, H; Li, Y; Prasad, G

    2015-01-01

    A common assumption in traditional supervised learning is the similar probability distribution of data between the training phase and the testing/operating phase. When transitioning from the training to testing phase, a shift in the probability distribution of input data is known as a covariate shift. Covariate shifts commonly arise in a wide range of real-world systems such as electroencephalogram-based brain–computer interfaces (BCIs). In such systems, there is a necessity for continuous mo...

  15. Identification of an Adaptable Computer Program Design for Analyzing a Modular Organizational Assessment Instrument.

    Science.gov (United States)

    1981-09-01

    ber) Survey-guided development Organizational effectiveness Computer program Organizational diagnosis Management 20. ABSTRACT (Continue an reverse...Army. Doctoral dissertation, Purdue University, December 1977. (DTIC AD-A059-542) Bowers, D. G. Organizational diagnosis : A review and a proposed method...G. E. Compara- tive issues and methods in organizational diagnosis . Ann Arbor MI: Institute for Social Research, University of Michigan, November 1977

  16. From Collective Adaptive Systems to Human Centric Computation and Back: Spatial Model Checking for Medical Imaging

    Directory of Open Access Journals (Sweden)

    Gina Belmonte

    2016-07-01

    Full Text Available Recent research on formal verification for Collective Adaptive Systems (CAS pushed advancements in spatial and spatio-temporal model checking, and as a side result provided novel image analysis methodologies, rooted in logical methods for topological spaces. Medical Imaging (MI is a field where such technologies show potential for ground-breaking innovation. In this position paper, we present a preliminary investigation centred on applications of spatial model checking to MI. The focus is shifted from pure logics to a mixture of logical, statistical and algorithmic approaches, driven by the logical nature intrinsic to the specification of the properties of interest in the field. As a result, novel operators are introduced, that could as well be brought back to the setting of CAS.

  17. Adaptable structural synthesis using advanced analysis and optimization coupled by a computer operating system

    Science.gov (United States)

    Sobieszczanski-Sobieski, J.; Bhat, R. B.

    1979-01-01

    A finite element program is linked with a general purpose optimization program in a 'programing system' which includes user supplied codes that contain problem dependent formulations of the design variables, objective function and constraints. The result is a system adaptable to a wide spectrum of structural optimization problems. In a sample of numerical examples, the design variables are the cross-sectional dimensions and the parameters of overall shape geometry, constraints are applied to stresses, displacements, buckling and vibration characteristics, and structural mass is the objective function. Thin-walled, built-up structures and frameworks are included in the sample. Details of the system organization and characteristics of the component programs are given.

  18. Influence of Adaptive Statistical Iterative Reconstruction on coronary plaque analysis in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H; Broersen, Alexander

    2016-01-01

    performed. Images were reconstructed using FBP, 30% and 60% adaptive statistical IR (ASIR). Coronary plaque analysis was performed as per patient and per vessel (LM, LAD, CX and RCA) measurements. Lumen and vessel volumes and plaque burden measurements were based on automatic detected contours in each...... reconstruction. Lumen and plaque intensity measurements and HU based plaque characterization were based on corrected contours copied to each reconstruction. RESULTS: No significant changes between FBP and 30% ASIR were found except for lumen- (-2.53 HU) and plaque intensities (-1.28 HU). Between FBP and 60% ASIR...... the change in total volume showed an increase of 0.94%, 4.36% and 2.01% for lumen, plaque and vessel, respectively. The change in total plaque burden between FBP and 60% ASIR was 0.76%. Lumen and plaque intensities decreased between FBP and 60% ASIR with -9.90 HU and -1.97 HU, respectively. The total plaque...

  19. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali

    2014-11-01

    The European Extremely Large Telescope project (E-ELT) is one of Europe\\'s highest priorities in ground-based astronomy. ELTs are built on top of a variety of highly sensitive and critical astronomical instruments. In particular, a new instrument called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used to drive the deformable mirror in real time from the measurements. A new numerical algorithm is proposed (1) to capture the actual experimental noise and (2) to substantially speed up previous implementations by exposing more concurrency, while reducing the number of floating-point operations. Based on the Matrices Over Runtime System at Exascale numerical library (MORSE), a dynamic scheduler drives all computational stages of the tomographic reconstruct or simulation and allows to pipeline and to run tasks out-of order across different stages on heterogeneous systems, while ensuring data coherency and dependencies. The proposed TR simulation outperforms asymptotically previous state-of-the-art implementations up to 13-fold speedup. At more than 50000 unknowns, this appears to be the largest-scale AO problem submitted to computation, to date, and opens new research directions for extreme scale AO simulations. © 2014 IEEE.

  20. Human versus Computer Controlled Selection of Ventilator Settings: An Evaluation of Adaptive Support Ventilation and Mid-Frequency Ventilation

    Directory of Open Access Journals (Sweden)

    Eduardo Mireles-Cabodevila

    2012-01-01

    Full Text Available Background. There are modes of mechanical ventilation that can select ventilator settings with computer controlled algorithms (targeting schemes. Two examples are adaptive support ventilation (ASV and mid-frequency ventilation (MFV. We studied how different clinician-chosen ventilator settings are from these computer algorithms under different scenarios. Methods. A survey of critical care clinicians provided reference ventilator settings for a 70 kg paralyzed patient in five clinical/physiological scenarios. The survey-derived values for minute ventilation and minute alveolar ventilation were used as goals for ASV and MFV, respectively. A lung simulator programmed with each scenario’s respiratory system characteristics was ventilated using the clinician, ASV, and MFV settings. Results. Tidal volumes ranged from 6.1 to 8.3 mL/kg for the clinician, 6.7 to 11.9 mL/kg for ASV, and 3.5 to 9.9 mL/kg for MFV. Inspiratory pressures were lower for ASV and MFV. Clinician-selected tidal volumes were similar to the ASV settings for all scenarios except for asthma, in which the tidal volumes were larger for ASV and MFV. MFV delivered the same alveolar minute ventilation with higher end expiratory and lower end inspiratory volumes. Conclusions. There are differences and similarities among initial ventilator settings selected by humans and computers for various clinical scenarios. The ventilation outcomes are the result of the lung physiological characteristics and their interaction with the targeting scheme.

  1. Hybrid GPU-CPU adaptive precision ray-triangle intersection tests for robust high-performance GPU dosimetry computations

    International Nuclear Information System (INIS)

    Perrotte, Lancelot; Bodin, Bruno; Chodorge, Laurent

    2011-01-01

    Before an intervention on a nuclear site, it is essential to study different scenarios to identify the less dangerous one for the operator. Therefore, it is mandatory to dispose of an efficient dosimetry simulation code with accurate results. One classical method in radiation protection is the straight-line attenuation method with build-up factors. In the case of 3D industrial scenes composed of meshes, the computation cost resides in the fast computation of all of the intersections between the rays and the triangles of the scene. Efficient GPU algorithms have already been proposed, that enable dosimetry calculation for a huge scene (800000 rays, 800000 triangles) in a fraction of second. But these algorithms are not robust: because of the rounding caused by floating-point arithmetic, the numerical results of the ray-triangle intersection tests can differ from the expected mathematical results. In worst case scenario, this can lead to a computed dose rate dramatically inferior to the real dose rate to which the operator is exposed. In this paper, we present a hybrid GPU-CPU algorithm to manage adaptive precision floating-point arithmetic. This algorithm allows robust ray-triangle intersection tests, with very small loss of performance (less than 5 % overhead), and without any need for scene-dependent tuning. (author)

  2. Limitations of 64-Detector-Row Computed Tomography Coronary Angiography: Calcium and Motion but not Short Experience

    International Nuclear Information System (INIS)

    Mir-Akbari, H.; Ripsweden, J.; Jensen, J.; Pichler, P.; Sylven, C.; Cederlund, K.; Rueck, A.

    2009-01-01

    Background: Recently, 64-detector-row computed tomography coronary angiography (CTA) has been introduced for the noninvasive diagnosis of coronary artery disease. Purpose: To evaluate the diagnostic capacity and limitations of a newly established CTA service. Material and Methods: In 101 outpatients with suspected coronary artery disease, 64-detector-row CTA (VCT Lightspeed 64; GE Healthcare, Milwaukee, WI., USA) was performed before invasive coronary angiography (ICA). The presence of >50% diameter coronary stenosis on CTA was rated by two radiologists recently trained in CTA, and separately by an experienced colleague. Diagnostic performance of CTA was calculated on segment, vessel, and patient levels, using ICA as a reference. Segments with a proximal reference diameter <2 mm or with stents were not analyzed. Results: In 51 of 101 patients and 121 of 1280 segments, ICA detected coronary stenosis. In 274 of 1280 (21%) segments, CTA had non-diagnostic image quality, the main reasons being severe calcifications (49%), motion artifacts associated with high or irregular heart rate (45%), and low contrast opacification (14%). Significantly more women (43%) had non-diagnostic scans compared to men (20%). A heart rate above 60 beats per minute was associated with significantly more non-diagnostic patients (38% vs. 18%). In the 1006 diagnostic segments, CTA had a sensitivity of 78%, specificity of 95%, positive predictive value (PPV) of 54%, and negative predictive value (NPV) of 98% for detecting significant coronary stenosis. In 29 patients, CTA was non-diagnostic. In the remaining 72 patients, sensitivity was 100%, specificity 65%, PPV 79%, and NPV 100%. The use of a more experienced CTA reader did not improve diagnostic performance. Conclusion: CTA had a very high negative predictive value, but the number of non-diagnostic scans was high, especially in women. The main limitations were motion artifacts and vessel calcifications, while short experience in CTA did not

  3. Inside marginal adaptation of crowns by X-ray micro-computed tomography

    International Nuclear Information System (INIS)

    Dos Santos, T. M.; Lima, I.; Lopes, R. T.; Author, S. B. Jr.

    2015-01-01

    The objective of this work was to access dental arcade by using X-ray micro-computed tomography. For this purpose high resolution system was used and three groups were studied: Zirkonzahn CAD-CAM system, IPS e.max Press, and metal ceramic. The three systems assessed in this study showed results of marginal and discrepancy gaps clinically accepted. The great result of 2D and 3D evaluations showed that the used technique is a powerful method to investigate quantitative characteristics of dental arcade. (authors)

  4. Inside marginal adaptation of crowns by X-ray micro-computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Dos Santos, T. M.; Lima, I.; Lopes, R. T. [Nuclear Instrumentation Laboratory, Nuclear Engineering Program, Federal University of Rio de Janeiro, RJ, (Brazil); Author, S. B. Jr. [Department of Physics, Colorado State University, Ft. Collins, CO 80523, (United States)

    2015-07-01

    The objective of this work was to access dental arcade by using X-ray micro-computed tomography. For this purpose high resolution system was used and three groups were studied: Zirkonzahn CAD-CAM system, IPS e.max Press, and metal ceramic. The three systems assessed in this study showed results of marginal and discrepancy gaps clinically accepted. The great result of 2D and 3D evaluations showed that the used technique is a powerful method to investigate quantitative characteristics of dental arcade. (authors)

  5. Adaptation of PyFlag to Efficient Analysis of Overtaken Computer Data Storage

    Directory of Open Access Journals (Sweden)

    Aleksander Byrski

    2010-03-01

    Full Text Available Based on existing software aimed at investigation support in the analysis of computer data storage overtaken during investigation (PyFlag, an extension is proposed involving the introduction of dedicated components for data identification and filtering. Hash codes for popular software contained in NIST/NSRL database are considered in order to avoid unwanted files while searching and to classify them into several categories. The extension allows for further analysis, e.g. using artificial intelligence methods. The considerations are illustrated by the overview of the system's design.

  6. Adaptation of the C.H.A.D. computer library to nuclear simulations

    Science.gov (United States)

    Rock, Daniel Thomas

    The Computational Hydrodynamics for Automotive Design computer program, CHAD, is a modern, three-dimensional computational fluid dynamics code that holds promise for fulfilling a need in the nuclear industry and academia. Because CHAD may be freely distributed to non export controlled countries, it offers a cheap and customizable CFD capability. Several modifications were made to CHAD to make it more usable to those in industry and academia. A first order up-winding scheme for momentum and enthalpy and a reformulated continuity equation were migrated from a historical version of CHAD developed at Argonne National Laboratory. The Portable, Extensible Toolkit for Scientific Computing, PETSc, was also added as an optional solver package for the original and reformulated continuity equations. PETSc's highly optimized parallel solvers can be activated from either CHAD's input file or the command line. Solution times for PETSc based calculations depend in large part on convergence criteria provided, however improvements in CPU time of approximately one-third have been observed. CHAD was further extended by adding a capability to monitor solution progress by specifying a coordinate in space, as well as monitoring the residuals in the problem. The ability to model incompressible fluids was also added to the code. Incompressible fluid comparisons were made using several test cases against the commercial CFD code Fluent and found to agree well. A major limitation of CHAD in the academic environment is a limited mesh generation capability. A tool for CHAD was developed that translates Gambit based neutral mesh files into a CHAD usable format. This tool was used to translate a large mesh representing a simplified cooling jacket of a BWR control rod drive. This model serves as a practical, demonstration application of a nuclear application for CHAD and PETSc. Both CHAD with PETSc and Fluent were used to obtain solutions to this problem. The overall agreement between the two

  7. Development of an item bank for the EORTC Role Functioning Computer Adaptive Test (EORTC RF-CAT)

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Petersen, Morten Aa.; Aaronson, Neil

    2016-01-01

    a computer-adaptive test (CAT) for RF. This was part of a larger project whose objective is to develop a CAT version of the EORTC QLQ-C30 which is one of the most widely used HRQOL instruments in oncology. METHODS: In accordance with EORTC guidelines, the development of the RF-CAT comprised four phases...... with good psychometric properties. The resulting item bank exhibits excellent reliability (mean reliability = 0.85, median = 0.95). Using the RF-CAT may allow sample size savings from 11 % up to 50 % compared to using the QLQ-C30 RF scale. CONCLUSIONS: The RF-CAT item bank improves the precision...

  8. Extreme Computing for Extreme Adaptive Optics: the Key to Finding Life Outside our Solar System

    KAUST Repository

    Ltaief, Hatem; Sukkari, Dalal; Guyon, Olivier; Keyes, David E.

    2018-01-01

    The real-time correction of telescopic images in the search for exoplanets is highly sensitive to atmospheric aberrations. The pseudo- inverse algorithm is an efficient mathematical method to filter out these turbulences. We introduce a new partial singular value decomposition (SVD) algorithm based on QR-based Diagonally Weighted Halley (QDWH) iteration for the pseudo-inverse method of adaptive optics. The QDWH partial SVD algorithm selectively calculates the most significant singular values and their corresponding singular vectors. We develop a high performance implementation and demonstrate the numerical robustness of the QDWH-based partial SVD method. We also perform a benchmarking campaign on various generations of GPU hardware accelerators and compare against the state-of-the-art SVD implementation SGESDD from the MAGMA library. Numerical accuracy and performance results are reported using synthetic and real observational datasets from the Subaru telescope. Our implementation outperforms SGESDD by up to fivefold and fourfold performance speedups on ill-conditioned synthetic matrices and real observational datasets, respectively. The pseudo-inverse simulation code will be deployed on-sky for the Subaru telescope during observation nights scheduled early 2018.

  9. Hybrid Direct and Iterative Solver with Library of Multi-criteria Optimal Orderings for h Adaptive Finite Element Method Computations

    KAUST Repository

    AbouEisha, Hassan M.

    2016-06-02

    In this paper we present a multi-criteria optimization of element partition trees and resulting orderings for multi-frontal solver algorithms executed for two dimensional h adaptive finite element method. In particular, the problem of optimal ordering of elimination of rows in the sparse matrices resulting from adaptive finite element method computations is reduced to the problem of finding of optimal element partition trees. Given a two dimensional h refined mesh, we find all optimal element partition trees by using the dynamic programming approach. An element partition tree defines a prescribed order of elimination of degrees of freedom over the mesh. We utilize three different metrics to estimate the quality of the element partition tree. As the first criterion we consider the number of floating point operations(FLOPs) performed by the multi-frontal solver. As the second criterion we consider the number of memory transfers (MEMOPS) performed by the multi-frontal solver algorithm. As the third criterion we consider memory usage (NONZEROS) of the multi-frontal direct solver. We show the optimization results for FLOPs vs MEMOPS as well as for the execution time estimated as FLOPs+100MEMOPS vs NONZEROS. We obtain Pareto fronts with multiple optimal trees, for each mesh, and for each refinement level. We generate a library of optimal elimination trees for small grids with local singularities. We also propose an algorithm that for a given large mesh with identified local sub-grids, each one with local singularity. We compute Schur complements over the sub-grids using the optimal trees from the library, and we submit the sequence of Schur complements into the iterative solver ILUPCG.

  10. Improvement of Detection of Hypoattenuation in Acute Ischemic Stroke in Unenhanced Computed Tomography Using an Adaptive Smoothing Filter

    International Nuclear Information System (INIS)

    Takahashi, N.; Lee, Y.; Tsai, D. Y.; Ishii, K.; Kinoshita, T.; Tamura, H.; K imura, M.

    2008-01-01

    Background: Much attention has been directed toward identifying early signs of cerebral ischemia on computed tomography (CT) images. Hypoattenuation of ischemic brain parenchyma has been found to be the most frequent early sign. Purpose: To evaluate the effect of a previously proposed adaptive smoothing filter for improving detection of parenchymal hypoattenuation of acute ischemic stroke on unenhanced CT images. Material and Methods: Twenty-six patients with parenchymal hypoattenuation and 49 control subjects without hypoattenuation were retrospectively selected in this study. The adaptive partial median filter (APMF) designed for improving detectability of hypoattenuation areas on unenhanced CT images was applied. Seven radiologists, including four certified radiologists and three radiology residents, indicated their confidence level regarding the presence (or absence) of hypoattenuation on CT images, first without and then with the APMF processed images. Their performances without and with the APMF processed images were evaluated by receiver operating characteristic (ROC) analysis. Results: The mean areas under the ROC curves (AUC) for all observers increased from 0.875 to 0.929 (P=0.002) when the radiologists observed with the APMF processed images. The mean sensitivity in the detection of hypoattenuation significantly improved, from 69% (126 of 182 observations) to 89% (151 of 182 observations), when employing the APMF (P=0.012). The specificity, however, was unaffected by the APMF (P=0.41). Conclusion: The APMF has the potential to improve the detection of parenchymal hypoattenuation of acute ischemic stroke on unenhanced CT images

  11. Improvement of resolution in full-view linear-array photoacoustic computed tomography using a novel adaptive weighting method

    Science.gov (United States)

    Omidi, Parsa; Diop, Mamadou; Carson, Jeffrey; Nasiriavanaki, Mohammadreza

    2017-03-01

    Linear-array-based photoacoustic computed tomography is a popular methodology for deep and high resolution imaging. However, issues such as phase aberration, side-lobe effects, and propagation limitations deteriorate the resolution. The effect of phase aberration due to acoustic attenuation and constant assumption of the speed of sound (SoS) can be reduced by applying an adaptive weighting method such as the coherence factor (CF). Utilizing an adaptive beamforming algorithm such as the minimum variance (MV) can improve the resolution at the focal point by eliminating the side-lobes. Moreover, invisibility of directional objects emitting parallel to the detection plane, such as vessels and other absorbing structures stretched in the direction perpendicular to the detection plane can degrade resolution. In this study, we propose a full-view array level weighting algorithm in which different weighs are assigned to different positions of the linear array based on an orientation algorithm which uses the histogram of oriented gradient (HOG). Simulation results obtained from a synthetic phantom show the superior performance of the proposed method over the existing reconstruction methods.

  12. Adaptive vibrational configuration interaction (A-VCI): A posteriori error estimation to efficiently compute anharmonic IR spectra

    Science.gov (United States)

    Garnier, Romain; Odunlami, Marc; Le Bris, Vincent; Bégué, Didier; Baraille, Isabelle; Coulaud, Olivier

    2016-05-01

    A new variational algorithm called adaptive vibrational configuration interaction (A-VCI) intended for the resolution of the vibrational Schrödinger equation was developed. The main advantage of this approach is to efficiently reduce the dimension of the active space generated into the configuration interaction (CI) process. Here, we assume that the Hamiltonian writes as a sum of products of operators. This adaptive algorithm was developed with the use of three correlated conditions, i.e., a suitable starting space, a criterion for convergence, and a procedure to expand the approximate space. The velocity of the algorithm was increased with the use of a posteriori error estimator (residue) to select the most relevant direction to increase the space. Two examples have been selected for benchmark. In the case of H2CO, we mainly study the performance of A-VCI algorithm: comparison with the variation-perturbation method, choice of the initial space, and residual contributions. For CH3CN, we compare the A-VCI results with a computed reference spectrum using the same potential energy surface and for an active space reduced by about 90%.

  13. Adapting to a New Core Curriculum at Hood College: From Computation to Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Betty Mayfield

    2015-07-01

    Full Text Available Our institution, a small, private liberal arts college, recently revised its core curriculum. In the Department of Mathematics, we took this opportunity to formally introduce Quantitative Literacy into the language and the reality of the academic requirements for all students. We developed a list of characteristics that we thought all QL courses should exhibit, no matter in which department they are taught. We agreed on a short list of learning outcomes for students who complete those courses. Then we conducted a preliminary assessment of those two attributes: the fidelity of QL-labeled courses to our list of desired characteristics, and our students’ success in meeting the learning objectives. We also performed an attitudes survey in two courses, measuring students’ attitudes towards mathematics before and after completing a QL course. In the process we have had valuable conversations with full- and part-time faculty, and we have been led to re-examine the role of adjunct faculty in our department. In this paper we list our course characteristics and include one instructor’s description of how she ensured that her QL course exhibited many of those traits. We include examples of student work illustrating how they met the learning objectives, and we report on the results of our attitudes survey. Much remains to be done; we describe our preliminary conclusions and plans for the future.

  14. Adaptive resource allocation scheme using sliding window subchannel gain computation: context of OFDMA wireless mobiles systems

    International Nuclear Information System (INIS)

    Khelifa, F.; Samet, A.; Ben Hassen, W.; Afif, M.

    2011-01-01

    Multiuser diversity combined with Orthogonal Frequency Division Multiple Access (OFDMA) are a promising technique for achieving high downlink capacities in new generation of cellular and wireless network systems. The total capacity of OFDMA based-system is maximized when each subchannel is assigned to the mobile station with the best channel to noise ratio for that subchannel with power is uniformly distributed between all subchannels. A contiguous method for subchannel construction is adopted in IEEE 802.16 m standard in order to reduce OFDMA system complexity. In this context, new subchannel gain computation method, can contribute, jointly with optimal assignment subchannel to maximize total system capacity. In this paper, two new methods have been proposed in order to achieve a better trade-off between fairness and efficiency use of resources. Numerical results show that proposed algorithms provide low complexity, higher total system capacity and fairness among users compared to others recent methods.

  15. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    Science.gov (United States)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  16. Computational tool for immunotoxic assessment of pyrethroids toward adaptive immune cell receptors.

    Science.gov (United States)

    Kumar, Anoop; Behera, Padma Charan; Rangra, Naresh Kumar; Dey, Suddhasattya; Kant, Kamal

    2018-01-01

    Pyrethroids have prominently known for their insecticidal actions worldwide, but recent reports as anticancer and antiviral applications gained a lot of interest to further understand their safety and immunotoxicity. This encouraged us to carry out our present study to evaluate the interactions of pyrethroids toward adaptive immune cell receptors. Type 1 and Type 2 pyrethroids were tested on T (CD4 and CD8) and B (CD28 and CD45) immune cell receptors using Maestro 9.3 (Schrödinger, LLC, Cambridge, USA). In addition, top-ranked tested ligands were too explored for toxicity prediction in rodents using ProTOX tool. Pyrethroids (specifically type 2) such as fenvalerate (-5.534 kcal/mol: CD8), fluvalinate (-4.644 and - 4.431 kcal/mol: CD4 and CD45), and cypermethrin (-3.535 kcal/mol: CD28) have outcome in less energy or more affinity for B-cell and T-cell immune receptors which may later result in the immunosuppressive and hypersensitivity reactions. The current findings have uncovered that there is a further need to assess the Type 2 pyrethroids with wet laboratory experiments to understand the chemical nature of pyrethroid-induced immunotoxicity. Fenvalerate showed apex glide score toward CD8 immune receptor, while fluvalinate confirmed top-ranked binding with CD4 and CD45 immune proteinsIn addition, cypermethrin outcame in top glide score against CD28 immune receptorTop dock hits (Type 2) pyrethroids have shown probable toxicity targets toward AOFA: Amine oxidase (flavin-containing) A and PGH1: Prostaglandin G/H synthase 1, respectively. Abbreviations used: PDB: Protein Data Bank; AOFA: Amine oxidase (flavin-containing) A; PGH 1: Prostaglandin G/H synthase 1.

  17. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    Science.gov (United States)

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  18. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  19. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    Energy Technology Data Exchange (ETDEWEB)

    Bigu, J [Department of Energy, Mines and Resources, Elliot Lake, Ontario (Canada). Elliot Lake Lab.; Raz, R; Golden, K; Dominguez, P [Alpha-NUCLEAR, Toronto, Ontario (Canada)

    1984-08-15

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operation with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross ..cap alpha..-count methods and ..cap alpha..-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of ..cap alpha..-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software.

  20. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    International Nuclear Information System (INIS)

    Bigu, J.

    1984-01-01

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operatin with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross α-count methods and α-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of α-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software. (orig.)

  1. A possible role of midbrain dopamine neurons in short- and long-term adaptation of saccades to position-reward mapping.

    Science.gov (United States)

    Takikawa, Yoriko; Kawagoe, Reiko; Hikosaka, Okihide

    2004-10-01

    Dopamine (DA) neurons respond to sensory stimuli that predict reward. To understand how DA neurons acquire such ability, we trained monkeys on a one-direction-rewarded version of memory-guided saccade task (1DR) only when we recorded from single DA neurons. In 1DR, position-reward mapping was changed across blocks of trials. In the early stage of training of 1DR, DA neurons responded to reward delivery; in the later stages, they responded predominantly to the visual cue that predicted reward or no reward (reward predictor) differentially. We found that such a shift of activity from reward to reward predictor also occurred within a block of trials after position-reward mapping was altered. A main effect of long-term training was to accelerate the within-block reward-to-predictor shift of DA neuronal responses. The within-block shift appeared first in the intermediate stage, but was slow, and DA neurons often responded to the cue that indicated reward in the preceding block. In the advanced stage, the reward-to-predictor shift occurred quickly such that the DA neurons' responses to visual cues faithfully matched the current position-reward mapping. Changes in the DA neuronal responses co-varied with the reward-predictive differentiation of saccade latency both in short-term (within-block) and long-term adaptation. DA neurons' response to the fixation point also underwent long-term changes until it occurred predominantly in the first trial within a block. This might trigger a switch between the learned sets. These results suggest that midbrain DA neurons play an essential role in adapting oculomotor behavior to frequent switches in position-reward mapping.

  2. Effects of megavoltage computed tomographic scan methodology on setup verification and adaptive dose calculation in helical TomoTherapy.

    Science.gov (United States)

    Zhu, Jian; Bai, Tong; Gu, Jiabing; Sun, Ziwen; Wei, Yumei; Li, Baosheng; Yin, Yong

    2018-04-27

    To evaluate the effect of pretreatment megavoltage computed tomographic (MVCT) scan methodology on setup verification and adaptive dose calculation in helical TomoTherapy. Both anthropomorphic heterogeneous chest and pelvic phantoms were planned with virtual targets by TomoTherapy Physicist Station and were scanned with TomoTherapy megavoltage image-guided radiotherapy (IGRT) system consisted of six groups of options: three different acquisition pitches (APs) of 'fine', 'normal' and 'coarse' were implemented by multiplying 2 different corresponding reconstruction intervals (RIs). In order to mimic patient setup variations, each phantom was shifted 5 mm away manually in three orthogonal directions respectively. The effect of MVCT scan options was analyzed in image quality (CT number and noise), adaptive dose calculation deviations and positional correction variations. MVCT scanning time with pitch of 'fine' was approximately twice of 'normal' and 3 times more than 'coarse' setting, all which will not be affected by different RIs. MVCT with different APs delivered almost identical CT numbers and image noise inside 7 selected regions with various densities. DVH curves from adaptive dose calculation with serial MVCT images acquired by varied pitches overlapped together, where as there are no significant difference in all p values of intercept & slope of emulational spinal cord (p = 0.761 & 0.277), heart (p = 0.984 & 0.978), lungs (p = 0.992 & 0.980), soft tissue (p = 0.319 & 0.951) and bony structures (p = 0.960 & 0.929) between the most elaborated and the roughest serials of MVCT. Furthermore, gamma index analysis shown that, compared to the dose distribution calculated on MVCT of 'fine', only 0.2% or 1.1% of the points analyzed on MVCT of 'normal' or 'coarse' do not meet the defined gamma criterion. On chest phantom, all registration errors larger than 1 mm appeared at superior-inferior axis, which cannot be avoided with the smallest AP and RI

  3. Development and adaptation of conduction and radiation heat-transfer computer codes for the CFTL

    International Nuclear Information System (INIS)

    Conklin, J.C.

    1981-08-01

    RODCON and HOTTEL are two computational methods used to calculate thermal and radiation heat transfer for the Core Flow Test Loop (CFTL) analysis efforts. RODCON was developed at ORNL to calculate the internal temperature distribution of the fuel rod simulator (FRS) for the CFTL. RODCON solves the time-dependent heat transfer equation in two-dimensional (R angle) cylindrical coordinates at an axial plane with user-specified radial material zones and time- and position-variant surface conditions at the FRS periphery. Symmetry of the FRS periphery boundary conditions is not necessary. The governing elliptic, partial differential heat equation is cast into a fully implicit, finite-difference form by approximating the derivatives with a forward-differencing scheme with variable mesh spacing. The heat conduction path is circumferentially complete, and the potential mathematical problem at the rod center can be effectively ignored. HOTTEL is a revision of an algorithm developed by C.B. Baxi at the General Atomic Company (GAC) to be used in calculating radiation heat transfer in a rod bundle enclosed in a hexagonal duct. HOTTEL uses geometric view factors, surface emissivities, and surface areas to calculate the gray-body or composite view factors in an enclosure having multiple reflections in a nonparticipating medium

  4. Adaptation Computing Parameters of Pan-Tilt-Zoom Cameras for Traffic Monitoring

    Directory of Open Access Journals (Sweden)

    Ya Lin WU

    2014-01-01

    Full Text Available The Closed- CIRCUIT television (CCTV cameras have been widely used in recent years for traffic monitoring and surveillance applications. We can use CCTV cameras to extract automatically real-time traffic parameters according to the image processing and tracking technologies. Especially, the pan-tilt-zoom (PTZ cameras can provide flexible view selection as well as a wider observation range, and this makes the traffic parameters can be accurately calculated. Therefore, that the parameters of PTZ cameras are calibrated plays an important role in vision-based traffic applications. However, in the specific traffic environment, which is that the license plate number of the illegal parking is located, the parameters of PTZ cameras have to be updated according to the position and distance of illegal parking. In proposed traffic monitoring systems, we use the ordinary webcam and PTZ camera. We get vanishing-point of traffic lane lines in the pixel-based coordinate system by fixed webcam. The parameters of PTZ camera can be initialized by distance of the traffic monitoring and specific objectives and vanishing-point. And then we can use the coordinate position of the illegally parked car to update the parameters of PTZ camera and then get the real word coordinate position of the illegally parked car and use it to compute the distance. The result shows the error of the tested distance and real distance is only 0.2064 meter.

  5. Computational fluid dynamics simulations of membrane filtration process adapted for water treatment of aerated sewage lagoons.

    Science.gov (United States)

    Cano, Grégory; Mouahid, Adil; Carretier, Emilie; Guasp, Pascal; Dhaler, Didier; Castelas, Bernard; Moulin, Philippe

    2015-01-01

    The aim of this study is to apply the membrane bioreactor technology in an oxidation ditch in submerged conditions. This new wastewater filtration process will benefit rural areas (membranes developed without support are immersed in an aeration well and work in suction mode. The development of the membrane without support and more precisely the performance of spacers are approached by computational fluid dynamics in order to provide the best compromise between pressure drop/flow velocity and permeate flux. The numerical results on the layout and the membrane modules' geometry in the aeration well indicate that the optimal configuration is to install the membranes horizontally on three levels. Membranes should be connected to each other to a manifold providing a total membrane area of 18 m². Loss rate compared to the theoretical throughput is relatively low (less than 3%). Preliminary data obtained by modeling the lagoon provide access to its hydrodynamics, revealing that recirculation zones can be optimized by making changes in the operating conditions. The experimental validation of these results and taking into account the aeration in the numerical models are underway.

  6. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  7. Individually adapted imagery improves brain-computer interface performance in end-users with disability.

    Science.gov (United States)

    Scherer, Reinhold; Faller, Josef; Friedrich, Elisabeth V C; Opisso, Eloy; Costa, Ursula; Kübler, Andrea; Müller-Putz, Gernot R

    2015-01-01

    Brain-computer interfaces (BCIs) translate oscillatory electroencephalogram (EEG) patterns into action. Different mental activities modulate spontaneous EEG rhythms in various ways. Non-stationarity and inherent variability of EEG signals, however, make reliable recognition of modulated EEG patterns challenging. Able-bodied individuals who use a BCI for the first time achieve - on average - binary classification performance of about 75%. Performance in users with central nervous system (CNS) tissue damage is typically lower. User training generally enhances reliability of EEG pattern generation and thus also robustness of pattern recognition. In this study, we investigated the impact of mental tasks on binary classification performance in BCI users with central nervous system (CNS) tissue damage such as persons with stroke or spinal cord injury (SCI). Motor imagery (MI), that is the kinesthetic imagination of movement (e.g. squeezing a rubber ball with the right hand), is the "gold standard" and mainly used to modulate EEG patterns. Based on our recent results in able-bodied users, we hypothesized that pair-wise combination of "brain-teaser" (e.g. mental subtraction and mental word association) and "dynamic imagery" (e.g. hand and feet MI) tasks significantly increases classification performance of induced EEG patterns in the selected end-user group. Within-day (How stable is the classification within a day?) and between-day (How well does a model trained on day one perform on unseen data of day two?) analysis of variability of mental task pair classification in nine individuals confirmed the hypothesis. We found that the use of the classical MI task pair hand vs. feed leads to significantly lower classification accuracy - in average up to 15% less - in most users with stroke or SCI. User-specific selection of task pairs was again essential to enhance performance. We expect that the gained evidence will significantly contribute to make imagery-based BCI technology

  8. Individually adapted imagery improves brain-computer interface performance in end-users with disability.

    Directory of Open Access Journals (Sweden)

    Reinhold Scherer

    Full Text Available Brain-computer interfaces (BCIs translate oscillatory electroencephalogram (EEG patterns into action. Different mental activities modulate spontaneous EEG rhythms in various ways. Non-stationarity and inherent variability of EEG signals, however, make reliable recognition of modulated EEG patterns challenging. Able-bodied individuals who use a BCI for the first time achieve - on average - binary classification performance of about 75%. Performance in users with central nervous system (CNS tissue damage is typically lower. User training generally enhances reliability of EEG pattern generation and thus also robustness of pattern recognition. In this study, we investigated the impact of mental tasks on binary classification performance in BCI users with central nervous system (CNS tissue damage such as persons with stroke or spinal cord injury (SCI. Motor imagery (MI, that is the kinesthetic imagination of movement (e.g. squeezing a rubber ball with the right hand, is the "gold standard" and mainly used to modulate EEG patterns. Based on our recent results in able-bodied users, we hypothesized that pair-wise combination of "brain-teaser" (e.g. mental subtraction and mental word association and "dynamic imagery" (e.g. hand and feet MI tasks significantly increases classification performance of induced EEG patterns in the selected end-user group. Within-day (How stable is the classification within a day? and between-day (How well does a model trained on day one perform on unseen data of day two? analysis of variability of mental task pair classification in nine individuals confirmed the hypothesis. We found that the use of the classical MI task pair hand vs. feed leads to significantly lower classification accuracy - in average up to 15% less - in most users with stroke or SCI. User-specific selection of task pairs was again essential to enhance performance. We expect that the gained evidence will significantly contribute to make imagery-based BCI

  9. Retroposition of the AFC family of SINEs (short interspersed repetitive elements) before and during the adaptive radiation of cichlid fishes in Lake Malawi and related inferences about phylogeny.

    Science.gov (United States)

    Takahashi, K; Nishida, M; Yuma, M; Okada, N

    2001-01-01

    Lake Malawi is home to more than 450 species of endemic cichlids, which provide a spectacular example of adaptive radiation. To clarify the phylogenetic relationships among these fish, we examined the presence and absence of SINEs (short interspersed repetitive elements) at orthologous loci. We identified six loci at which a SINE sequence had apparently been specifically inserted by retroposition in the common ancestor of all the investigated species of endemic cichlids in Lake Malawi. At another locus, unique sharing of a SINE sequence was evident among all the investigated species of endemic non-Mbuna cichlids with the exception of Rhamphochromis sp. The relationships were in good agreement with those deduced in previous studies with various different markers, demonstrating that the SINE method is useful for the elucidation of phylogenetic relationships among cichlids in Lake Malawi. We also characterized a locus that exhibited transspecies polymorphism with respect to the presence or absence of the SINE sequence among non-Mbuna species. This result suggests that incomplete lineage sorting and/or interspecific hybridization might have occurred or be occurring among the species in this group, which might potentially cause misinterpretation of phylogenetic data, in particular when a single-locus marker, such as a sequence in the mitochondrial DNA, is used for analysis.

  10. Short version of the Smartphone Addiction Scale adapted to Spanish and French: Towards a cross-cultural research in problematic mobile phone use.

    Science.gov (United States)

    Lopez-Fernandez, Olatz

    2017-01-01

    Research into smartphone addiction has followed the scientific literature on problematic mobile phone use developed during the last decade, with valid screening scales being developed to identify maladaptive behaviour associated with this technology, usually in adolescent populations. This study adapts the short version of the Smartphone Addiction Scale [SAS-SV] into Spanish and into French. The aim of the study was to (i) examine the scale's psychometric properties in both languages, (ii) estimate the prevalence of potential excessive smartphone use among Spanish and Belgian adults, and (iii) compare the addictive symptomatology measured by the SAS-SV between potentially excessive users from both countries. Data were collected via online surveys administered to 281 and 144 voluntary participants from both countries respectively, aged over 18years and recruited from academic environments. Results indicated that the reliability was excellent (i.e., Cronbach alphas: Spain: .88 and Belgium: .90), and the validity was very good (e.g., unifactoriality with a 49% and 54% of variance explained through explorative factor analysis, respectively). Findings showed that the prevalence of potential excessive smartphone use 12.5% for Spanish and 21.5% for francophone Belgians. The scale showed that at least 60% of excessive users endorsed withdrawal and tolerance symptoms in both countries, although the proposed addictive symptomatology did not cover the entire group of estimated excessive users and cultural differences appeared. This first cross-cultural study discusses the smartphone excessive use construct from its addictive pathway. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Short and Long Term Effects of High-Intensity Interval Training on Hormones, Metabolites, Antioxidant System, Glycogen Concentration, and Aerobic Performance Adaptations in Rats.

    Science.gov (United States)

    de Araujo, Gustavo G; Papoti, Marcelo; Dos Reis, Ivan Gustavo Masselli; de Mello, Maria A R; Gobatto, Claudio A

    2016-01-01

    The purpose of the study was to investigate the effects of short and long term High-Intensity Interval Training (HIIT) on anaerobic and aerobic performance, creatinine, uric acid, urea, creatine kinase, lactate dehydrogenase, catalase, superoxide dismutase, testosterone, corticosterone, and glycogen concentration (liver, soleus, and gastrocnemius). The Wistar rats were separated in two groups: HIIT and sedentary/control (CT). The lactate minimum (LM) was used to evaluate the aerobic and anaerobic performance (AP) (baseline, 6, and 12 weeks). The lactate peak determination consisted of two swim bouts at 13% of body weight (bw): (1) 30 s of effort; (2) 30 s of passive recovery; (3) exercise until exhaustion (AP). Tethered loads equivalent to 3.5, 4.0, 4.5, 5.0, 5.5, and 6.5% bw were performed in incremental phase. The aerobic capacity in HIIT group increased after 12 weeks (5.2 ± 0.2% bw) in relation to baseline (4.4 ± 0.2% bw), but not after 6 weeks (4.5 ± 0.3% bw). The exhaustion time in HIIT group showed higher values than CT after 6 (HIIT = 58 ± 5 s; CT = 40 ± 7 s) and 12 weeks (HIIT = 62 ± 7 s; CT = 49 ± 3 s). Glycogen (mg/100 mg) increased in gastrocnemius for HIIT group after 6 weeks (0.757 ± 0.076) and 12 weeks (1.014 ± 0.157) in comparison to baseline (0.358 ± 0.024). In soleus, the HIIT increased glycogen after 6 weeks (0.738 ± 0.057) and 12 weeks (0.709 ± 0.085) in comparison to baseline (0.417 ± 0.035). The glycogen in liver increased after HIIT 12 weeks (4.079 ± 0.319) in relation to baseline (2.400 ± 0.416). The corticosterone (ng/mL) in HIIT increased after 6 weeks (529.0 ± 30.5) and reduced after 12 weeks (153.6 ± 14.5) in comparison to baseline (370.0 ± 18.3). In conclusion, long term HIIT enhanced the aerobic capacity, but short term was not enough to cause aerobic adaptations. The anaerobic performance increased in HIIT short and long term compared with CT, without differences between HIIT short and long term. Furthermore, the

  12. SHORT AND LONG TERM EFFECTS OF HIGH-INTENSITY INTERVAL TRAINING ON HORMONES, METABOLITES, ANTIOXIDANT SYSTEM, GLYCOGEN CONCENTRATION AND AEROBIC PERFORMANCE ADAPTATIONS IN RATS

    Directory of Open Access Journals (Sweden)

    Gustavo Gomes De Araujo

    2016-10-01

    Full Text Available The purpose of the study was to investigate the effects of short and long term High-Intensity Interval Training (HIIT on anaerobic and aerobic performance, creatinine, uric acid, urea, creatine kinase, lactate dehydrogenase, catalase, superoxide dismutase, testosterone, corticosterone and glycogen concentration (liver, soleus and gastrocnemius. The Wistar were separated in two groups: HIIT and sedentary/control (CT. The lactate minimum (LM was used to evaluate the aerobic and anaerobic performance (AP (baseline, 6 and 12 wk. The lactate peak determination consisted of two swim bouts at 13% of body weight (bw: 1 30 s of effort; 2 30 s of passive recovery; 3 exercise until exhaustion (AP. Tethered loads equivalent to 3.5, 4.0, 4.5, 5.0, 5.5 and 6.5% bw were performed in incremental phase. The aerobic capacity in HIIT group increased after 12 wk (5.2±0.2 % bw in relation to baseline (4.4±0.2 % bw, but not after 6 wk (4.5±0.3 % bw. The exhaustion time in HIIT group showed higher values than CT after 6 (HIIT= 58±5 s; CT=40±7 s and 12 wk (HIIT=62±7 s; CT=49±3 s. Glycogen (mg/100mg increased in gastrocnemius for HIIT group after 6 wk (0.757±0.076 and 12 wk (1.014±0.157 in comparison to baseline (0.358±0.024. In soleus, the HIIT increased glycogen after 6 wk (0.738±0.057 and 12 wk (0.709±0.085 in comparison to baseline (0.417±0.035. The glycogen in liver increased after HIIT 12 wk (4.079±0.319 in relation to baseline (2.400±0.416. The corticosterone (ng/mL in HIIT increased after 6 wk (529.0±30.5 and reduced after 12 wk (153.6±14.5 in comparison to baseline (370.0±18.3. In conclusion, long term HIIT enhanced the aerobic capacity, but short term (6wk was not enough to cause aerobic adaptations. The anaerobic performance increased in HIIT short and long term compared with CT, without differences between HIIT short and long term. Furthermore, the glycogen super-compensantion increased after short and long term HIIT in comparison to

  13. Pepsi-SAXS: an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles.

    Science.gov (United States)

    Grudinin, Sergei; Garkavenko, Maria; Kazennov, Andrei

    2017-05-01

    A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist-Shannon-Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion order, this method has the same quadratic dependence on the number of atoms in the model as the Debye-based approach, but with a much smaller prefactor in the computational complexity. The method has been systematically validated on a large set of over 50 models collected from the BioIsis and SASBDB databases. Using a laptop, it was demonstrated that Pepsi-SAXS is about seven, 29 and 36 times faster compared with CRYSOL, FoXS and the three-dimensional Zernike method in SAStbx, respectively, when tested on data from the BioIsis database, and is about five, 21 and 25 times faster compared with CRYSOL, FoXS and SAStbx, respectively, when tested on data from SASBDB. On average, Pepsi-SAXS demonstrates comparable accuracy in terms of χ 2 to CRYSOL and FoXS when tested on BioIsis and SASBDB profiles. Together with a small allowed variation of adjustable parameters, this demonstrates the effectiveness of the method. Pepsi-SAXS is available at http://team.inria.fr/nano-d/software/pepsi-saxs.

  14. Home Computers and Child Outcomes: Short-Term Impacts from a Randomized Experiment in Peru. NBER Working Paper No. 18818

    Science.gov (United States)

    Beuermann, Diether W.; Cristia, Julian P.; Cruz-Aguayo, Yyannu; Cueto, Santiago; Malamud, Ofer

    2013-01-01

    This paper presents results from a randomized control trial in which approximately 1,000 OLPC XO laptops were provided for home use to children attending primary schools in Lima, Peru. The intervention increased access and use of home computers, with some substitution away from computer use outside the home. Beneficiaries were more likely to…

  15. Validity of Cognitive ability tests – comparison of computerized adaptive testing with paper and pencil and computer-based forms of administrations

    Czech Academy of Sciences Publication Activity Database

    Žitný, P.; Halama, P.; Jelínek, Martin; Květon, Petr

    2012-01-01

    Roč. 54, č. 3 (2012), s. 181-194 ISSN 0039-3320 R&D Projects: GA ČR GP406/09/P284 Institutional support: RVO:68081740 Keywords : item response theory * computerized adaptive testing * paper and pencil * computer-based * criterion and construct validity * efficiency Subject RIV: AN - Psychology Impact factor: 0.215, year: 2012

  16. E-assessment for learning? Exploring the potential of computer-marked assessment and computer-generated feedback, from short-answer questions to assessment analytics.

    OpenAIRE

    Jordan, Sally

    2014-01-01

    This submission draws on research from twelve publications, all addressing some aspect of the broad research question: “Can interactive computer-marked assessment improve the effectiveness of assessment for learning?” \\ud \\ud The work starts from a consideration of the conditions under which assessment of any sort is predicted to best support learning, and reviews the broader literature of assessment and feedback before considering the potential of computer-based assessment, focusing on relat...

  17. Impact of a New Adaptive Statistical Iterative Reconstruction (ASIR)-V Algorithm on Image Quality in Coronary Computed Tomography Angiography.

    Science.gov (United States)

    Pontone, Gianluca; Muscogiuri, Giuseppe; Andreini, Daniele; Guaricci, Andrea I; Guglielmo, Marco; Baggiano, Andrea; Fazzari, Fabio; Mushtaq, Saima; Conte, Edoardo; Annoni, Andrea; Formenti, Alberto; Mancini, Elisabetta; Verdecchia, Massimo; Campari, Alessandro; Martini, Chiara; Gatti, Marco; Fusini, Laura; Bonfanti, Lorenzo; Consiglio, Elisa; Rabbat, Mark G; Bartorelli, Antonio L; Pepi, Mauro

    2018-03-27

    A new postprocessing algorithm named adaptive statistical iterative reconstruction (ASIR)-V has been recently introduced. The aim of this article was to analyze the impact of ASIR-V algorithm on signal, noise, and image quality of coronary computed tomography angiography. Fifty consecutive patients underwent clinically indicated coronary computed tomography angiography (Revolution CT; GE Healthcare, Milwaukee, WI). Images were reconstructed using filtered back projection and ASIR-V 0%, and a combination of filtered back projection and ASIR-V 20%-80% and ASIR-V 100%. Image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were calculated for left main coronary artery (LM), left anterior descending artery (LAD), left circumflex artery (LCX), and right coronary artery (RCA) and were compared between the different postprocessing algorithms used. Similarly a four-point Likert image quality score of coronary segments was graded for each dataset and compared. A cutoff value of P ASIR-V 0%, ASIR-V 100% demonstrated a significant reduction of image noise in all coronaries (P ASIR-V 0%, SNR was significantly higher with ASIR-V 60% in LM (P ASIR-V 0%, CNR for ASIR-V ≥60% was significantly improved in LM (P ASIR-V ≥80%. ASIR-V 60% had significantly better Likert image quality scores compared to ASIR-V 0% in segment-, vessel-, and patient-based analyses (P ASIR-V 60% provides the optimal balance between image noise, SNR, CNR, and image quality. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  18. Cultural adaptation and validation of the Filipino version of Kidney Disease Quality of Life--Short Form (KDQOL-SF version 1.3).

    Science.gov (United States)

    Bataclan, Rommel P; Dial, Ma Antonietta D

    2009-10-01

    Chronic kidney disease is the 10th leading cause of death among Filipinos. Those with chronic kidney disease are exposed to stressors which effect their daily lives. Therefore, assessment of health-related quality of life is important in these patients. The objective of the present study was to translate the Kidney Disease Quality of Life--Short Form version 1.3 (KDQOL-SF ver. 1.3) into Filipino and measure its validity and reliability. Translation and cultural adaptation began with two translations into Filipino, with reconciliation of the forward translators. Pretesting with 10 renal patients, review by experts (nephrologist, translator and dialysis nurse) and back-translation was also done. The final questionnaire was administered to 80 patients with chronic renal disease undergoing haemodialysis for at least 3 months, who could understand Filipino, and were without life-threatening or terminal conditions at the time of the test. A convenience sample of 30 patients from the group had a repeat test 10-14 days after to determine test-retest reliability. Test-retest reliability was assessed by intraclass correlation coefficient and internal consistency reliability was measured by determining the Cronbach's alpha value. Validity was measured using Pearson's correlation between the overall health rating scale and the items from the questionnaire. All of the items showed good test-retest reliability (intraclass correlation coefficient >0.40), ranging from 0.58 (social interaction) to 0.98 (role--emotional). Internal consistency reliability values were acceptable, with Cronbach's alpha ranging from 0.60 (cognitive function) to 0.80 (physical functioning and role--physical). Regarding construct validity, overall health rating in kidney disease-targeted scales was significantly correlated with symptoms/problems, effects of kidney disease and burden of kidney disease. All items in the SF 36 scales had significant correlation with overall health rating (P < 0.05) except

  19. Measurement precision and efficiency of multidimensional computer adaptive testing of physical functioning using the pediatric evaluation of disability inventory.

    Science.gov (United States)

    Haley, Stephen M; Ni, Pengsheng; Ludlow, Larry H; Fragala-Pinkham, Maria A

    2006-09-01

    To compare the measurement efficiency and precision of a multidimensional computer adaptive testing (M-CAT) application to a unidimensional CAT (U-CAT) comparison using item bank data from 2 of the functional skills scales of the Pediatric Evaluation of Disability Inventory (PEDI). Using existing PEDI mobility and self-care item banks, we compared the stability of item calibrations and model fit between unidimensional and multidimensional Rasch models and compared the efficiency and precision of the U-CAT- and M-CAT-simulated assessments to a random draw of items. Pediatric rehabilitation hospital and clinics. Clinical and normative samples. Not applicable. Not applicable. The M-CAT had greater levels of precision and efficiency than the separate mobility and self-care U-CAT versions when using a similar number of items for each PEDI subdomain. Equivalent estimation of mobility and self-care scores can be achieved with a 25% to 40% item reduction with the M-CAT compared with the U-CAT. M-CAT applications appear to have both precision and efficiency advantages compared with separate U-CAT assessments when content subdomains have a high correlation. Practitioners may also realize interpretive advantages of reporting test score information for each subdomain when separate clinical inferences are desired.

  20. A Computer Adaptive Testing Version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT

    Science.gov (United States)

    Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.

    2017-01-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387

  1. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    Science.gov (United States)

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. [Impact to Z-score Mapping of Hyperacute Stroke Images by Computed Tomography in Adaptive Statistical Iterative Reconstruction].

    Science.gov (United States)

    Watanabe, Shota; Sakaguchi, Kenta; Hosono, Makoto; Ishii, Kazunari; Murakami, Takamichi; Ichikawa, Katsuhiro

    The purpose of this study was to evaluate the effect of a hybrid-type iterative reconstruction method on Z-score mapping of hyperacute stroke in unenhanced computed tomography (CT) images. We used a hybrid-type iterative reconstruction [adaptive statistical iterative reconstruction (ASiR)] implemented in a CT system (Optima CT660 Pro advance, GE Healthcare). With 15 normal brain cases, we reconstructed CT images with a filtered back projection (FBP) and ASiR with a blending factor of 100% (ASiR100%). Two standardized normal brain data were created from normal databases of FBP images (FBP-NDB) and ASiR100% images (ASiR-NDB), and standard deviation (SD) values in basal ganglia were measured. The Z-score mapping was performed for 12 hyperacute stroke cases by using FBP-NDB and ASiR-NDB, and compared Z-score value on hyperacute stroke area and normal area between FBP-NDB and ASiR-NDB. By using ASiR-NDB, the SD value of standardized brain was decreased by 16%. The Z-score value of ASiR-NDB on hyperacute stroke area was significantly higher than FBP-NDB (pASiR100% for Z-score mapping had potential to improve the accuracy of Z-score mapping.

  3. A study of the image quality of computed tomography adaptive statistical iterative reconstructed brain images using subjective and objective methods

    International Nuclear Information System (INIS)

    Mangat, J.; Morgan, J.; Benson, E.; Baath, M.; Lewis, M.; Reilly, A.

    2016-01-01

    The recent reintroduction of iterative reconstruction in computed tomography has facilitated the realisation of major dose saving. The aim of this article was to investigate the possibility of achieving further savings at a site with well-established Adaptive Statistical iterative Reconstruction (ASiR TM ) (GE Healthcare) brain protocols. An adult patient study was conducted with observers making visual grading assessments using image quality criteria, which were compared with the frequency domain metrics, noise power spectrum and modulation transfer function. Subjective image quality equivalency was found in the 40-70% ASiR TM range, leading to the proposal of ranges for the objective metrics defining acceptable image quality. Based on the findings of both the patient-based and objective studies of the ASiR TM /tube-current combinations tested, 60%/305 mA was found to fall within all, but one, of these ranges. Therefore, it is recommended that an ASiR TM level of 60%, with a noise index of 12.20, is a viable alternative to the currently used protocol featuring a 40% ASiR TM level and a noise index of 11.20, potentially representing a 16% dose saving. (authors)

  4. Reduction of radiation exposure and improvement of image quality with BMI-adapted prospective cardiac computed tomography and iterative reconstruction

    International Nuclear Information System (INIS)

    Hosch, Waldemar; Stiller, Wolfram; Mueller, Dirk; Gitsioudis, Gitsios; Welzel, Johanna; Dadrich, Monika; Buss, Sebastian J.; Giannitsis, Evangelos; Kauczor, Hans U.; Katus, Hugo A.; Korosoglou, Grigorios

    2012-01-01

    Purpose: To assess the impact of body mass index (BMI)-adapted protocols and iterative reconstruction algorithms (iDose) on patient radiation exposure and image quality in patients undergoing prospective ECG-triggered 256-slice coronary computed tomography angiography (CCTA). Methods: Image quality and radiation exposure were systematically analyzed in 100 patients. 60 Patients underwent prospective ECG-triggered CCTA using a non-tailored protocol and served as a ‘control’ group (Group 1: 120 kV, 200 mA s). 40 Consecutive patients with suspected coronary artery disease (CAD) underwent prospective CCTA, using BMI-adapted tube voltage and standard (Group 2: 100/120 kV, 100–200 mA s) versus reduced tube current (Group 3: 100/120 kV, 75–150 mA s). Iterative reconstructions were provided with different iDose levels and were compared to filtered back projection (FBP) reconstructions. Image quality was assessed in consensus of 2 experienced observers and using a 5-grade scale (1 = best to 5 = worse), and signal- and contrast-to-noise ratios (SNR and CNR) were quantified. Results: CCTA was performed without adverse events in all patients (n = 100, heart rate of 47–87 bpm and BMI of 19–38 kg/m 2 ). Patients examined using the non-tailored protocol in Group 1 had the highest radiation exposure (3.2 ± 0.4 mSv), followed by Group 2 (1.7 ± 0.7 mSv) and Group 3 (1.2 ± 0.6 mSv) (radiation savings of 47% and 63%, respectively, p < 0.001). Iterative reconstructions provided increased SNR and CNR, particularly when higher iDose level 5 was applied with Multi-Frequency reconstruction (iDose5 MFR) (14.1 ± 4.6 versus 21.2 ± 7.3 for SNR and 12.0 ± 4.2 versus 18.1 ± 6.6 for CNR, for FBP versus iDose5 MFR, respectively, p < 0.001). The combination of BMI adaptation with iterative reconstruction reduced radiation exposure and simultaneously improved image quality (subjective image quality of 1.4 ± 0.4 versus 1.9 ± 0.5 for Group 2 reconstructed using iDose5 MFR versus

  5. Cultural adaptation and validation of the “Kidney Disease and Quality of Life - Short Form (KDQOL-SF™ version 1.3” questionnaire in Egypt

    Directory of Open Access Journals (Sweden)

    Abd ElHafeez Samar

    2012-12-01

    Full Text Available Abstract Background Health Related Quality of Life (HRQOL instruments need disease and country specific validation. In Arab countries, there is no specific validated questionnaire for assessment of HRQOL in chronic kidney disease (CKD patients. The aim of this study was to present an Arabic translation, adaptation, and the subsequent validation of the kidney disease quality of life-short form (KDQOL-SFTM version 1.3 questionnaire in a representative series of Egyptian CKD patients. Methods KDQOL-SFTM version 1.3 was translated into Arabic by two independent translators, and then subsequently translated back into English. After translation disparities were reconciled, the final Arabic questionnaire was tested by interviewing 100 pre-dialysis CKD (stage 1-4 patients randomly selected from outpatients attending the Nephrology clinic at the Main Alexandria University Hospital. Test re-test reliability was performed, with a subsample of 50 consecutive CKD patients, by two interviews 7 days apart and internal consistency estimated by Cronbach’s α. Discriminant, concept, and construct validity were assessed. Results All items of SF-36 met the criterion for internal consistency and were reproducible. Of the 10 kidney disease targeted scales, only three had Cronbach’s α TM 1.3 were significantly inter-correlated. Finally, principal component analysis of the kidney disease targeted scale indicated that this part of the questionnaire could be summarized into 10 factors that together explained 70.9% of the variance. Conclusion The results suggest that this Arabic version of the KDQOL-SFTM 1.3 questionnaire is a valid and reliable tool for use in Egyptian patients with CKD.

  6. Adaptive-Predictive Organ Localization Using Cone-Beam Computed Tomography for Improved Accuracy in External Beam Radiotherapy for Bladder Cancer

    International Nuclear Information System (INIS)

    Lalondrelle, Susan; Huddart, Robert; Warren-Oseni, Karole; Hansen, Vibeke Nordmark; McNair, Helen; Thomas, Karen; Dearnaley, David; Horwich, Alan; Khoo, Vincent

    2011-01-01

    Purpose: To examine patterns of bladder wall motion during high-dose hypofractionated bladder radiotherapy and to validate a novel adaptive planning method, A-POLO, to prevent subsequent geographic miss. Methods and Materials: Patterns of individual bladder filling were obtained with repeat computed tomography planning scans at 0, 15, and 30 minutes after voiding. A series of patient-specific plans corresponding to these time-displacement points was created. Pretreatment cone-beam computed tomography was performed before each fraction and assessed retrospectively for adaptive intervention. In fractions that would have required intervention, the most appropriate plan was chosen from the patient's 'library,' and the resulting target coverage was reassessed with repeat cone-beam computed tomography. Results: A large variation in patterns of bladder filling and interfraction displacement was seen. During radiotherapy, predominant translations occurred cranially (maximum 2.5 cm) and anteriorly (maximum 1.75 cm). No apparent explanation was found for this variation using pretreatment patient factors. A need for adaptive planning was demonstrated by 51% of fractions, and 73% of fractions would have been delivered correctly using A-POLO. The adaptive strategy improved target coverage and was able to account for intrafraction motion also. Conclusions: Bladder volume variation will result in geographic miss in a high proportion of delivered bladder radiotherapy treatments. The A-POLO strategy can be used to correct for this and can be implemented from the first fraction of radiotherapy; thus, it is particularly suited to hypofractionated bladder radiotherapy regimens.

  7. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  8. Construct validity of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT) in children with medical complexity.

    Science.gov (United States)

    Dumas, Helene M; Fragala-Pinkham, Maria A; Rosen, Elaine L; O'Brien, Jane E

    2017-11-01

    To assess construct (convergent and divergent) validity of the Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) in a sample of children with complex medical conditions. Demographics, clinical information, PEDI-CAT normative score, and the Post-Acute Acuity Rating for Children (PAARC) level were collected for all post-acute hospital admissions (n = 110) from 1 April 2015 to 1 March 2016. Correlations between the PEDI-CAT Daily Activities, Mobility, and Social/Cognitive domain scores for the total sample and across three age groups (infant, preschool, and school-age) were calculated. Differences in mean PEDI-CAT scores for each domain across two groups, children with "Less Complexity," or "More Complexity" based on PAARC level were examined. All correlations for the total sample and age subgroups were statistically significant and trends across age groups were evident with the stronger associations between domains for the infant group. Significant differences were found between mean PEDI-CAT Daily Activities, Mobility, and Social/Cognitive normative scores across the two complexity groups with children in the "Less Complex" group having higher PEDI-CAT scores for all domains. This study provides evidence indicating the PEDI-CAT can be used with confidence in capturing and differentiating children's level of function in a post-acute care setting. Implications for Rehabilitation The PEDI-CAT is measure of function for children with a variety of conditions and can be used in any clinical setting. Convergent validity of the PEDI-CAT's Daily Activities, Mobility, and Social/Cognitive domains was significant and particularly strong for infants and young children with medical complexity. The PEDI-CAT was able to discriminate groups of children with differing levels of medical complexity admitted to a pediatric post-acute care hospital.

  9. Online selection of short-lived particles on many-core computer architectures in the CBM experiment at FAIR

    Energy Technology Data Exchange (ETDEWEB)

    Zyzak, Maksym

    2016-07-07

    Modern experiments in heavy ion collisions operate with huge data rates that can not be fully stored on the currently available storage devices. Therefore the data flow should be reduced by selecting those collisions that potentially carry the information of the physics interest. The future CBM experiment will have no simple criteria for selecting such collisions and requires the full online reconstruction of the collision topology including reconstruction of short-lived particles. In this work the KF Particle Finder package for online reconstruction and selection of short-lived particles is proposed and developed. It reconstructs more than 70 decays, covering signals from all the physics cases of the CBM experiment: strange particles, strange resonances, hypernuclei, low mass vector mesons, charmonium, and open-charm particles. The package is based on the Kalman filter method providing a full set of the particle parameters together with their errors including position, momentum, mass, energy, lifetime, etc. It shows a high quality of the reconstructed particles, high efficiencies, and high signal to background ratios. The KF Particle Finder is extremely fast for achieving the reconstruction speed of 1.5 ms per minimum-bias AuAu collision at 25 AGeV beam energy on single CPU core. It is fully vectorized and parallelized and shows a strong linear scalability on the many-core architectures of up to 80 cores. It also scales within the First Level Event Selection package on the many-core clusters up to 3200 cores. The developed KF Particle Finder package is a universal platform for short- lived particle reconstruction, physics analysis and online selection.

  10. Online selection of short-lived particles on many-core computer architectures in the CBM experiment at FAIR

    International Nuclear Information System (INIS)

    Zyzak, Maksym

    2016-01-01

    Modern experiments in heavy ion collisions operate with huge data rates that can not be fully stored on the currently available storage devices. Therefore the data flow should be reduced by selecting those collisions that potentially carry the information of the physics interest. The future CBM experiment will have no simple criteria for selecting such collisions and requires the full online reconstruction of the collision topology including reconstruction of short-lived particles. In this work the KF Particle Finder package for online reconstruction and selection of short-lived particles is proposed and developed. It reconstructs more than 70 decays, covering signals from all the physics cases of the CBM experiment: strange particles, strange resonances, hypernuclei, low mass vector mesons, charmonium, and open-charm particles. The package is based on the Kalman filter method providing a full set of the particle parameters together with their errors including position, momentum, mass, energy, lifetime, etc. It shows a high quality of the reconstructed particles, high efficiencies, and high signal to background ratios. The KF Particle Finder is extremely fast for achieving the reconstruction speed of 1.5 ms per minimum-bias AuAu collision at 25 AGeV beam energy on single CPU core. It is fully vectorized and parallelized and shows a strong linear scalability on the many-core architectures of up to 80 cores. It also scales within the First Level Event Selection package on the many-core clusters up to 3200 cores. The developed KF Particle Finder package is a universal platform for short- lived particle reconstruction, physics analysis and online selection.

  11. Evolution of short range order in Ar: Liquid to glass and solid transitions-A computational study

    Science.gov (United States)

    Shor, Stanislav; Yahel, Eyal; Makov, Guy

    2018-04-01

    The evolution of the short range order (SRO) as a function of temperature in a Lennard-Jones model liquid with Ar parameters was determined and juxtaposed with thermodynamic and kinetic properties obtained as the liquid was cooled (heated) and transformed between crystalline solid or glassy states and an undercooled liquid. The Lennard-Jones system was studied by non-equilibrium molecular dynamics simulations of large supercells (approximately 20000 atoms) rapidly cooled or heated at selected quenching rates and at constant pressure. The liquid to solid transition was identified by discontinuities in the atomic volume and molar enthalpy; the glass transition temperature range was identified from the temperature dependence of the self-diffusion. The SRO was studied within the quasi-crystalline model (QCM) framework and compared with the Steinhardt bond order parameters. Within the QCM it was found that the SRO evolves from a bcc-like order in the liquid through a bct-like short range order (c/a=1.2) in the supercooled liquid which persists into the glass and finally to a fcc-like ordering in the crystalline solid. The variation of the SRO that results from the QCM compares well with that obtained with Steinhardt's bond order parameters. The hypothesis of icosahedral order in liquids and glasses is not supported by our results.

  12. Translation, adaptation, validation and performance of the American Weight Efficacy Lifestyle Questionnaire Short Form (WEL-SF to a Norwegian version: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Tone N. Flølo

    2014-09-01

    Full Text Available Background. Researchers have emphasized a need to identify predictors that can explain the variability in weight management after bariatric surgery. Eating self-efficacy has demonstrated predictive impact on patients’ adherence to recommended eating habits following multidisciplinary treatment programs, but has to a limited extent been subject for research after bariatric surgery. Recently an American short form version (WEL-SF of the commonly used Weight Efficacy Lifestyle Questionnaire (WEL was available for research and clinical purposes.Objectives. We intended to translate and culturally adapt the WEL-SF to Norwegian conditions, and to evaluate the new versions’ psychometrical properties in a Norwegian population of morbidly obese patients eligible for bariatric surgery.Design. Cross-sectionalMethods. A total of 225 outpatients selected for Laparoscopic sleeve gastrectomy (LSG were recruited; 114 non-operated and 111 operated patients, respectively. The questionnaire was translated through forward and backward procedures. Structural properties were assessed performing principal component analysis (PCA, correlation and regression analysis were conducted to evaluate convergent validity and sensitivity, respectively. Data was assessed by mean, median, item response, missing values, floor- and ceiling effect, Cronbach’s alpha and alpha if item deleted.Results. The PCA resulted in one factor with eigenvalue > 1, explaining 63.0% of the variability. The WEL-SF sum scores were positively correlated with the Self-efficacy and quality of life instruments (p < 0.001. The WEL-SF was associated with body mass index (BMI (p < 0.001 and changes in BMI (p = 0.026. A very high item response was obtained with only one missing value (0.4%. The ceiling effect was in average 0.9 and 17.1% in the non-operated and operated sample, respectively. Strong internal consistency (r = 0.92 was obtained, and Cronbach’s alpha remained high (0.86–0.92 if single

  13. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and distributed......Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  14. Predicting Short-Term Electricity Demand by Combining the Advantages of ARMA and XGBoost in Fog Computing Environment

    Directory of Open Access Journals (Sweden)

    Chuanbin Li

    2018-01-01

    Full Text Available With the rapid development of IoT, the disadvantages of Cloud framework have been exposed, such as high latency, network congestion, and low reliability. Therefore, the Fog Computing framework has emerged, with an extended Fog Layer between the Cloud and terminals. In order to address the real-time prediction on electricity demand, we propose an approach based on XGBoost and ARMA in Fog Computing environment. By taking the advantages of Fog Computing framework, we first propose a prototype-based clustering algorithm to divide enterprise users into several categories based on their total electricity consumption; we then propose a model selection approach by analyzing users’ historical records of electricity consumption and identifying the most important features. Generally speaking, if the historical records pass the test of stationarity and white noise, ARMA is used to model the user’s electricity consumption in time sequence; otherwise, if the historical records do not pass the test, and some discrete features are the most important, such as weather and whether it is weekend, XGBoost will be used. The experiment results show that our proposed approach by combining the advantage of ARMA and XGBoost is more accurate than the classical models.

  15. Multidetector row computed tomography of acute pancreatitis: Utility of single portal phase CT scan in short-term follow up

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yongwonn [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Park, Hee Sun, E-mail: heesun.park@gmail.com [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Kim, Young Jun; Jung, Sung Il; Jeon, Hae Jeong [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of)

    2012-08-15

    Objective: The purpose of this study is to evaluate the question of whether nonenhanced CT or contrast enhanced portal phase CT can replace multiphasic pancreas protocol CT in short term monitoring in patients with acute pancreatitis. Materials and methods: This retrospective study was approved by the Institutional Review Board. From April 2006 to May 2010, a total of 52 patients having acute pancreatitis who underwent initial dual phase multidetector row CT (unenhanced, arterial, and portal phase) at admission and a short term (within 30 days) follow up dual phase CT (mean interval 10.3 days, range 3-28 days) were included. Two abdominal radiologists performed an independent review of three sets of follow up CT images (nonenhanced scan, single portal phase scan, and dual phase scan). Interpretation of each image set was done with at least 2-week interval. Radiologists evaluated severity of acute pancreatitis with regard to pancreatic inflammation, pancreatic necrosis, and extrapancreatic complication, based on the modified CT severity index. Scores of each image set were compared using a paired t-test and interobserver agreement was evaluated using intraclass correlation coefficient statistics. Results: Mean scores of sum of CT severity index on nonenhanced scan, portal phase scan, and dual phase scan were 5.7, 6.6, and 6.5 for radiologist 1, and 5.0, 5.6, and 5.8 for radiologist 2, respectively. In both radiologists, contrast enhanced scan (portal phase scan and dual phase scan) showed significantly higher severity score compared with that of unenhanced scan (P < 0.05), while portal phase and dual phase scan showed no significant difference each other. The trend was similar regarding pancreatic inflammation and extrapancreatic complications, in which contrast enhanced scans showed significantly higher score compared with those of unenhanced scan, while no significant difference was observed between portal phase scan and dual phase scan. In pancreatic necrosis

  16. Development of an imaging-planning program for screen/film and computed radiography mammography for breasts with short chest wall to nipple distance.

    Science.gov (United States)

    Dong, S L; Su, J L; Yeh, Y H; Chu, T C; Lin, Y C; Chuang, K S

    2011-04-01

    Imaging breasts with a short chest wall to nipple distance (CWND) using a traditional mammographic X-ray unit is a technical challenge for mammographers. The purpose of this study is the development of an imaging-planning program to assist in determination of imaging parameters of screen/film (SF) and computed radiography (CR) mammography for short CWND breasts. A traditional mammographic X-ray unit (Mammomat 3000, Siemens, Munich, Germany) was employed. The imaging-planning program was developed by combining the compressed breast thickness correction, the equivalent polymethylmethacrylate thickness assessment for breasts and the tube loading (mAs) measurement. Both phantom exposures and a total of 597 exposures were used for examining the imaging-planning program. Results of the phantom study show that the tube loading rapidly decreased with the CWND when the automatic exposure control (AEC) detector was not fully covered by the phantom. For patient exposures with the AEC fully covered by breast tissue, the average fractional tube loadings, defined as the ratio of the predicted mAs using the imaging-planning program and mAs of the mammogram, were 1.10 and 1.07 for SF and CR mammograms, respectively. The predicted mAs values were comparable to the mAs values, as determined by the AEC. By applying the imaging-planning program in clinical practice, the experiential dependence of the mammographer for determination of the imaging parameters for short CWND breasts is minimised.

  17. Short-range order in ab initio computer generated amorphous and liquid Cu–Zr alloys: A new approach

    International Nuclear Information System (INIS)

    Galván-Colín, Jonathan; Valladares, Ariel A.; Valladares, Renela M.; Valladares, Alexander

    2015-01-01

    Using ab initio molecular dynamics and a new approach based on the undermelt-quench method we generated amorphous and liquid samples of Cu x Zr 100−x (x=64, 50, 36) alloys. We characterized the topology of our resulting structures by means of the pair distribution function and the bond-angle distribution; a coordination number distribution was also calculated. Our results for both amorphous and liquids agree well with experiment. Dependence of short-range order with the concentration is reported. We found that icosahedron-like geometry plays a major role whenever the alloys are Cu-rich or Zr-rich disregarding if the samples are amorphous or liquid. The validation of these results, in turn would let us calculate other properties so far disregarded in the literature

  18. Short-range order in ab initio computer generated amorphous and liquid Cu–Zr alloys: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    Galván-Colín, Jonathan, E-mail: jgcolin@ciencias.unam.mx [Instituto de Investigaciones en Materiales, Universidad Nacional Autónoma de México, Apartado Postal 70-360, México, D.F. 04510, México (Mexico); Valladares, Ariel A., E-mail: valladar@unam.mx [Instituto de Investigaciones en Materiales, Universidad Nacional Autónoma de México, Apartado Postal 70-360, México, D.F. 04510, México (Mexico); Valladares, Renela M.; Valladares, Alexander [Facultad de Ciencias, Universidad Nacional Autónoma de México, Apartado Postal 70-542, México, D.F. 04510, México (Mexico)

    2015-10-15

    Using ab initio molecular dynamics and a new approach based on the undermelt-quench method we generated amorphous and liquid samples of Cu{sub x}Zr{sub 100−x} (x=64, 50, 36) alloys. We characterized the topology of our resulting structures by means of the pair distribution function and the bond-angle distribution; a coordination number distribution was also calculated. Our results for both amorphous and liquids agree well with experiment. Dependence of short-range order with the concentration is reported. We found that icosahedron-like geometry plays a major role whenever the alloys are Cu-rich or Zr-rich disregarding if the samples are amorphous or liquid. The validation of these results, in turn would let us calculate other properties so far disregarded in the literature.

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  20. Acute effects of the Glucagon-Like Peptide 2 analogue, teduglutide, on intestinal adaptation in newborn pigs with short bowel syndrome

    DEFF Research Database (Denmark)

    Thymann, Thomas; Stoll, Barbara; Mecklenburg, Lars

    2014-01-01

    Neonatal short bowel syndrome following massive gut resection associates with malabsorption of nutrients. The intestinotrophic factor glucagon-like peptide 2 (GLP-2) improves gut function in adult short bowel patients, but its effect in pediatric patients remains unknown. Our objective was to test...

  1. First Clinical Investigation of Cone Beam Computed Tomography and Deformable Registration for Adaptive Proton Therapy for Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Veiga, Catarina [Proton and Advanced RadioTherapy Group, Department of Medical Physics and Biomedical Engineering, University College London, London (United Kingdom); Janssens, Guillaume [Ion Beam Applications SA, Louvain-la-Neuve (Belgium); Teng, Ching-Ling [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Baudier, Thomas; Hotoiu, Lucian [iMagX Project, ICTEAM Institute, Université Catholique de Louvain, Louvain-la-Neuve (Belgium); McClelland, Jamie R. [Centre for Medical Image Computing, Department of Medical Physics and Biomedical Engineering, University College London, London (United Kingdom); Royle, Gary [Proton and Advanced RadioTherapy Group, Department of Medical Physics and Biomedical Engineering, University College London, London (United Kingdom); Lin, Liyong; Yin, Lingshu; Metz, James; Solberg, Timothy D.; Tochner, Zelig; Simone, Charles B.; McDonough, James [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States); Kevin Teo, Boon-Keng, E-mail: teok@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania (United States)

    2016-05-01

    Purpose: An adaptive proton therapy workflow using cone beam computed tomography (CBCT) is proposed. It consists of an online evaluation of a fast range-corrected dose distribution based on a virtual CT (vCT) scan. This can be followed by more accurate offline dose recalculation on the vCT scan, which can trigger a rescan CT (rCT) for replanning. Methods and Materials: The workflow was tested retrospectively for 20 consecutive lung cancer patients. A diffeomorphic Morphon algorithm was used to generate the lung vCT by deforming the average planning CT onto the CBCT scan. An additional correction step was applied to account for anatomic modifications that cannot be modeled by deformation alone. A set of clinical indicators for replanning were generated according to the water equivalent thickness (WET) and dose statistics and compared with those obtained on the rCT scan. The fast dose approximation consisted of warping the initial planned dose onto the vCT scan according to the changes in WET. The potential under- and over-ranges were assessed as a variation in WET at the target's distal surface. Results: The range-corrected dose from the vCT scan reproduced clinical indicators similar to those of the rCT scan. The workflow performed well under different clinical scenarios, including atelectasis, lung reinflation, and different types of tumor response. Between the vCT and rCT scans, we found a difference in the measured 95% percentile of the over-range distribution of 3.4 ± 2.7 mm. The limitations of the technique consisted of inherent uncertainties in deformable registration and the drawbacks of CBCT imaging. The correction step was adequate when gross errors occurred but could not recover subtle anatomic or density changes in tumors with complex topology. Conclusions: A proton therapy workflow based on CBCT provided clinical indicators similar to those using rCT for patients with lung cancer with considerable anatomic changes.

  2. A qualitative and quantitative analysis of radiation dose and image quality of computed tomography images using adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Hussain, Fahad Ahmed; Mail, Noor; Shamy, Abdulrahman M; Suliman, Alghamdi; Saoudi, Abdelhamid

    2016-05-08

    Image quality is a key issue in radiology, particularly in a clinical setting where it is important to achieve accurate diagnoses while minimizing radiation dose. Some computed tomography (CT) manufacturers have introduced algorithms that claim significant dose reduction. In this study, we assessed CT image quality produced by two reconstruction algorithms provided with GE Healthcare's Discovery 690 Elite positron emission tomography (PET) CT scanner. Image quality was measured for images obtained at various doses with both conventional filtered back-projection (FBP) and adaptive statistical iterative reconstruction (ASIR) algorithms. A stan-dard CT dose index (CTDI) phantom and a pencil ionization chamber were used to measure the CT dose at 120 kVp and an exposure of 260 mAs. Image quality was assessed using two phantoms. CT images of both phantoms were acquired at tube voltage (kV) of 120 with exposures ranging from 25 mAs to 400 mAs. Images were reconstructed using FBP and ASIR ranging from 10% to 100%, then analyzed for noise, low-contrast detectability, contrast-to-noise ratio (CNR), and modulation transfer function (MTF). Noise was 4.6 HU in water phantom images acquired at 260 mAs/FBP 120 kV and 130 mAs/50% ASIR 120 kV. The large objects (fre-quency ASIR, compared to 260 mAs/FBP. The application of ASIR for small objects (frequency >7 lp/cm) showed poor visibility compared to FBP at 260 mAs and even worse for images acquired at less than 130 mAs. ASIR blending more than 50% at low dose tends to reduce contrast of small objects (frequency >7 lp/cm). We concluded that dose reduction and ASIR should be applied with close attention if the objects to be detected or diagnosed are small (frequency > 7 lp/cm). Further investigations are required to correlate the small objects (frequency > 7 lp/cm) to patient anatomy and clinical diagnosis.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  6. Effect of the Novel Polysaccharide PolyGlycopleX® on Short-Chain Fatty Acid Production in a Computer-Controlled in Vitro Model of the Human Large Intestine

    Directory of Open Access Journals (Sweden)

    Raylene A. Reimer

    2014-03-01

    Full Text Available Many of the health benefits associated with dietary fiber are attributed to their fermentation by microbiota and production of short chain fatty acids (SCFA. The aim of this study was to investigate the fermentability of the functional fiber PolyGlyopleX® (PGX® in vitro. A validated dynamic, computer-controlled in vitro system simulating the conditions in the proximal large intestine (TIM-2 was used. Sodium hydroxide (NaOH consumption in the system was used as an indicator of fermentability and SCFA and branched chain fatty acids (BCFA production was determined. NaOH consumption was significantly higher for Fructooligosaccharide (FOS than PGX, which was higher than cellulose (p = 0.002. At 32, 48 and 72 h, acetate and butyrate production were higher for FOS and PGX versus cellulose. Propionate production was higher for PGX than cellulose at 32, 48, 56 and 72 h and higher than FOS at 72 h (p = 0.014. Total BCFA production was lower for FOS compared to cellulose, whereas production with PGX was lower than for cellulose at 72 h. In conclusion, PGX is fermented by the colonic microbiota which appeared to adapt to the substrate over time. The greater propionate production for PGX may explain part of the cholesterol-lowering properties of PGX seen in rodents and humans.

  7. Value of Information for Optimal Adaptive Routing in Stochastic Time-Dependent Traffic Networks: Algorithms and Computational Tools

    Science.gov (United States)

    2010-10-25

    Real-time information is important for travelers' routing decisions in uncertain networks by enabling online adaptation to revealed traffic conditions. Usually there are spatial and/or temporal limitations in traveler information. In this research, a...

  8. Pepsi-SAXS : an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles

    OpenAIRE

    Grudinin , Sergei; Garkavenko , Maria; Kazennov , Andrei

    2017-01-01

    International audience; A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist–Shannon–Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion ord...

  9. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  10. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. A detailed description of the short-term musculoskeletal and cognitive effects of prolonged standing for office computer work.

    Science.gov (United States)

    Baker, Richelle; Coenen, Pieter; Howie, Erin; Lee, Jeremy; Williamson, Ann; Straker, Leon

    2018-07-01

    Due to concerns about excessive sedentary exposure for office workers, alternate work positions such as standing are being trialled. However, prolonged standing may have health and productivity impacts, which this study assessed. Twenty adult participants undertook two hours of laboratory-based standing computer work to investigate changes in discomfort and cognitive function, along with muscle fatigue, movement, lower limb swelling and mental state. Over time, discomfort increased in all body areas (total body IRR [95% confidence interval]: 1.47[1.36-1.59]). Sustained attention reaction time (β = 18.25[8.00-28.51]) deteriorated, while creative problem solving improved (β = 0.89[0.29-1.49]). There was no change in erector spinae, rectus femoris, biceps femoris or tibialis anterior muscle fatigue; low back angle changed towards less  lordosis, pelvis movement increased, lower limb swelling increased and mental state decreased. Body discomfort was positively correlated with mental state. The observed changes suggest replacing office work sitting with standing should be done with caution. Practitioner Summary: Standing is being used to replace sitting by office workers; however, there are health risks associated with prolonged standing. In a laboratory study involving 2 h prolonged standing discomfort increased (all body areas), reaction time and mental state deteriorated while creative problem-solving improved. Prolonged standing should be undertaken with caution.

  13. Short communication: Milk meal pattern of dairy calves is affected by computer-controlled milk feeder set-up

    DEFF Research Database (Denmark)

    Jensen, Margit Bak

    2009-01-01

    for a minimum of 2 and 4 portions, respectively, whereas low-fed calves ingested their milk in 2.4 and 4.4 meals for a minimum of 2 and 4 portions, respectively. Calves on a high milk allowance had fewer milk meals over time, whereas calves on a low milk allowance had the same number of milk meals throughout...... milk portions, whereas the other half could ingest the milk in 4 or more daily portions. Data were collected during 3 successive 14-d periods, the first period starting the day after introduction to the feeder at minimum 12 d of age. High-fed calves ingested their milk in 4.0 and 4.9 meals....... Thus, the development from small and frequent milk meals to fewer and larger meals reported by studies of natural suckling was also found among high-fed calves on a computer-controlled milk feeder. Irrespectively of minimum number of milk portions, the low-fed calves had more unrewarded visits...

  14. Short-distance expansion for the electromagnetic half-space Green's tensor: general results and an application to radiative lifetime computations

    International Nuclear Information System (INIS)

    Panasyuk, George Y; Schotland, John C; Markel, Vadim A

    2009-01-01

    We obtain a short-distance expansion for the half-space, frequency domain electromagnetic Green's tensor. The small parameter of the theory is ωε 1 L/c, where ω is the frequency, ε 1 is the permittivity of the upper half-space, in which both the source and the point of observation are located, and which is assumed to be transparent, c is the speed of light in vacuum and L is a characteristic length, defined as the distance from the point of observation to the reflected (with respect to the planar interface) position of the source. In the case when the lower half-space (the substrate) is characterized by a complex permittivity ε 2 , we compute the expansion to third order. For the case when the substrate is a transparent dielectric, we compute the imaginary part of the Green's tensor to seventh order. The analytical calculations are verified numerically. The practical utility of the obtained expansion is demonstrated by computing the radiative lifetime of two electromagnetically interacting molecules in the vicinity of a transparent dielectric substrate. The computation is performed in the strong interaction regime when the quasi-particle pole approximation is inapplicable. In this regime, the integral representation for the half-space Green's tensor is difficult to use while its electrostatic limiting expression is grossly inadequate. However, the analytical expansion derived in this paper can be used directly and efficiently. The results of this study are also relevant to nano-optics and near-field imaging, especially when tomographic image reconstruction is involved

  15. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs; Straaldosreglering vid kroppsdatortomografi - bakgrund till dosregleringsprogrammet OmnimAs

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, Ulf; Kristiansson, Mattias [Trelleborg Hospital (Sweden); Leitz, Wolfram [Swedish Radiation Protection Authority, Stockholm (Sweden); Paahlstorp, Per-Aake [Siemens Medical Solutions, Solna (Sweden)

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations.

  16. Online adaptation of a c-VEP Brain-computer Interface(BCI) based on error-related potentials and unsupervised learning.

    Science.gov (United States)

    Spüler, Martin; Rosenstiel, Wolfgang; Bogdan, Martin

    2012-01-01

    The goal of a Brain-Computer Interface (BCI) is to control a computer by pure brain activity. Recently, BCIs based on code-modulated visual evoked potentials (c-VEPs) have shown great potential to establish high-performance communication. In this paper we present a c-VEP BCI that uses online adaptation of the classifier to reduce calibration time and increase performance. We compare two different approaches for online adaptation of the system: an unsupervised method and a method that uses the detection of error-related potentials. Both approaches were tested in an online study, in which an average accuracy of 96% was achieved with adaptation based on error-related potentials. This accuracy corresponds to an average information transfer rate of 144 bit/min, which is the highest bitrate reported so far for a non-invasive BCI. In a free-spelling mode, the subjects were able to write with an average of 21.3 error-free letters per minute, which shows the feasibility of the BCI system in a normal-use scenario. In addition we show that a calibration of the BCI system solely based on the detection of error-related potentials is possible, without knowing the true class labels.

  17. A dynamically adaptive wavelet approach to stochastic computations based on polynomial chaos - capturing all scales of random modes on independent grids

    International Nuclear Information System (INIS)

    Ren Xiaoan; Wu Wenquan; Xanthis, Leonidas S.

    2011-01-01

    Highlights: → New approach for stochastic computations based on polynomial chaos. → Development of dynamically adaptive wavelet multiscale solver using space refinement. → Accurate capture of steep gradients and multiscale features in stochastic problems. → All scales of each random mode are captured on independent grids. → Numerical examples demonstrate the need for different space resolutions per mode. - Abstract: In stochastic computations, or uncertainty quantification methods, the spectral approach based on the polynomial chaos expansion in random space leads to a coupled system of deterministic equations for the coefficients of the expansion. The size of this system increases drastically when the number of independent random variables and/or order of polynomial chaos expansions increases. This is invariably the case for large scale simulations and/or problems involving steep gradients and other multiscale features; such features are variously reflected on each solution component or random/uncertainty mode requiring the development of adaptive methods for their accurate resolution. In this paper we propose a new approach for treating such problems based on a dynamically adaptive wavelet methodology involving space-refinement on physical space that allows all scales of each solution component to be refined independently of the rest. We exemplify this using the convection-diffusion model with random input data and present three numerical examples demonstrating the salient features of the proposed method. Thus we establish a new, elegant and flexible approach for stochastic problems with steep gradients and multiscale features based on polynomial chaos expansions.

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Effects of a manualized short-term treatment of internet and computer game addiction (STICA: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Jäger Susanne

    2012-04-01

    Full Text Available Abstract Background In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA and computer addiction (CA in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. Methods/design This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA. The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. Discussion A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. Trial Registration ClinicalTrials (NCT01434589

  20. Effects of a manualized short-term treatment of internet and computer game addiction (STICA): study protocol for a randomized controlled trial

    Science.gov (United States)

    2012-01-01

    Background In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA) and computer addiction (CA) in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. Methods/design This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA). The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression) will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. Discussion A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. Trial Registration ClinicalTrials (NCT01434589) PMID:22540330

  1. Effects of a manualized short-term treatment of internet and computer game addiction (STICA): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Jäger, Susanne; Müller, Kai W; Ruckes, Christian; Wittig, Tobias; Batra, Anil; Musalek, Michael; Mann, Karl; Wölfling, Klaus; Beutel, Manfred E

    2012-04-27

    In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA) and computer addiction (CA) in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA). The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression) will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. ClinicalTrials (NCT01434589).

  2. Intestinal adaptation is stimulated by partial enteral nutrition supplemented with the prebiotic short-chain fructooligosaccharide in a neonatal intestinal failure piglet model

    DEFF Research Database (Denmark)

    Barnes, Jennifer L; Hartmann, Bolette; Holst, Jens Juul

    2012-01-01

    Butyrate has been shown to stimulate intestinal adaptation when added to parenteral nutrition (PN) following small bowel resection but is not available in current PN formulations. The authors hypothesized that pre- and probiotic administration may be a clinically feasible method to administer but...

  3. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi?GPU System

    KAUST Repository

    Charara, Ali; Ltaief, Hatem; Gratadour, Damien; Keyes, David E.; Sevin, Arnaud; Abdelfattah, Ahmad; Gendron, Eric; Morel, Carine; Vidal, Fabrice

    2014-01-01

    European Extreme Large Telescope (E-ELT) is a high priority project in ground based astronomy that aims at constructing the largest telescope ever built. MOSAIC is an instrument proposed for E-ELT using Multi- Object Adaptive Optics (MOAO) technique for astronomical telescopes, which compensates for effects of atmospheric turbulence on image quality, and operates on patches across a large FoV.

  4. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi?GPU System

    KAUST Repository

    Charara, Ali

    2014-05-04

    European Extreme Large Telescope (E-ELT) is a high priority project in ground based astronomy that aims at constructing the largest telescope ever built. MOSAIC is an instrument proposed for E-ELT using Multi- Object Adaptive Optics (MOAO) technique for astronomical telescopes, which compensates for effects of atmospheric turbulence on image quality, and operates on patches across a large FoV.

  5. Cross-cultural development of an item list for computer-adaptive testing of fatigue in oncological patients

    DEFF Research Database (Denmark)

    Giesinger, Johannes M.; Petersen, Morten Aa.; Grønvold, Mogens

    2011-01-01

    Within an ongoing project of the EORTC Quality of Life Group, we are developing computerized adaptive test (CAT) measures for the QLQ-C30 scales. These new CAT measures are conceptualised to reflect the same constructs as the QLQ-C30 scales. Accordingly, the Fatigue-CAT is intended to capture phy...... physical and general fatigue....

  6. Computer adaptive practice of Maths ability using a new item response model for on the fly ability and difficulty estimation

    NARCIS (Netherlands)

    Klinkenberg, S.; Straatemeier, M.; van der Maas, H.L.J.

    2011-01-01

    In this paper we present a model for computerized adaptive practice and monitoring. This model is used in the Maths Garden, a web-based monitoring system, which includes a challenging web environment for children to practice arithmetic. Using a new item response model based on the Elo (1978) rating

  7. An efficient Adaptive Mesh Refinement (AMR) algorithm for the Discontinuous Galerkin method: Applications for the computation of compressible two-phase flows

    Science.gov (United States)

    Papoutsakis, Andreas; Sazhin, Sergei S.; Begg, Steven; Danaila, Ionut; Luddens, Francky

    2018-06-01

    We present an Adaptive Mesh Refinement (AMR) method suitable for hybrid unstructured meshes that allows for local refinement and de-refinement of the computational grid during the evolution of the flow. The adaptive implementation of the Discontinuous Galerkin (DG) method introduced in this work (ForestDG) is based on a topological representation of the computational mesh by a hierarchical structure consisting of oct- quad- and binary trees. Adaptive mesh refinement (h-refinement) enables us to increase the spatial resolution of the computational mesh in the vicinity of the points of interest such as interfaces, geometrical features, or flow discontinuities. The local increase in the expansion order (p-refinement) at areas of high strain rates or vorticity magnitude results in an increase of the order of accuracy in the region of shear layers and vortices. A graph of unitarian-trees, representing hexahedral, prismatic and tetrahedral elements is used for the representation of the initial domain. The ancestral elements of the mesh can be split into self-similar elements allowing each tree to grow branches to an arbitrary level of refinement. The connectivity of the elements, their genealogy and their partitioning are described by linked lists of pointers. An explicit calculation of these relations, presented in this paper, facilitates the on-the-fly splitting, merging and repartitioning of the computational mesh by rearranging the links of each node of the tree with a minimal computational overhead. The modal basis used in the DG implementation facilitates the mapping of the fluxes across the non conformal faces. The AMR methodology is presented and assessed using a series of inviscid and viscous test cases. Also, the AMR methodology is used for the modelling of the interaction between droplets and the carrier phase in a two-phase flow. This approach is applied to the analysis of a spray injected into a chamber of quiescent air, using the Eulerian

  8. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  13. Comparison of Rigid and Adaptive Methods of Propagating Gross Tumor Volume Through Respiratory Phases of Four-Dimensional Computed Tomography Image Data Set

    International Nuclear Information System (INIS)

    Ezhil, Muthuveni; Choi, Bum; Starkschall, George; Bucci, M. Kara; Vedam, Sastry; Balter, Peter

    2008-01-01

    Purpose: To compare three different methods of propagating the gross tumor volume (GTV) through the respiratory phases that constitute a four-dimensional computed tomography image data set. Methods and Materials: Four-dimensional computed tomography data sets of 20 patients who had undergone definitive hypofractionated radiotherapy to the lung were acquired. The GTV regions of interest (ROIs) were manually delineated on each phase of the four-dimensional computed tomography data set. The ROI from the end-expiration phase was propagated to the remaining nine phases of respiration using the following three techniques: (1) rigid-image registration using in-house software, (2) rigid image registration using research software from a commercial radiotherapy planning system vendor, and (3) rigid-image registration followed by deformable adaptation originally intended for organ-at-risk delineation using the same software. The internal GTVs generated from the various propagation methods were compared with the manual internal GTV using the normalized Dice similarity coefficient (DSC) index. Results: The normalized DSC index of 1.01 ± 0.06 (SD) for rigid propagation using the in-house software program was identical to the normalized DSC index of 1.01 ± 0.06 for rigid propagation achieved with the vendor's research software. Adaptive propagation yielded poorer results, with a normalized DSC index of 0.89 ± 0.10 (paired t test, p <0.001). Conclusion: Propagation of the GTV ROIs through the respiratory phases using rigid- body registration is an acceptable method within a 1-mm margin of uncertainty. The adaptive organ-at-risk propagation method was not applicable to propagating GTV ROIs, resulting in an unacceptable reduction of the volume and distortion of the ROIs

  14. ADAPTATION OF JOHNSON SEQUENCING ALGORITHM FOR JOB SCHEDULING TO MINIMISE THE AVERAGE WAITING TIME IN CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    SOUVIK PAL

    2016-09-01

    Full Text Available Cloud computing is an emerging paradigm of Internet-centric business computing where Cloud Service Providers (CSPs are providing services to the customer according to their needs. The key perception behind cloud computing is on-demand sharing of resources available in the resource pool provided by CSP, which implies new emerging business model. The resources are provisioned when jobs arrive. The job scheduling and minimization of waiting time are the challenging issue in cloud computing. When a large number of jobs are requested, they have to wait for getting allocated to the servers which in turn may increase the queue length and also waiting time. This paper includes system design for implementation which is concerned with Johnson Scheduling Algorithm that provides the optimal sequence. With that sequence, service times can be obtained. The waiting time and queue length can be reduced using queuing model with multi-server and finite capacity which improves the job scheduling model.

  15. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H; von Schwerin, E; Szepessy, A; Tempone, Raul

    2011-01-01

    . An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates

  16. Adaptive Laboratory Evolution Of Escherichia Coli Reveals Arduous Resistance Development To A Combination Of Three Novel Antimicrobial Compounds And To The Short Amp P9-4

    DEFF Research Database (Denmark)

    Citterio, Linda; Franzyk, Henrik; Gram, Lone

    2015-01-01

    Antimicrobial peptides (AMPs) were for long considered as promising new antimicrobials since resistance was not expected. However, adaptive evolution experiments have demonstrated that bacteria may indeed develop resistance also to AMPs. However, we and others hypothesize that the risk...... of resistance development decreases when two or more compounds are combined as compared to single-drug treatments. The purpose of this study was to determine if resistance could develop in Escherichia coli ATCC 25922 to the peptidomimetic HF-1002 2 and the AMPs novicidin and P9-4. The mentioned compounds were...... adaptation to 32 x MIC. This shows that resistance to novicidin and HF-1002 2, administered alone, developed more easily than it occurred in lineages exposed to the combination of three drugs. This result further supports combinatorial treatment as a way to circumvent resistance development. Surprisingly...

  17. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  18. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  20. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  1. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  4. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  5. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  6. Short and Long Term Effects of High-Intensity Interval Training on Hormones, Metabolites, Antioxidant System, Glycogen Concentration, and Aerobic Performance Adaptations in Rats

    OpenAIRE

    de Araujo, Gustavo G.; Papoti, Marcelo; dos Reis, Ivan Gustavo Masselli; de Mello, Maria A. R.; Gobatto, Claudio A.

    2016-01-01

    The purpose of the study was to investigate the effects of short and long term High-Intensity Interval Training (HIIT) on anaerobic and aerobic performance, creatinine, uric acid, urea, creatine kinase, lactate dehydrogenase, catalase, superoxide dismutase, testosterone, corticosterone, and glycogen concentration (liver, soleus, and gastrocnemius). The Wistar rats were separated in two groups: HIIT and sedentary/control (CT). The lactate minimum (LM) was used to evaluate the aerobic and anaer...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  8. Using an adaptive expertise lens to understand the quality of teachers' classroom implementation of computer-supported complex systems curricula in high school science

    Science.gov (United States)

    Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric

    2015-05-01

    Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to better understand teachers' classroom practices as they attempt to navigate myriad variables in the implementation of biology units that include working with computer simulations, and learning about and teaching through complex systems ideas. Sample: Research participants were three high school biology teachers, two females and one male, ranging in teaching experience from six to 16 years. Their teaching contexts also ranged in student achievement from 14-47% advanced science proficiency. Design and methods: We used a holistic multiple case study methodology and collected data during the 2011-2012 school year. Data sources include classroom observations, teacher and student surveys, and interviews. Data analyses and trustworthiness measures were conducted through qualitative mining of data sources and triangulation of findings. Results: We illustrate the characteristics of adaptive expertise of more or less successful teaching and learning when implementing complex systems curricula. We also demonstrate differences between case study teachers in terms of particular variables associated with adaptive expertise. Conclusions: This research contributes to scholarship on practices and professional development needed to better support teachers to teach through a complex systems pedagogical and curricular approach.

  9. Radiation dose considerations by intra-individual Monte Carlo simulations in dual source spiral coronary computed tomography angiography with electrocardiogram-triggered tube current modulation and adaptive pitch

    Energy Technology Data Exchange (ETDEWEB)

    May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael [University of Erlangen, Department of Radiology, Erlangen (Germany); Deak, Paul; Kalender, Willi A. [University of Erlangen, Department of Medical Physics, Erlangen (Germany); Keller, Andrea K.; Haeberle, Lothar [University of Erlangen, Department of Medical Informatics, Biometry and Epidemiology, Erlangen (Germany); Achenbach, Stephan; Seltmann, Martin [University of Erlangen, Department of Cardiology, Erlangen (Germany)

    2012-03-15

    To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 {+-} 2.1 mSv/100 mAs for TCM and 12.5 {+-} 5.3 mSv/100 mAs for CTC (P < 0.001). Relative dose reduction at low HR ({<=}60 bpm) was highest (49 {+-} 5%) compared to intermediate (60-70 bpm, 33 {+-} 12%) and high HR (>70 bpm, 29 {+-} 12%). However lowest ED is achieved at high HR (5.2 {+-} 1.5 mSv/100 mAs), compared with intermediate (6.7 {+-} 1.6 mSv/100 mAs) and low (8.3 {+-} 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)

  10. Radiation dose considerations by intra-individual Monte Carlo simulations in dual source spiral coronary computed tomography angiography with electrocardiogram-triggered tube current modulation and adaptive pitch

    International Nuclear Information System (INIS)

    May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael; Deak, Paul; Kalender, Willi A.; Keller, Andrea K.; Haeberle, Lothar; Achenbach, Stephan; Seltmann, Martin

    2012-01-01

    To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 ± 2.1 mSv/100 mAs for TCM and 12.5 ± 5.3 mSv/100 mAs for CTC (P 70 bpm, 29 ± 12%). However lowest ED is achieved at high HR (5.2 ± 1.5 mSv/100 mAs), compared with intermediate (6.7 ± 1.6 mSv/100 mAs) and low (8.3 ± 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)

  11. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  12. CALL in a Climate of Change: Adapting to Turbulent Global Conditions. Short Papers from EUROCALL 2017 (25th, Southampton, United Kingdom, August 23-26, 2017)

    Science.gov (United States)

    Borthwick, Kate, Ed.; Bradley, Linda, Ed.; Thouësny, Sylvie, Ed.

    2017-01-01

    The 25th European Association of Computer-Assisted Language Learning (EUROCALL) conference was hosted by Modern Languages and Linguistics at the University of Southampton, in the United Kingdom, from the 23rd to the 26th of August 2017. The theme of the conference was "CALL in a climate of change." The theme encompassed the notion of how…

  13. Adaptive Response in Female Fathead Minnows Exposed to an Aromatase Inhibitor: Computational Modeling of the Hypothalamic-Pituitary-Gonadal Axis

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...

  14. Dosimetric Advantages of Four-Dimensional Adaptive Image-Guided Radiotherapy for Lung Tumors Using Online Cone-Beam Computed Tomography

    International Nuclear Information System (INIS)

    Harsolia, Asif; Hugo, Geoffrey D.; Kestin, Larry L.; Grills, Inga S.; Yan Di

    2008-01-01

    Purpose: This study compares multiple planning techniques designed to improve accuracy while allowing reduced planning target volume (PTV) margins though image-guided radiotherapy (IGRT) with four-dimensional (4D) cone-beam computed tomography (CBCT). Methods and Materials: Free-breathing planning and 4D-CBCT scans were obtained in 8 patients with lung tumors. Four plans were generated for each patient: 3D-conformal, 4D-union, 4D-offline adaptive with a single correction (offline ART), and 4D-online adaptive with daily correction (online ART). For the 4D-union plan, the union of gross tumor volumes from all phases of the 4D-CBCT was created with a 5-mm expansion applied for setup uncertainty. For offline and online ART, the gross tumor volume was delineated at the mean position of tumor motion from the 4D-CBCT. The PTV margins were calculated from the random components of tumor motion and setup uncertainty. Results: Adaptive IGRT techniques provided better PTV coverage with less irradiated normal tissues. Compared with 3D plans, mean relative decreases in PTV volumes were 15%, 39%, and 44% using 4D-union, offline ART, and online ART planning techniques, respectively. This resulted in mean lung volume receiving ≥ 20Gy (V20) relative decreases of 21%, 23%, and 31% and mean lung dose relative decreases of 16%, 26%, and 31% for the 4D-union, 4D-offline ART, and 4D-online ART, respectively. Conclusions: Adaptive IGRT using CBCT is feasible for the treatment of patients with lung tumors and significantly decreases PTV volume and dose to normal tissues, allowing for the possibility of dose escalation. All analyzed 4D planning strategies resulted in improvements over 3D plans, with 4D-online ART appearing optimal

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  16. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    Science.gov (United States)

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residualsLD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  17. The nociceptive withdrawal reflex does not adapt to joint position change and short-term motor practice [v2; ref status: indexed, http://f1000r.es/2lr

    Directory of Open Access Journals (Sweden)

    Nathan Eckert

    2013-12-01

    Full Text Available The nociceptive withdrawal reflex is a protective mechanism to mediate interactions within a potentially dangerous environment. The reflex is formed by action-based sensory encoding during the early post-natal developmental period, and it is unknown if the protective motor function of the nociceptive withdrawal reflex in the human upper-limb is adaptable based on the configuration of the arm or if it can be modified by short-term practice of a similar or opposing motor action. In the present study, nociceptive withdrawal reflexes were evoked by a brief train of electrical stimuli applied to digit II, 1 in five different static arm positions and, 2 before and after motor practice that was opposite (EXT or similar (FLEX to the stereotyped withdrawal response, in 10 individuals. Withdrawal responses were quantified by the electromyography (EMG reflex response in several upper limb muscles, and by the forces and moments recorded at the wrist. EMG onset latencies and response amplitudes were not significantly different across the arm positions or between the EXT and FLEX practice conditions, and the general direction of the withdrawal response was similar across arm positions. In addition, the force vectors were not different after practice in either the practice condition or between EXT and FLEX conditions. We conclude the withdrawal response is insensitive to changes in elbow or shoulder joint angles as well as remaining resistant to short-term adaptations from the practice of motor actions, resulting in a generalized limb withdrawal in each case. It is further hypothesized that the multisensory feedback is weighted differently in each arm position, but integrated to achieve a similar withdrawal response to safeguard against erroneous motor responses that could cause further harm. The results remain consistent with the concept that nociceptive withdrawal reflexes are shaped through long-term and not short-term action based sensory encoding.

  18. Short-Term Local Adaptation of Historical Common Bean (Phaseolus vulgaris L. Varieties and Implications for In Situ Management of Bean Diversity

    Directory of Open Access Journals (Sweden)

    Stephanie M. Klaedtke

    2017-02-01

    Full Text Available Recognizing both the stakes of traditional European common bean diversity and the role farmers’ and gardeners’ networks play in maintaining this diversity, the present study examines the role that local adaptation plays for the management of common bean diversity in situ. To the purpose, four historical bean varieties and one modern control were multiplied on two organic farms for three growing seasons. The fifteen resulting populations, the initial ones and two populations of each variety obtained after the three years of multiplication, were then grown in a common garden. Twenty-two Simple Sequence Repeat (SSR markers and 13 phenotypic traits were assessed. In total, 68.2% of tested markers were polymorphic and a total of 66 different alleles were identified. FST analysis showed that the genetic composition of two varieties multiplied in different environments changed. At the phenotypic level, differences were observed in flowering date and leaf length. Results indicate that three years of multiplication suffice for local adaptation to occur. The spatial dynamics of genetic and phenotypic bean diversity imply that the maintenance of diversity should be considered at the scale of the network, rather than individual farms and gardens. The microevolution of bean populations within networks of gardens and farms emerges as a research perspective.

  19. Computational and experimental studies of microvascular void features for passive-adaptation of structural panel dynamic properties

    Science.gov (United States)

    Sears, Nicholas C.; Harne, Ryan L.

    2018-01-01

    The performance, integrity, and safety of built-up structural systems are critical to their effective employment in diverse engineering applications. In conflict with these goals, harmonic or random excitations of structural panels may promote large amplitude oscillations that are particularly harmful when excitation energies are concentrated around natural frequencies. This contributes to fatigue concerns, performance degradation, and failure. While studies have considered active or passive damping treatments that adapt material characteristics and configurations for structural control, it remains to be understood how vibration properties of structural panels may be tailored via internal material transitions. Motivated to fill this knowledge gap, this research explores an idea of adapting the static and dynamic material distribution of panels through embedded microvascular channels and strategically placed voids that permit the internal movement of fluids within the panels for structural dynamic control. Finite element model and experimental investigations probe how redistributing material in the form of microscale voids influences the global vibration modes and natural frequencies of structural panels. Through parameter studies, the relationships among void shape, number, size, and location are quantified towards their contribution to the changing structural dynamics. For the panel composition and boundary conditions considered in this report, the findings reveal that transferring material between strategically placed voids may result in eigenfrequency changes as great as 10.0, 5.0, and 7.4% for the first, second, and third modes, respectively.

  20. Short-term outcomes and safety of computed tomography-guided percutaneous microwave ablation of solitary adrenal metastasis from lung cancer: A multi-center retrospective study

    Energy Technology Data Exchange (ETDEWEB)

    Men, Min; Ye, Xin; Yang, Xia; Zheng, Aimin; Huang, Guang Hui; Wei, Zhigang [Dept. of Oncology, Shandong Provincial Hospital Affiliated with Shandong University, Jinan (China); Fan, Wei Jun [Imaging and Interventional Center, Sun Yat-sen University Cancer Center, Guangzhou (China); Zhang, Kaixian [Dept. of Oncology, Teng Zhou Central People' s Hospital Affiliated with Jining Medical College, Tengzhou (China); Bi, Jing Wang [Dept. of Oncology, Jinan Military General Hospital of Chinese People' s Liberation Army, Jinan (China)

    2016-11-15

    To retrospectively evaluate the short-term outcomes and safety of computed tomography (CT)-guided percutaneous microwave ablation (MWA) of solitary adrenal metastasis from lung cancer. From May 2010 to April 2014, 31 patients with unilateral adrenal metastasis from lung cancer who were treated with CT-guided percutaneous MWA were enrolled. This study was conducted with approval from local Institutional Review Board. Clinical outcomes and complications of MWA were assessed. Their tumors ranged from 1.5 to 5.4 cm in diameter. After a median follow-up period of 11.1 months, primary efficacy rate was 90.3% (28/31). Local tumor progression was detected in 7 (22.6%) of 31 cases. Their median overall survival time was 12 months. The 1-year overall survival rate was 44.3%. Median local tumor progression-free survival time was 9 months. Local tumor progression-free survival rate was 77.4%. Of 36 MWA sessions, two (5.6%) had major complications (hypertensive crisis). CT-guided percutaneous MWA may be fairly safe and effective for treating solitary adrenal metastasis from lung cancer.

  1. Short-term Reproducibility of Computed Tomography-based Lung Density Measurements in Alpha-1 Antitrypsin Deficiency and Smokers with Emphysema

    International Nuclear Information System (INIS)

    Shaker, S.B.; Dirksen, A.; Laursen, L.C.; Maltbaek, N.; Christensen, L.; Sander, U.; Seersholm, N.; Skovgaard, L.T.; Nielsen, L.; Kok-Jensen, A.

    2004-01-01

    Purpose: To study the short-term reproducibility of lung density measurements by multi-slice computed tomography (CT) using three different radiation doses and three reconstruction algorithms. Material and Methods: Twenty-five patients with smoker's emphysema and 25 patients with 1-antitrypsin deficiency underwent 3 scans at 2-week intervals. Low-dose protocol was applied, and images were reconstructed with bone, detail, and soft algorithms. Total lung volume (TLV), 15th percentile density (PD-15), and relative area at -910 Hounsfield units (RA-910) were obtained from the images using Pulmo-CMS software. Reproducibility of PD-15 and RA-910 and the influence of radiation dose, reconstruction algorithm, and type of emphysema were then analysed. Results: The overall coefficient of variation of volume adjusted PD-15 for all combinations of radiation dose and reconstruction algorithm was 3.7%. The overall standard deviation of volume-adjusted RA-910 was 1.7% (corresponding to a coefficient of variation of 6.8%). Radiation dose, reconstruction algorithm, and type of emphysema had no significant influence on the reproducibility of PD-15 and RA-910. However, bone algorithm and very low radiation dose result in overestimation of the extent of emphysema. Conclusion: Lung density measurement by CT is a sensitive marker for quantitating both subtypes of emphysema. A CT-protocol with radiation dose down to 16 mAs and soft or detail reconstruction algorithm is recommended

  2. Clinical validity of the Japanese version of WAIS-III short forms: Adaptation for patients with mild neurocognitive disorder and dementia.

    Science.gov (United States)

    Takeda, Mihoko; Nakaya, Makoto; Kikuchi, Yoko; Inoue, Sayaka; Kamata, Tomoyuki

    2018-01-01

    We investigated the Japanese WAIS-III short form utility in mild neurocognitive disorder and dementia. Our sample consisted of 108 old patients (ages: 65-89; mean age = 78.3). Fifteen short forms (SFs) and full-scale (FS) IQs were compared. The SFs included Dyads (SF1, SF2), Triads (SF3), Tetrads (SF4, SF5, SF6, SF7), Pentad (SF8), Six-subtest (SF9), Seven-subtests (SF10(a)(b), SF11(a)(b), SF12), and Nine-subtest (SF13). Correlations between SFIQs and FSIQ were all significant. Significant differences also were found in paired t-test between FSIQ and 5 SFIQs (SF2: t = -4.16, SF5: t = -7.06, SF7; t = 2.59, SF10(a): t = 2.56, SF12: t = -4.82; p Arithmetic, Digit Span, Information, Picture Completion, Digit Symbol-Coding, and Matrix Reasoning (Ryan & Ward, 1999), and the formula (Axelrod et al., 2001) should be adopted to convert scaled scores into estimated IQ scores. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. German Language Adaptation of the Headache Management Self-Efficacy Scale (HMSE-G) and Development of a New Short Form (HMSE-G-SF).

    Science.gov (United States)

    Graef, Julia E; Rief, Winfried; French, Douglas J; Nilges, Paul; Nestoriuc, Yvonne

    2015-01-01

    This study aims to develop and validate a German version of French and colleagues' Headache Management Self-efficacy Scale and to construct an abbreviated form for use in behavioral headache research. Furthermore, the contribution of headache-specific self-efficacy to pain-related disability in German chronic headache sufferers was examined. Headache-specific self-efficacy refers to an individuals' confidence that they can engage in behaviors to either prevent headache episodes or to manage headache-related pain and disability. Self-efficacy beliefs have been shown repeatedly to be positively associated with psychological well-being, effective coping, and enhanced treatment outcomes. A cross-sectional sample of 304 individuals diagnosed with either migraine, chronic tension-type headache, or a combination of 2 or more headache disorders completed the German Headache Management Self-efficacy Scale and questionnaires assessing headache activity, pain-related coping, general self-efficacy, depression, and anxiety. Responsiveness of the scale was analyzed in a longitudinal subsample of 32 inpatients undergoing headache treatment. Finally, a short form was constructed and evaluated regarding psychometric properties. The German Headache Management Self-efficacy Scale showed good reliability (Cronbach's α = 0.87) as did the 6-item short form (Cronbach's α = 0.72). In the longitudinal sample, both versions showed a good ability to change over time (SRM= 0.52-1.16). Chronic headache patients with higher levels of self-efficacy reported lower levels of disability (r = -0.26 to -0.31). Multiple regression analyses revealed headache intensity and headache-specific self-efficacy as strongest predictors of headache-related disability (βself-efficacy  = -0.21, βintensity  = 0.26). Both the 25-item version and the 6-item version appear to be valid, reliable measures of self-efficacy beliefs. These scales will allow clinicians to identify headache sufferers

  4. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs; Straaldosreglering vid kroppsdatortomografi - bakgrund till dosregleringsprogrammet OmnimAs

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, Ulf; Kristiansson, Mattias [Trelleborg Hospital (Sweden); Leitz, Wolfram [Swedish Radiation Protection Authority, Stockholm (Sweden); Paahlstorp, Per-Aake [Siemens Medical Solutions, Solna (Sweden)

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations.

  5. Adaptive user interfaces

    CERN Document Server

    1990-01-01

    This book describes techniques for designing and building adaptive user interfaces developed in the large AID project undertaken by the contributors.Key Features* Describes one of the few large-scale adaptive interface projects in the world* Outlines the principles of adaptivity in human-computer interaction

  6. Translation, cultural adaptation assessment, and both validity and reliability testing of the kidney disease quality of life - short form version 1.3 for use with Iranian patients

    DEFF Research Database (Denmark)

    Pakpour, Amir; Yekaninejad, Mirsaeed; Mølsted, Stig

    2011-01-01

    AIM: The aims of the study were to translate the Kidney Disease Quality of Life--Short Form version 1.3 (KDQOL-SF ver. 1.3) questionnaire into Iranian (Farsi), and to then assess it in terms of validity and reliability on Iranian patients. METHODS: The questionnaire was first translated into Farsi...... a larger group (212 patients with end-stage renal disease on haemodialysis). Afterwards, reliability was estimated by internal consistency, and validity was assessed using known group comparisons and constructs for the patient group as a whole. Finally, the factor structure of the questionnaire...... be summarized into an 11 factor structure that jointly accounted for 79.81% of the variance. CONCLUSION: The Iranian version of the KDQOL-SF questionnaire is both highly reliable and valid for use with Iranian patients on haemodialysis....

  7. Individualized Nonadaptive and Online-Adaptive Intensity-Modulated Radiotherapy Treatment Strategies for Cervical Cancer Patients Based on Pretreatment Acquired Variable Bladder Filling Computed Tomography Scans

    International Nuclear Information System (INIS)

    Bondar, M.L.; Hoogeman, M.S.; Mens, J.W.; Quint, S.; Ahmad, R.; Dhawtal, G.; Heijmen, B.J.

    2012-01-01

    Purpose: To design and evaluate individualized nonadaptive and online-adaptive strategies based on a pretreatment established motion model for the highly deformable target volume in cervical cancer patients. Methods and Materials: For 14 patients, nine to ten variable bladder filling computed tomography (CT) scans were acquired at pretreatment and after 40 Gy. Individualized model-based internal target volumes (mbITVs) accounting for the cervix and uterus motion due to bladder volume changes were generated by using a motion-model constructed from two pretreatment CT scans (full and empty bladder). Two individualized strategies were designed: a nonadaptive strategy, using an mbITV accounting for the full-range of bladder volume changes throughout the treatment; and an online-adaptive strategy, using mbITVs of bladder volume subranges to construct a library of plans. The latter adapts the treatment online by selecting the plan-of-the-day from the library based on the measured bladder volume. The individualized strategies were evaluated by the seven to eight CT scans not used for mbITVs construction, and compared with a population-based approach. Geometric uniform margins around planning cervix–uterus and mbITVs were determined to ensure adequate coverage. For each strategy, the percentage of the cervix–uterus, bladder, and rectum volumes inside the planning target volume (PTV), and the clinical target volume (CTV)-to-PTV volume (volume difference between PTV and CTV) were calculated. Results: The margin for the population-based approach was 38 mm and for the individualized strategies was 7 to 10 mm. Compared with the population-based approach, the individualized nonadaptive strategy decreased the CTV-to-PTV volume by 48% ± 6% and the percentage of bladder and rectum inside the PTV by 5% to 45% and 26% to 74% (p < 0.001), respectively. Replacing the individualized nonadaptive strategy by an online-adaptive, two-plan library further decreased the percentage of

  8. Verification of computer system PROLOG - software tool for rapid assessments of consequences of short-term radioactive releases to the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Kiselev, Alexey A.; Krylov, Alexey L.; Bogatov, Sergey A. [Nuclear Safety Institute (IBRAE), Bolshaya Tulskaya st. 52, 115191, Moscow (Russian Federation)

    2014-07-01

    In case of nuclear and radiation accidents emergency response authorities require a tool for rapid assessments of possible consequences. One of the most significant problems is lack of data on initial state of an accident. The lack can be especially critical in case the accident occurred in a location that was not thoroughly studied beforehand (during transportation of radioactive materials for example). One of possible solutions is the hybrid method when a model that enables rapid assessments with the use of reasonable minimum of input data is used conjointly with an observed data that can be collected shortly after accidents. The model is used to estimate parameters of the source and uncertain meteorological parameters on the base of some observed data. For example, field of fallout density can be observed and measured within hours after an accident. After that the same model with the use of estimated parameters is used to assess doses and necessity of recommended and mandatory countermeasures. The computer system PROLOG was designed to solve the problem. It is based on the widely used Gaussian model. The standard Gaussian model is supplemented with several sub-models that allow to take into account: polydisperse aerosols, aerodynamic shade from buildings in the vicinity of the place of accident, terrain orography, initial size of the radioactive cloud, effective height of the release, influence of countermeasures on the doses of radioactive exposure of humans. It uses modern GIS technologies and can use web map services. To verify ability of PROLOG to solve the problem it is necessary to test its ability to assess necessary parameters of real accidents in the past. Verification of the computer system on the data of Chazhma Bay accident (Russian Far East, 1985) was published previously. In this work verification was implemented on the base of observed contamination from the Kyshtym disaster (PA Mayak, 1957) and the Tomsk accident (1993). Observations of Sr-90

  9. Short- and medium-term efficacy of a Web-based computer-tailored nutrition education intervention for adults including cognitive and environmental feedback: randomized controlled trial.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; de Vries, Hein; Candel, Math J J M; Oenema, Anke

    2015-01-19

    Web-based, computer-tailored nutrition education interventions can be effective in modifying self-reported dietary behaviors. Traditional computer-tailored programs primarily targeted individual cognitions (knowledge, awareness, attitude, self-efficacy). Tailoring on additional variables such as self-regulation processes and environmental-level factors (the home food environment arrangement and perception of availability and prices of healthy food products in supermarkets) may improve efficacy and effect sizes (ES) of Web-based computer-tailored nutrition education interventions. This study evaluated the short- and medium-term efficacy and educational differences in efficacy of a cognitive and environmental feedback version of a Web-based computer-tailored nutrition education intervention on self-reported fruit, vegetable, high-energy snack, and saturated fat intake compared to generic nutrition information in the total sample and among participants who did not comply with dietary guidelines (the risk groups). A randomized controlled trial was conducted with a basic (tailored intervention targeting individual cognition and self-regulation processes; n=456), plus (basic intervention additionally targeting environmental-level factors; n=459), and control (generic nutrition information; n=434) group. Participants were recruited from the general population and randomly assigned to a study group. Self-reported fruit, vegetable, high-energy snack, and saturated fat intake were assessed at baseline and at 1- (T1) and 4-months (T2) postintervention using online questionnaires. Linear mixed model analyses examined group differences in change over time. Educational differences were examined with group×time×education interaction terms. In the total sample, the basic (T1: ES=-0.30; T2: ES=-0.18) and plus intervention groups (T1: ES=-0.29; T2: ES=-0.27) had larger decreases in high-energy snack intake than the control group. The basic version resulted in a larger decrease in

  10. Examining sensory ability, feature matching and assessment-based adaptation for a brain-computer interface using the steady-state visually evoked potential.

    Science.gov (United States)

    Brumberg, Jonathan S; Nguyen, Anh; Pitt, Kevin M; Lorenz, Sean D

    2018-01-31

    We investigated how overt visual attention and oculomotor control influence successful use of a visual feedback brain-computer interface (BCI) for accessing augmentative and alternative communication (AAC) devices in a heterogeneous population of individuals with profound neuromotor impairments. BCIs are often tested within a single patient population limiting generalization of results. This study focuses on examining individual sensory abilities with an eye toward possible interface adaptations to improve device performance. Five individuals with a range of neuromotor disorders participated in four-choice BCI control task involving the steady state visually evoked potential. The BCI graphical interface was designed to simulate a commercial AAC device to examine whether an integrated device could be used successfully by individuals with neuromotor impairment. All participants were able to interact with the BCI and highest performance was found for participants able to employ an overt visual attention strategy. For participants with visual deficits to due to impaired oculomotor control, effective performance increased after accounting for mismatches between the graphical layout and participant visual capabilities. As BCIs are translated from research environments to clinical applications, the assessment of BCI-related skills will help facilitate proper device selection and provide individuals who use BCI the greatest likelihood of immediate and long term communicative success. Overall, our results indicate that adaptations can be an effective strategy to reduce barriers and increase access to BCI technology. These efforts should be directed by comprehensive assessments for matching individuals to the most appropriate device to support their complex communication needs. Implications for Rehabilitation Brain computer interfaces using the steady state visually evoked potential can be integrated with an augmentative and alternative communication device to provide access

  11. Computational Analysis of the Mode of Action of Disopyramide and Quinidine on hERG-Linked Short QT Syndrome in Human Ventricles

    Directory of Open Access Journals (Sweden)

    Dominic G. Whittaker

    2017-10-01

    Full Text Available The short QT syndrome (SQTS is a rare cardiac disorder associated with arrhythmias and sudden death. Gain-of-function mutations to potassium channels mediating the rapid delayed rectifier current, IKr, underlie SQTS variant 1 (SQT1, in which treatment with Na+ and K+ channel blocking class Ia anti-arrhythmic agents has demonstrated some efficacy. This study used computational modeling to gain mechanistic insights into the actions of two such drugs, disopyramide and quinidine, in the setting of SQT1. The O'Hara-Rudy (ORd human ventricle model was modified to incorporate a Markov chain formulation of IKr describing wild type (WT and SQT1 mutant conditions. Effects of multi-channel block by disopyramide and quinidine, including binding kinetics and altered potency of IKr/hERG channel block in SQT1 and state-dependent block of sodium channels, were simulated on action potential and multicellular tissue models. A one-dimensional (1D transmural ventricular strand model was used to assess prolongation of the QT interval, effective refractory period (ERP, and re-entry wavelength (WL by both drugs. Dynamics of re-entrant excitation waves were investigated using a 3D human left ventricular wedge model. In the setting of SQT1, disopyramide, and quinidine both produced a dose-dependent prolongation in (i the QT interval, which was primarily due to IKr block, and (ii the ERP, which was mediated by a synergistic combination of IKr and INa block. Over the same range of concentrations quinidine was more effective in restoring the QT interval, due to more potent block of IKr. Both drugs demonstrated an anti-arrhythmic increase in the WL of re-entrant circuits. In the 3D wedge, disopyramide and quinidine at clinically-relevant concentrations decreased the dominant frequency of re-entrant excitations and exhibited anti-fibrillatory effects; preventing formation of multiple, chaotic wavelets which developed in SQT1, and could terminate arrhythmias. This

  12. History of adaptation determines short-term shifts in performance and community structure of hydrogen-producing microbial communities degrading wheat straw.

    Science.gov (United States)

    Valdez-Vazquez, Idania; Morales, Ana L; Escalante, Ana E

    2017-11-01

    This study addresses the question of ecological interest for the determination of structure and diversity of microbial communities that degrade lignocellulosic biomasses to produce biofuels. Two microbial consortia with different history, native of wheat straw (NWS) and from a methanogenic digester (MD) fed with cow manure, were contrasted in terms of hydrogen performance, substrate disintegration and microbial diversity. NWS outperformed the hydrogen production rate of MD. Microscopic images revealed that NWS acted on the cuticle and epidermis, generating cellulose strands with high crystallinity, while MD degraded deeper layers, equally affecting all polysaccharides. The bacterial composition markedly differed according to the inocula origin. NWS almost solely comprised hydrogen producers of the phyla Firmicutes and Proteobacteria, with 38% members of Enterococcus. After hydrogen fermentation, NWS comprised 8% Syntrophococcus, an acetogen that cleaves aryl ethers of constituent groups on the aromatic components of lignin. Conversely, MD comprised thirteen phyla, primarily including Firmicutes with H 2 -producing members, and Bacteroidetes with non-H 2 -producing members, which reduced the hydrogen performance. Overall, the results of this study provide clear evidence that the history of adaptation of NWS enhanced the hydrogen performance from untreated wheat straw. Further, native wheat straw communities have the potential to refine cellulose fibers and produce biofuels simultaneously. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  13. Adaptive changes of pancreatic protease secretion to a short-term vegan diet: influence of reduced intake and modification of protein.

    Science.gov (United States)

    Walkowiak, Jaroslaw; Mądry, Edyta; Lisowska, Aleksandra; Szaflarska-Popławska, Anna; Grzymisławski, Marian; Stankowiak-Kulpa, Hanna; Przysławski, Juliusz

    2012-01-01

    In our previous study, we demonstrated that abstaining from meat, for 1 month, by healthy omnivores (lacto-ovovegetarian model) resulted in a statistical decrease in pancreatic secretion as measured by faecal elastase-1 output. However, no correlation between relative and non-relative changes of energy and nutrient consumption and pancreatic secretion was documented. Therefore, in the present study, we aimed to assess the changes of exocrine pancreatic secretion with a more restrictive dietetic modification, by applying a vegan diet. A total of twenty-one healthy omnivores (sixteen females and five males) participated in the prospective study lasting for 6 weeks. The nutrient intake and faecal output of pancreatic enzymes (elastase-1, chymotrypsin and lipase) were assessed twice during the study. Each assessment period lasted for 7 d: the first before the transition to the vegan diet (omnivore diet) and the second during the last week of the study (vegan diet). The dietary modification resulted in a significant decrease in faecal elastase-1 (P vegan diet resulted in an adaptation of pancreatic protease secretion in healthy volunteers.

  14. A Computational Approach to Model Vascular Adaptation During Chronic Hemodialysis: Shape Optimization as a Substitute for Growth Modeling

    Science.gov (United States)

    Mahmoudzadeh Akherat, S. M. Javid; Boghosian, Michael; Cassel, Kevin; Hammes, Mary

    2015-11-01

    End-stage-renal disease patients depend on successful long-term hemodialysis via vascular access, commonly facilitated via a Brachiocephalic Fistula (BCF). The primary cause of BCF failure is Cephalic Arch Stenosis (CAS). It is believed that low Wall Shear Stress (WSS) regions, which occur because of the high flow rates through the natural bend in the cephalic vein, create hemodynamic circumstances that trigger the onset and development of Intimal Hyperplasia (IH) and subsequent CAS. IH is hypothesized to be a natural effort to reshape the vessel, aiming to bring the WSS values back to a physiologically acceptable range. We seek to explore the correlation between regions of low WSS and subsequent IH and CAS in patient-specific geometries. By utilizing a shape optimization framework, a method is proposed to predict cardiovascular adaptation that could potentially be an alternative to vascular growth and remodeling. Based on an objective functional that seeks to alter the vessel shape in such a way as to readjust the WSS to be within the normal physiological range, CFD and shape optimization are then coupled to investigate whether the optimal shape evolution is correlated with actual patient-specific geometries thereafter. Supported by the National Institute of Diabetes and Digestive and Kidney Diseases of the National Institutes of Health (R01 DK90769).

  15. Redefining diagnostic symptoms of depression using Rasch analysis: testing an item bank suitable for DSM-V and computer adaptive testing.

    Science.gov (United States)

    Mitchell, Alex J; Smith, Adam B; Al-salihy, Zerak; Rahim, Twana A; Mahmud, Mahmud Q; Muhyaldin, Asma S

    2011-10-01

    We aimed to redefine the optimal self-report symptoms of depression suitable for creation of an item bank that could be used in computer adaptive testing or to develop a simplified screening tool for DSM-V. Four hundred subjects (200 patients with primary depression and 200 non-depressed subjects), living in Iraqi Kurdistan were interviewed. The Mini International Neuropsychiatric Interview (MINI) was used to define the presence of major depression (DSM-IV criteria). We examined symptoms of depression using four well-known scales delivered in Kurdish. The Partial Credit Model was applied to each instrument. Common-item equating was subsequently used to create an item bank and differential item functioning (DIF) explored for known subgroups. A symptom level Rasch analysis reduced the original 45 items to 24 items of the original after the exclusion of 21 misfitting items. A further six items (CESD13 and CESD17, HADS-D4, HADS-D5 and HADS-D7, and CDSS3 and CDSS4) were removed due to misfit as the items were added together to form the item bank, and two items were subsequently removed following the DIF analysis by diagnosis (CESD20 and CDSS9, both of which were harder to endorse for women). Therefore the remaining optimal item bank consisted of 17 items and produced an area under the curve (AUC) of 0.987. Using a bank restricted to the optimal nine items revealed only minor loss of accuracy (AUC = 0.989, sensitivity 96%, specificity 95%). Finally, when restricted to only four items accuracy was still high (AUC was still 0.976; sensitivity 93%, specificity 96%). An item bank of 17 items may be useful in computer adaptive testing and nine or even four items may be used to develop a simplified screening tool for DSM-V major depressive disorder (MDD). Further examination of this item bank should be conducted in different cultural settings.

  16. Alleviation of Psychological Distress and the Improvement of Quality of Life in Patients With Amyotrophic Lateral Sclerosis: Adaptation of a Short-Term Psychotherapeutic Intervention

    Directory of Open Access Journals (Sweden)

    Moritz Caspar Franz Oberstadt

    2018-04-01

    Full Text Available Amyotrophic lateral sclerosis (ALS is a progressive neurodegenerative disease that is inevitably fatal. To be diagnosed with a terminal illness such as ALS deeply affects one’s personal existence and goes along with significant changes regarding the physical, emotional, and social domains of the patients’ life. ALS patients have to face a rapidly debilitating physical decline which restrains mobility and impairs all activities of daily living. This progressive loss of autonomy may lead to a sense of hopelessness and loss of quality of life, which in turn may even result in thoughts about physician-assisted suicide. Here, we would like to propose a psychotherapeutic manualized, individual, semi-structured intervention to relieve distress and promote psychological well-being in ALS patients. This short-term intervention was originally developed for advanced cancer patients. “Managing Cancer and Living Meaningfully (CALM” focuses on the four dimensions: (i symptom management and communication with healthcare providers, (ii changes in self and relations with close others, (iii spirituality, sense of meaning and purpose and (iv thinking of the future, hope, and mortality. We suggest to supplement the concept by two additional dimensions which take into account specific issues of ALS patients: (v communication skills, and (vi emotional expression and control. This therapeutic concept named “ManagIng Burden in ALS and Living Meaningfully (mi-BALM” may be a further treatment option to help improving quality of life of ALS patients.

  17. Efficient and Adaptive Methods for Computing Accurate Potential Surfaces for Quantum Nuclear Effects: Applications to Hydrogen-Transfer Reactions.

    Science.gov (United States)

    DeGregorio, Nicole; Iyengar, Srinivasan S

    2018-01-09

    We present two sampling measures to gauge critical regions of potential energy surfaces. These sampling measures employ (a) the instantaneous quantum wavepacket density, an approximation to the (b) potential surface, its (c) gradients, and (d) a Shannon information theory based expression that estimates the local entropy associated with the quantum wavepacket. These four criteria together enable a directed sampling of potential surfaces that appears to correctly describe the local oscillation frequencies, or the local Nyquist frequency, of a potential surface. The sampling functions are then utilized to derive a tessellation scheme that discretizes the multidimensional space to enable efficient sampling of potential surfaces. The sampled potential surface is then combined with four different interpolation procedures, namely, (a) local Hermite curve interpolation, (b) low-pass filtered Lagrange interpolation, (c) the monomial symmetrization approximation (MSA) developed by Bowman and co-workers, and (d) a modified Shepard algorithm. The sampling procedure and the fitting schemes are used to compute (a) potential surfaces in highly anharmonic hydrogen-bonded systems and (b) study hydrogen-transfer reactions in biogenic volatile organic compounds (isoprene) where the transferring hydrogen atom is found to demonstrate critical quantum nuclear effects. In the case of isoprene, the algorithm discussed here is used to derive multidimensional potential surfaces along a hydrogen-transfer reaction path to gauge the effect of quantum-nuclear degrees of freedom on the hydrogen-transfer process. Based on the decreased computational effort, facilitated by the optimal sampling of the potential surfaces through the use of sampling functions discussed here, and the accuracy of the associated potential surfaces, we believe the method will find great utility in the study of quantum nuclear dynamics problems, of which application to hydrogen-transfer reactions and hydrogen

  18. An Ad Hoc Adaptive Hashing Technique forNon-Uniformly Distributed IP Address Lookup in Computer Networks

    Directory of Open Access Journals (Sweden)

    Christopher Martinez

    2007-02-01

    Full Text Available Hashing algorithms long have been widely adopted to design a fast address look-up process which involves a search through a large database to find a record associated with a given key. Hashing algorithms involve transforming a key inside each target data to a hash value hoping that the hashing would render the database a uniform distribution with respect to this new hash value. The close the final distribution is to uniform, the less search time would be required when a query is made. When the database is already key-wise uniformly distributed, any regular hashing algorithm, such as bit-extraction, bit-group XOR, etc., would easily lead to a statistically perfect uniform distribution after the hashing. On the other hand, if records in the database are instead not uniformly distributed as in almost all known practical applications, then even different regular hash functions would lead to very different performance. When the target database has a key with a highly skewed distributed value, performance delivered by regular hashing algorithms usually becomes far from desirable. This paper aims at designing a hashing algorithm to achieve the highest probability in leading to a uniformly distributed hash result from a non-uniformly distributed database. An analytical pre-process on the original database is first performed to extract critical information that would significantly benefit the design of a better hashing algorithm. This process includes sorting on the bits of the key to prioritize the use of them in the XOR hashing sequence, or in simple bit extraction, or even a combination of both. Such an ad hoc hash design is critical to adapting to all real-time situations when there exists a changing (and/or expanding database with an irregular non-uniform distribution. Significant improvement from simulation results is obtained in randomly generated data as well as real data.

  19. Sweden's Future Climate in the short- and medium-term perspective. Basis for development of climate adaptation tools; Sveriges framtida klimat paa kort och medellaang sikt. Underlag foer utveckling av verktyg foer klimatanpassning

    Energy Technology Data Exchange (ETDEWEB)

    Carlsen, Henrik; Parmhed, Oskar

    2008-12-15

    This report studies two questions - how the actual amount of greenhouse gas emissions has changed over time in comparison with previous assumptions and what will happen to Sweden's climate from short and medium term perspectives given the future development of the climate in general. The content of this report shall be used as the basis for continued work in the development of climate adaptation tools that can be used primarily in Sweden. The first section of this report presents the development of actual greenhouse gas emissions in recent years. Eight years have passed since IPCC published its emissions scenarios (SRES) and for this reason a comparison between the emissions levels from the report and the experimental data from observations made in recent years is well due. Measurements show that current levels of emissions of carbon dioxide from fossil fuels exceed the average of the SRES families as a whole. Among other implications, this means that emission levels are significantly higher than the levels assumed in the climate scenarios of the Swedish Commission on Climate and Vulnerability (Klimat- och saarbarhetsutredningen). There is little support from the experimental data to indicate that this rate of increase will subside. In addition, studies are indicating that the increased emissions are not solely attributable to a rise in population levels and increased wealth (GNP/capita). Another aspect is that the carbon intensity, measured as the amount of carbon dioxide per unit production price, has been declining since 2000. This is entirely contrary to the assumptions made in all of the emissions scenarios in SRES that carbon dioxide efficiency would continue to increase. The report therefore concludes that work with climate adaptation in Sweden must take into consideration to a greater extent than is currently done today the consequences of emissions levels that are in line with or exceed the highest SRES levels. The second section of this report takes

  20. Long-term adaptation of Daphnia to toxic environment in Lake Orta: the effects of short-term exposure to copper and acidification

    Directory of Open Access Journals (Sweden)

    Marina MANCA

    2010-08-01

    Full Text Available Because of its 80-year history of heavy pollution and re-colonization, Lake Orta provides a good opportunity for investigating the response of zooplankton organisms to heavy metals and acidification as well as the mechanisms involved. After the recent establishment of Daphnia galeata Sars, and the detection of an extremely low clonal diversity of Lake Orta population, we carried out a study to investigate the lethal tolerance to ionic copper, as well as to acidity, and the impact of newborn Daphnia exposure to sublethal concentrations of copper for their later development and reproduction. We conducted acute toxicity tests to estimate the EC50 for ionic copper and tolerance to low pH, as well as life table experiments. Tolerance to ionic copper was high, three times that reported in literature. An increased mortality soon after exposure to low pH confirmed a high sensitivity to acidity and explained the success of the species in Lake Orta only after pH recovery. An analysis of reproductive and demographic parameters revealed that D. galeata Sars was stressed at concentrations of ionic copper only twice higher than those presently recorded in the lake (i.e., ca 3 μg L-1. An increased cumulative number of eggs produced by each female were in fact counterbalanced by an increasing abortion rate, which resulted in an unaltered or lower intrinsic rate of population increase. Our results are likely due to the strong selective pressure, more than physiological processes (acclimation, in a polluted area in which only specific adapted clones are able to grow, confirming the results previously obtained on Lake Orta's D. obtusa Kurz population. The reproductive response and the relatively low within treatment variability suggest that clone specificity, rather than physiological acclimation, was the driving force. The low variability confirmed results previously obtained from life tables experiments on Lake Orta's D. obtusa clone. Overall, our results