WorldWideScience

Sample records for computer adaptive testing

  1. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  2. Computerized adaptive testing in computer assisted learning?

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick

    2011-01-01

    A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and

  3. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  4. Simple and Effective Algorithms: Computer-Adaptive Testing.

    Science.gov (United States)

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  5. Computer-Adaptive Testing in Second Language Contexts.

    Science.gov (United States)

    Chalhoub-Deville, Micheline; Deville, Craig

    1999-01-01

    Provides a broad overview of computerized testing issues with an emphasis on computer-adaptive testing (CAT). A survey of the potential benefits and drawbacks of CAT are given, the process of CAT development is described; and some L2 instruments developed to assess various language skills are summarized. (Author/VWL)

  6. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  7. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  8. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...

  9. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    NARCIS (Netherlands)

    de Beurs, D.P.; de Vries, A.L.M.; de Groot, M.H.; de Keijser, J.; Kerkhof, A.J.F.M.

    2014-01-01

    Background: The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce

  10. Adaptive test

    DEFF Research Database (Denmark)

    Kjeldsen, Lars Peter; Eriksen, Mette Rose

    2010-01-01

    Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale.......Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale....

  11. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  12. The impacts of computer adaptive testing from a variety of perspectives

    Directory of Open Access Journals (Sweden)

    Tetsuo Kimura

    2017-05-01

    Full Text Available Computer adaptive testing (CAT is a kind of tailored testing, in that it is a form of computer-based testing that is adaptive to each test-taker’s ability level. In this review, the impacts of CAT are discussed from different perspectives in order to illustrate crucial points to keep in mind during the development and implementation of CAT. Test developers and psychometricians often emphasize the efficiency and accuracy of CAT in comparison to traditional linear tests. However, many test-takers report feeling discouraged after taking CATs, and this feeling can reduce learning self-efficacy and motivation. A trade-off must be made between the psychological experiences of test-takers and measurement efficiency. From the perspective of educators and subject matter experts, nonstatistical specifications, such as content coverage, content balance, and form length are major concerns. Thus, accreditation bodies may be faced with a discrepancy between the perspectives of psychometricians and those of subject matter experts. In order to improve test-takers’ impressions of CAT, the author proposes increasing the target probability of answering correctly in the item selection algorithm even if doing so consequently decreases measurement efficiency. Two different methods, CAT with a shadow test approach and computerized multistage testing, have been developed in order to ensure the satisfaction of subject matter experts. In the shadow test approach, a full-length test is assembled that meets the constraints and provides maximum information at the current ability estimate, while computerized multistage testing gives subject matter experts an opportunity to review all test forms prior to administration.

  13. Adjusting for cross-cultural differences in computer-adaptive tests of quality of life.

    Science.gov (United States)

    Gibbons, C J; Skevington, S M

    2018-04-01

    Previous studies using the WHOQOL measures have demonstrated that the relationship between individual items and the underlying quality of life (QoL) construct may differ between cultures. If unaccounted for, these differing relationships can lead to measurement bias which, in turn, can undermine the reliability of results. We used item response theory (IRT) to assess differential item functioning (DIF) in WHOQOL data from diverse language versions collected in UK, Zimbabwe, Russia, and India (total N = 1332). Data were fitted to the partial credit 'Rasch' model. We used four item banks previously derived from the WHOQOL-100 measure, which provided excellent measurement for physical, psychological, social, and environmental quality of life domains (40 items overall). Cross-cultural differential item functioning was assessed using analysis of variance for item residuals and post hoc Tukey tests. Simulated computer-adaptive tests (CATs) were conducted to assess the efficiency and precision of the four items banks. Splitting item parameters by DIF results in four linked item banks without DIF or other breaches of IRT model assumptions. Simulated CATs were more precise and efficient than longer paper-based alternatives. Assessing differential item functioning using item response theory can identify measurement invariance between cultures which, if uncontrolled, may undermine accurate comparisons in computer-adaptive testing assessments of QoL. We demonstrate how compensating for DIF using item anchoring allowed data from all four countries to be compared on a common metric, thus facilitating assessments which were both sensitive to cultural nuance and comparable between countries.

  14. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    Directory of Open Access Journals (Sweden)

    Jack Burston

    2014-09-01

    Full Text Available This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2 English ability of incoming students is by means of a computer-based test since such evaluation can be administered quickly, automatically corrected, and the outcome known as soon as the test is completed. While the option of using a commercial CAT is available to institutions with the ability to pay substantial annual fees, or the means of passing these expenses on to their students, language instructors without these resources can only avail themselves of the advantages of CAT evaluation by creating their own tests.  As is demonstrated by the E-CAT project described in this paper, this is a viable alternative even for those lacking any computer programing expertise.  However, language teaching experience and testing expertise are critical to such an undertaking, which requires considerable effort and, above all, collaborative teamwork to succeed. A number of practical skills are also required. Firstly, the operation of a CAT authoring programme must be learned. Once this is done, test makers must master the art of creating a question database and assigning difficulty levels to test items. Lastly, if multimedia resources are to be exploited in a CAT, test creators need to be able to locate suitable copyright-free resources and re-edit them as needed.

  15. Computer adaptive test performance in children with and without disabilities: prospective field study of the PEDI-CAT

    NARCIS (Netherlands)

    Dumas, H.M.; Fragala-Pinkham, M.A.; Haley, S.M.; Ni, P.; Coster, W.; Kramer, J.M.; Kao, Y.C.; Moed, R.; Ludlow, L.H.

    2012-01-01

    PURPOSE: To examine the discriminant validity, test-retest reliability, administration time and acceptability of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT). METHODS: A sample of 102 parents of children 3 through 20 years of age with (n = 50) and without (n =

  16. Validation of a computer-adaptive test to evaluate generic health-related quality of life

    Directory of Open Access Journals (Sweden)

    Zardaín Pilar C

    2010-12-01

    Full Text Available Abstract Background Health Related Quality of Life (HRQoL is a relevant variable in the evaluation of health outcomes. Questionnaires based on Classical Test Theory typically require a large number of items to evaluate HRQoL. Computer Adaptive Testing (CAT can be used to reduce tests length while maintaining and, in some cases, improving accuracy. This study aimed at validating a CAT based on Item Response Theory (IRT for evaluation of generic HRQoL: the CAT-Health instrument. Methods Cross-sectional study of subjects aged over 18 attending Primary Care Centres for any reason. CAT-Health was administered along with the SF-12 Health Survey. Age, gender and a checklist of chronic conditions were also collected. CAT-Health was evaluated considering: 1 feasibility: completion time and test length; 2 content range coverage, Item Exposure Rate (IER and test precision; and 3 construct validity: differences in the CAT-Health scores according to clinical variables and correlations between both questionnaires. Results 396 subjects answered CAT-Health and SF-12, 67.2% females, mean age (SD 48.6 (17.7 years. 36.9% did not report any chronic condition. Median completion time for CAT-Health was 81 seconds (IQ range = 59-118 and it increased with age (p Conclusions Although domain-specific CATs exist for various areas of HRQoL, CAT-Health is one of the first IRT-based CATs designed to evaluate generic HRQoL and it has proven feasible, valid and efficient, when administered to a broad sample of individuals attending primary care settings.

  17. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF...

  18. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    Science.gov (United States)

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  19. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  20. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  1. Statistical Indexes for Monitoring Item Behavior under Computer Adaptive Testing Environment.

    Science.gov (United States)

    Zhu, Renbang; Yu, Feng; Liu, Su

    A computerized adaptive test (CAT) administration usually requires a large supply of items with accurately estimated psychometric properties, such as item response theory (IRT) parameter estimates, to ensure the precision of examinee ability estimation. However, an estimated IRT model of a given item in any given pool does not always correctly…

  2. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  3. The use of computer adaptive tests in outcome assessments following upper limb trauma.

    Science.gov (United States)

    Jayakumar, P; Overbeek, C; Vranceanu, A-M; Williams, M; Lamb, S; Ring, D; Gwilym, S

    2018-06-01

    Aims Outcome measures quantifying aspects of health in a precise, efficient, and user-friendly manner are in demand. Computer adaptive tests (CATs) may overcome the limitations of established fixed scales and be more adept at measuring outcomes in trauma. The primary objective of this review was to gain a comprehensive understanding of the psychometric properties of CATs compared with fixed-length scales in the assessment of outcome in patients who have suffered trauma of the upper limb. Study designs, outcome measures and methodological quality are defined, along with trends in investigation. Materials and Methods A search of multiple electronic databases was undertaken on 1 January 2017 with terms related to "CATs", "orthopaedics", "trauma", and "anatomical regions". Studies involving adults suffering trauma to the upper limb, and undergoing any intervention, were eligible. Those involving the measurement of outcome with any CATs were included. Identification, screening, and eligibility were undertaken, followed by the extraction of data and quality assessment using the Consensus-Based Standards for the Selection of Health Measurement Instruments (COSMIN) criteria. The review is reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria and reg istered (PROSPERO: CRD42016053886). Results A total of 31 studies reported trauma conditions alone, or in combination with non-traumatic conditions using CATs. Most were cross-sectional with varying level of evidence, number of patients, type of study, range of conditions and methodological quality. CATs correlated well with fixed scales and had minimal or no floor-ceiling effects. They required significantly fewer questions and/or less time for completion. Patient-Reported Outcomes Measurement Information System (PROMIS) CATs were the most frequently used, and the use of CATs is increasing. Conclusion Early studies show valid and reliable outcome measurement with CATs

  4. Validity of Cognitive ability tests – comparison of computerized adaptive testing with paper and pencil and computer-based forms of administrations

    Czech Academy of Sciences Publication Activity Database

    Žitný, P.; Halama, P.; Jelínek, Martin; Květon, Petr

    2012-01-01

    Roč. 54, č. 3 (2012), s. 181-194 ISSN 0039-3320 R&D Projects: GA ČR GP406/09/P284 Institutional support: RVO:68081740 Keywords : item response theory * computerized adaptive testing * paper and pencil * computer-based * criterion and construct validity * efficiency Subject RIV: AN - Psychology Impact factor: 0.215, year: 2012

  5. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  6. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    Science.gov (United States)

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  7. Adaptation, testing and application of the two-dimensional FE computer program system for steam generator tube testing

    International Nuclear Information System (INIS)

    Betzold, K.

    1987-01-01

    The 2d-FE computing program system, taken over by EPRI, is used for the improvement of the eddy current test of steam generator heating tubes. The investigations focus on test tasks in the area of the tube plate and the scrap mark; among them: accumulation of mud in the cracking area and above the tube plate; circulating slots with and without accumulation of mud. The interaction of the factors of influence given by the test object and the parameters selectable by the tester as for example coil length and base space for absolute coils and differential coils as well as test frequencies are calculated and the form of the signal locus curves and the dynamic curves are listed in a sample catalogue. It is demonstrated with selected examples that the sample catalogue contributes to the test-specific design of the coil and to the choice of the test frequencies; interpretation of measured signals; deepening of the knowledge of the physical processe in eddy current tests. (orig./HP) [de

  8. Hybrid GPU-CPU adaptive precision ray-triangle intersection tests for robust high-performance GPU dosimetry computations

    International Nuclear Information System (INIS)

    Perrotte, Lancelot; Bodin, Bruno; Chodorge, Laurent

    2011-01-01

    Before an intervention on a nuclear site, it is essential to study different scenarios to identify the less dangerous one for the operator. Therefore, it is mandatory to dispose of an efficient dosimetry simulation code with accurate results. One classical method in radiation protection is the straight-line attenuation method with build-up factors. In the case of 3D industrial scenes composed of meshes, the computation cost resides in the fast computation of all of the intersections between the rays and the triangles of the scene. Efficient GPU algorithms have already been proposed, that enable dosimetry calculation for a huge scene (800000 rays, 800000 triangles) in a fraction of second. But these algorithms are not robust: because of the rounding caused by floating-point arithmetic, the numerical results of the ray-triangle intersection tests can differ from the expected mathematical results. In worst case scenario, this can lead to a computed dose rate dramatically inferior to the real dose rate to which the operator is exposed. In this paper, we present a hybrid GPU-CPU algorithm to manage adaptive precision floating-point arithmetic. This algorithm allows robust ray-triangle intersection tests, with very small loss of performance (less than 5 % overhead), and without any need for scene-dependent tuning. (author)

  9. Development of an item bank for the EORTC Role Functioning Computer Adaptive Test (EORTC RF-CAT)

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Petersen, Morten Aa.; Aaronson, Neil

    2016-01-01

    a computer-adaptive test (CAT) for RF. This was part of a larger project whose objective is to develop a CAT version of the EORTC QLQ-C30 which is one of the most widely used HRQOL instruments in oncology. METHODS: In accordance with EORTC guidelines, the development of the RF-CAT comprised four phases...... with good psychometric properties. The resulting item bank exhibits excellent reliability (mean reliability = 0.85, median = 0.95). Using the RF-CAT may allow sample size savings from 11 % up to 50 % compared to using the QLQ-C30 RF scale. CONCLUSIONS: The RF-CAT item bank improves the precision...

  10. Development of a lack of appetite item bank for computer-adaptive testing (CAT)

    DEFF Research Database (Denmark)

    Thamsborg, Lise Laurberg Holst; Petersen, Morten Aa; Aaronson, Neil K

    2015-01-01

    to 12 lack of appetite items. CONCLUSIONS: Phases 1-3 resulted in 12 lack of appetite candidate items. Based on a field testing (phase 4), the psychometric characteristics of the items will be assessed and the final item bank will be generated. This CAT item bank is expected to provide precise...

  11. Measurement precision and efficiency of multidimensional computer adaptive testing of physical functioning using the pediatric evaluation of disability inventory.

    Science.gov (United States)

    Haley, Stephen M; Ni, Pengsheng; Ludlow, Larry H; Fragala-Pinkham, Maria A

    2006-09-01

    To compare the measurement efficiency and precision of a multidimensional computer adaptive testing (M-CAT) application to a unidimensional CAT (U-CAT) comparison using item bank data from 2 of the functional skills scales of the Pediatric Evaluation of Disability Inventory (PEDI). Using existing PEDI mobility and self-care item banks, we compared the stability of item calibrations and model fit between unidimensional and multidimensional Rasch models and compared the efficiency and precision of the U-CAT- and M-CAT-simulated assessments to a random draw of items. Pediatric rehabilitation hospital and clinics. Clinical and normative samples. Not applicable. Not applicable. The M-CAT had greater levels of precision and efficiency than the separate mobility and self-care U-CAT versions when using a similar number of items for each PEDI subdomain. Equivalent estimation of mobility and self-care scores can be achieved with a 25% to 40% item reduction with the M-CAT compared with the U-CAT. M-CAT applications appear to have both precision and efficiency advantages compared with separate U-CAT assessments when content subdomains have a high correlation. Practitioners may also realize interpretive advantages of reporting test score information for each subdomain when separate clinical inferences are desired.

  12. Implementation of an Improved Adaptive Testing Theory

    Science.gov (United States)

    Al-A'ali, Mansoor

    2007-01-01

    Computer adaptive testing is the study of scoring tests and questions based on assumptions concerning the mathematical relationship between examinees' ability and the examinees' responses. Adaptive student tests, which are based on item response theory (IRT), have many advantages over conventional tests. We use the least square method, a…

  13. A Computer Adaptive Testing Version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT

    Science.gov (United States)

    Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.

    2017-01-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387

  14. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    Science.gov (United States)

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    Science.gov (United States)

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  16. Computer controlled testing of batteries

    NARCIS (Netherlands)

    Kuiper, A.C.J.; Einerhand, R.E.F.; Visscher, W.

    1989-01-01

    A computerized testing device for batteries consists of a power supply, a multiplexer circuit connected to the batteries, a protection circuit, and an IBM Data Aquisition and Control Adapter card, connected to a personal computer. The software is written in Turbo-Pascal and can be easily adapted to

  17. Redefining diagnostic symptoms of depression using Rasch analysis: testing an item bank suitable for DSM-V and computer adaptive testing.

    Science.gov (United States)

    Mitchell, Alex J; Smith, Adam B; Al-salihy, Zerak; Rahim, Twana A; Mahmud, Mahmud Q; Muhyaldin, Asma S

    2011-10-01

    We aimed to redefine the optimal self-report symptoms of depression suitable for creation of an item bank that could be used in computer adaptive testing or to develop a simplified screening tool for DSM-V. Four hundred subjects (200 patients with primary depression and 200 non-depressed subjects), living in Iraqi Kurdistan were interviewed. The Mini International Neuropsychiatric Interview (MINI) was used to define the presence of major depression (DSM-IV criteria). We examined symptoms of depression using four well-known scales delivered in Kurdish. The Partial Credit Model was applied to each instrument. Common-item equating was subsequently used to create an item bank and differential item functioning (DIF) explored for known subgroups. A symptom level Rasch analysis reduced the original 45 items to 24 items of the original after the exclusion of 21 misfitting items. A further six items (CESD13 and CESD17, HADS-D4, HADS-D5 and HADS-D7, and CDSS3 and CDSS4) were removed due to misfit as the items were added together to form the item bank, and two items were subsequently removed following the DIF analysis by diagnosis (CESD20 and CDSS9, both of which were harder to endorse for women). Therefore the remaining optimal item bank consisted of 17 items and produced an area under the curve (AUC) of 0.987. Using a bank restricted to the optimal nine items revealed only minor loss of accuracy (AUC = 0.989, sensitivity 96%, specificity 95%). Finally, when restricted to only four items accuracy was still high (AUC was still 0.976; sensitivity 93%, specificity 96%). An item bank of 17 items may be useful in computer adaptive testing and nine or even four items may be used to develop a simplified screening tool for DSM-V major depressive disorder (MDD). Further examination of this item bank should be conducted in different cultural settings.

  18. Cross-cultural development of an item list for computer-adaptive testing of fatigue in oncological patients

    DEFF Research Database (Denmark)

    Giesinger, Johannes M.; Petersen, Morten Aa.; Grønvold, Mogens

    2011-01-01

    Within an ongoing project of the EORTC Quality of Life Group, we are developing computerized adaptive test (CAT) measures for the QLQ-C30 scales. These new CAT measures are conceptualised to reflect the same constructs as the QLQ-C30 scales. Accordingly, the Fatigue-CAT is intended to capture phy...... physical and general fatigue....

  19. Construct validity of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT) in children with medical complexity.

    Science.gov (United States)

    Dumas, Helene M; Fragala-Pinkham, Maria A; Rosen, Elaine L; O'Brien, Jane E

    2017-11-01

    To assess construct (convergent and divergent) validity of the Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) in a sample of children with complex medical conditions. Demographics, clinical information, PEDI-CAT normative score, and the Post-Acute Acuity Rating for Children (PAARC) level were collected for all post-acute hospital admissions (n = 110) from 1 April 2015 to 1 March 2016. Correlations between the PEDI-CAT Daily Activities, Mobility, and Social/Cognitive domain scores for the total sample and across three age groups (infant, preschool, and school-age) were calculated. Differences in mean PEDI-CAT scores for each domain across two groups, children with "Less Complexity," or "More Complexity" based on PAARC level were examined. All correlations for the total sample and age subgroups were statistically significant and trends across age groups were evident with the stronger associations between domains for the infant group. Significant differences were found between mean PEDI-CAT Daily Activities, Mobility, and Social/Cognitive normative scores across the two complexity groups with children in the "Less Complex" group having higher PEDI-CAT scores for all domains. This study provides evidence indicating the PEDI-CAT can be used with confidence in capturing and differentiating children's level of function in a post-acute care setting. Implications for Rehabilitation The PEDI-CAT is measure of function for children with a variety of conditions and can be used in any clinical setting. Convergent validity of the PEDI-CAT's Daily Activities, Mobility, and Social/Cognitive domains was significant and particularly strong for infants and young children with medical complexity. The PEDI-CAT was able to discriminate groups of children with differing levels of medical complexity admitted to a pediatric post-acute care hospital.

  20. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  1. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  2. On-Line Testing and Reconfiguration of Field Programmable Gate Arrays (FPGAs) for Fault-Tolerant (FT) Applications in Adaptive Computing Systems (ACS)

    National Research Council Canada - National Science Library

    Abramovici, Miron

    2002-01-01

    Adaptive computing systems (ACS) rely on reconfigurable hardware to adapt the system operation to changes in the external environment, and to extend mission capability by implementing new functions on the same hardware platform...

  3. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  4. Bayesian item selection criteria for adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1996-01-01

    R.J. Owen (1975) proposed an approximate empirical Bayes procedure for item selection in adaptive testing. The procedure replaces the true posterior by a normal approximation with closed-form expressions for its first two moments. This approximation was necessary to minimize the computational

  5. Towards psychologically adaptive brain-computer interfaces

    Science.gov (United States)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  6. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang

    2010-10-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting the changes in a grid system can help to alarm the anomalies, clean the noises, and report the new patterns. In this paper, we proposed an approach of self-adaptive change detection based on the Page-Hinkley statistic test. It handles the non-stationary distribution without the assumption of data distribution and the empirical setting of parameters. We validate the approach on the EGEE streaming jobs, and report its better performance on achieving higher accuracy comparing to the other change detection methods. Meanwhile this change detection process could help to discover the device fault which was not claimed in the system logs. © 2010 IEEE.

  7. Turning the Page on Pen-and-Paper Questionnaires: Combining Ecological Momentary Assessment and Computer Adaptive Testing to Transform Psychological Assessment in the 21st Century.

    Science.gov (United States)

    Gibbons, Chris J

    2016-01-01

    The current paper describes new opportunities for patient-centred assessment methods which have come about by the increased adoption of affordable smart technologies in biopsychosocial research and medical care. In this commentary, we review modern assessment methods including item response theory (IRT), computer adaptive testing (CAT), and ecological momentary assessment (EMA) and explain how these methods may be combined to improve psychological assessment. We demonstrate both how a 'naïve' selection of a small group of items in an EMA can lead to unacceptably unreliable assessments and how IRT can provide detailed information on the individual information that each item gives thus allowing short form assessments to be selected with acceptable reliability. The combination of CAT and IRT can ensure assessments are precise, efficient, and well targeted to the individual; allowing EMAs to be both brief and accurate.

  8. Psychometrics behind Computerized Adaptive Testing.

    Science.gov (United States)

    Chang, Hua-Hua

    2015-03-01

    The paper provides a survey of 18 years' progress that my colleagues, students (both former and current) and I made in a prominent research area in Psychometrics-Computerized Adaptive Testing (CAT). We start with a historical review of the establishment of a large sample foundation for CAT. It is worth noting that the asymptotic results were derived under the framework of Martingale Theory, a very theoretical perspective of Probability Theory, which may seem unrelated to educational and psychological testing. In addition, we address a number of issues that emerged from large scale implementation and show that how theoretical works can be helpful to solve the problems. Finally, we propose that CAT technology can be very useful to support individualized instruction on a mass scale. We show that even paper and pencil based tests can be made adaptive to support classroom teaching.

  9. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  10. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    Science.gov (United States)

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residualsLD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  11. Validation of Patient Reported Outcomes Measurement Information System (PROMIS) Computer Adaptive Tests (CATs) in the Surgical Treatment of Lumbar Spinal Stenosis.

    Science.gov (United States)

    Patel, Alpesh A; Dodwad, Shah-Nawaz M; Boody, Barrett S; Bhatt, Surabhi; Savage, Jason W; Hsu, Wellington K; Rothrock, Nan E

    2018-03-19

    Prospective, cohort study. Demonstrate validity of PROMIS physical function, pain interference, and pain behavior computer adaptive tests (CATs) in surgically treated lumbar stenosis patients. There has been increasing attention given to patient reported outcomes associated with spinal interventions. Historical patient outcome measures have inadequate validation, demonstrate floor/ceiling effects, and infrequently used due to time constraints. PROMIS is an adaptive, responsive NIH assessment tool that measures patient-reported health status. 98 consecutive patients were surgically treated for lumbar spinal stenosis and were assessed using PROMIS CATs, ODI, ZCQ and SF-12. Prior lumbar surgery, history of scoliosis, cancer, trauma, or infection were excluded. Completion time, preoperative assessment, 6 week and 3 month postoperative scores were collected. At baseline, 49%, 79%, and 81% of patients had PROMIS PB, PI, and PF scores greater than 1 SD worse than the general population. 50.6% were categorized as severely disabled, crippled, or bed bound by ODI. PROMIS CATs demonstrated convergent validity through moderate to high correlations with legacy measures (r = 0.35-0.73). PROMIS CATs demonstrated known groups validity when stratified by ODI levels of disability. ODI improvements of at least 10 points on average had changes in PROMIS scores in the expected direction (PI = -12.98, PB = -9.74, PF = 7.53). PROMIS CATs demonstrated comparable responsiveness to change when evaluated against legacy measures. PROMIS PB and PI decreased 6.66 and 9.62 and PROMIS PF increased 6.8 points between baseline and 3-months post-op (p validity, known groups validity, and responsiveness for surgically treated patients with lumbar stenosis to detect change over time and are more efficient than legacy instruments. 2.

  12. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  13. Extended shadow test approach for constrained adaptive testing

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Ariel, A.

    2002-01-01

    Several methods have been developed for use on constrained adaptive testing. Item pool partitioning, multistage testing, and testlet-based adaptive testing are methods that perform well for specific cases of adaptive testing. The weighted deviation model and the Shadow Test approach can be more

  14. Wavefront measurement using computational adaptive optics.

    Science.gov (United States)

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  15. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  16. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    , we are improving computational performance by exploiting modern hardware features, such as dynamic voltage-frequency scaling and transactional memory. Adapting software is an iterative process, requiring that we continually revisit it to meet new requirements or realities; a time consuming process......The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen...

  17. Computers in Language Testing: Present Research and Some Future Directions.

    Science.gov (United States)

    Brown, James Dean

    1997-01-01

    Explores recent developments in the use of computers in language testing in four areas: (1) item banking; (2) computer-assisted language testing; (3) computerized-adaptive language testing; and (4) research on the effectiveness of computers in language testing. Examines educational measurement literature in an attempt to forecast the directions…

  18. Rotationally Adaptive Flight Test Surface

    Science.gov (United States)

    Barrett, Ron

    1999-01-01

    Research on a new design of flutter exciter vane using adaptive materials was conducted. This novel design is based on all-moving aerodynamic surface technology and consists of a structurally stiff main spar, a series of piezoelectric actuator elements and an aerodynamic shell which is pivoted around the main spar. The work was built upon the current missile-type all-moving surface designs and change them so they are better suited for flutter excitation through the transonic flight regime. The first portion of research will be centered on aerodynamic and structural modeling of the system. USAF DatCom and vortex lattice codes was used to capture the fundamental aerodynamics of the vane. Finite element codes and laminated plate theory and virtual work analyses will be used to structurally model the aerodynamic vane and wing tip. Following the basic modeling, a flutter test vane was designed. Each component within the structure was designed to meet the design loads. After the design loads are met, then the deflections will be maximized and the internal structure will be laid out. In addition to the structure, a basic electrical control network will be designed which will be capable of driving a scaled exciter vane. The third and final stage of main investigation involved the fabrication of a 1/4 scale vane. This scaled vane was used to verify kinematics and structural mechanics theories on all-moving actuation. Following assembly, a series of bench tests was conducted to determine frequency response, electrical characteristics, mechanical and kinematic properties. Test results indicate peak-to-peak deflections of 1.1 deg with a corner frequency of just over 130 Hz.

  19. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  20. A stereotactic adapter compatible with computed tomography

    International Nuclear Information System (INIS)

    Cacak, R.K.; Law, J.D.

    1982-01-01

    One application of computed-tomographic (CT) scanners is the localization of intracranial targets for stereotactic surgery. Unfortunately, conventional stereotactic devices affixed to the patient cause artifacts which obscure anatomic features in CT images. The authors describe the initial phase of a project to eliminate this problem by using an adapter that is free of metallic objects. Localization of the target point relative to the coordinate system of a Leksell stereotactic frame is achieved from CT image measurements

  1. Computerized adaptive testing item selection in computerized adaptive learning systems

    NARCIS (Netherlands)

    Eggen, Theodorus Johannes Hendrikus Maria; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item selection methods traditionally developed for computerized adaptive testing (CAT) are explored for their usefulness in item-based computerized adaptive learning (CAL) systems. While in CAT Fisher information-based selection is optimal, for recovering learning populations in CAL systems item

  2. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  3. Flight Test Approach to Adaptive Control Research

    Science.gov (United States)

    Pavlock, Kate Maureen; Less, James L.; Larson, David Nils

    2011-01-01

    The National Aeronautics and Space Administration s Dryden Flight Research Center completed flight testing of adaptive controls research on a full-scale F-18 testbed. The validation of adaptive controls has the potential to enhance safety in the presence of adverse conditions such as structural damage or control surface failures. This paper describes the research interface architecture, risk mitigations, flight test approach and lessons learned of adaptive controls research.

  4. Development of the adaptive music perception test.

    Science.gov (United States)

    Kirchberger, Martin J; Russo, Frank A

    2015-01-01

    Despite vast amounts of research examining the influence of hearing loss on speech perception, comparatively little is known about its influence on music perception. No standardized test exists to quantify music perception of hearing-impaired (HI) persons in a clinically practical manner. This study presents the Adaptive Music Perception (AMP) test as a tool to assess important aspects of music perception with hearing loss. A computer-driven test was developed to determine the discrimination thresholds of 10 low-level physical dimensions (e.g., duration, level) in the context of perceptual judgments about musical dimensions: meter, harmony, melody, and timbre. In the meter test, the listener is asked to judge whether a tone sequence is duple or triple in meter. The harmony test requires that the listener make judgments about the stability of the chord sequences. In the melody test, the listener must judge whether a comparison melody is the same as a standard melody when presented in transposition and in the context of a chordal accompaniment that serves as a mask. The timbre test requires that the listener determines which of two comparison tones is different in timbre from a standard tone (ABX design). Twenty-one HI participants and 19 normal-hearing (NH) participants were recruited to carry out the music tests. Participants were tested twice on separate occasions to evaluate test-retest reliability. The HI group had significantly higher discrimination thresholds than the NH group in 7 of the 10 low-level physical dimensions: frequency discrimination in the meter test, dissonance and intonation perception in the harmony test, melody-to-chord ratio for both melody types in the melody test, and the perception of brightness and spectral irregularity in the timbre test. Small but significant improvement between test and retest was observed in three dimensions: frequency discrimination (meter test), dissonance (harmony test), and attack length (timbre test). All other

  5. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    2001-01-01

    A model for constrained computerized adaptive testing is proposed in which the information on the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  6. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    1997-01-01

    A model for constrained computerized adaptive testing is proposed in which the information in the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  7. Study maps as a tool for the adaptive tests construction

    Directory of Open Access Journals (Sweden)

    Dita Dlabolová

    2013-01-01

    Full Text Available Measurement of students’ knowledge is an essential part of the educational process. Teachers on universities often use computer-based tests to testing a large number of students in a short time. The question is, what kind of information these tests provide, and if it is possible to classify students on this basis. Praxis shows that the scalar test results in the form of simple numbers cannot be plainly interpreted as the level of knowledge; moreover it is not easy to build such tests, which detect the necessary information. In the first part of the article we present the results of pedagogical experiment focused on the difference between information obtained through the computer-based test and a teacher’s interview with the same students. Possible starting point to improve information from computer-based tests in non-scalar form is a construction of an adaptive test, adapting test items to identify knowledge similar to a conversation with a teacher. As a tool for the design of the adaptive tests we use so called study maps, which are described in the second part of the article.

  8. Microcomputer Network for Computerized Adaptive Testing (CAT)

    Science.gov (United States)

    1984-03-01

    PRDC TR 84-33 \\Q.�d-33- \\ MICROCOMPUTER NETWOJlt FOR COMPUTERIZED ADAPTIVE TESTING ( CAT ) Baldwin Quan Thomas A . Park Gary Sandahl John H...ACCEIIION NO NPRDC TR 84-33 4. TITLE (-d Sul>tlllo) MICROCOMP UTER NETWORK FOR COMPUTERIZED ADA PTIVE TESTING ( CAT ) 1. Q B. uan T. A . Park...adaptive testing ( CAT ) Bayesian sequential testing 20. ABSTitACT (Continuo on ro•••• aide II noco .. _, _., ld-tlly ,.,. t.loclt _._.) DO Computerized

  9. Intricacies of Feedback in Computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    Prism Adaptation Therapy (PAT) is an intervention method for treatment of attentional disorders, such as neglect e.g. 1,2. The method involves repeated pointing at specified targets with or without prism glasses using a specifically designed wooden box. The aim of this study was to ascertain...... whether the PAT method can be executed with similar effect using a computer with a touch screen.   62 healthy subjects were subjected to two experimental conditions: 1) pointing out at targets using the original box, 2) pointing out at targets on a computer attached touch screen. In both conditions......, the subjects performed a pre-test consisting of 30 targets without feedback, then an exposure-test of 90 targets with prism glasses and feedback, and finally a post-test of 60 targets, with no glasses and no feedback. Two experiments were carried out, 1) the feedback was provided by showing a cross...

  10. ICAN Computer Code Adapted for Building Materials

    Science.gov (United States)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  11. Individual Differences in Computerized Adaptive Testing.

    Science.gov (United States)

    Kim, JinGyu

    Research on the major computerized adaptive testing (CAT) strategies is reviewed, and some findings are reported that examine effects of examinee demographic and psychological characteristics on CAT strategies. In fixed branching strategies, all examinees respond to a common routing test, the score of which is used to assign examinees to a…

  12. Self-Testing Computer Memory

    Science.gov (United States)

    Chau, Savio, N.; Rennels, David A.

    1988-01-01

    Memory system for computer repeatedly tests itself during brief, regular interruptions of normal processing of data. Detects and corrects transient faults as single-event upsets (changes in bits due to ionizing radiation) within milliseconds after occuring. Self-testing concept surpasses conventional by actively flushing latent defects out of memory and attempting to correct before accumulating beyond capacity for self-correction or detection. Cost of improvement modest increase in complexity of circuitry and operating time.

  13. The Role of Item Feedback in Self-Adapted Testing.

    Science.gov (United States)

    Roos, Linda L.; And Others

    1997-01-01

    The importance of item feedback in self-adapted testing was studied by comparing feedback and no feedback conditions for computerized adaptive tests and self-adapted tests taken by 363 college students. Results indicate that item feedback is not necessary to realize score differences between self-adapted and computerized adaptive testing. (SLD)

  14. Testlet-Based Multidimensional Adaptive Testing.

    Science.gov (United States)

    Frey, Andreas; Seitz, Nicki-Nils; Brandt, Steffen

    2016-01-01

    Multidimensional adaptive testing (MAT) is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT). MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, and 1.5) and testlet sizes (3, 6, and 9 items) with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  15. Testlet-based Multidimensional Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Andreas Frey

    2016-11-01

    Full Text Available Multidimensional adaptive testing (MAT is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT. MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, 1.5 and testlet sizes (3 items, 6 items, 9 items with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  16. Adaptation of HAMMER computer code to CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Pinheiro, A.M.B.S.; Nair, R.P.K.

    1982-01-01

    The adaptation of HAMMER computer code to CYBER 170/750 computer is presented. The HAMMER code calculates cell parameters by multigroup transport theory and reactor parameters by few group diffusion theory. The auxiliary programs, the carried out modifications and the use of HAMMER system adapted to CYBER 170/750 computer are described. (M.C.K.) [pt

  17. The EORTC emotional functioning computerized adaptive test

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Grønvold, Mogens; Petersen, Morten Aa

    2014-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is currently developing computerized adaptive testing measures for the Quality of Life Questionnaire Core-30 (QLQ-C30) scales. The work presented here describes the development of an EORTC item bank for e...... for emotional functioning (EF), which is one of the core domains of the QLQ-C30....

  18. Discrete linear canonical transform computation by adaptive method.

    Science.gov (United States)

    Zhang, Feng; Tao, Ran; Wang, Yue

    2013-07-29

    The linear canonical transform (LCT) describes the effect of quadratic phase systems on a wavefield and generalizes many optical transforms. In this paper, the computation method for the discrete LCT using the adaptive least-mean-square (LMS) algorithm is presented. The computation approaches of the block-based discrete LCT and the stream-based discrete LCT using the LMS algorithm are derived, and the implementation structures of these approaches by the adaptive filter system are considered. The proposed computation approaches have the inherent parallel structures which make them suitable for efficient VLSI implementations, and are robust to the propagation of possible errors in the computation process.

  19. Unauthorised adaptation of computer programmes - is ...

    African Journals Online (AJOL)

    Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty) Ltd. This case note inter alia analyses the possibility of an author being sued for ...

  20. A New Adaptive Checkpointing Strategy for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    MENChaoguang; ZUODecheng; YANGXiaozong

    2005-01-01

    Adaptive checkpointing strategy is an efficient recovery scheme, which is suitable for mobile computing system. However, all existing adaptive checkpointing schemes are not correct to recover system when failure occurs in some special period. In this paper, the issues that will lead to system inconsistency are first discussed and then a new adaptive strategy that can recover system to correct consistent state is proposed. Our algorithm improves system recovery performance because only failure process needs rollback through logging.

  1. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  2. Statistical tests for person misfit in computerized adaptive testing

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Meijer, R.R.; van Krimpen-Stoop, Edith

    1998-01-01

    Recently, several person-fit statistics have been proposed to detect nonfitting response patterns. This study is designed to generalize an approach followed by Klauer (1995) to an adaptive testing system using the two-parameter logistic model (2PL) as a null model. The approach developed by Klauer

  3. Adaptive security protocol selection for mobile computing

    NARCIS (Netherlands)

    Pontes Soares Rocha, B.; Costa, D.N.O.; Moreira, R.A.; Rezende, C.G.; Loureiro, A.A.F.; Boukerche, A.

    2010-01-01

    The mobile computing paradigm has introduced new problems for application developers. Challenges include heterogeneity of hardware, software, and communication protocols, variability of resource limitations and varying wireless channel quality. In this scenario, security becomes a major concern for

  4. Adapting the Freiburg monosyllabic word test for Slovenian

    Directory of Open Access Journals (Sweden)

    Tatjana Marvin

    2017-12-01

    Full Text Available Speech audiometry is one of the standard methods used to diagnose the type of hearing loss and to assess the communication function of the patient by determining the level of the patient’s ability to understand and repeat words presented to him or her in a hearing test. For this purpose, the Slovenian adaptation of the German tests developed by Hahlbrock (1953, 1960 – the Freiburg Monosyllabic Word Test and the Freiburg Number Test – are used in Slovenia (adapted in 1968 by Pompe. In this paper we focus on the Freiburg Monosyllabic Word Test for Slovenian, which has been criticized by patients as well as in the literature for the unequal difficulty and frequency of the words, with many of these being extremely rare or even obsolete. As part of the patient’s communication function is retrieving the meaning of individual words by guessing, the less frequent and consequently less familiar words do not contribute to reliable testing results. We therefore adapt the test by identifying and removing such words and supplement them with phonetically similar words to preserve the phonetic balance of the list. The words used for replacement are extracted from the written corpus of Slovenian Gigafida and the spoken corpus of Slovenian GOS, while the optimal combinations of words are established by using computational algorithms.

  5. An Adaptive Test Sheet Generation Mechanism Using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Huan-Yu Lin

    2012-01-01

    Full Text Available For test-sheet composition systems, it is important to adaptively compose test sheets with diverse conceptual scopes, discrimination and difficulty degrees to meet various assessment requirements during real learning situations. Computation time and item exposure rate also influence performance and item bank security. Therefore, this study proposes an Adaptive Test Sheet Generation (ATSG mechanism, where a Candidate Item Selection Strategy adaptively determines candidate test items and conceptual granularities according to desired conceptual scopes, and an Aggregate Objective Function applies Genetic Algorithm (GA to figure out the approximate solution of mixed integer programming problem for the test-sheet composition. Experimental results show that the ATSG mechanism can efficiently, precisely generate test sheets to meet the various assessment requirements than existing ones. Furthermore, according to experimental finding, Fractal Time Series approach can be applied to analyze the self-similarity characteristics of GA’s fitness scores for improving the quality of the test-sheet composition in the near future.

  6. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Science.gov (United States)

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  7. Considerations about expected a posteriori estimation in adaptive testing: adaptive a priori, adaptive correction for bias, and adaptive integration interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    2009-01-01

    In a computerized adaptive test, we would like to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Unfortunately, decreasing the number of items is accompanied by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. The authors suggest that it is possible to reduced the bias, and even the standard error of the estimate, by applying to each provisional estimation one or a combination of the following strategies: adaptive correction for bias proposed by Bock and Mislevy (1982), adaptive a priori estimate, and adaptive integration interval.

  8. On the issue of item selection in computerized adaptive testing with response times

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2016-01-01

    Many standardized tests are now administered via computer rather than paper-and-pencil format. The computer-based delivery mode brings with it certain advantages. One advantage is the ability to adapt the difficulty level of the test to the ability level of the test taker in what has been termed

  9. Quinoa - Adaptive Computational Fluid Dynamics, 0.2

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter, an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh

  10. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  11. Unauthorised adaptation of computer programmes - is criminalisation a solution?

    Directory of Open Access Journals (Sweden)

    L Muswaka

    2011-12-01

    Full Text Available In Haupt t/a Softcopy v Brewers Marketing Intelligence (Pty Ltd 2006 4 SA 458 (SCA Haupt sought to enforce a copyright claim in the Data Explorer computer programme against Brewers Marketing Intelligence (Pty Ltd. His claim was dismissed in the High Court and he appealed to the Supreme Court of Appeal. The Court held that copyright in the Data Explorer programme vested in Haupt. Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty Ltd.This case note inter alia analyses the possibility of an author being sued for infringement even though he has acquired copyright in a work that he created by making unauthorised adaptations to another's copyright material. Furthermore, it examines whether or not the law adequately protects copyright owners in situations where infringement takes the form of unauthorised adaptations of computer programmes. It is argued that the protection afforded by the Copyright Act 98 of 1978 (Copyright Act in terms of section 27(1 to copyright owners of computer programmes is narrowly defined. It excludes from its ambit of criminal liability the act of making unauthorised adaptation of computer programmes. The issue that is considered is therefore whether or not the unauthorised adaptation of computer programmes should attract a criminal sanction. In addressing this issue and with the aim of making recommendations, the legal position in the United Kingdom (UK is analysed. From the analysis it is recommended that the Copyright Act be amended by the insertion of a new section, section 27(1(A, which will make the act of making an unauthorised adaptation of a computer programme an offence. This recommended section will close the gap that currently exists in our law with regard to unauthorised adaptations of computer programmes.

  12. Computer prediction of subsurface radionuclide transport: an adaptive numerical method

    International Nuclear Information System (INIS)

    Neuman, S.P.

    1983-01-01

    Radionuclide transport in the subsurface is often modeled with the aid of the advection-dispersion equation. A review of existing computer methods for the solution of this equation shows that there is need for improvement. To answer this need, a new adaptive numerical method is proposed based on an Eulerian-Lagrangian formulation. The method is based on a decomposition of the concentration field into two parts, one advective and one dispersive, in a rigorous manner that does not leave room for ambiguity. The advective component of steep concentration fronts is tracked forward with the aid of moving particles clustered around each front. Away from such fronts the advection problem is handled by an efficient modified method of characteristics called single-step reverse particle tracking. When a front dissipates with time, its forward tracking stops automatically and the corresponding cloud of particles is eliminated. The dispersion problem is solved by an unconventional Lagrangian finite element formulation on a fixed grid which involves only symmetric and diagonal matrices. Preliminary tests against analytical solutions of ne- and two-dimensional dispersion in a uniform steady state velocity field suggest that the proposed adaptive method can handle the entire range of Peclet numbers from 0 to infinity, with Courant numbers well in excess of 1

  13. The validation of a computer-adaptive test (CAT) for assessing health-related quality of life in children and adolescents in a clinical sample: study design, methods and first results of the Kids-CAT study.

    Science.gov (United States)

    Barthel, D; Otto, C; Nolte, S; Meyrose, A-K; Fischer, F; Devine, J; Walter, O; Mierke, A; Fischer, K I; Thyen, U; Klein, M; Ankermann, T; Rose, M; Ravens-Sieberer, U

    2017-05-01

    Recently, we developed a computer-adaptive test (CAT) for assessing health-related quality of life (HRQoL) in children and adolescents: the Kids-CAT. It measures five generic HRQoL dimensions. The aims of this article were (1) to present the study design and (2) to investigate its psychometric properties in a clinical setting. The Kids-CAT study is a longitudinal prospective study with eight measurements over one year at two University Medical Centers in Germany. For validating the Kids-CAT, 270 consecutive 7- to 17-year-old patients with asthma (n = 52), diabetes (n = 182) or juvenile arthritis (n = 36) answered well-established HRQoL instruments (Pediatric Quality of Life Inventory™ (PedsQL), KIDSCREEN-27) and scales measuring related constructs (e.g., social support, self-efficacy). Measurement precision, test-retest reliability, convergent and discriminant validity were investigated. The mean standard error of measurement ranged between .38 and .49 for the five dimensions, which equals a reliability between .86 and .76, respectively. The Kids-CAT measured most reliably in the lower HRQoL range. Convergent validity was supported by moderate to high correlations of the Kids-CAT dimensions with corresponding PedsQL dimensions ranging between .52 and .72. A lower correlation was found between the social dimensions of both instruments. Discriminant validity was confirmed by lower correlations with non-corresponding subscales of the PedsQL. The Kids-CAT measures pediatric HRQoL reliably, particularly in lower areas of HRQoL. Its test-retest reliability should be re-investigated in future studies. The validity of the instrument was demonstrated. Overall, results suggest that the Kids-CAT is a promising candidate for detecting psychosocial needs in chronically ill children.

  14. Implementation of the Kids-CAT in clinical settings: a newly developed computer-adaptive test to facilitate the assessment of patient-reported outcomes of children and adolescents in clinical practice in Germany.

    Science.gov (United States)

    Barthel, D; Fischer, K I; Nolte, S; Otto, C; Meyrose, A-K; Reisinger, S; Dabs, M; Thyen, U; Klein, M; Muehlan, H; Ankermann, T; Walter, O; Rose, M; Ravens-Sieberer, U

    2016-03-01

    To describe the implementation process of a computer-adaptive test (CAT) for measuring health-related quality of life (HRQoL) of children and adolescents in two pediatric clinics in Germany. The study focuses on the feasibility and user experience with the Kids-CAT, particularly the patients' experience with the tool and the pediatricians' experience with the Kids-CAT Report. The Kids-CAT was completed by 312 children and adolescents with asthma, diabetes or rheumatoid arthritis. The test was applied during four clinical visits over a 1-year period. A feedback report with the test results was made available to the pediatricians. To assess both feasibility and acceptability, a multimethod research design was used. To assess the patients' experience with the tool, the children and adolescents completed a questionnaire. To assess the clinicians' experience, two focus groups were conducted with eight pediatricians. The children and adolescents indicated that the Kids-CAT was easy to complete. All pediatricians reported that the Kids-CAT was straightforward and easy to understand and integrate into clinical practice; they also expressed that routine implementation of the tool would be desirable and that the report was a valuable source of information, facilitating the assessment of self-reported HRQoL of their patients. The Kids-CAT was considered an efficient and valuable tool for assessing HRQoL in children and adolescents. The Kids-CAT Report promises to be a useful adjunct to standard clinical care with the potential to improve patient-physician communication, enabling pediatricians to evaluate and monitor their young patients' self-reported HRQoL.

  15. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    Science.gov (United States)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  16. Learning Words through Computer-Adaptive Tool

    DEFF Research Database (Denmark)

    Zhang, Chun

    2005-01-01

    construction, I stress the design of a test theory, namely, a learning algorithm. The learning algorithm is designed under such principles that users experience both 'elaborative rehearsal’ (aspects in receptive and productive learning) and 'expanding rehearsal, (memory-based learning and repetitive act...

  17. Features of the adaptive control and measuring the effectiveness of distant teaching to computer science

    Directory of Open Access Journals (Sweden)

    Евгений Игоревич Горюшкин

    2009-06-01

    Full Text Available In title approaches to construction of effective monitoring systems of productivity of training to computer science in high schools are described. It is offered to put adaptive testing at which in development of tests artificial neural networks are applied in a basis of such systems.

  18. CAT -- computer aided testing for resonant inspection

    International Nuclear Information System (INIS)

    Foley, David K.

    1998-01-01

    Application of computer technology relates to inspection and quality control. The computer aided testing (CAT) can be used to analyze various NDT technologies, such as eddy current, ultrasonics, and resonant inspection

  19. Sequential decision making in computational sustainability via adaptive submodularity

    Science.gov (United States)

    Krause, Andreas; Golovin, Daniel; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  20. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  1. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  2. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  3. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  4. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  5. An authoring tool for building both mobile adaptable tests and web-based adaptive or classic tests

    NARCIS (Netherlands)

    Romero, C.; Ventura, S.; Hervás, C.; De Bra, P.M.E.; Wade, V.; Ashman, H.; Smyth, B.

    2006-01-01

    This paper describes Test Editor, an authoring tool for building both mobile adaptable tests and web-based adaptive or classic tests. This tool facilitates the development and maintenance of different types of XML-based multiple- choice tests for using in web-based education systems and wireless

  6. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    Science.gov (United States)

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  7. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  8. Adaptive testing with equated number-correct scoring

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1999-01-01

    A constrained CAT algorithm is presented that automatically equates the number-correct scores on adaptive tests. The algorithm can be used to equate number-correct scores across different administrations of the same adaptive test as well as to an external reference test. The constraints are derived

  9. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  10. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    Science.gov (United States)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  11. Method and system for environmentally adaptive fault tolerant computing

    Science.gov (United States)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  12. An Adaptive Approach to Locating Mobile HIV Testing Services.

    Science.gov (United States)

    Gonsalves, Gregg S; Crawford, Forrest W; Cleary, Paul D; Kaplan, Edward H; Paltiel, A David

    2018-02-01

    Public health agencies suggest targeting "hotspots" to identify individuals with undetected HIV infection. However, definitions of hotspots vary. Little is known about how best to target mobile HIV testing resources. We conducted a computer-based tournament to compare the yield of 4 algorithms for mobile HIV testing. Over 180 rounds of play, the algorithms selected 1 of 3 hypothetical zones, each with unknown prevalence of undiagnosed HIV, in which to conduct a fixed number of HIV tests. The algorithms were: 1) Thompson Sampling, an adaptive Bayesian search strategy; 2) Explore-then-Exploit, a strategy that initially draws comparable samples from all zones and then devotes all remaining rounds of play to HIV testing in whichever zone produced the highest observed yield; 3) Retrospection, a strategy using only base prevalence information; and; 4) Clairvoyance, a benchmarking strategy that employs perfect information about HIV prevalence in each zone. Over 250 tournament runs, Thompson Sampling outperformed Explore-then-Exploit 66% of the time, identifying 15% more cases. Thompson Sampling's superiority persisted in a variety of circumstances examined in the sensitivity analysis. Case detection rates using Thompson Sampling were, on average, within 90% of the benchmark established by Clairvoyance. Retrospection was consistently the poorest performer. We did not consider either selection bias (i.e., the correlation between infection status and the decision to obtain an HIV test) or the costs of relocation to another zone from one round of play to the next. Adaptive methods like Thompson Sampling for mobile HIV testing are practical and effective, and may have advantages over other commonly used strategies.

  13. Design and Build an Adapter for Hearing Protector Test

    Directory of Open Access Journals (Sweden)

    Rostam Golmohammadi

    2016-06-01

    Full Text Available Introduction: To determine the effectiveness of hearing protective devices that lack the technical information are one of the major challenges of occupational health experts to judge the impact of this exposure on reducing the level of occupational exposure to noise. The aim of this study was to design a built a hearing test adapter and expriment it to determine the reduction rate of earmuffs and earplugs. Methods: Technical information in real environments and glass industries were Hamadan kitchen garden and guards to ensure exceptional performance test results were compared with computational methods. Results: The results of the testing of Personal hearing protection compared with the results in real industry environment and octave-band method, have shown good regrassions average operating transmission losses. Results showed that the average noise reduction between measured and calculations method for earmuffs 9.3, 8.8 dB and 9.3, 11.2 dB for earplugs respectively. Comparison of the tests, did not show significant differences between the results in tow methods (P>0.05. Conclusion: The results of the testing designed Adaptor for some hearing protectors showed that the valid tool for used to reduction rate teste of earmuffs and earplugs

  14. Implementing content constraints in alpha-stratified adaptive testing using a shadow test approach

    NARCIS (Netherlands)

    van der Linden, Willem J.; Chang, Hua-Hua

    2001-01-01

    The methods of alpha-stratified adaptive testing and constrained adaptive testing with shadow tests are combined in this study. The advantages are twofold. First, application of the shadow test allows the researcher to implement any type of constraint on item selection in alpha-stratified adaptive

  15. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  16. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  17. Computerized Adaptive Personality Testing: A Review and Illustration With the MMPI-2 Computerized Adaptive Version.

    Science.gov (United States)

    Forbey, Johnathan D.; Ben-Porath, Yossef S.

    2007-01-01

    Computerized adaptive testing in personality assessment can improve efficiency by significantly reducing the number of items administered to answer an assessment question. Two approaches have been explored for adaptive testing in computerized personality assessment: item response theory and the countdown method. In this article, the authors…

  18. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  19. Modifications of the branch-and-bound algorithm for application in constrained adaptive testing

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2000-01-01

    A mathematical programming approach is presented for computer adaptive testing (CAT) with many constraints on the item and test attributes. Because mathematical programming problems have to be solved while the examinee waits for the next item, a fast implementation of the Branch-and-Bound algorithm

  20. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  1. Item selection and ability estimation adaptive testing

    NARCIS (Netherlands)

    Pashley, Peter J.; van der Linden, Wim J.; van der Linden, Willem J.; Glas, Cornelis A.W.; Glas, Cees A.W.

    2010-01-01

    The last century saw a tremendous progression in the refinement and use of standardized linear tests. The first administered College Board exam occurred in 1901 and the first Scholastic Assessment Test (SAT) was given in 1926. Since then, progressively more sophisticated standardized linear tests

  2. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei; Guyet, Thomas; Quiniou, René ; Cordier, Marie-Odile; Masseglia, Florent; Zhang, Xiangliang

    2014-01-01

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  3. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  4. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. QUASI-RANDOM TESTING OF COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. V. Yarmolik

    2013-01-01

    Full Text Available Various modified random testing approaches have been proposed for computer system testing in the black box environment. Their effectiveness has been evaluated on the typical failure patterns by employing three measures, namely, P-measure, E-measure and F-measure. A quasi-random testing, being a modified version of the random testing, has been proposed and analyzed. The quasi-random Sobol sequences and modified Sobol sequences are used as the test patterns. Some new methods for Sobol sequence generation have been proposed and analyzed.

  6. Multidimensional adaptive testing with a minimum error-variance criterion

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1997-01-01

    The case of adaptive testing under a multidimensional logistic response model is addressed. An adaptive algorithm is proposed that minimizes the (asymptotic) variance of the maximum-likelihood (ML) estimator of a linear combination of abilities of interest. The item selection criterion is a simple

  7. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  8. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    Science.gov (United States)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable

  9. Computerized adaptive testing--ready for ambulatory monitoring?

    DEFF Research Database (Denmark)

    Rose, Matthias; Bjørner, Jakob; Fischer, Felix

    2012-01-01

    Computerized adaptive tests (CATs) have abundant theoretical advantages over established static instruments, which could improve ambulatory monitoring of patient-reported outcomes (PROs). However, an empirical demonstration of their practical benefits is warranted.......Computerized adaptive tests (CATs) have abundant theoretical advantages over established static instruments, which could improve ambulatory monitoring of patient-reported outcomes (PROs). However, an empirical demonstration of their practical benefits is warranted....

  10. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees.

    Science.gov (United States)

    Chien, Tsair-Wei; Lai, Wen-Pin; Lu, Chih-Wei; Wang, Weng-Chung; Chen, Shih-Chung; Wang, Hsien-Yi; Su, Shih-Bin

    2011-04-17

    To develop a web-based computer adaptive testing (CAT) application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37) could evaluate the job satisfaction of individual employees as a single construct. The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  11. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees

    Directory of Open Access Journals (Sweden)

    Chen Shih-Chung

    2011-04-01

    Full Text Available Abstract Background To develop a web-based computer adaptive testing (CAT application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37 could evaluate the job satisfaction of individual employees as a single construct. Methods The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Results Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. Conclusions A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  12. Are We Measuring Teachers’ Attitudes towards Computers in Detail?: Adaptation of a Questionnaire into Turkish Culture

    Directory of Open Access Journals (Sweden)

    Nilgün Günbaş

    2017-04-01

    Full Text Available Teachers’ perceptions of computers play an important role in integrating computers into education. The related literature includes studies developing or adapting a survey instrument in Turkish culture measuring teachers’ attitudes toward computers. These instruments have three to four factors (e.g., computer importance, computer enjoyment, computer confidence and 18 to 26 items under these factors. The purpose of the present study is to adapt a more detailed and stronger survey questionnaire measuring more dimensions related to teachers’ attitudes. The source instrument was developed by Christensen and Kenzek (2009 and called Teachers’ Attitudes toward Computers (TAC. It has nine factors with 51 items. Before testing the instrument, the interaction (e-mail factor was taken out because of the cultural differences. The reliability and validity testing of the translated instrument was completed with 273 teachers’ candidates in a Faculty of Education in Turkey. The results showed that the translated instrument (Cronbach’s Alpha: .94 included eight factors and consisted of 42 items under these factors, which were consistent with the original instrument. These factors were: Interest (α: .83, Comfort (α: .90, Accommodation (α: .87, Concern (α: .79, Utility (α: .90, Perception (α: .89, Absorption (α: .84, and Significance (α: .83. Additionally, the confirmatory factor analysis result for the model with eight factors was: RMSEA=0.050, χ2/df=1.69, RMR=0.075, SRMR=0.057, GFI= 0.81, AGFI= 0.78, NFI= 0.94, NNFI=0.97, CFI=0.97, IFI= 0.97. Accordingly, as a reliable, valid and stronger instrument, the adapted survey instrument can be suggested for the use in Turkish academic studies.

  13. Preoperative prism adaptation test in normosensoric strabismus

    NARCIS (Netherlands)

    A. Schildwächter-von Langenthal (Annette); G. Kommerell (Guntram); U. Klein (Ulrike); H.J. Simonsz (Huib)

    1989-01-01

    textabstractIn 19 patients with normosensoric esotropia, the squint angles measured with the alternate cover test were compared with those after prolonged prismatic correction of the squint angle and with those after prolonged occlusion of one eye. All patients showed an increase of the squint angle

  14. Item response times in computerized adaptive testing

    Directory of Open Access Journals (Sweden)

    Lutz F. Hornke

    2000-01-01

    Full Text Available Tiempos de respuesta al ítem en tests adaptativos informatizados. Los tests adaptativos informatizados (TAI proporcionan puntuaciones y a la vez tiempos de respuesta a los ítems. La investigación sobre el significado adicional que se puede obtener de la información contenida en los tiempos de respuesta es de especial interés. Se dispuso de los datos de 5912 jóvenes en un test adaptativo informatizado. Estudios anteriores indican mayores tiempos de respuesta cuando las respuestas son incorrectas. Este resultado fue replicado en este estudio más amplio. No obstante, los tiempos promedios de respuesta al ítem para las respuestas erróneas y correctas no muestran una interpretación diferencial de la obtenida con los niveles de rasgo, y tampoco correlacionan de manera diferente con unos cuantos tests de capacidad. Se discute si los tiempos de respuesta deben ser interpretados en la misma dimensión que mide el TAI o en otras dimensiones. Desde los primeros años 30 los tiempos de respuesta han sido considerados indicadores de rasgos de personalidad que deben ser diferenciados de los rasgos que miden las puntuaciones del test. Esta idea es discutida y se ofrecen argumentos a favor y en contra. Los acercamientos mas recientes basados en modelos también se muestran. Permanece abierta la pregunta de si se obtiene o no información diagnóstica adicional de un TAI que tenga una toma de datos detallada y programada.

  15. Highly adaptive tests for group differences in brain functional connectivity

    Directory of Open Access Journals (Sweden)

    Junghi Kim

    2015-01-01

    The proposed tests combine statistical evidence against a null hypothesis from multiple sources across a range of plausible tuning parameter values reflecting uncertainty with the unknown truth. These highly adaptive tests are not only easy to use, but also high-powered robustly across various scenarios. The usage and advantages of these novel tests are demonstrated on an Alzheimer's disease dataset and simulated data.

  16. Computerized Adaptive Testing. A Case Study.

    Science.gov (United States)

    1980-12-01

    Vocational Interest Blank in 1927 [Dubois 1970]. 3. German Contributions Cattell also studied for a period of three years in Leipzig under Wilhelm Wundt in the...world’s first psychological laboratory, 2 founded by Wundt in 1879 [Heidbreder 1933]. 2William James’ laboratory, established at Harvard in 1875, did...have become important parts of psychological test theory. Under Wundt , Spearman’s principal endeavor was experimental psychology, but he also found time

  17. Configurable multiplier modules for an adaptive computing system

    Directory of Open Access Journals (Sweden)

    O. A. Pfänder

    2006-01-01

    Full Text Available The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  18. Computer board for radioactive ray test

    International Nuclear Information System (INIS)

    Zuo Mingfu

    1996-05-01

    The present status of the radioactive-ray test system for industrial applications, the newly designed computer board for overcoming the shortcomings of the current system are described. The functions, measurement principles and the feature of the board as well as the test results for this board are discussed. The board puts together many functions of the radioactive-ray test system, such as energy calibration, MCS, etc.. It also provides many other subordinate practical function such as motor control, ADC and so on. The board summarizes two sets of test parts into one and therefore composes a powerful unit for the system. Not only can it replace all units in a normal test system for signal analysis, signal process, data management, and motor control, but also can be used in more complex test systems, such as those for double source/double energy/double channel testing, multichannel testing, position testing and core positioning, etc.. The board makes the test system more easier to achieve miniaturization, computerization goals, and therefore improves the quality of the test and reduces the cost of the system. (10 refs., 8 figs.)

  19. Tutoring system for nondestructive testing using computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Koo; Koh, Sung Nam [Joong Ang Inspection Co.,Ltd., Seoul (Korea, Republic of); Shim, Yun Ju; Kim, Min Koo [Dept. of Computer Engineering, Aju University, Suwon (Korea, Republic of)

    1997-10-15

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  20. Tutoring system for nondestructive testing using computer

    International Nuclear Information System (INIS)

    Kim, Jin Koo; Koh, Sung Nam; Shim, Yun Ju; Kim, Min Koo

    1997-01-01

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  1. Water System Adaptation To Hydrological Changes: Module 11, Methods and Tools: Computational Models

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  2. Multidimensional Computerized Adaptive Testing for Indonesia Junior High School Biology

    Science.gov (United States)

    Kuo, Bor-Chen; Daud, Muslem; Yang, Chih-Wei

    2015-01-01

    This paper describes a curriculum-based multidimensional computerized adaptive test that was developed for Indonesia junior high school Biology. In adherence to the Indonesian curriculum of different Biology dimensions, 300 items was constructed, and then tested to 2238 students. A multidimensional random coefficients multinomial logit model was…

  3. Procedures for Selecting Items for Computerized Adaptive Tests.

    Science.gov (United States)

    Kingsbury, G. Gage; Zara, Anthony R.

    1989-01-01

    Several classical approaches and alternative approaches to item selection for computerized adaptive testing (CAT) are reviewed and compared. The study also describes procedures for constrained CAT that may be added to classical item selection approaches to allow them to be used for applied testing. (TJH)

  4. CUSUM-based person-fit statistics for adaptive testing

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    1999-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be estimated inaccurately. Several person-fit statistics for detecting nonfitting score patterns for paper-and-pencil tests have been proposed. In the context of computerized adaptive tests (CAT),

  5. CUSUM-based person-fit statistics for adaptive testing

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    2001-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be inaccurately estimated. Several person-fit statistics for detecting nonfitting score patterns for paper-and-pencil tests have been proposed. In the context of computerized adaptive tests (CAT),

  6. Adaptive testing for making unidimensional and multidimensional classification decisions

    NARCIS (Netherlands)

    van Groen, M.M.

    2014-01-01

    Computerized adaptive tests (CATs) were originally developed to obtain an efficient estimate of the examinee’s ability, but they can also be used to classify the examinee into one of two or more levels (e.g. master/non-master). These computerized classification tests have the advantage that they can

  7. A procedure for empirical initialization of adaptive testing algorithms

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1997-01-01

    In constrained adaptive testing, the numbers of constraints needed to control the content of the tests can easily run into the hundreds. Proper initialization of the algorithm becomes a requirement because the presence of large numbers of constraints slows down the convergence of the ability

  8. Adaptation of electrical conductivity test for Moringa oleifera seeds

    Directory of Open Access Journals (Sweden)

    Maria Luiza de Souza Medeiros

    2017-09-01

    Full Text Available This study aimed to adapt and test the efficiency of electrical conductivity methodology test in quality evaluation of Moringa oleifera Lam seeds. For physiological characterization four seed sets were evaluated by tests of germination, seedlings emergency, speed of emergency index, emergency first count, seedlings length and dry mass and cold test. The electrical conductivity test was carried out at 25 °C for 4, 8, 12, 16 and 24 h of immersion in 75 or 125 mL of distilled water using 25 or 50 seeds. A completely randomized design was used. The best results were obtained when using 50 seeds immersed in 75 mL or 125 mL of distilled water for 4 h. The electrical conductivity test adapted to moringa seeds was efficient in ranking sets of different vigor levels. The test may be efficiently used for physiological quality evaluation of moringa seeds.

  9. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  10. Adaptive linear rank tests for eQTL studies.

    Science.gov (United States)

    Szymczak, Silke; Scheinhardt, Markus O; Zeller, Tanja; Wild, Philipp S; Blankenberg, Stefan; Ziegler, Andreas

    2013-02-10

    Expression quantitative trait loci (eQTL) studies are performed to identify single-nucleotide polymorphisms that modify average expression values of genes, proteins, or metabolites, depending on the genotype. As expression values are often not normally distributed, statistical methods for eQTL studies should be valid and powerful in these situations. Adaptive tests are promising alternatives to standard approaches, such as the analysis of variance or the Kruskal-Wallis test. In a two-stage procedure, skewness and tail length of the distributions are estimated and used to select one of several linear rank tests. In this study, we compare two adaptive tests that were proposed in the literature using extensive Monte Carlo simulations of a wide range of different symmetric and skewed distributions. We derive a new adaptive test that combines the advantages of both literature-based approaches. The new test does not require the user to specify a distribution. It is slightly less powerful than the locally most powerful rank test for the correct distribution and at least as powerful as the maximin efficiency robust rank test. We illustrate the application of all tests using two examples from different eQTL studies. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Adaptive phase measurements in linear optical quantum computation

    International Nuclear Information System (INIS)

    Ralph, T C; Lund, A P; Wiseman, H M

    2005-01-01

    Photon counting induces an effective non-linear optical phase shift in certain states derived by linear optics from single photons. Although this non-linearity is non-deterministic, it is sufficient in principle to allow scalable linear optics quantum computation (LOQC). The most obvious way to encode a qubit optically is as a superposition of the vacuum and a single photon in one mode-so-called 'single-rail' logic. Until now this approach was thought to be prohibitively expensive (in resources) compared to 'dual-rail' logic where a qubit is stored by a photon across two modes. Here we attack this problem with real-time feedback control, which can realize a quantum-limited phase measurement on a single mode, as has been recently demonstrated experimentally. We show that with this added measurement resource, the resource requirements for single-rail LOQC are not substantially different from those of dual-rail LOQC. In particular, with adaptive phase measurements an arbitrary qubit state α vertical bar 0>+β vertical bar 1> can be prepared deterministically

  12. Multiple Maximum Exposure Rates in Computerized Adaptive Testing

    Science.gov (United States)

    Ramon Barrada, Juan; Veldkamp, Bernard P.; Olea, Julio

    2009-01-01

    Computerized adaptive testing is subject to security problems, as the item bank content remains operative over long periods and administration time is flexible for examinees. Spreading the content of a part of the item bank could lead to an overestimation of the examinees' trait level. The most common way of reducing this risk is to impose a…

  13. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  14. Adaptation of spatial navigation tests to virtual reality.

    OpenAIRE

    Šupalová, Ivana

    2009-01-01

    At the Department of Neurophysiology of Memory in the Academy of Sciences in Czech Republic are recently performed tests of spatial navigation of people in experimental real enviroment called Blue Velvet Arena. In introdution of this thesis is described importancy of these tests for medical purposes and the recent solution. The main aim is to adapt this real enviroment to virtual reality, allow it's configuration and enable to collect data retieved during experiment's execution. Resulting sys...

  15. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  16. WEB-BASED ADAPTIVE TESTING SYSTEM (WATS FOR CLASSIFYING STUDENTS ACADEMIC ABILITY

    Directory of Open Access Journals (Sweden)

    Jaemu LEE,

    2012-08-01

    Full Text Available Computer Adaptive Testing (CAT has been highlighted as a promising assessment method to fulfill two testing purposes: estimating student academic ability and classifying student academic level. In this paper, we introduced the Web-based Adaptive Testing System (WATS developed to support a cost effective assessment for classifying students’ ability into different academic levels. Instead of using a traditional paper and pencil test, the WATS is expected to serve as an alternate method to promptly diagnosis and identify underachieving students through Web-based testing. The WATS can also help provide students with appropriate learning contents and necessary academic support in time. In this paper, theoretical background and structure of WATS, item construction process based upon item response theory, and user interfaces of WATS were discussed.

  17. A Danish adaptation of the Boston Naming Test

    DEFF Research Database (Denmark)

    Jørgensen, Kasper; Johannsen, Peter; Vogel, Asmus

    2017-01-01

    Objective: The purpose of the present study was to develop a Danish adaptation of the Boston Naming Test (BNT) including a shortened 30-item version of the BNT for routine clinical use and two parallel 15-item versions for screening purposes. Method: The Danish adaptation of the BNT was based...... on ranking of items according to difficulty in a sample of older non-patients (n = 99). By selecting those items with the largest discrepancy in difficulty for non-patients compared to a mild Alzheimer’s disease (AD) sample (n = 53), the shortened versions of the BNT were developed. Using an overlapping...

  18. Adaptive Test Schemes for Control of Paratuberculosis in Dairy Cows

    DEFF Research Database (Denmark)

    Kirkeby, Carsten Thure; Græsbøll, Kaare; Nielsen, Søren Saxmose

    2016-01-01

    consequences of continuously adapting the sampling interval in response to the estimated true prevalence in the herd. The key results were that the true prevalence was greatly affected by the hygiene level and to some extent by the test-frequency. Furthermore, the choice of prevalence that will be tolerated...... through a variety of test-strategies, but are challenged by the lack of perfect tests. Frequent testing increases the sensitivity but the costs of testing are a cause of concern for farmers. Here, we used a herd simulation model using milk ELISA tests to evaluate the epidemiological and economic...... in a control scenario had a major impact on the true prevalence in the normal hygiene setting, but less so when the hygiene was poor. The net revenue is not greatly affected by the test-strategy, because of the general variation in net revenues between farms. An exception to this is the low hygiene herd, where...

  19. Numerical thermal mathematical model correlation to thermal balance test using adaptive particle swarm optimization (APSO)

    International Nuclear Information System (INIS)

    Beck, T.; Bieler, A.; Thomas, N.

    2012-01-01

    We present structural and thermal model (STM) tests of the BepiColombo laser altimeter (BELA) receiver baffle with emphasis on the correlation of the data with a thermal mathematical model. The test unit is a part of the thermal and optical protection of the BELA instrument being tested under infrared and solar irradiation at University of Bern. An iterative optimization method known as particle swarm optimization has been adapted to adjust the model parameters, mainly the linear conductivity, in such a way that model and test results match. The thermal model reproduces the thermal tests to an accuracy of 4.2 °C ± 3.2 °C in a temperature range of 200 °C after using only 600 iteration steps of the correlation algorithm. The use of this method brings major benefits to the accuracy of the results as well as to the computational time required for the correlation. - Highlights: ► We present model correlations of the BELA receiver baffle to thermal balance tests. ► Adaptive particle swarm optimization has been adapted for the correlation. ► The method improves the accuracy of the correlation and the computational time.

  20. Testing for local adaptation in brown trout using reciprocal transplants

    Directory of Open Access Journals (Sweden)

    Stelkens Rike B

    2012-12-01

    Full Text Available Abstract Background Local adaptation can drive the divergence of populations but identification of the traits under selection remains a major challenge in evolutionary biology. Reciprocal transplant experiments are ideal tests of local adaptation, yet rarely used for higher vertebrates because of the mobility and potential invasiveness of non-native organisms. Here, we reciprocally transplanted 2500 brown trout (Salmo trutta embryos from five populations to investigate local adaptation in early life history traits. Embryos were bred in a full-factorial design and raised in natural riverbeds until emergence. Customized egg capsules were used to simulate the natural redd environment and allowed tracking the fate of every individual until retrieval. We predicted that 1 within sites, native populations would outperform non-natives, and 2 across sites, populations would show higher performance at ‘home’ compared to ‘away’ sites. Results There was no evidence for local adaptation but we found large differences in survival and hatching rates between sites, indicative of considerable variation in habitat quality. Survival was generally high across all populations (55% ± 3%, but ranged from 4% to 89% between sites. Average hatching rate was 25% ± 3% across populations ranging from 0% to 62% between sites. Conclusion This study provides rare empirical data on variation in early life history traits in a population network of a salmonid, and large-scale breeding and transplantation experiments like ours provide powerful tests for local adaptation. Despite the recently reported genetic and morphological differences between the populations in our study area, local adaptation at the embryo level is small, non-existent, or confined to ecological conditions that our experiment could not capture.

  1. Dynamic leaching test of personal computer components.

    Science.gov (United States)

    Li, Yadong; Richardson, Jay B; Niu, Xiaojun; Jackson, Ollie J; Laster, Jeremy D; Walker, Aaron K

    2009-11-15

    A dynamic leaching test (DLT) was developed and used to evaluate the leaching of toxic substances for electronic waste in the environment. The major components in personal computers (PCs) including motherboards, hard disc drives, floppy disc drives, and compact disc drives were tested. The tests lasted for 2 years for motherboards and 1.5 year for the disc drives. The extraction fluids for the standard toxicity characteristic leaching procedure (TCLP) and synthetic precipitation leaching procedure (SPLP) were used as the DLT leaching solutions. A total of 18 elements including Ag, Al, As, Au, Ba, Be, Cd, Cr, Cu, Fe, Ga, Ni, Pd, Pb, Sb, Se, Sn, and Zn were analyzed in the DLT leachates. Only Al, Cu, Fe, Ni, Pb, and Zn were commonly found in the DLT leachates of the PC components. Their leaching levels were much higher in TCLP extraction fluid than in SPLP extraction fluid. The toxic heavy metal Pb was found to continuously leach out of the components over the entire test periods. The cumulative amounts of Pb leached out of the motherboards in TCLP extraction fluid reached 2.0 g per motherboard over the 2-year test period, and that in SPLP extraction fluid were 75-90% less. The leaching rates or levels of Pb were largely affected by the content of galvanized steel in the PC components. The higher was the steel content, the lower the Pb leaching rate would be. The findings suggest that the obsolete PCs disposed of in landfills or discarded in the environment continuously release Pb for years when subjected to landfill leachate or rains.

  2. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  3. A new adaptive testing algorithm for shortening health literacy assessments

    Directory of Open Access Journals (Sweden)

    Currie Leanne M

    2011-08-01

    Full Text Available Abstract Background Low health literacy has a detrimental effect on health outcomes, as well as ability to use online health resources. Good health literacy assessment tools must be brief to be adopted in practice; test development from the perspective of item-response theory requires pretesting on large participant populations. Our objective was to develop a novel classification method for developing brief assessment instruments that does not require pretesting on large numbers of research participants, and that would be suitable for computerized adaptive testing. Methods We present a new algorithm that uses principles of measurement decision theory (MDT and Shannon's information theory. As a demonstration, we applied it to a secondary analysis of data sets from two assessment tests: a study that measured patients' familiarity with health terms (52 participants, 60 items and a study that assessed health numeracy (165 participants, 8 items. Results In the familiarity data set, the method correctly classified 88.5% of the subjects, and the average length of test was reduced by about 50%. In the numeracy data set, for a two-class classification scheme, 96.9% of the subjects were correctly classified with a more modest reduction in test length of 35.7%; a three-class scheme correctly classified 93.8% with a 17.7% reduction in test length. Conclusions MDT-based approaches are a promising alternative to approaches based on item-response theory, and are well-suited for computerized adaptive testing in the health domain.

  4. Computational identification of adaptive mutants using the VERT system

    Directory of Open Access Journals (Sweden)

    Winkler James

    2012-04-01

    Full Text Available Background Evolutionary dynamics of microbial organisms can now be visualized using the Visualizing Evolution in Real Time (VERT system, in which several isogenic strains expressing different fluorescent proteins compete during adaptive evolution and are tracked using fluorescent cell sorting to construct a population history over time. Mutations conferring enhanced growth rates can be detected by observing changes in the fluorescent population proportions. Results Using data obtained from several VERT experiments, we construct a hidden Markov-derived model to detect these adaptive events in VERT experiments without external intervention beyond initial training. Analysis of annotated data revealed that the model achieves consensus with human annotation for 85-93% of the data points when detecting adaptive events. A method to determine the optimal time point to isolate adaptive mutants is also introduced. Conclusions The developed model offers a new way to monitor adaptive evolution experiments without the need for external intervention, thereby simplifying adaptive evolution efforts relying on population tracking. Future efforts to construct a fully automated system to isolate adaptive mutants may find the algorithm a useful tool.

  5. Nondestructive evaluation of low carbon steel by magnetic adaptive testing

    Czech Academy of Sciences Publication Activity Database

    Vértesy, G.; Tomáš, Ivan; Kobayashi, S.

    2010-01-01

    Roč. 25, č. 2 (2010), s. 125-132 ISSN 1058-9759 R&D Projects: GA ČR GA102/06/0866; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * steel * magnetic hysteresis Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.771, year: 2010

  6. Nondestructive characterization of ductile cast iron by magnetic adaptive testing

    Czech Academy of Sciences Publication Activity Database

    Vértesy, G.; Uchimoto, T.; Tomáš, Ivan; Takagi, T.

    2010-01-01

    Roč. 322, č. 20 (2010), s. 3117-3121 ISSN 0304-8853 R&D Projects: GA ČR GA101/09/1323; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * magnetic hysteresis * cast iron Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.689, year: 2010

  7. Temperature dependence of magnetic descriptors of Magnetic Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Vértesy, G.; Uchimoto, T.; Tomáš, Ivan; Takagi, T.

    2010-01-01

    Roč. 46, č. 2 (2010), s. 509-512 ISSN 0018-9464 R&D Projects: GA ČR GA101/09/1323; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * magnetic hysteresis Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.052, year: 2010

  8. A Framework for the Development of Computerized Adaptive Tests

    Directory of Open Access Journals (Sweden)

    Nathan A. Thompson

    2011-01-01

    Full Text Available A substantial amount of research has been conducted over the past 40 years on technical aspects of computerized adaptive testing (CAT, such as item selection algorithms, item exposure controls, and termination criteria. However, there is little literature providing practical guidance on the development of a CAT. This paper seeks to collate some of the available research methodologies into a general framework for the development of any CAT assessment.

  9. Laboratory tests for assessing adaptability and stickiness of dental composites.

    Science.gov (United States)

    Rosentritt, Martin; Buczovsky, Sebastian; Behr, Michael; Preis, Verena

    2014-09-01

    Handling (stickiness, adaptability) of a dental composite does strongly influence quality and success of a dental restoration. The purpose was to develop an in vitro test, which allows for evaluating adaptability and stickiness. 15 dentists were asked for providing individual assessment (school scores 1-6) of five dental composites addressing adaptability and stickiness. Composites were applied with a dental plugger (d=1.8 mm) in a class I cavity (human tooth 17). The tooth was fixed on a force gauge for simultaneous determination of application forces with varying storage (6/25°C) and application temperatures (6/25°C). On basis of these data tensile tests were performed with a dental plugger (application force 1N/2N; v=35 mm/min) on PMMA- or human tooth plates. Composite was dosed onto the tip of the plugger and applied. Application and unplugging was performed once and unplugging forces (UF) and length of the adhesive flags (LAF) were determined at different storage (6/25°C) and application temperatures (25/37°C). Unplugging work (UW) was calculated from area of UF and LAF data. The individual assessment revealed significantly different temperature-dependent application forces between 0.58 N and 2.23 N. Adaptability was assessed between 2.1 and 2.8 school scores. Stickiness varied significantly between the materials (scores: 2-3.2). UW differed significantly between the materials with values between 3.20 N mm and 37.83 N mm. Between PMMA substrate or tooth slides and between 1N or 2N application force only small UW differences were found. The presented in vitro unplugging work allows for an in vitro estimation of the handling parameters adaptability and stickiness. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  10. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    Science.gov (United States)

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  11. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  12. Using response-time constraints in item selection to control for differential speededness in computerized adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Scrams, David J.; Schnipke, Deborah L.

    2003-01-01

    This paper proposes an item selection algorithm that can be used to neutralize the effect of time limits in computer adaptive testing. The method is based on a statistical model for the response-time distributions of the test takers on the items in the pool that is updated each time a new item has

  13. Adaptive Test Schemes for Control of Paratuberculosis in Dairy Cows.

    Directory of Open Access Journals (Sweden)

    Carsten Kirkeby

    Full Text Available Paratuberculosis is a chronic infection that in dairy cattle causes reduced milk yield, weight loss, and ultimately fatal diarrhea. Subclinical animals can excrete bacteria (Mycobacterium avium ssp. paratuberculosis, MAP in feces and infect other animals. Farmers identify the infectious animals through a variety of test-strategies, but are challenged by the lack of perfect tests. Frequent testing increases the sensitivity but the costs of testing are a cause of concern for farmers. Here, we used a herd simulation model using milk ELISA tests to evaluate the epidemiological and economic consequences of continuously adapting the sampling interval in response to the estimated true prevalence in the herd. The key results were that the true prevalence was greatly affected by the hygiene level and to some extent by the test-frequency. Furthermore, the choice of prevalence that will be tolerated in a control scenario had a major impact on the true prevalence in the normal hygiene setting, but less so when the hygiene was poor. The net revenue is not greatly affected by the test-strategy, because of the general variation in net revenues between farms. An exception to this is the low hygiene herd, where frequent testing results in lower revenue. When we look at the probability of eradication, then it is correlated with the testing frequency and the target prevalence during the control phase. The probability of eradication is low in the low hygiene herd, and a test-and-cull strategy should probably not be the primary strategy in this herd. Based on this study we suggest that, in order to control MAP, the standard Danish dairy farm should use an adaptive strategy where a short sampling interval of three months is used when the estimated true prevalence is above 1%, and otherwise use a long sampling interval of one year.

  14. A case study of evolutionary computation of biochemical adaptation

    International Nuclear Information System (INIS)

    François, Paul; Siggia, Eric D

    2008-01-01

    Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein–protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature

  15. Adaptive weighted anisotropic diffusion for computed tomography denoising

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhi; Silver, Michael D. [Toshiba Medical Research Institute USA, Inc., Vernon Hills, IL (United States); Noshi, Yasuhiro [Toshiba Medical System Corporation, Tokyo (Japan)

    2011-07-01

    With increasing awareness of radiation safety, dose reduction has become an important task of modern CT system development. This paper proposes an adaptive weighted anisotropic diffusion method and an adaptive weighted sharp source anisotropic diffusion method as image domain filters to potentially help dose reduction. Different from existing anisotropic diffusion methods, the proposed methods incorporate an edge-sensitive adaptive source term as part of the diffusion iteration. It provides better edge and detail preservation. Visual evaluation showed that the new methods can reduce noise substantially without apparent edge and detail loss. The quantitative evaluations also showed over 50% of noise reduction in terms of noise standard deviations, which is equivalent to over 75% of dose reduction for a normal dose image quality. (orig.)

  16. An adaptable Boolean net trainable to control a computing robot

    International Nuclear Information System (INIS)

    Lauria, F. E.; Prevete, R.; Milo, M.; Visco, S.

    1999-01-01

    We discuss a method to implement in a Boolean neural network a Hebbian rule so to obtain an adaptable universal control system. We start by presenting both the Boolean neural net and the Hebbian rule we have considered. Then we discuss, first, the problems arising when the latter is naively implemented in a Boolean neural net, second, the method consenting us to overcome them and the ensuing adaptable Boolean neural net paradigm. Next, we present the adaptable Boolean neural net as an intelligent control system, actually controlling a writing robot, and discuss how to train it in the execution of the elementary arithmetic operations on operands represented by numerals with an arbitrary number of digits

  17. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  18. Efficient computation in adaptive artificial spiking neural networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); R.B.P. Nusselder (Roeland); H.S. Scholte; S.M. Bohte (Sander)

    2017-01-01

    textabstractArtificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of

  19. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    Science.gov (United States)

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  20. Improvement of defect characterization in ultrasonic testing by adaptative learning network

    International Nuclear Information System (INIS)

    Bieth, M.; Adamonis, D.C.; Jusino, A.

    1982-01-01

    Numerous methods exist now for signal analysis in ultrasonic testing. These methods give more or less accurate information for defects characterization. In this paper is presented the development of a particular system based on a computer Signal processing: the Adaptative Learning Network (ALN) allowing the discrimination of defects in function of their nature. The ultrasonic signal is sampled and characterized by parameters amplitude-time and amplitude-frequency. The method was tested on stainless steel tubes welds showing fatigue cracks. The ALN model developed allows, under certain conditions, the discrimination of cracks from other defects [fr

  1. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    International Nuclear Information System (INIS)

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Soummer, Remi; Macintosh, Bruce; Sivaramakrishnan, Anand

    2011-01-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  2. Practical Considerations about Expected A Posteriori Estimation in Adaptive Testing: Adaptive A Priori, Adaptive Correction for Bias, and Adaptive Integration Interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…

  3. Applications of decision theory to computer-based adaptive instructional systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate

  4. Development of a computerized adaptive test for Schizotypy assessment.

    Directory of Open Access Journals (Sweden)

    Eduardo Fonseca-Pedrero

    Full Text Available BACKGROUND: Schizotypal traits in adolescents from the general population represent the behavioral expression of liability for psychotic disorders. Schizotypy assessment in this sector of population has advanced considerably in the last few years; however, it is necessary to incorporate recent advances in psychological and educational measurement. OBJECTIVE: The main goal of this study was to develop a Computerized Adaptive Test (CAT to evaluate schizotypy through "The Oviedo Questionnaire for Schizotypy Assessment" (ESQUIZO-Q, in non-clinical adolescents. METHODS: The final sample consisted of 3,056 participants, 1,469 males, with a mean age of 15.9 years (SD=1.2. RESULTS: The results indicated that the ESQUIZO-Q scores presented adequate psychometric properties under both Classical Test Theory and Item Response Theory. The Information Function estimated using the Gradual Response Model indicated that the item pool effectively assesses schizotypy at the high end of the latent trait. The correlation between the CAT total scores and the paper-and-pencil test was 0.92. The mean number of presented items in the CAT with the standard error fixed at ≤ 0.30 was of 34 items. CONCLUSION: The CAT showed adequate psychometric properties for schizotypy assessment in the general adolescent population. The ESQUIZO-Q adaptive version could be used as a screening method for the detection of adolescents at risk for psychosis in both educational and mental health settings.

  5. Grid computing faces IT industry test

    CERN Multimedia

    Magno, L

    2003-01-01

    Software company Oracle Corp. unveiled it's Oracle 10g grid computing platform at the annual OracleWorld user convention in San Francisco. It gave concrete examples of how grid computing can be a viable option outside the scientific community where the concept was born (1 page).

  6. Adapting a computer-delivered brief alcohol intervention for veterans with Hepatitis C.

    Science.gov (United States)

    Cucciare, Michael A; Jamison, Andrea L; Combs, Ann S; Joshi, Gauri; Cheung, Ramsey C; Rongey, Catherine; Huggins, Joe; Humphreys, Keith

    2017-12-01

    This study adapted an existing computer-delivered brief alcohol intervention (cBAI) for use in Veterans with the hepatitis C virus (HCV) and examined its acceptability and feasibility in this patient population. A four-stage model consisting of initial pilot testing, qualitative interviews with key stakeholders, development of a beta version of the cBAI, and usability testing was used to achieve the study objectives. In-depth interviews gathered feedback for modifying the cBAI, including adding HCV-related content such as the health effects of alcohol on liver functioning, immune system functioning, and management of HCV, a preference for concepts to be displayed through "newer looking" graphics, and limiting the use of text to convey key concepts. Results from usability testing indicated that the modified cBAI was acceptable and feasible for use in this patient population. The development model used in this study is effective for gathering actionable feedback that can inform the development of a cBAI and can result in the development of an acceptable and feasible intervention for use in this population. Findings also have implications for developing computer-delivered interventions targeting behavior change more broadly.

  7. 3D Printing device adaptable to Computer Numerical Control (CNC)

    OpenAIRE

    GARDAN , Julien; Danesi , F.; Roucoules , Lionel; Schneider , A.

    2014-01-01

    This article presents the development of a 3D printing device for the additive manufacturing adapted to a CNC machining. The application involves the integration of a specific printing head. Additive manufacturing technology is most commonly used for modeling, prototyping, tooling through an exclusive machine or 3D printer. A global review and analysis of technologies show the additive manufacturing presents little independent solutions [6][9]. The problem studied especially the additive manu...

  8. An Adaptive Genetic Association Test Using Double Kernel Machines.

    Science.gov (United States)

    Zhan, Xiang; Epstein, Michael P; Ghosh, Debashis

    2015-10-01

    Recently, gene set-based approaches have become very popular in gene expression profiling studies for assessing how genetic variants are related to disease outcomes. Since most genes are not differentially expressed, existing pathway tests considering all genes within a pathway suffer from considerable noise and power loss. Moreover, for a differentially expressed pathway, it is of interest to select important genes that drive the effect of the pathway. In this article, we propose an adaptive association test using double kernel machines (DKM), which can both select important genes within the pathway as well as test for the overall genetic pathway effect. This DKM procedure first uses the garrote kernel machines (GKM) test for the purposes of subset selection and then the least squares kernel machine (LSKM) test for testing the effect of the subset of genes. An appealing feature of the kernel machine framework is that it can provide a flexible and unified method for multi-dimensional modeling of the genetic pathway effect allowing for both parametric and nonparametric components. This DKM approach is illustrated with application to simulated data as well as to data from a neuroimaging genetics study.

  9. Methodological testing: Are fast quantum computers illusions?

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Steven [Tachyon Design Automation, San Francisco, CA (United States)

    2013-07-01

    Popularity of the idea for computers constructed from the principles of QM started with Feynman's 'Lectures On Computation', but he called the idea crazy and dependent on statistical mechanics. In 1987, Feynman published a paper in 'Quantum Implications - Essays in Honor of David Bohm' on negative probabilities which he said gave him cultural shock. The problem with imagined fast quantum computers (QC) is that speed requires both statistical behavior and truth of the mathematical formalism. The Swedish Royal Academy 2012 Nobel Prize in physics press release touted the discovery of methods to control ''individual quantum systems'', to ''measure and control very fragile quantum states'' which enables ''first steps towards building a new type of super fast computer based on quantum physics.'' A number of examples where widely accepted mathematical descriptions have turned out to be problematic are examined: Problems with the use of Oracles in P=NP computational complexity, Paul Finsler's proof of the continuum hypothesis, and Turing's Enigma code breaking versus William tutte's Colossus. I view QC research as faith in computational oracles with wished for properties. Arther Fine's interpretation in 'The Shaky Game' of Einstein's skepticism toward QM is discussed. If Einstein's reality as space-time curvature is correct, then space-time computers will be the next type of super fast computer.

  10. Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

    DEFF Research Database (Denmark)

    Saari, Pasi; Fazekas, György; Eerola, Tuomas

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are prop......This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling...... related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outperforms a semantic computing technique that does not exploit genre information, and ACTwg-SLPwg outperforms conventional techniques and other genre-adaptive alternatives. In particular, improvements......-based genre representation for genre-adaptive music mood analysis....

  11. Complex adaptative systems and computational simulation in Archaeology

    Directory of Open Access Journals (Sweden)

    Salvador Pardo-Gordó

    2017-07-01

    Full Text Available Traditionally the concept of ‘complexity’ is used as a synonym for ‘complex society’, i.e., human groups with characteristics such as urbanism, inequalities, and hierarchy. The introduction of Nonlinear Systems and Complex Adaptive Systems to the discipline of archaeology has nuanced this concept. This theoretical turn has led to the rise of modelling as a method of analysis of historical processes. This work has a twofold objective: to present the theoretical current characterized by generative thinking in archaeology and to present a concrete application of agent-based modelling to an archaeological problem: the dispersal of the first ceramic production in the western Mediterranean.

  12. Privacy context model for dynamic privacy adaptation in ubiquitous computing

    NARCIS (Netherlands)

    Schaub, Florian; Koenings, Bastian; Dietzel, Stefan; Weber, M.; Kargl, Frank

    Ubiquitous computing is characterized by the merger of physical and virtual worlds as physical artifacts gain digital sensing, processing, and communication capabilities. Maintaining an appropriate level of privacy in the face of such complex and often highly dynamic systems is challenging. We argue

  13. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  14.   Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

      Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  15. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    Science.gov (United States)

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  16. Design and tests of an adaptive focusing neutron guide

    International Nuclear Information System (INIS)

    Valicu, Roxana Georgiana

    2012-01-01

    This work contains the Monte Carlo Simulations, as well as the first tests with an adaptive focusing neutron guide for creating a focus that does not depend on the wavelength of the incoming neutrons. All known neutron guides consist of a rectangular shape, built out of four glass plates. The inner side of the guide is coated with a complex structure of metal layers. This reflects and guides the neutrons (in analogy with the reflection of the light). For beam focusing neutron guides with fixed curvature can be built. For most experiments it is important that the beam is focused on to a small surface of the sample. In the case of focusing guides with fixed curvature it has been observed that the focusing (dimension and position of the beam focus) is wavelength dependent. This is why for measurements that are performed with different wavelengths it is very important to change the curvature of the neutron guide in order to obtain optimal results. In this work we have designed, constructed and tested a guide where we can change the curvature during the experiment. In this way we can obtain a variable curvature in horizontal as well as in vertical direction. For a curvature in the horizontal or vertical direction it is not necessary to move all four walls, only two of the opposed plates. The element that changes the curvature of the guide consists of an acting element (piezomotor) as well as a rod that can be operated by the piezomotor and that acts through a lever onto the plate. The action of a force and a consecutive torsion momentum at the free end of the plate changes the curvature of the whole plate in an almost parabolic way. Making use of the Monte Carlo simulations we were able to determine the optimal curvature for each wavelength of a neutron guide for the spectrometer TOFTOF installed at the Forschungsneutronenquelle Heinz Maier-Leibnitz (FRM II). First tests have shown that with an adaptive focusing guide one can gain up to a factor three in intensity at

  17. Design and tests of an adaptive focusing neutron guide

    Energy Technology Data Exchange (ETDEWEB)

    Valicu, Roxana Georgiana

    2012-08-23

    This work contains the Monte Carlo Simulations, as well as the first tests with an adaptive focusing neutron guide for creating a focus that does not depend on the wavelength of the incoming neutrons. All known neutron guides consist of a rectangular shape, built out of four glass plates. The inner side of the guide is coated with a complex structure of metal layers. This reflects and guides the neutrons (in analogy with the reflection of the light). For beam focusing neutron guides with fixed curvature can be built. For most experiments it is important that the beam is focused on to a small surface of the sample. In the case of focusing guides with fixed curvature it has been observed that the focusing (dimension and position of the beam focus) is wavelength dependent. This is why for measurements that are performed with different wavelengths it is very important to change the curvature of the neutron guide in order to obtain optimal results. In this work we have designed, constructed and tested a guide where we can change the curvature during the experiment. In this way we can obtain a variable curvature in horizontal as well as in vertical direction. For a curvature in the horizontal or vertical direction it is not necessary to move all four walls, only two of the opposed plates. The element that changes the curvature of the guide consists of an acting element (piezomotor) as well as a rod that can be operated by the piezomotor and that acts through a lever onto the plate. The action of a force and a consecutive torsion momentum at the free end of the plate changes the curvature of the whole plate in an almost parabolic way. Making use of the Monte Carlo simulations we were able to determine the optimal curvature for each wavelength of a neutron guide for the spectrometer TOFTOF installed at the Forschungsneutronenquelle Heinz Maier-Leibnitz (FRM II). First tests have shown that with an adaptive focusing guide one can gain up to a factor three in intensity at

  18. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  19. Design, realization and structural testing of a compliant adaptable wing

    International Nuclear Information System (INIS)

    Molinari, G; Arrieta, A F; Ermanni, P; Quack, M; Morari, M

    2015-01-01

    This paper presents the design, optimization, realization and testing of a novel wing morphing concept, based on distributed compliance structures, and actuated by piezoelectric elements. The adaptive wing features ribs with a selectively compliant inner structure, numerically optimized to achieve aerodynamically efficient shape changes while simultaneously withstanding aeroelastic loads. The static and dynamic aeroelastic behavior of the wing, and the effect of activating the actuators, is assessed by means of coupled 3D aerodynamic and structural simulations. To demonstrate the capabilities of the proposed morphing concept and optimization procedure, the wings of a model airplane are designed and manufactured according to the presented approach. The goal is to replace conventional ailerons, thus to achieve controllability in roll purely by morphing. The mechanical properties of the manufactured components are characterized experimentally, and used to create a refined and correlated finite element model. The overall stiffness, strength, and actuation capabilities are experimentally tested and successfully compared with the numerical prediction. To counteract the nonlinear hysteretic behavior of the piezoelectric actuators, a closed-loop controller is implemented, and its capability of accurately achieving the desired shape adaptation is evaluated experimentally. Using the correlated finite element model, the aeroelastic behavior of the manufactured wing is simulated, showing that the morphing concept can provide sufficient roll authority to allow controllability of the flight. The additional degrees of freedom offered by morphing can be also used to vary the plane lift coefficient, similarly to conventional flaps. The efficiency improvements offered by this technique are evaluated numerically, and compared to the performance of a rigid wing. (paper)

  20. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  1. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  2. Adaptive and robust active vibration control methodology and tests

    CERN Document Server

    Landau, Ioan Doré; Castellanos-Silva, Abraham; Constantinescu, Aurelian

    2017-01-01

    This book approaches the design of active vibration control systems from the perspective of today’s ideas of computer control. It formulates the various design problems encountered in the active management of vibration as control problems and searches for the most appropriate tools to solve them. The experimental validation of the solutions proposed on relevant tests benches is also addressed. To promote the widespread acceptance of these techniques, the presentation eliminates unnecessary theoretical developments (which can be found elsewhere) and focuses on algorithms and their use. The solutions proposed cannot be fully understood and creatively exploited without a clear understanding of the basic concepts and methods, so these are considered in depth. The focus is on enhancing motivations, algorithm presentation and experimental evaluation. MATLAB®routines, Simulink® diagrams and bench-test data are available for download and encourage easy assimilation of the experimental and exemplary material. Thre...

  3. Computer modeling describes gravity-related adaptation in cell cultures.

    Science.gov (United States)

    Alexandrov, Ludmil B; Alexandrova, Stoyana; Usheva, Anny

    2009-12-16

    Questions about the changes of biological systems in response to hostile environmental factors are important but not easy to answer. Often, the traditional description with differential equations is difficult due to the overwhelming complexity of the living systems. Another way to describe complex systems is by simulating them with phenomenological models such as the well-known evolutionary agent-based model (EABM). Here we developed an EABM to simulate cell colonies as a multi-agent system that adapts to hyper-gravity in starvation conditions. In the model, the cell's heritable characteristics are generated and transferred randomly to offspring cells. After a qualitative validation of the model at normal gravity, we simulate cellular growth in hyper-gravity conditions. The obtained data are consistent with previously confirmed theoretical and experimental findings for bacterial behavior in environmental changes, including the experimental data from the microgravity Atlantis and the Hypergravity 3000 experiments. Our results demonstrate that it is possible to utilize an EABM with realistic qualitative description to examine the effects of hypergravity and starvation on complex cellular entities.

  4. Cone Beam Computed Tomography-Derived Adaptive Radiotherapy for Radical Treatment of Esophageal Cancer

    International Nuclear Information System (INIS)

    Hawkins, Maria A.; Brooks, Corrinne; Hansen, Vibeke N.; Aitken, Alexandra; Tait, Diana M.

    2010-01-01

    Purpose: To investigate the potential for reduction in normal tissue irradiation by creating a patient specific planning target volume (PTV) using cone beam computed tomography (CBCT) imaging acquired in the first week of radiotherapy for patients receiving radical radiotherapy. Methods and materials: Patients receiving radical RT for carcinoma of the esophagus were investigated. The PTV is defined as CTV(tumor, nodes) plus esophagus outlined 3 to 5 cm cranio-caudally and a 1.5-cm circumferential margin is added (clinical plan). Prefraction CBCT are acquired on Days 1 to 4, then weekly. No correction for setup error made. The images are imported into the planning system. The tumor and esophagus for the length of the PTV are contoured on each CBCT and 5 mm margin is added. A composite volume (PTV1) is created using Week 1 composite CBCT volumes. The same process is repeated using CBCT Week 2 to 6 (PTV2). A new plan is created using PTV1 (adaptive plan). The coverage of the 95% isodose of PTV1 is evaluated on PTV2. Dose-volume histograms (DVH) for lungs, heart, and cord for two plans are compared. Results: A total of 139 CBCT for 14 cases were analyzed. For the adaptive plan the coverage of the 95% prescription isodose for PTV1 = 95.6% ± 4% and the PTV2 = 96.8% ± 4.1% (t test, 0.19). Lungs V20 (15.6 Gy vs. 10.2 Gy) and heart mean dose (26.9 Gy vs. 20.7 Gy) were significantly smaller for the adaptive plan. Conclusions: A reduced planning volume can be constructed within the first week of treatment using CBCT. A single plan modification can be performed within the second week of treatment with considerable reduction in organ at risk dose.

  5. Adaptive zooming in X-ray computed tomography.

    Science.gov (United States)

    Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan

    2014-01-01

    In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. Increase the spatial resolution of the reconstructed image by optimal zooming during scanning. A new approach is proposed, in which the full width of the detector is exploited for every projection angle. This approach is based on the use of prior information about the object's convex hull to move the source as close as possible to the object, while avoiding truncation of the projections. Experiments show that the proposed approach can significantly improve reconstruction quality, producing reconstructions with smaller errors and revealing more details in the object. The proposed approach can lead to more accurate reconstructions and increased spatial resolution in the object compared to the conventional circular trajectory.

  6. Effect of Preparation Depth on the Marginal and Internal Adaptation of Computer-aided Design/Computer-assisted Manufacture Endocrowns.

    Science.gov (United States)

    Gaintantzopoulou, M D; El-Damanhoury, H M

    The aim of the study was to evaluate the effect of preparation depth and intraradicular extension on the marginal and internal adaptation of computer-aided design/computer-assisted manufacture (CAD/CAM) endocrown restorations. Standardized preparations were made in resin endodontic tooth models (Nissin Dental), with an intracoronal preparation depth of 2 mm (group H2), with extra 1- (group H3) or 2-mm (group H4) intraradicular extensions in the root canals (n=12). Vita Enamic polymer-infiltrated ceramic-network material endocrowns were fabricated using the CEREC AC CAD/CAM system and were seated on the prepared teeth. Specimens were evaluated by microtomography. Horizontal and vertical tomographic sections were recorded and reconstructed by using the CTSkan software (TView v1.1, Skyscan).The surface/void volume (S/V) in the region of interest was calculated. Marginal gap (MG), absolute marginal discrepancy (MD), and internal marginal gap were measured at various measuring locations and calculated in microscale (μm). Marginal and internal discrepancy data (μm) were analyzed with nonparametric Kruskal-Wallis analysis of variance by ranks with Dunn's post hoc, whereas S/V data were analyzed by one-way analysis of variance and Bonferroni multiple comparisons (α=0.05). Significant differences were found in MG, MD, and internal gap width values between the groups, with H2 showing the lowest values from all groups. S/V calculations presented significant differences between H2 and the other two groups (H3 and H4) tested, with H2 again showing the lowest values. Increasing the intraradicular extension of endocrown restorations increased the marginal and internal gap of endocrown restorations.

  7. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  8. Procedures for Computing Transonic Flows for Control of Adaptive Wind Tunnels. Ph.D. Thesis - Technische Univ., Berlin, Mar. 1986

    Science.gov (United States)

    Rebstock, Rainer

    1987-01-01

    Numerical methods are developed for control of three dimensional adaptive test sections. The physical properties of the design problem occurring in the external field computation are analyzed, and a design procedure suited for solution of the problem is worked out. To do this, the desired wall shape is determined by stepwise modification of an initial contour. The necessary changes in geometry are determined with the aid of a panel procedure, or, with incident flow near the sonic range, with a transonic small perturbation (TSP) procedure. The designed wall shape, together with the wall deflections set during the tunnel run, are the input to a newly derived one-step formula which immediately yields the adapted wall contour. This is particularly important since the classical iterative adaptation scheme is shown to converge poorly for 3D flows. Experimental results obtained in the adaptive test section with eight flexible walls are presented to demonstrate the potential of the procedure. Finally, a method is described to minimize wall interference in 3D flows by adapting only the top and bottom wind tunnel walls.

  9. PEAC: A Power-Efficient Adaptive Computing Technology for Enabling Swarm of Small Spacecraft and Deployable Mini-Payloads

    Data.gov (United States)

    National Aeronautics and Space Administration — This task is to develop and demonstrate a path-to-flight and power-adaptive avionics technology PEAC (Power Efficient Adaptive Computing). PEAC will enable emerging...

  10. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  11. A Secure, Scalable and Elastic Autonomic Computing Systems Paradigm: Supporting Dynamic Adaptation of Self-* Services from an Autonomic Cloud

    Directory of Open Access Journals (Sweden)

    Abdul Jaleel

    2018-05-01

    Full Text Available Autonomic computing embeds self-management features in software systems using external feedback control loops, i.e., autonomic managers. In existing models of autonomic computing, adaptive behaviors are defined at the design time, autonomic managers are statically configured, and the running system has a fixed set of self-* capabilities. An autonomic computing design should accommodate autonomic capability growth by allowing the dynamic configuration of self-* services, but this causes security and integrity issues. A secure, scalable and elastic autonomic computing system (SSE-ACS paradigm is proposed to address the runtime inclusion of autonomic managers, ensuring secure communication between autonomic managers and managed resources. Applying the SSE-ACS concept, a layered approach for the dynamic adaptation of self-* services is presented with an online ‘Autonomic_Cloud’ working as the middleware between Autonomic Managers (offering the self-* services and Autonomic Computing System (requiring the self-* services. A stock trading and forecasting system is used for simulation purposes. The security impact of the SSE-ACS paradigm is verified by testing possible attack cases over the autonomic computing system with single and multiple autonomic managers running on the same and different machines. The common vulnerability scoring system (CVSS metric shows a decrease in the vulnerability severity score from high (8.8 for existing ACS to low (3.9 for SSE-ACS. Autonomic managers are introduced into the system at runtime from the Autonomic_Cloud to test the scalability and elasticity. With elastic AMs, the system optimizes the Central Processing Unit (CPU share resulting in an improved execution time for business logic. For computing systems requiring the continuous support of self-management services, the proposed system achieves a significant improvement in security, scalability, elasticity, autonomic efficiency, and issue resolving time

  12. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  13. Adaptive Tests of Significance Using Permutations of Residuals with R and SAS

    CERN Document Server

    O'Gorman, Thomas W

    2012-01-01

    Provides the tools needed to successfully perform adaptive tests across a broad range of datasets Adaptive Tests of Significance Using Permutations of Residuals with R and SAS illustrates the power of adaptive tests and showcases their ability to adjust the testing method to suit a particular set of data. The book utilizes state-of-the-art software to demonstrate the practicality and benefits for data analysis in various fields of study. Beginning with an introduction, the book moves on to explore the underlying concepts of adaptive tests, including:Smoothing methods and normalizing transforma

  14. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  15. Computer Aided Education System SuperTest. Present and Prospective

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available This paper analyzes the testing and self-testing process for the Computer Aided Education System (CAES SuperTest, used at the Academy of Economic Studies of Chisinau, Moldova and recently implemented at the University of Bacau, Romania. We discuss here the future of this software, from the Information Society and Knowledge Society point of view.

  16. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  17. A computer simulation of an adaptive noise canceler with a single input

    Science.gov (United States)

    Albert, Stuart D.

    1991-06-01

    A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.

  18. Adaptive control of Parkinson's state based on a nonlinear computational model with unknown parameters.

    Science.gov (United States)

    Su, Fei; Wang, Jiang; Deng, Bin; Wei, Xi-Le; Chen, Ying-Yuan; Liu, Chen; Li, Hui-Yan

    2015-02-01

    The objective here is to explore the use of adaptive input-output feedback linearization method to achieve an improved deep brain stimulation (DBS) algorithm for closed-loop control of Parkinson's state. The control law is based on a highly nonlinear computational model of Parkinson's disease (PD) with unknown parameters. The restoration of thalamic relay reliability is formulated as the desired outcome of the adaptive control methodology, and the DBS waveform is the control input. The control input is adjusted in real time according to estimates of unknown parameters as well as the feedback signal. Simulation results show that the proposed adaptive control algorithm succeeds in restoring the relay reliability of the thalamus, and at the same time achieves accurate estimation of unknown parameters. Our findings point to the potential value of adaptive control approach that could be used to regulate DBS waveform in more effective treatment of PD.

  19. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  20. Computational adaptive optics for broadband interferometric tomography of tissues and cells

    Science.gov (United States)

    Adie, Steven G.; Mulligan, Jeffrey A.

    2016-03-01

    Adaptive optics (AO) can shape aberrated optical wavefronts to physically restore the constructive interference needed for high-resolution imaging. With access to the complex optical field, however, many functions of optical hardware can be achieved computationally, including focusing and the compensation of optical aberrations to restore the constructive interference required for diffraction-limited imaging performance. Holography, which employs interferometric detection of the complex optical field, was developed based on this connection between hardware and computational image formation, although this link has only recently been exploited for 3D tomographic imaging in scattering biological tissues. This talk will present the underlying imaging science behind computational image formation with optical coherence tomography (OCT) -- a beam-scanned version of broadband digital holography. Analogous to hardware AO (HAO), we demonstrate computational adaptive optics (CAO) and optimization of the computed pupil correction in 'sensorless mode' (Zernike polynomial corrections with feedback from image metrics) or with the use of 'guide-stars' in the sample. We discuss the concept of an 'isotomic volume' as the volumetric extension of the 'isoplanatic patch' introduced in astronomical AO. Recent CAO results and ongoing work is highlighted to point to the potential biomedical impact of computed broadband interferometric tomography. We also discuss the advantages and disadvantages of HAO vs. CAO for the effective shaping of optical wavefronts, and highlight opportunities for hybrid approaches that synergistically combine the unique advantages of hardware and computational methods for rapid volumetric tomography with cellular resolution.

  1. PA171 Containers on a Wood Pallet with Metal Top Adapter, Air Pressure Tests During MIL-STD-1660 Tests

    National Research Council Canada - National Science Library

    2004-01-01

    ... (PM-MAS) to conduct Air Pressure Tests during MIL-STD-1660, "Design Criteria for Ammunition Unit Loads" testing on the PA171 containers on a wood pallet with metal top adapter as manufactured by Alliant Tech...

  2. Using An Adapter To Perform The Chalfant-Style Containment Vessel Periodic Maintenance Leak Rate Test

    International Nuclear Information System (INIS)

    Loftin, B.; Abramczyk, G.; Trapp, D.

    2011-01-01

    Recently the Packaging Technology and Pressurized Systems (PT and PS) organization at the Savannah River National Laboratory was asked to develop an adapter for performing the leak-rate test of a Chalfant-style containment vessel. The PT and PS organization collaborated with designers at the Department of Energy's Pantex Plant to develop the adapter currently in use for performing the leak-rate testing on the containment vessels. This paper will give the history of leak-rate testing of the Chalfant-style containment vessels, discuss the design concept for the adapter, give an overview of the design, and will present results of the testing done using the adapter.

  3. Recommendations for elaboration, transcultural adaptation and validation process of tests in Speech, Hearing and Language Pathology.

    Science.gov (United States)

    Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de

    2017-06-08

    to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.

  4. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  5. Development of a computer-adaptive physical function instrument for Social Security Administration disability determination.

    Science.gov (United States)

    Ni, Pengsheng; McDonough, Christine M; Jette, Alan M; Bogusz, Kara; Marfeo, Elizabeth E; Rasch, Elizabeth K; Brandt, Diane E; Meterko, Mark; Haley, Stephen M; Chan, Leighton

    2013-09-01

    To develop and test an instrument to assess physical function for Social Security Administration (SSA) disability programs, the SSA-Physical Function (SSA-PF) instrument. Item response theory (IRT) analyses were used to (1) create a calibrated item bank for each of the factors identified in prior factor analyses, (2) assess the fit of the items within each scale, (3) develop separate computer-adaptive testing (CAT) instruments for each scale, and (4) conduct initial psychometric testing. Cross-sectional data collection; IRT analyses; CAT simulation. Telephone and Internet survey. Two samples: SSA claimants (n=1017) and adults from the U.S. general population (n=999). None. Model fit statistics, correlation, and reliability coefficients. IRT analyses resulted in 5 unidimensional SSA-PF scales: Changing & Maintaining Body Position, Whole Body Mobility, Upper Body Function, Upper Extremity Fine Motor, and Wheelchair Mobility for a total of 102 items. High CAT accuracy was demonstrated by strong correlations between simulated CAT scores and those from the full item banks. On comparing the simulated CATs with the full item banks, very little loss of reliability or precision was noted, except at the lower and upper ranges of each scale. No difference in response patterns by age or sex was noted. The distributions of claimant scores were shifted to the lower end of each scale compared with those of a sample of U.S. adults. The SSA-PF instrument contributes important new methodology for measuring the physical function of adults applying to the SSA disability programs. Initial evaluation revealed that the SSA-PF instrument achieved considerable breadth of coverage in each content domain and demonstrated noteworthy psychometric properties. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  6. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  7. Computer intervention impact on psychosocial adaptation of rural women with chronic conditions.

    Science.gov (United States)

    Weinert, Clarann; Cudney, Shirley; Comstock, Bryan; Bansal, Aasthaa

    2011-01-01

    Adapting to living with chronic conditions is a life-long psychosocial challenge. The purpose of this study was to report the effect of a computer intervention on the psychosocial adaptation of rural women with chronic conditions. A two-group study design was used with 309 middle-aged, rural women who had chronic conditions, randomized into either a computer-based intervention or a control group. Data were collected at baseline, at the end of the intervention, and 6 months later on the psychosocial indicators of social support, self-esteem, acceptance of illness, stress, depression, and loneliness. The impact of the computer-based intervention was statistically significant for five of six of the psychosocial outcomes measured, with a modest impact on social support. The largest benefits were seen in depression, stress, and acceptance. The women-to-women intervention resulted in positive psychosocial responses that have the potential to contribute to successful management of illness and adaptation. Other components of adaptation to be examined are the impact of the intervention on illness management and quality of life and the interrelationships among environmental stimuli, psychosocial response, and illness management.

  8. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  9. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    International Nuclear Information System (INIS)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E; Loukopoulos, Klearchos

    2011-01-01

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  10. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Loukopoulos, Klearchos, E-mail: m.hoban@ucl.ac.uk [Department of Materials, Oxford University, Parks Road, Oxford OX1 4PH (United Kingdom)

    2011-02-15

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  11. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  12. Repetitive Domain-Referenced Testing Using Computers: the TITA System.

    Science.gov (United States)

    Olympia, P. L., Jr.

    The TITA (Totally Interactive Testing and Analysis) System algorithm for the repetitive construction of domain-referenced tests utilizes a compact data bank, is highly portable, is useful in any discipline, requires modest computer hardware, and does not present a security problem. Clusters of related keyphrases, statement phrases, and distractors…

  13. Design of an Adaptive Power Regulation Mechanism and a Nozzle for a Hydroelectric Power Plant Turbine Test Rig

    Science.gov (United States)

    Mert, Burak; Aytac, Zeynep; Tascioglu, Yigit; Celebioglu, Kutay; Aradag, Selin; ETU Hydro Research Center Team

    2014-11-01

    This study deals with the design of a power regulation mechanism for a Hydroelectric Power Plant (HEPP) model turbine test system which is designed to test Francis type hydroturbines up to 2 MW power with varying head and flow(discharge) values. Unlike the tailor made regulation mechanisms of full-sized, functional HEPPs; the design for the test system must be easily adapted to various turbines that are to be tested. In order to achieve this adaptability, a dynamic simulation model is constructed in MATLAB/Simulink SimMechanics. This model acquires geometric data and hydraulic loading data of the regulation system from Autodesk Inventor CAD models and Computational Fluid Dynamics (CFD) analysis respectively. The dynamic model is explained and case studies of two different HEPPs are performed for validation. CFD aided design of the turbine guide vanes, which is used as input for the dynamic model, is also presented. This research is financially supported by Turkish Ministry of Development.

  14. A computer-controlled automated test system for fatigue and fracture testing

    International Nuclear Information System (INIS)

    Nanstad, R.K.; Alexander, D.J.; Swain, R.L.; Hutton, J.T.; Thomas, D.L.

    1989-01-01

    A computer-controlled system consisting of a servohydraulic test machine, an in-house designed test controller, and a desktop computer has been developed for performing automated fracture toughness and fatigue crack growth testing both in the laboratory and in hot cells for remote testing of irradiated specimens. Both unloading compliance and dc-potential drop can be used to monitor crack growth. The test controller includes a dc-current supply programmer, a function generator for driving the servohydraulic test machine to required test outputs, five measurement channels (each consisting of low-pass filter, track/hold amplifier, and 16-bit analog-to-digital converter), and digital logic for various control and data multiplexing functions. The test controller connects to the computer via a 16-bit wide photo-isolated bidirectional bus. The computer, a Hewlett-Packard series 200/300, inputs specimen and test parameters from the operator, configures the test controller, stores test data from the test controller in memory, does preliminary analysis during the test, and records sensor calibrations, specimen and test parameters, and test data on flexible diskette for later recall and analysis with measured initial and final crack length information. During the test, the operator can change test parameters as necessary. 24 refs., 6 figs

  15. Development and Evaluation of a Confidence-Weighting Computerized Adaptive Testing

    Science.gov (United States)

    Yen, Yung-Chin; Ho, Rong-Guey; Chen, Li-Ju; Chou, Kun-Yi; Chen, Yan-Lin

    2010-01-01

    The purpose of this study was to examine whether the efficiency, precision, and validity of computerized adaptive testing (CAT) could be improved by assessing confidence differences in knowledge that examinees possessed. We proposed a novel polytomous CAT model called the confidence-weighting computerized adaptive testing (CWCAT), which combined a…

  16. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items

    Science.gov (United States)

    Aybek, Eren Can; Demirtasli, R. Nukhet

    2017-01-01

    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  17. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  18. The students' ability in the mathematical literacy for uncertainty problems on the PISA adaptation test

    Science.gov (United States)

    Julie, Hongki; Sanjaya, Febi; Anggoro, Ant. Yudhi

    2017-08-01

    One of purposes of this study was to describe the solution profile of the junior high school students for the PISA adaptation test. The procedures conducted by researchers to achieve this objective were (1) adapting the PISA test, (2) validating the adapting PISA test, (3) asking junior high school students to do the adapting PISA test, and (4) making the students' solution profile. The PISA problems for mathematics could be classified into four areas, namely quantity, space and shape, change and relationship, and uncertainty. The research results that would be presented in this paper were the result test for uncertainty problems. In the adapting PISA test, there were fifteen questions. Subjects in this study were 18 students from 11 junior high schools in Yogyakarta, Central Java, and Banten. The type of research that used by the researchers was a qualitative research. For the first uncertainty problem in the adapting test, 66.67% of students reached level 3. For the second uncertainty problem in the adapting test, 44.44% of students achieved level 4, and 33.33% of students reached level 3. For the third uncertainty problem in the adapting test n, 38.89% of students achieved level 5, 11.11% of students reached level 4, and 5.56% of students achieved level 3. For the part a of the fourth uncertainty problem in the adapting test, 72.22% of students reached level 4 and for the part b of the fourth uncertainty problem in the adapting test, 83.33% students achieved level 4.

  19. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    Science.gov (United States)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and

  20. Radiographic test phantom for computed tomographic lung nodule analysis

    International Nuclear Information System (INIS)

    Zerhouni, E.A.

    1987-01-01

    This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs

  1. The self-adaptation to dynamic failures for efficient virtual organization formations in grid computing context

    International Nuclear Information System (INIS)

    Han Liangxiu

    2009-01-01

    Grid computing aims to enable 'resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations (VOs)'. However, due to the nature of heterogeneous and dynamic resources, dynamic failures in the distributed grid environment usually occur more than in traditional computation platforms, which cause failed VO formations. In this paper, we develop a novel self-adaptive mechanism to dynamic failures during VO formations. Such a self-adaptive scheme allows an individual and member of VOs to automatically find other available or replaceable one once a failure happens and therefore makes systems automatically recover from dynamic failures. We define dynamic failure situations of a system by using two standard indicators: mean time between failures (MTBF) and mean time to recover (MTTR). We model both MTBF and MTTR as Poisson distributions. We investigate and analyze the efficiency of the proposed self-adaptation mechanism to dynamic failures by comparing the success probability of VO formations before and after adopting it in three different cases: (1) different failure situations; (2) different organizational structures and scales; (3) different task complexities. The experimental results show that the proposed scheme can automatically adapt to dynamic failures and effectively improve the dynamic VO formation performance in the event of node failures, which provide a valuable addition to the field.

  2. Adaptive Remodeling of Achilles Tendon: A Multi-scale Computational Model.

    Directory of Open Access Journals (Sweden)

    Stuart R Young

    2016-09-01

    Full Text Available While it is known that musculotendon units adapt to their load environments, there is only a limited understanding of tendon adaptation in vivo. Here we develop a computational model of tendon remodeling based on the premise that mechanical damage and tenocyte-mediated tendon damage and repair processes modify the distribution of its collagen fiber lengths. We explain how these processes enable the tendon to geometrically adapt to its load conditions. Based on known biological processes, mechanical and strain-dependent proteolytic fiber damage are incorporated into our tendon model. Using a stochastic model of fiber repair, it is assumed that mechanically damaged fibers are repaired longer, whereas proteolytically damaged fibers are repaired shorter, relative to their pre-damage length. To study adaptation of tendon properties to applied load, our model musculotendon unit is a simplified three-component Hill-type model of the human Achilles-soleus unit. Our model results demonstrate that the geometric equilibrium state of the Achilles tendon can coincide with minimization of the total metabolic cost of muscle activation. The proposed tendon model independently predicts rates of collagen fiber turnover that are in general agreement with in vivo experimental measurements. While the computational model here only represents a first step in a new approach to understanding the complex process of tendon remodeling in vivo, given these findings, it appears likely that the proposed framework may itself provide a useful theoretical foundation for developing valuable qualitative and quantitative insights into tendon physiology and pathology.

  3. A semiautomated computer-interactive dynamic impact testing system

    International Nuclear Information System (INIS)

    Alexander, D.J.; Nanstad, R.K.; Corwin, W.R.; Hutton, J.T.

    1989-01-01

    A computer-assisted semiautomated system has been developed for testing a variety of specimen types under dynamic impact conditions. The primary use of this system is for the testing of Charpy specimens. Full-, half-, and third-size specimens have been tested, both in the lab and remotely in a hot cell for irradiated specimens. Specimens are loaded into a transfer device which moves the specimen into a chamber, where a hot air gun is used to heat the specimen, or cold nitrogen gas is used for cooling, as required. The specimen is then quickly transferred from the furnace to the anvils and then broken. This system incorporates an instrumented tup to determine the change in voltage during the fracture process. These data are analyzed by the computer system after the test is complete. The voltage-time trace is recorded with a digital oscilloscope, transferred to the computer, and analyzed. The analysis program incorporates several unique features. It interacts with the operator and identifies the maximum voltage during the test, the amount of rapid fracture during the test (if any), and the end of the fracture process. The program then calculates the area to maximum voltage and the total area under the voltage-time curve. The data acquisition and analysis part of the system can also be used to conduct other dynamic testing. Dynamic tear and precracked specimens can be tested with an instrumented tup and analyzed in a similar manner. 3 refs., 7 figs

  4. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  5. A multiple objective test assembly approach for exposure control problems in Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Theo J.H.M. Eggen

    2010-01-01

    Full Text Available Overexposure and underexposure of items in the bank are serious problems in operational computerized adaptive testing (CAT systems. These exposure problems might result in item compromise, or point at a waste of investments. The exposure control problem can be viewed as a test assembly problem with multiple objectives. Information in the test has to be maximized, item compromise has to be minimized, and pool usage has to be optimized. In this paper, a multiple objectives method is developed to deal with both types of exposure problems. In this method, exposure control parameters based on observed exposure rates are implemented as weights for the information in the item selection procedure. The method does not need time consuming simulation studies, and it can be implemented conditional on ability level. The method is compared with Sympson Hetter method for exposure control, with the Progressive method and with alphastratified testing. The results show that the method is successful in dealing with both kinds of exposure problems.

  6. Test bank to accompany Computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1980-01-01

    Test Bank to Accompany Computers and Data Processing provides a variety of questions from which instructors can easily custom tailor exams appropriate for their particular courses. This book contains over 4000 short-answer questions that span the full range of topics for introductory computing course.This book is organized into five parts encompassing 19 chapters. This text provides a very large number of questions so that instructors can produce different exam testing essentially the same topics in succeeding semesters. Three types of questions are included in this book, including multiple ch

  7. Concentrator optical characterization using computer mathematical modelling and point source testing

    Science.gov (United States)

    Dennison, E. W.; John, S. L.; Trentelman, G. F.

    1984-01-01

    The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.

  8. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  9. Adaptive tight frame based medical image reconstruction: a proof-of-concept study for computed tomography

    International Nuclear Information System (INIS)

    Zhou, Weifeng; Cai, Jian-Feng; Gao, Hao

    2013-01-01

    A popular approach for medical image reconstruction has been through the sparsity regularization, assuming the targeted image can be well approximated by sparse coefficients under some properly designed system. The wavelet tight frame is such a widely used system due to its capability for sparsely approximating piecewise-smooth functions, such as medical images. However, using a fixed system may not always be optimal for reconstructing a variety of diversified images. Recently, the method based on the adaptive over-complete dictionary that is specific to structures of the targeted images has demonstrated its superiority for image processing. This work is to develop the adaptive wavelet tight frame method image reconstruction. The proposed scheme first constructs the adaptive wavelet tight frame that is task specific, and then reconstructs the image of interest by solving an l 1 -regularized minimization problem using the constructed adaptive tight frame system. The proof-of-concept study is performed for computed tomography (CT), and the simulation results suggest that the adaptive tight frame method improves the reconstructed CT image quality from the traditional tight frame method. (paper)

  10. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  11. Adaptive Radiotherapy Planning on Decreasing Gross Tumor Volumes as Seen on Megavoltage Computed Tomography Images

    International Nuclear Information System (INIS)

    Woodford, Curtis; Yartsev, Slav; Dar, A. Rashid; Bauman, Glenn; Van Dyk, Jake

    2007-01-01

    Purpose: To evaluate gross tumor volume (GTV) changes for patients with non-small-cell lung cancer by using daily megavoltage (MV) computed tomography (CT) studies acquired before each treatment fraction on helical tomotherapy and to relate the potential benefit of adaptive image-guided radiotherapy to changes in GTV. Methods and Materials: Seventeen patients were prescribed 30 fractions of radiotherapy on helical tomotherapy for non-small-cell lung cancer at London Regional Cancer Program from Dec 2005 to March 2007. The GTV was contoured on the daily MVCT studies of each patient. Adapted plans were created using merged MVCT-kilovoltage CT image sets to investigate the advantages of replanning for patients with differing GTV regression characteristics. Results: Average GTV change observed over 30 fractions was -38%, ranging from -12 to -87%. No significant correlation was observed between GTV change and patient's physical or tumor features. Patterns of GTV changes in the 17 patients could be divided broadly into three groups with distinctive potential for benefit from adaptive planning. Conclusions: Changes in GTV are difficult to predict quantitatively based on patient or tumor characteristics. If changes occur, there are points in time during the treatment course when it may be appropriate to adapt the plan to improve sparing of normal tissues. If GTV decreases by greater than 30% at any point in the first 20 fractions of treatment, adaptive planning is appropriate to further improve the therapeutic ratio

  12. Moving finite elements: A continuously adaptive method for computational fluid dynamics

    International Nuclear Information System (INIS)

    Glasser, A.H.; Miller, K.; Carlson, N.

    1991-01-01

    Moving Finite Elements (MFE), a recently developed method for computational fluid dynamics, promises major advances in the ability of computers to model the complex behavior of liquids, gases, and plasmas. Applications of computational fluid dynamics occur in a wide range of scientifically and technologically important fields. Examples include meteorology, oceanography, global climate modeling, magnetic and inertial fusion energy research, semiconductor fabrication, biophysics, automobile and aircraft design, industrial fluid processing, chemical engineering, and combustion research. The improvements made possible by the new method could thus have substantial economic impact. Moving Finite Elements is a moving node adaptive grid method which has a tendency to pack the grid finely in regions where it is most needed at each time and to leave it coarse elsewhere. It does so in a manner which is simple and automatic, and does not require a large amount of human ingenuity to apply it to each particular problem. At the same time, it often allows the time step to be large enough to advance a moving shock by many shock thicknesses in a single time step, moving the grid smoothly with the solution and minimizing the number of time steps required for the whole problem. For 2D problems (two spatial variables) the grid is composed of irregularly shaped and irregularly connected triangles which are very flexible in their ability to adapt to the evolving solution. While other adaptive grid methods have been developed which share some of these desirable properties, this is the only method which combines them all. In many cases, the method can save orders of magnitude of computing time, equivalent to several generations of advancing computer hardware

  13. Comparison of tests of accommodation for computer users.

    Science.gov (United States)

    Kolker, David; Hutchinson, Robert; Nilsen, Erik

    2002-04-01

    With the increased use of computers in the workplace and at home, optometrists are finding more patients presenting with symptoms of Computer Vision Syndrome. Among these symptomatic individuals, research supports that accommodative disorders are the most common vision finding. A prepresbyopic group (N= 30) and a presbyopic group (N = 30) were selected from a private practice. Assignment to a group was determined by age, accommodative amplitude, and near visual acuity with their distance prescription. Each subject was given a thorough vision and ocular health examination, then administered several nearpoint tests of accommodation at a computer working distance. All the tests produced similar results in the presbyopic group. For the prepresbyopic group, the tests yielded very different results. To effectively treat symptomatic VDT users, optometrists must assess the accommodative system along with the binocular and refractive status. For presbyopic patients, all nearpoint tests studied will yield virtually the same result. However, the method of testing accommodation, as well as the test stimulus presented, will yield significantly different responses for prepresbyopic patients. Previous research indicates that a majority of patients prefer the higher plus prescription yielded by the Gaussian image test.

  14. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  15. Modifications to Langley 0.3-m TCT adaptive wall software for heavy gas test medium, phase 1 studies

    Science.gov (United States)

    Murthy, A. V.

    1992-01-01

    The scheme for two-dimensional wall adaptation with sulfur hexafluoride (SF6) as test gas in the NASA Langley Research Center 0.3-m Transonic Cryogenic Tunnel (0.3-m TCT) is presented. A unified version of the wall adaptation software has been developed to function in a dual gas operation mode (nitrogen or SF6). The feature of ideal gas calculations for nitrogen operation is retained. For SF6 operation, real gas properties have been computed using the departure function technique. Installation of the software on the 0.3-m TCT ModComp-A computer and preliminary validation with nitrogen operation were found to be satisfactory. Further validation and improvements to the software will be undertaken when the 0.3-m TCT is ready for operation with SF6 gas.

  16. Specifying colours for colour vision testing using computer graphics.

    Science.gov (United States)

    Toufeeq, A

    2004-10-01

    This paper describes a novel test of colour vision using a standard personal computer, which is simple and reliable to perform. Twenty healthy individuals with normal colour vision and 10 healthy individuals with a red/green colour defect were tested binocularly at 13 selected points in the CIE (Commission International d'Eclairage, 1931) chromaticity triangle, representing the gamut of a computer monitor, where the x, y coordinates of the primary colour phosphors were known. The mean results from individuals with normal colour vision were compared to those with defective colour vision. Of the 13 points tested, five demonstrated consistently high sensitivity in detecting colour defects. The test may provide a convenient method for classifying colour vision abnormalities.

  17. Adapting the Critical Thinking Assessment Test for Palestinian Universities

    Science.gov (United States)

    Basha, Sami; Drane, Denise; Light, Gregory

    2016-01-01

    Critical thinking is a key learning outcome for Palestinian students. However, there are no validated critical thinking tests in Arabic. Suitability of the US developed Critical Thinking Assessment Test (CAT) for use in Palestine was assessed. The test was piloted with university students in English (n = 30) and 4 questions were piloted in Arabic…

  18. Digital computed radiography in industrial X-ray testing

    International Nuclear Information System (INIS)

    Osterloh, K.; Onel, Y.; Zscherpel, U.; Ewert, U.

    2001-01-01

    Computed radiography is used for X-ray testing in many industrial applications. There are different systems depending on the application, e.g. fast systems for detection of material inhomogeneities and slower systems with higher local resolution for detection of cracks and fine details, e.g. in highly stressed areas or in welded seams. The method is more dynamic than film methods, and digital image processing is possible during testing [de

  19. Sample test cases using the environmental computer code NECTAR

    International Nuclear Information System (INIS)

    Ponting, A.C.

    1984-06-01

    This note demonstrates a few of the many different ways in which the environmental computer code NECTAR may be used. Four sample test cases are presented and described to show how NECTAR input data are structured. Edited output is also presented to illustrate the format of the results. Two test cases demonstrate how NECTAR may be used to study radio-isotopes not explicitly included in the code. (U.K.)

  20. Security Considerations and Recommendations in Computer-Based Testing

    Directory of Open Access Journals (Sweden)

    Saleh M. Al-Saleem

    2014-01-01

    Full Text Available Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT. However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password in order to check the identity and authenticity of the examinee.

  1. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  2. A Look at Computer-Assisted Testing Operations. The Illinois Series on Educational Application of Computers, No. 12e.

    Science.gov (United States)

    Muiznieks, Viktors; Dennis, J. Richard

    In computer assisted test construction (CATC) systems, the computer is used to perform the mechanical aspects of testing while the teacher retains control over question content. Advantages of CATC systems include question banks, decreased importance of test item security, computer analysis and response to student test answers, item analysis…

  3. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  4. Teste de Conconi adaptado para bicicleta aquática Conconi test adapted to aquatic bicycle

    Directory of Open Access Journals (Sweden)

    Jonas Neves Martins

    2007-10-01

    Full Text Available A prática regular de exercícios físicos tem sido considerada um dos mecanismos que auxiliam a melhoria de padrões da saúde e de qualidade de vida. Em conseqüência do crescimento da procura por academias de ginástica, as atividades físicas no meio líquido, com destaque para a bicicleta aquática, têm aumentado nos últimos anos. No entanto, há ainda carência de métodos para a avaliação e prescrição do treinamento aeróbio neste tipo de equipamento. O objetivo deste estudo foi propor uma adaptação do teste de Conconi et al. (1982 para bicicleta aquática. Foram testados 27 participantes (24 ± 6 anos, 171 ± 8cm, 66 ± 12kg 15 do sexo masculino e 12 do feminino. Os participantes foram submetidos a um teste progressivo, realizado em bicicleta aquática, com carga inicial de 50RPM e incremento de 3RPM a cada minuto, até a exaustão. A FC foi registrada durante todo o teste. Para análise dos dados, foi utilizada estatística descritiva e o teste "t" de Student (P Physical exercise has been considered one of the mechanisms that improve health and quality of life. As a consequence of the enhanced demand for fitness centers, physical activities in liquid environment, especially aquatic cycling, have increased in the last years. However, methods of assessment and prescription of aerobic training in these equipments are still scarce. The objective of this study was to propose an adapted test of Conconi et al (1982 to aquatic bicycle. 27 participants (24 ± 6 years, 171 ± 8 cm, 66 ± 12 kg, 15 male and 12 female, were assessed. The participants have been submitted to a graded test in aquatic bicycle, with initial load of 50 RPM and increments of 3 RPM each minute, until exhaustion. HR was registered during the entire test. For data analysis, descriptive statistics were used as well as Student "t" test for comparison between genders. HRDP was identified in 85% of the subjects. There were not significant differences in HRmax (181 ± 12

  5. Adaptation.

    Science.gov (United States)

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  6. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  7. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  8. Construction of a 2- by 2-foot transonic adaptive-wall test section at the NASA Ames Research Center

    Science.gov (United States)

    Morgan, Daniel G.; Lee, George

    1986-01-01

    The development of a new production-size, two-dimensional, adaptive-wall test section with ventilated walls at the NASA Ames Research Center is described. The new facility incorporates rapid closed-loop operation, computer/sensor integration, and on-line interference assessment and wall corrections. Air flow through the test section is controlled by a series of plenum compartments and three-way slide vales. A fast-scan laser velocimeter was built to measure velocity boundary conditions for the interference assessment scheme. A 15.2-cm- (6.0-in.-) chord NACA 0012 airfoil model will be used in the first experiments during calibration of the facility.

  9. COMPUTATION FORMAT computer codes X4TOC4 and PLOTC4. Implementing and Testing on a Personal Computer

    International Nuclear Information System (INIS)

    McLaughlin, P.K.

    1987-05-01

    This document describes the contents of the diskette containing the COMPUTATION FORMAT codes X4TOC4 and PLOTC4 by D.E. Cullen, and example data for use in implementing and testing these codes on a Personal Computer of the type IBM-PC/AT. Upon request the codes are available from the IAEA Nuclear Data Section, free of charge, on a single diskette. (author)

  10. Communicative Language Testing: Implications for Computer Based Language Testing in French for Specific Purposes

    Science.gov (United States)

    García Laborda, Jesús; López Santiago, Mercedes; Otero de Juan, Nuria; Álvarez Álvarez, Alfredo

    2014-01-01

    Current evolutions of language testing have led to integrating computers in FSP assessments both in oral and written communicative tasks. This paper deals with two main issues: learners' expectations about the types of questions in FSP computer based assessments and the relation with their own experience. This paper describes the experience of 23…

  11. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  12. Adaptive cognitive testing in cerebrovascular disease and vascular dementia

    NARCIS (Netherlands)

    Wouters, Hans; de Koning, Inge; Zwinderman, Aeilko H; van Gool, Willem A; Schmand, Ben; Buiter, Maarten; Lindeboom, Robert

    2009-01-01

    BACKGROUND/AIMS: To examine whether brevity can be combined with precision in measuring global cognitive ability in patients with cerebrovascular disease (CVD) or vascular dementia (VaD). Longer tests (e.g. the CAMCOG) are precise but inefficient, whereas brief tests (e.g. the MMSE) are efficient

  13. Adaptation

    International Development Research Centre (IDRC) Digital Library (Canada)

    building skills, knowledge or networks on adaptation, ... the African partners leading the AfricaAdapt network, together with the UK-based Institute of Development Studies; and ... UNCCD Secretariat, Regional Coordination Unit for Africa, Tunis, Tunisia .... 26 Rural–urban Cooperation on Water Management in the Context of.

  14. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  15. Computational analysis in support of the SSTO flowpath test

    Science.gov (United States)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-10-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  16. Processing-Efficient Distributed Adaptive RLS Filtering for Computationally Constrained Platforms

    Directory of Open Access Journals (Sweden)

    Noor M. Khan

    2017-01-01

    Full Text Available In this paper, a novel processing-efficient architecture of a group of inexpensive and computationally incapable small platforms is proposed for a parallely distributed adaptive signal processing (PDASP operation. The proposed architecture runs computationally expensive procedures like complex adaptive recursive least square (RLS algorithm cooperatively. The proposed PDASP architecture operates properly even if perfect time alignment among the participating platforms is not available. An RLS algorithm with the application of MIMO channel estimation is deployed on the proposed architecture. Complexity and processing time of the PDASP scheme with MIMO RLS algorithm are compared with sequentially operated MIMO RLS algorithm and liner Kalman filter. It is observed that PDASP scheme exhibits much lesser computational complexity parallely than the sequential MIMO RLS algorithm as well as Kalman filter. Moreover, the proposed architecture provides an improvement of 95.83% and 82.29% decreased processing time parallely compared to the sequentially operated Kalman filter and MIMO RLS algorithm for low doppler rate, respectively. Likewise, for high doppler rate, the proposed architecture entails an improvement of 94.12% and 77.28% decreased processing time compared to the Kalman and RLS algorithms, respectively.

  17. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  18. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  19. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  20. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali; Ltaief, Hatem; Gratadour, Damien; Keyes, David E.; Sevin, Arnaud; Abdelfattah, Ahmad; Gendron, Eric; Morel, Carine; Vidal, Fabrice

    2014-01-01

    called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used

  1. L(sub 1) Adaptive Control Design for NASA AirSTAR Flight Test Vehicle

    Science.gov (United States)

    Gregory, Irene M.; Cao, Chengyu; Hovakimyan, Naira; Zou, Xiaotian

    2009-01-01

    In this paper we present a new L(sub 1) adaptive control architecture that directly compensates for matched as well as unmatched system uncertainty. To evaluate the L(sub 1) adaptive controller, we take advantage of the flexible research environment with rapid prototyping and testing of control laws in the Airborne Subscale Transport Aircraft Research system at the NASA Langley Research Center. We apply the L(sub 1) adaptive control laws to the subscale turbine powered Generic Transport Model. The presented results are from a full nonlinear simulation of the Generic Transport Model and some preliminary pilot evaluations of the L(sub 1) adaptive control law.

  2. Adaptation of MPDATA Heterogeneous Stencil Computation to Intel Xeon Phi Coprocessor

    Directory of Open Access Journals (Sweden)

    Lukasz Szustak

    2015-01-01

    Full Text Available The multidimensional positive definite advection transport algorithm (MPDATA belongs to the group of nonoscillatory forward-in-time algorithms and performs a sequence of stencil computations. MPDATA is one of the major parts of the dynamic core of the EULAG geophysical model. In this work, we outline an approach to adaptation of the 3D MPDATA algorithm to the Intel MIC architecture. In order to utilize available computing resources, we propose the (3 + 1D decomposition of MPDATA heterogeneous stencil computations. This approach is based on combination of the loop tiling and fusion techniques. It allows us to ease memory/communication bounds and better exploit the theoretical floating point efficiency of target computing platforms. An important method of improving the efficiency of the (3 + 1D decomposition is partitioning of available cores/threads into work teams. It permits for reducing inter-cache communication overheads. This method also increases opportunities for the efficient distribution of MPDATA computation onto available resources of the Intel MIC architecture, as well as Intel CPUs. We discuss preliminary performance results obtained on two hybrid platforms, containing two CPUs and Intel Xeon Phi. The top-of-the-line Intel Xeon Phi 7120P gives the best performance results, and executes MPDATA almost 2 times faster than two Intel Xeon E5-2697v2 CPUs.

  3. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  4. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  5. Spatial co-adaptation of cortical control columns in a micro-ECoG brain-computer interface

    Science.gov (United States)

    Rouse, A. G.; Williams, J. J.; Wheeler, J. J.; Moran, D. W.

    2016-10-01

    Objective. Electrocorticography (ECoG) has been used for a range of applications including electrophysiological mapping, epilepsy monitoring, and more recently as a recording modality for brain-computer interfaces (BCIs). Studies that examine ECoG electrodes designed and implanted chronically solely for BCI applications remain limited. The present study explored how two key factors influence chronic, closed-loop ECoG BCI: (i) the effect of inter-electrode distance on BCI performance and (ii) the differences in neural adaptation and performance when fixed versus adaptive BCI decoding weights are used. Approach. The amplitudes of epidural micro-ECoG signals between 75 and 105 Hz with 300 μm diameter electrodes were used for one-dimensional and two-dimensional BCI tasks. The effect of inter-electrode distance on BCI control was tested between 3 and 15 mm. Additionally, the performance and cortical modulation differences between constant, fixed decoding using a small subset of channels versus adaptive decoding weights using the entire array were explored. Main results. Successful BCI control was possible with two electrodes separated by 9 and 15 mm. Performance decreased and the signals became more correlated when the electrodes were only 3 mm apart. BCI performance in a 2D BCI task improved significantly when using adaptive decoding weights (80%-90%) compared to using constant, fixed weights (50%-60%). Additionally, modulation increased for channels previously unavailable for BCI control under the fixed decoding scheme upon switching to the adaptive, all-channel scheme. Significance. Our results clearly show that neural activity under a BCI recording electrode (which we define as a ‘cortical control column’) readily adapts to generate an appropriate control signal. These results show that the practical minimal spatial resolution of these control columns with micro-ECoG BCI is likely on the order of 3 mm. Additionally, they show that the combination and

  6. Applying computerized adaptive testing to the Negative Acts Questionnaire-Revised: Rasch analysis of workplace bullying.

    Science.gov (United States)

    Ma, Shu-Ching; Chien, Tsair-Wei; Wang, Hsiu-Hung; Li, Yu-Chi; Yui, Mei-Shu

    2014-02-17

    Workplace bullying is a prevalent problem in contemporary work places that has adverse effects on both the victims of bullying and organizations. With the rapid development of computer technology in recent years, there is an urgent need to prove whether item response theory-based computerized adaptive testing (CAT) can be applied to measure exposure to workplace bullying. The purpose of this study was to evaluate the relative efficiency and measurement precision of a CAT-based test for hospital nurses compared to traditional nonadaptive testing (NAT). Under the preliminary conditions of a single domain derived from the scale, a CAT module bullying scale model with polytomously scored items is provided as an example for evaluation purposes. A total of 300 nurses were recruited and responded to the 22-item Negative Acts Questionnaire-Revised (NAQ-R). All NAT (or CAT-selected) items were calibrated with the Rasch rating scale model and all respondents were randomly selected for a comparison of the advantages of CAT and NAT in efficiency and precision by paired t tests and the area under the receiver operating characteristic curve (AUROC). The NAQ-R is a unidimensional construct that can be applied to measure exposure to workplace bullying through CAT-based administration. Nursing measures derived from both tests (CAT and NAT) were highly correlated (r=.97) and their measurement precisions were not statistically different (P=.49) as expected. CAT required fewer items than NAT (an efficiency gain of 32%), suggesting a reduced burden for respondents. There were significant differences in work tenure between the 2 groups (bullied and nonbullied) at a cutoff point of 6 years at 1 worksite. An AUROC of 0.75 (95% CI 0.68-0.79) with logits greater than -4.2 (or >30 in summation) was defined as being highly likely bullied in a workplace. With CAT-based administration of the NAQ-R for nurses, their burden was substantially reduced without compromising measurement precision.

  7. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    International Nuclear Information System (INIS)

    Ball, D.G.; Cheverton, R.D.

    1985-01-01

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  8. Overview of the computerized adaptive testing special section

    Directory of Open Access Journals (Sweden)

    Vicente Ponsoda

    2000-01-01

    Full Text Available Presentación de la sección monográfica sobre test adaptativos informatizados. Este artículo proporciona una visión conjunta de la sección especial de Psicológica sobre tests adaptativos informatizados. Se presenta también una breve introducción al tema. De cada artículo se muestran sus principales resultados, las conexiones con los demás trabajos de la sección especial y el tema de investigación con el que está más relacionado.

  9. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  10. Overview and current management of computerized adaptive testing in licensing/certification examinations

    Directory of Open Access Journals (Sweden)

    Dong Gi Seo

    2017-07-01

    Full Text Available Computerized adaptive testing (CAT has been implemented in high-stakes examinations such as the National Council Licensure Examination-Registered Nurses in the United States since 1994. Subsequently, the National Registry of Emergency Medical Technicians in the United States adopted CAT for certifying emergency medical technicians in 2007. This was done with the goal of introducing the implementation of CAT for medical health licensing examinations. Most implementations of CAT are based on item response theory, which hypothesizes that both the examinee and items have their own characteristics that do not change. There are 5 steps for implementing CAT: first, determining whether the CAT approach is feasible for a given testing program; second, establishing an item bank; third, pretesting, calibrating, and linking item parameters via statistical analysis; fourth, determining the specification for the final CAT related to the 5 components of the CAT algorithm; and finally, deploying the final CAT after specifying all the necessary components. The 5 components of the CAT algorithm are as follows: item bank, starting item, item selection rule, scoring procedure, and termination criterion. CAT management includes content balancing, item analysis, item scoring, standard setting, practice analysis, and item bank updates. Remaining issues include the cost of constructing CAT platforms and deploying the computer technology required to build an item bank. In conclusion, in order to ensure more accurate estimations of examinees’ ability, CAT may be a good option for national licensing examinations. Measurement theory can support its implementation for high-stakes examinations.

  11. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  12. The full spectrum of climate change adaptation: testing an analytical framework in Tyrolean mountain agriculture (Austria).

    Science.gov (United States)

    Grüneis, Heidelinde; Penker, Marianne; Höferl, Karl-Michael

    2016-01-01

    Our scientific view on climate change adaptation (CCA) is unsatisfying in many ways: It is often dominated by a modernistic perspective of planned pro-active adaptation, with a selective focus on measures directly responding to climate change impacts and thus it is far from real-life conditions of those who are actually affected by climate change. Farmers have to simultaneously adapt to multiple changes. Therefore, also empirical climate change adaptation research needs a more integrative perspective on real-life climate change adaptations. This also has to consider "hidden" adaptations, which are not explicitly and directly motivated by CCA but actually contribute to the sector's adaptability to climate change. The aim of the present study is to develop and test an analytic framework that contributes to a broader understanding of CCA and to bridge the gap between scientific expertise and practical action. The framework distinguishes three types of CCA according to their climate related motivations: explicit adaptations, multi-purpose adaptations, and hidden adaptations. Although agriculture is among the sectors that are most affected by climate change, results from the case study of Tyrolean mountain agriculture show that climate change is ranked behind other more pressing "real-life-challenges" such as changing agricultural policies or market conditions. We identified numerous hidden adaptations which make a valuable contribution when dealing with climate change impacts. We conclude that these hidden adaptations have not only to be considered to get an integrative und more realistic view on CCA; they also provide a great opportunity for linking adaptation strategies to farmers' realities.

  13. Adaptation of a Vocabulary Test from British Sign Language to American Sign Language

    Science.gov (United States)

    Mann, Wolfgang; Roy, Penny; Morgan, Gary

    2016-01-01

    This study describes the adaptation process of a vocabulary knowledge test for British Sign Language (BSL) into American Sign Language (ASL) and presents results from the first round of pilot testing with 20 deaf native ASL signers. The web-based test assesses the strength of deaf children's vocabulary knowledge by means of different mappings of…

  14. Adaptive and Qualitative Changes in Encoding Strategy with Experience: Evidence from the Test-Expectancy Paradigm

    Science.gov (United States)

    Finley, Jason R.; Benjamin, Aaron S.

    2012-01-01

    Three experiments demonstrated learners' abilities to adaptively and qualitatively accommodate their encoding strategies to the demands of an upcoming test. Stimuli were word pairs. In Experiment 1, test expectancy was induced for either cued recall (of targets given cues) or free recall (of targets only) across 4 study-test cycles of the same…

  15. A Comparison of Procedures for Content-Sensitive Item Selection in Computerized Adaptive Tests.

    Science.gov (United States)

    Kingsbury, G. Gage; Zara, Anthony R.

    1991-01-01

    This simulation investigated two procedures that reduce differences between paper-and-pencil testing and computerized adaptive testing (CAT) by making CAT content sensitive. Results indicate that the price in terms of additional test items of using constrained CAT for content balancing is much smaller than that of using testlets. (SLD)

  16. Accelerated Desensitization and Adaptive Attitudes Interventions and Test Gains with Academic Probation Students

    Science.gov (United States)

    Driscoll, Richard; Holt, Bruce; Hunter, Lori

    2005-01-01

    The study evaluates the test-gain benefits of an accelerated desensitization and adaptive attitudes intervention for test-anxious students. College students were screened for high test anxiety. Twenty anxious students, half of them on academic probation, were assigned to an Intervention or to a minimal treatment Control group. The Intervention was…

  17. Scheduling and recording of reactor maintenance and testing by computer

    International Nuclear Information System (INIS)

    Gray, P.L.

    1975-01-01

    The use of a computer program, Maintenance Information and Control (MIAC), at the Savannah River Laboratory (SRL) assists a small operating staff in maintaining three research reactors and a subcritical facility. The program schedules and defines preventive maintenance, schedules required periodic tests, logs repair and cost information, specifies custodial and service responsibilities, and provides equipment maintenance history, all with a minimum of record-keeping

  18. Computer tomography of flows external to test models

    Science.gov (United States)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  19. Benchmark tests and spin adaptation for the particle-particle random phase approximation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yang; Steinmann, Stephan N.; Peng, Degao [Department of Chemistry, Duke University, Durham, North Carolina 27708 (United States); Aggelen, Helen van, E-mail: Helen.VanAggelen@UGent.be [Department of Chemistry, Duke University, Durham, North Carolina 27708 (United States); Department of Inorganic and Physical Chemistry, Ghent University, 9000 Ghent (Belgium); Yang, Weitao, E-mail: Weitao.Yang@duke.edu [Department of Chemistry and Department of Physics, Duke University, Durham, North Carolina 27708 (United States)

    2013-11-07

    The particle-particle random phase approximation (pp-RPA) provides an approximation to the correlation energy in density functional theory via the adiabatic connection [H. van Aggelen, Y. Yang, and W. Yang, Phys. Rev. A 88, 030501 (2013)]. It has virtually no delocalization error nor static correlation error for single-bond systems. However, with its formal O(N{sup 6}) scaling, the pp-RPA is computationally expensive. In this paper, we implement a spin-separated and spin-adapted pp-RPA algorithm, which reduces the computational cost by a substantial factor. We then perform benchmark tests on the G2/97 enthalpies of formation database, DBH24 reaction barrier database, and four test sets for non-bonded interactions (HB6/04, CT7/04, DI6/04, and WI9/04). For the G2/97 database, the pp-RPA gives a significantly smaller mean absolute error (8.3 kcal/mol) than the direct particle-hole RPA (ph-RPA) (22.7 kcal/mol). Furthermore, the error in the pp-RPA is nearly constant with the number of atoms in a molecule, while the error in the ph-RPA increases. For chemical reactions involving typical organic closed-shell molecules, pp- and ph-RPA both give accurate reaction energies. Similarly, both RPAs perform well for reaction barriers and nonbonded interactions. These results suggest that the pp-RPA gives reliable energies in chemical applications. The adiabatic connection formalism based on pairing matrix fluctuation is therefore expected to lead to widely applicable and accurate density functionals.

  20. Computational adaptive optics for broadband optical interferometric tomography of biological tissue.

    Science.gov (United States)

    Adie, Steven G; Graf, Benedikt W; Ahmad, Adeel; Carney, P Scott; Boppart, Stephen A

    2012-05-08

    Aberrations in optical microscopy reduce image resolution and contrast, and can limit imaging depth when focusing into biological samples. Static correction of aberrations may be achieved through appropriate lens design, but this approach does not offer the flexibility of simultaneously correcting aberrations for all imaging depths, nor the adaptability to correct for sample-specific aberrations for high-quality tomographic optical imaging. Incorporation of adaptive optics (AO) methods have demonstrated considerable improvement in optical image contrast and resolution in noninterferometric microscopy techniques, as well as in optical coherence tomography. Here we present a method to correct aberrations in a tomogram rather than the beam of a broadband optical interferometry system. Based on Fourier optics principles, we correct aberrations of a virtual pupil using Zernike polynomials. When used in conjunction with the computed imaging method interferometric synthetic aperture microscopy, this computational AO enables object reconstruction (within the single scattering limit) with ideal focal-plane resolution at all depths. Tomographic reconstructions of tissue phantoms containing subresolution titanium-dioxide particles and of ex vivo rat lung tissue demonstrate aberration correction in datasets acquired with a highly astigmatic illumination beam. These results also demonstrate that imaging with an aberrated astigmatic beam provides the advantage of a more uniform depth-dependent signal compared to imaging with a standard gaussian beam. With further work, computational AO could enable the replacement of complicated and expensive optical hardware components with algorithms implemented on a standard desktop computer, making high-resolution 3D interferometric tomography accessible to a wider group of users and nonspecialists.

  1. Computer-aided system for interactive psychomotor testing

    Science.gov (United States)

    Selivanova, Karina G.; Ignashchuk, Olena V.; Koval, Leonid G.; Kilivnik, Volodymyr S.; Zlepko, Alexandra S.; Sawicki, Daniel; Kalizhanova, Aliya; Zhanpeisova, Aizhan; Smailova, Saule

    2017-08-01

    Nowadays research of psychomotor actions has taken a special place in education, sports, medicine, psychology etc. Development of computer system for psychomotor testing could help solve many operational problems in psychoneurology and psychophysiology and also determine the individual characteristics of fine motor skills. This is particularly relevant issue when it comes to children, students, athletes for definition of personal and professional features. The article presents the dynamics of a developing psychomotor skills and application in the training process of means. The results of testing indicated their significant impact on psychomotor skills development.

  2. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  3. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    Directory of Open Access Journals (Sweden)

    Laura Acqualagna

    Full Text Available In the last years Brain Computer Interface (BCI technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods.

  4. Adapting and Pilot Testing a Parenting Intervention for Homeless Families in Transitional Housing.

    Science.gov (United States)

    Holtrop, Kendal; Holcomb, Jamila E

    2018-01-24

    Intervention adaptation is a promising approach for extending the reach of evidence-based interventions to underserved families. One highly relevant population in need of services are homeless families. In particular, homeless families with children constitute more than one third of the total homeless population in the United States and face several unique challenges to parenting. The purpose of this study was to adapt and pilot test a parenting intervention for homeless families in transitional housing. An established adaptation model was used to guide this process. The systematic adaptation efforts included: (a) examining the theory of change in the original intervention, (b) identifying population differences relevant to homeless families in transitional housing, (c) adapting the content of the intervention, and (d) adapting the evaluation strategy. Next, a pilot test of the adapted intervention was conducted to examine implementation feasibility and acceptability. Feasibility data indicate an intervention spanning several weeks may be difficult to implement in the context of transitional housing. Yet, acceptability of the adapted intervention among participants was consistently high. The findings of this pilot work suggest several implications for informing continued parenting intervention research and practice with homeless families in transitional housing. © 2018 Family Process Institute.

  5. Computing Adaptive Feature Weights with PSO to Improve Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2017-01-01

    Full Text Available Android malware detection is a complex and crucial issue. In this paper, we propose a malware detection model using a support vector machine (SVM method based on feature weights that are computed by information gain (IG and particle swarm optimization (PSO algorithms. The IG weights are evaluated based on the relevance between features and class labels, and the PSO weights are adaptively calculated to result in the best fitness (the performance of the SVM classification model. Moreover, to overcome the defects of basic PSO, we propose a new adaptive inertia weight method called fitness-based and chaotic adaptive inertia weight-PSO (FCAIW-PSO that improves on basic PSO and is based on the fitness and a chaotic term. The goal is to assign suitable weights to the features to ensure the best Android malware detection performance. The results of experiments indicate that the IG weights and PSO weights both improve the performance of SVM and that the performance of the PSO weights is better than that of the IG weights.

  6. Cross-Cultural adaptation of an instrument to computer accessibility evaluation for students with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Gerusa Ferreira Lourenço

    2015-03-01

    Full Text Available The specific literature indicates that the successful education of children with cerebral palsy may require the implementation of appropriate assistive technology resources, allowing students to improve their performance and complete everyday tasks more efficiently and independently. To this end, these resources must be selected properly, emphasizing the importance of an appropriate initial assessment of the child and the possibilities of the resources available. The present study aimed to translate and adapt theoretically an American instrument that evaluates computer accessibility for people with cerebral palsy, in order to contextualize it for applicability to Brazilian students with cerebral palsy. The methodology involved the steps of translation and cross-cultural adaptation of this instrument, as well as the construction of a supplementary script for additional use of that instrument in the educational context. Translation procedures, theoretical and technical adaptation of the American instrument and theoretical analysis (content and semantics were carried out with the participation of professional experts of the special education area as adjudicators. The results pointed to the relevance of the proposal of the translated instrument in conjunction with the script built to the reality of professionals involved with the education of children with cerebral palsy, such as occupational therapists and special educators.

  7. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  8. THE PLUTO CODE FOR ADAPTIVE MESH COMPUTATIONS IN ASTROPHYSICAL FLUID DYNAMICS

    International Nuclear Information System (INIS)

    Mignone, A.; Tzeferacos, P.; Zanni, C.; Bodo, G.; Van Straalen, B.; Colella, P.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  9. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    Science.gov (United States)

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  11. Estimation of an Examinee's Ability in the Web-Based Computerized Adaptive Testing Program IRT-CAT

    Directory of Open Access Journals (Sweden)

    Yoon-Hwan Lee

    2006-11-01

    Full Text Available We developed a program to estimate an examinee's ability in order to provide freely available access to a web-based computerized adaptive testing (CAT program. We used PHP and Java Script as the program languages, PostgresSQL as the database management system on an Apache web server and Linux as the operating system. A system which allows for user input and searching within inputted items and creates tests was constructed. We performed an ability estimation on each test based on a Rasch model and 2- or 3-parametric logistic models. Our system provides an algorithm for a web-based CAT, replacing previous personal computer-based ones, and makes it possible to estimate an examinee?占퐏 ability immediately at the end of test.

  12. Adaptive learning with covariate shift-detection for motor imagery-based brain–computer interface

    OpenAIRE

    Raza, H; Cecotti, H; Li, Y; Prasad, G

    2015-01-01

    A common assumption in traditional supervised learning is the similar probability distribution of data between the training phase and the testing/operating phase. When transitioning from the training to testing phase, a shift in the probability distribution of input data is known as a covariate shift. Covariate shifts commonly arise in a wide range of real-world systems such as electroencephalogram-based brain–computer interfaces (BCIs). In such systems, there is a necessity for continuous mo...

  13. Flight Test of an L(sub 1) Adaptive Controller on the NASA AirSTAR Flight Test Vehicle

    Science.gov (United States)

    Gregory, Irene M.; Xargay, Enric; Cao, Chengyu; Hovakimyan, Naira

    2010-01-01

    This paper presents results of a flight test of the L-1 adaptive control architecture designed to directly compensate for significant uncertain cross-coupling in nonlinear systems. The flight test was conducted on the subscale turbine powered Generic Transport Model that is an integral part of the Airborne Subscale Transport Aircraft Research system at the NASA Langley Research Center. The results presented are for piloted tasks performed during the flight test.

  14. Students Perception on the Use of Computer Based Test

    Science.gov (United States)

    Nugroho, R. A.; Kusumawati, N. S.; Ambarwati, O. C.

    2018-02-01

    Teaching nowadays might use technology in order to disseminate science and knowledge. As part of teaching, the way evaluating study progress and result has also benefited from this IT rapid progress. The computer-based test (CBT) has been introduced to replace the more conventional Paper and Pencil Test (PPT). CBT are considered more advantageous than PPT. It is considered as more efficient, transparent, and has the ability of minimising fraud in cognitive evaluation. Current studies have indicated the debate of CBT vs PPT usage. Most of the current research compares the two methods without exploring the students’ perception about the test. This study will fill the gap in the literature by providing students’ perception on the two tests method. Survey approach is conducted to obtain the data. The sample is collected in two identical classes with similar subject in a public university in Indonesia. Mann-Whitney U test used to analyse the data. The result indicates that there is a significant difference between two groups of students regarding CBT usage. Student with different test method prefers to have test other than what they were having. Further discussion and research implication is discussed in the paper.

  15. Qualitative interviews with healthcare staff in four European countries to inform adaptation of an intervention to increase chlamydia testing.

    Science.gov (United States)

    McNulty, Cliodna; Ricketts, Ellie J; Fredlund, Hans; Uusküla, Anneli; Town, Katy; Rugman, Claire; Tisler-Sala, Anna; Mani, Alix; Dunais, Brigitte; Folkard, Kate; Allison, Rosalie; Touboul, Pia

    2017-09-25

    To determine the needs of primary healthcare general practice (GP) staff, stakeholders and trainers to inform the adaptation of a locally successful complex intervention (Chlamydia Intervention Randomised Trial (CIRT)) aimed at increasing chlamydia testing within primary healthcare within South West England to three EU countries (Estonia, France and Sweden) and throughout England. Qualitative interviews. European primary healthcare in England, France, Sweden and Estonia with a range of chlamydia screening provision in 2013. 45 GP staff, 13 trainers and 18 stakeholders. The iterative interview schedule explored participants' personal attitudes, subjective norms and perceived behavioural controls around provision of chlamydia testing, sexual health services and training in general practice. Researchers used a common thematic analysis. Findings were similar across all countries. Most participants agreed that chlamydia testing and sexual health services should be offered in general practice. There was no culture of GP staff routinely offering opportunistic chlamydia testing or sexual health advice, and due to other priorities, participants reported this would be challenging. All participants indicated that the CIRT workshop covering chlamydia testing and sexual health would be useful if practice based, included all practice staff and action planning, and was adequately resourced. Participants suggested minor adaptations to CIRT to suit their country's health services. A common complex intervention can be adapted for use across Europe, despite varied sexual health provision. The intervention (ChlamydiA Testing Training in Europe (CATTE)) should comprise: a staff workshop covering sexual health and chlamydia testing rates and procedures, action planning and patient materials and staff reminders via computer prompts, emails or newsletters, with testing feedback through practice champions. CATTE materials are available at: www.STItraining.eu. © Article author(s) (or their

  16. Interlaboratory computational comparisons of critical fast test reactor pin lattices

    International Nuclear Information System (INIS)

    Mincey, J.F.; Kerr, H.T.; Durst, B.M.

    1979-01-01

    An objective of the Consolidated Fuel Reprocessing Program's (CFRP) nuclear engineering group at Oak Ridge National Laboratory (ORNL) is to ensure that chemical equipment components designed for the reprocessing of spent LMFBR fuel (among other fuel types) are safe from a criticality standpoint. As existing data are inadequate for the general validation of computational models describing mixed plutonium--uranium oxide systems with isotopic compositions typical of LMFBR fuel, a program of critical experiments has been initiated at the Battelle Pacific Northwest Laboratories (PNL). The first series of benchmark experiments consisted of five square-pitched lattices of unirradiated Fast Test Reactor (FTR) fuel moderated and reflected by light water. Calculations of these five experiments have been conducted by both ORNL/CFRP and PNL personnel with the purpose of exploring how accurately various computational models will predict k/sub eff/ values for such neutronic systems and if differences between k/sub eff/ values obtained with these different models are significant

  17. Pepsi-SAXS: an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles.

    Science.gov (United States)

    Grudinin, Sergei; Garkavenko, Maria; Kazennov, Andrei

    2017-05-01

    A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist-Shannon-Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion order, this method has the same quadratic dependence on the number of atoms in the model as the Debye-based approach, but with a much smaller prefactor in the computational complexity. The method has been systematically validated on a large set of over 50 models collected from the BioIsis and SASBDB databases. Using a laptop, it was demonstrated that Pepsi-SAXS is about seven, 29 and 36 times faster compared with CRYSOL, FoXS and the three-dimensional Zernike method in SAStbx, respectively, when tested on data from the BioIsis database, and is about five, 21 and 25 times faster compared with CRYSOL, FoXS and SAStbx, respectively, when tested on data from SASBDB. On average, Pepsi-SAXS demonstrates comparable accuracy in terms of χ 2 to CRYSOL and FoXS when tested on BioIsis and SASBDB profiles. Together with a small allowed variation of adjustable parameters, this demonstrates the effectiveness of the method. Pepsi-SAXS is available at http://team.inria.fr/nano-d/software/pepsi-saxs.

  18. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    Science.gov (United States)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  19. Translation and cultural adaptation of the Aguado Syntax Test (AST) into Brazilian Portuguese.

    Science.gov (United States)

    Baggio, Gustavo Inheta; Hage, Simone Rocha de Vasconcellos

    2017-12-07

    To perform the translation and cultural adaptation of the Aguado Syntax Test (AST) into Brazilian Portuguese considering the linguistic and cultural reality of the language. The AST assesses the early morphosyntactic development in children aged 3 to 7 in terms of understanding and expression of various types of structures such as sentences, pronouns, verbal voices, comparisons, prepositions and verbal desinence as to number, mode and tense. The process of translation and cultural adaptation followed four steps: 1) preparation of two translations; 2) synthesis of consensual translations; 3) backtranslation; and 4) verification of equivalence between the initial translations and backtranslations that resulted in the final translated version. The whole process of translation and cultural adaptation revealed the presence of equivalence and reconciliation of the translated items and an almost complete semantic equivalence between the two translations and the absence of consistent translation difficulties. The AST was translated and culturally adapted into Brazilian Portuguese, constituting the first step towards validation and standardization of the test.

  20. Describing of elements IO field in a testing computer program

    Directory of Open Access Journals (Sweden)

    Igor V. Loshkov

    2017-01-01

    Full Text Available A standard of describing the process of displaying interactive windows on a computer monitor, through which an output of questions and input of answers are implemented during computer testing, is presented in the article [11]. According to the proposed standard, the description of the process mentioned above is performed with a format line, containing element names, their parameters as well as grouping and auxiliary symbols. Program objects are described using elements of standard. The majority of objects create input and output windows on a computer monitor. The aim of our research was to develop a minimum possible set of elements of standard to perform mathematical and computer science testing.The choice of elements of the standard was conducted in parallel with the development and testing of the program that uses them. This approach made it possible to choose a sufficiently complete set of elements for testing in fields of study mentioned above. For the proposed elements, names were selected in such a way: firstly, they indicate their function and secondly, they coincide with the names of elements in other programming languages that are similar by function. Parameters, their names, their assignments and accepted values are proposed for the elements. The principle of name selection for the parameters was the same as for elements of the standard: the names should correspond to their assignments or coincide with names of similar parameters in other programming languages. The parameters define properties of objects. Particularly, while the elements of standard create windows, the parameters define object properties (location, size, appearance and the sequence in which windows are created. All elements of standard, proposed in this article are composed in a table, the columns of which have names and functions of these elements. Inside the table, the elements of standard are grouped row by row into four sets: input elements, output elements, input

  1. Programs for Testing Processor-in-Memory Computing Systems

    Science.gov (United States)

    Katz, Daniel S.

    2006-01-01

    The Multithreaded Microbenchmarks for Processor-In-Memory (PIM) Compilers, Simulators, and Hardware are computer programs arranged in a series for use in testing the performances of PIM computing systems, including compilers, simulators, and hardware. The programs at the beginning of the series test basic functionality; the programs at subsequent positions in the series test increasingly complex functionality. The programs are intended to be used while designing a PIM system, and can be used to verify that compilers, simulators, and hardware work correctly. The programs can also be used to enable designers of these system components to examine tradeoffs in implementation. Finally, these programs can be run on non-PIM hardware (either single-threaded or multithreaded) using the POSIX pthreads standard to verify that the benchmarks themselves operate correctly. [POSIX (Portable Operating System Interface for UNIX) is a set of standards that define how programs and operating systems interact with each other. pthreads is a library of pre-emptive thread routines that comply with one of the POSIX standards.

  2. Simulation Research on Adaptive Control of a Six-degree-of-freedom Material-testing Machine

    Directory of Open Access Journals (Sweden)

    Dan Wang

    2014-02-01

    Full Text Available This paper presents an adaptive controller equipped with a stiffness estimation method for a novel material-testing machine, in order to alleviate the performance depression caused by the stiffness variance of the tested specimen. The dynamic model of the proposed machine is built using the Kane method, and kinematic model is established with a closed-form solution. The stiffness estimation method is developed based on the recursive least-squares method and the proposed stiffness equivalent matrix. Control performances of the adaptive controller are simulated in detail. The simulation results illustrate that the proposed controller can greatly improve the control performance of the target material-testing machine by online stiffness estimation and adaptive parameter tuning, especially in low-cycle fatigue (LCF and high-cycle fatigue (HCF tests.

  3. Adapting tests of sign language assessment for other sign languages--a review of linguistic, cultural, and psychometric problems.

    Science.gov (United States)

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from one natural sign language to another. Two tests which have been adapted for several other sign languages are focused upon: the Test for American Sign Language and the British Sign Language Receptive Skills Test. A brief description is given of each test as well as insights from ongoing adaptations of these tests for other sign languages. The problems reported in these adaptations were found to be grounded in linguistic and cultural differences, which need to be considered for future test adaptations. Other reported shortcomings of test adaptation are related to the question of how well psychometric measures transfer from one instrument to another.

  4. Optimizing the Use of Response Times for Item Selection in Computerized Adaptive Testing

    Science.gov (United States)

    Choe, Edison M.; Kern, Justin L.; Chang, Hua-Hua

    2018-01-01

    Despite common operationalization, measurement efficiency of computerized adaptive testing should not only be assessed in terms of the number of items administered but also the time it takes to complete the test. To this end, a recent study introduced a novel item selection criterion that maximizes Fisher information per unit of expected response…

  5. Accelerated Desensitization with Adaptive Attitudes and Test Gains with 5th Graders

    Science.gov (United States)

    Miller, Melanie; Morton, Jerome; Driscoll, Richard; Davis, Kai A.

    2006-01-01

    The study evaluates an easily-administered test-anxiety reduction program. An entire fifth grade was screened, and 36 students identified as test-anxious were randomly assigned to an Intervention or a non-participant Control group. The intervention was an accelerated desensitization and adaptive attitudes (ADAA) treatment which involved…

  6. A minimax sequential procedure in the context of computerized adaptive mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1997-01-01

    The purpose of this paper is to derive optimal rules for variable-length mastery tests in case three mastery classification decisions (nonmastery, partial mastery, and mastery) are distinguished. In a variable-length or adaptive mastery test, the decision is to classify a subject as a master, a

  7. Strategies for Controlling Item Exposure in Computerized Adaptive Testing with the Generalized Partial Credit Model

    Science.gov (United States)

    Davis, Laurie Laughlin

    2004-01-01

    Choosing a strategy for controlling item exposure has become an integral part of test development for computerized adaptive testing (CAT). This study investigated the performance of six procedures for controlling item exposure in a series of simulated CATs under the generalized partial credit model. In addition to a no-exposure control baseline…

  8. Development of computerized adaptive testing (CAT) for the EORTC QLQ-C30 physical functioning dimension

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Groenvold, Mogens; Aaronson, Neil K

    2011-01-01

    Computerized adaptive test (CAT) methods, based on item response theory (IRT), enable a patient-reported outcome instrument to be adapted to the individual patient while maintaining direct comparability of scores. The EORTC Quality of Life Group is developing a CAT version of the widely used EORTC...... QLQ-C30. We present the development and psychometric validation of the item pool for the first of the scales, physical functioning (PF)....

  9. Testing an extrapolation chamber in computed tomography standard beams

    Science.gov (United States)

    Castro, M. C.; Silva, N. F.; Caldas, L. V. E.

    2018-03-01

    The computed tomography (CT) is responsible for the highest dose values to the patients. Therefore, the radiation doses in this procedure must be accurate. However, there is no primary standard system for this kind of radiation beam yet. In order to search for a CT primary standard, an extrapolation ionization chamber built at the Calibration Laboratory (LCI) of the Instituto de Pesquisas Energéticas e Nucleares (IPEN), was tested in this work. The results showed to be within the international recommended limits.

  10. Token test and computed tomogram in cerebral apoplexy

    International Nuclear Information System (INIS)

    Hanazono, Toshihide; Watanabe, Shunzo; Tasaki, Hiroichi; Hojo, Kei; Sato, Tokijiro; Hirano, Takashi; Metoki, Hirofumi.

    1985-01-01

    One hundred and eighteen patients (103 with cerebrovascular disorder and 15 with head injury or cerebral tumor) who developed aphasia were examined using computed tomography (CT). Token test (TT) scores and the presence or absence of lesions on CT were inputted onto microcomputer. The affected area was drawn by hand using a standardized matrix and a digitizer. There was linear correlation between measured TT scores and expected TT scores from CT. There was no evidence of relationship between TT scores and the lateral lobe which has been considered responsible for speech function. CT seemed to predict TT scores to some extent. (Namekawa, K.)

  11. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    Science.gov (United States)

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  12. Development and Application of Detection Indices for Measuring Guessing Behaviors and Test-Taking Effort in Computerized Adaptive Testing

    Science.gov (United States)

    Chang, Shu-Ren; Plake, Barbara S.; Kramer, Gene A.; Lien, Shu-Mei

    2011-01-01

    This study examined the amount of time that different ability-level examinees spend on questions they answer correctly or incorrectly across different pretest item blocks presented on a fixed-length, time-restricted computerized adaptive testing (CAT). Results indicate that different ability-level examinees require different amounts of time to…

  13. Computer simulation of ultrasonic testing for aerospace vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Yamawaki, H [National Institute for Materials Science, 1-2-1, Sengen, 305-0047 Tsukuba (Japan); Moriya, S; Masuoka, T [Japan Aerospace Exploration Agency, 1 Koganesawa, Kimigawa, 981-1525 Kakuda (Japan); Takatsubo, J, E-mail: yamawaki.hisashi@nims.go.jp [Advanced Industrial Science and Technology, AIST Tsukuba Central 2, 1-1-1 Umezono, 305-8568 Tsukuba (Japan)

    2011-01-01

    Non-destructive testing techniques are developed to secure reliability of aerospace vehicles used repetitively. In the case of cracks caused by thermal stress on walls in combustion chambers of liquid-fuel rockets, it is examined by ultrasonic waves visualization technique developed in AIST. The technique is composed with non-contact ultrasonic generation by pulsed-laser scanning, piezoelectric transducer for the ultrasonic detection, and image reconstruction processing. It enables detection of defects by visualization of ultrasonic waves scattered by the defects. In NIMS, the condition of the detection by the visualization is investigated using computer simulation for ultrasonic propagation that has capability of fast 3-D calculation. The simulation technique is based on finite-difference method and two-step elastic wave equations. It is reported about the investigation by the calculation, and shows availability of the simulation for the ultrasonic testing technique of the wall cracks.

  14. Computer simulation of the Charpy V-notch toughness test

    International Nuclear Information System (INIS)

    Norris, D.M. Jr.

    1977-01-01

    The dynamic Charpy V-notch test was simulated on a computer. The calculational models (for A-533 Grade B class 1 steel) used both a rounded and a flat-tipped striker. The notch stress/strain state was found to be independent of the three-point loading type and was most strongly correlated with notch-opening displacement. The dynamic stress/strain state at the time of fracture initiation was obtained by comparing the calculated deformed shape with that obtained in interrupted Charpy V-notch tests where cracking had started. The calculation was also compared with stress/strain states calculated in other geometries at failure. The distribution and partition of specimen energy was calculated and adiabatic heating and strain rate are discussed

  15. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  16. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Suleman Khan

    2014-01-01

    Full Text Available Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  17. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    Science.gov (United States)

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  18. Implementation and adaption of the Computer Code ECOSYS/EXCEL for Austria as OECOSYS/EXCEL

    International Nuclear Information System (INIS)

    Hick, H.; Suda, M.; Mueck, K.

    1998-03-01

    During 1989, under contract to the Austrian Chamber of the Federal Chancellor, department VII, the radioecological forecast model OECOSYS was implemented by the Austrian Research Centre Seibersdorf on a VAX computer using VAX Fortran. OECOSYS allows the prediction of the consequences after a large scale contamination event. During 1992, under contract to the Austrian Federal Ministry of Health, Sports and Consumer Protection, department III OECOSYS - in the version of 1989 - was implemented on PC's in Seibersdorf and the Ministry using OS/2 and Microsoft -Fortran. In March 1993, the Ministry ordered an update which had become necessary and the evaluation of two exercise scenarios. Since that time the prognosis model with its auxiliary program and communication facilities is kept on stand-by and yearly exercises are performed to maintain its readiness. The current report describes the implementation and adaption to Austrian conditions of the newly available EXCEL version of the German ECOSYS prognosis model as OECOSYS. (author)

  19. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Science.gov (United States)

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  20. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  1. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  2. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    Science.gov (United States)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  3. Computer-based tests: The impact of test design and problem of equivalency

    Czech Academy of Sciences Publication Activity Database

    Květon, Petr; Jelínek, Martin; Vobořil, Dalibor; Klimusová, H.

    -, č. 23 (2007), s. 32-51 ISSN 0747-5632 R&D Projects: GA ČR(CZ) GA406/99/1052; GA AV ČR(CZ) KSK9058117 Institutional research plan: CEZ:AV0Z7025918 Keywords : Computer-based assessment * speeded test * equivalency Subject RIV: AN - Psychology Impact factor: 1.344, year: 2007

  4. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  5. An adaptive Mantel-Haenszel test for sensitivity analysis in observational studies.

    Science.gov (United States)

    Rosenbaum, Paul R; Small, Dylan S

    2017-06-01

    In a sensitivity analysis in an observational study with a binary outcome, is it better to use all of the data or to focus on subgroups that are expected to experience the largest treatment effects? The answer depends on features of the data that may be difficult to anticipate, a trade-off between unknown effect-sizes and known sample sizes. We propose a sensitivity analysis for an adaptive test similar to the Mantel-Haenszel test. The adaptive test performs two highly correlated analyses, one focused analysis using a subgroup, one combined analysis using all of the data, correcting for multiple testing using the joint distribution of the two test statistics. Because the two component tests are highly correlated, this correction for multiple testing is small compared with, for instance, the Bonferroni inequality. The test has the maximum design sensitivity of two component tests. A simulation evaluates the power of a sensitivity analysis using the adaptive test. Two examples are presented. An R package, sensitivity2x2xk, implements the procedure. © 2016, The International Biometric Society.

  6. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    Science.gov (United States)

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  7. Adaptive hybrid brain-computer interaction: ask a trainer for assistance!

    Science.gov (United States)

    Müller-Putz, Gernot R; Steyrl, David; Faller, Josef

    2014-01-01

    In applying mental imagery brain-computer interfaces (BCIs) to end users, training is a key part for novice users to get control. In general learning situations, it is an established concept that a trainer assists a trainee to improve his/her aptitude in certain skills. In this work, we want to evaluate whether we can apply this concept in the context of event-related desynchronization (ERD) based, adaptive, hybrid BCIs. Hence, in a first session we merged the features of a high aptitude BCI user, a trainer, and a novice user, the trainee, in a closed-loop BCI feedback task and automatically adapted the classifier over time. In a second session the trainees operated the system unassisted. Twelve healthy participants ran through this protocol. Along with the trainer, the trainees achieved a very high overall peak accuracy of 95.3 %. In the second session, where users operated the BCI unassisted, they still achieved a high overall peak accuracy of 83.6%. Ten of twelve first time BCI users successfully achieved significantly better than chance accuracy. Concluding, we can say that this trainer-trainee approach is very promising. Future research should investigate, whether this approach is superior to conventional training approaches. This trainer-trainee concept could have potential for future application of BCIs to end users.

  8. Efficient computation of the elastography inverse problem by combining variational mesh adaption and a clustering technique

    International Nuclear Information System (INIS)

    Arnold, Alexander; Bruhns, Otto T; Reichling, Stefan; Mosler, Joern

    2010-01-01

    This paper is concerned with an efficient implementation suitable for the elastography inverse problem. More precisely, the novel algorithm allows us to compute the unknown stiffness distribution in soft tissue by means of the measured displacement field by considerably reducing the numerical cost compared to previous approaches. This is realized by combining and further elaborating variational mesh adaption with a clustering technique similar to those known from digital image compression. Within the variational mesh adaption, the underlying finite element discretization is only locally refined if this leads to a considerable improvement of the numerical solution. Additionally, the numerical complexity is reduced by the aforementioned clustering technique, in which the parameters describing the stiffness of the respective soft tissue are sorted according to a predefined number of intervals. By doing so, the number of unknowns associated with the elastography inverse problem can be chosen explicitly. A positive side effect of this method is the reduction of artificial noise in the data (smoothing of the solution). The performance and the rate of convergence of the resulting numerical formulation are critically analyzed by numerical examples.

  9. Cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese.

    Science.gov (United States)

    Rossi, Natalia Freitas; Lindau, Tâmara de Andrade; Gillam, Ronald Bradley; Giacheti, Célia Maria

    To accomplish the translation and cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese. The TNL is a formal instrument which assesses narrative comprehension and oral narration of children between the ages of 5-0 and 11-11 (years-months). The TNL translation and adaptation process had the following steps: (1) translation into the target language; (2) summary of the translated versions; (3) back-translation; (4) checking of the conceptual, semantics and cultural equivalence process and (5) pilot study (56 children within the test age range and from both genders). The adapted version maintained the same structure as the original version: number of tasks (both, three comprehension and oral narration), narrative formats (no picture, sequenced pictures and single picture) and scoring system. There were no adjustments to the pictures. The "McDonald's Story" was replaced by the "Snack Bar History" to meet the semantic and experiential equivalence of the target population. The other stories had semantic and grammatical adjustments. Statistically significant difference was found when comparing the raw score (comprehension, narration and total) of age groups from the adapted version. Adjustments were required to meet the equivalence between the original and the translated versions. The adapted version showed it has the potential to identify differences in oral narratives of children in the age range provided by the test. Measurement equivalence for validation and test standardization are in progress and will be able to supplement the study outcomes.

  10. Test and control computer user's guide for a digital beam former test system

    Science.gov (United States)

    Alexovich, Robert E.; Mallasch, Paul G.

    1992-01-01

    A Digital Beam Former Test System was developed to determine the effects of noise, interferers and distortions, and digital implementations of beam forming as applied to the Tracking and Data Relay Satellite 2 (TDRS 2) architectures. The investigation of digital beam forming with application to TDRS 2 architectures, as described in TDRS 2 advanced concept design studies, was conducted by the NASA/Lewis Research Center for NASA/Goddard Space Flight Center. A Test and Control Computer (TCC) was used as the main controlling element of the digital Beam Former Test System. The Test and Control Computer User's Guide for a Digital Beam Former Test System provides an organized description of the Digital Beam Former Test System commands. It is written for users who wish to conduct tests of the Digital Beam forming Test processor using the TCC. The document describes the function, use, and syntax of the TCC commands available to the user while summarizing and demonstrating the use of the commands wtihin DOS batch files.

  11. Comparison of computer code calculations with FEBA test data

    International Nuclear Information System (INIS)

    Zhu, Y.M.

    1988-06-01

    The FEBA forced feed reflood experiments included base line tests with unblocked geometry. The experiments consisted of separate effect tests on a full-length 5x5 rod bundle. Experimental cladding temperatures and heat transfer coefficients of FEBA test No. 216 are compared with the analytical data postcalculated utilizing the SSYST-3 computer code. The comparison indicates a satisfactory matching of the peak cladding temperatures, quench times and heat transfer coefficients for nearly all axial positions. This agreement was made possible by the use of an artificially adjusted value of the empirical code input parameter in the heat transfer for the dispersed flow regime. A limited comparison of test data and calculations using the RELAP4/MOD6 transient analysis code are also included. In this case the input data for the water entrainment fraction and the liquid weighting factor in the heat transfer for the dispersed flow regime were adjusted to match the experimental data. On the other hand, no fitting of the input parameters was made for the COBRA-TF calculations which are included in the data comparison. (orig.) [de

  12. Students' Attitude toward and Acceptability of Computerized Adaptive Testing in Medical School and their Effect on the Examinees' Ability

    Directory of Open Access Journals (Sweden)

    Mee Young Kim

    2005-06-01

    Full Text Available An examinee's ability can be evaluated precisely using computerized adaptive testing (CAT, which is shorter than written tests and more efficient in terms of the duration of the examination. We used CAT for the second General Examination of 98 senior students in medical college on November 27, 2004. We prepared 1,050 pre-calibrated test items according to item response theory, which had been used for the General Examination administered to senior students in 2003. The computer was programmed to pose questions until the standard error of the ability estimate was smaller than 0.01. To determine the students' attitude toward and evaluation of CAT, we conducted surveys before and after the examination, via the Web. The mean of the students' ability estimates was 0.3513 and its standard deviation was 0.9097 (range -2.4680 to +2.5310. There was no significant difference in the ability estimates according to the responses of students to items concerning their experience with CAT, their ability to use a computer, or their anxiety before and after the examination (p>0.05. Many students were unhappy that they could not recheck their responses (49%, and some stated that there were too few examination items (24%. Of the students, 79 % had no complaints concerning using a computer and 63% wanted to expand the use of CAT. These results indicate that CAT can be implemented in medical schools without causing difficulties for users.

  13. An Adaptive and Integrated Low-Power Framework for Multicore Mobile Computing

    Directory of Open Access Journals (Sweden)

    Jongmoo Choi

    2017-01-01

    Full Text Available Employing multicore in mobile computing such as smartphone and IoT (Internet of Things device is a double-edged sword. It provides ample computing capabilities required in recent intelligent mobile services including voice recognition, image processing, big data analysis, and deep learning. However, it requires a great deal of power consumption, which causes creating a thermal hot spot and putting pressure on the energy resource in a mobile device. In this paper, we propose a novel framework that integrates two well-known low-power techniques, DPM (Dynamic Power Management and DVFS (Dynamic Voltage and Frequency Scaling for energy efficiency in multicore mobile systems. The key feature of the proposed framework is adaptability. By monitoring the online resource usage such as CPU utilization and power consumption, the framework can orchestrate diverse DPM and DVFS policies according to workload characteristics. Real implementation based experiments using three mobile devices have shown that it can reduce the power consumption ranging from 22% to 79%, while affecting negligibly the performance of workloads.

  14. Effect of radiation dose and adaptive statistical iterative reconstruction on image quality of pulmonary computed tomography

    International Nuclear Information System (INIS)

    Sato, Jiro; Akahane, Masaaki; Inano, Sachiko; Terasaki, Mariko; Akai, Hiroyuki; Katsura, Masaki; Matsuda, Izuru; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    The purpose of this study was to assess the effects of dose and adaptive statistical iterative reconstruction (ASIR) on image quality of pulmonary computed tomography (CT). Inflated and fixed porcine lungs were scanned with a 64-slice CT system at 10, 20, 40 and 400 mAs. Using automatic exposure control, 40 mAs was chosen as standard dose. Scan data were reconstructed with filtered back projection (FBP) and ASIR. Image pairs were obtained by factorial combination of images at a selected level. Using a 21-point scale, three experienced radiologists independently rated differences in quality between adjacently displayed paired images for image noise, image sharpness and conspicuity of tiny nodules. A subjective quality score (SQS) for each image was computed based on Anderson's functional measurement theory. The standard deviation was recorded as a quantitative noise measurement. At all doses examined, SQSs improved with ASIR for all evaluation items. No significant differences were noted between the SQSs for 40%-ASIR images obtained at 20 mAs and those for FBP images at 40 mAs. Compared to the FBP algorithm, ASIR for lung CT can enable an approximately 50% dose reduction from the standard dose while preserving visualization of small structures. (author)

  15. A computer vision based candidate for functional balance test.

    Science.gov (United States)

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  16. TRAC, a collaborative computer tool for tracer-test interpretation

    Directory of Open Access Journals (Sweden)

    Fécamp C.

    2013-05-01

    Full Text Available Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being. Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr.

  17. Validation and testing of the VAM2D computer code

    International Nuclear Information System (INIS)

    Kool, J.B.; Wu, Y.S.

    1991-10-01

    This document describes two modeling studies conducted by HydroGeoLogic, Inc. for the US NRC under contract no. NRC-04089-090, entitled, ''Validation and Testing of the VAM2D Computer Code.'' VAM2D is a two-dimensional, variably saturated flow and transport code, with applications for performance assessment of nuclear waste disposal. The computer code itself is documented in a separate NUREG document (NUREG/CR-5352, 1989). The studies presented in this report involve application of the VAM2D code to two diverse subsurface modeling problems. The first one involves modeling of infiltration and redistribution of water and solutes in an initially dry, heterogeneous field soil. This application involves detailed modeling over a relatively short, 9-month time period. The second problem pertains to the application of VAM2D to the modeling of a waste disposal facility in a fractured clay, over much larger space and time scales and with particular emphasis on the applicability and reliability of using equivalent porous medium approach for simulating flow and transport in fractured geologic media. Reflecting the separate and distinct nature of the two problems studied, this report is organized in two separate parts. 61 refs., 31 figs., 9 tabs

  18. Functional reach and lateral reach tests adapted for aquatic physical therapy

    Directory of Open Access Journals (Sweden)

    Ana Angélica Ribeiro de Lima

    Full Text Available Abstract Introduction: Functional reach (FR and lateral reach (LR tests are widely used in scientific research and clinical practice. Assessment tools are useful in assessing subjects with greater accuracy and are usually adapted according to the limitations of each condition. Objective: To adapt FR and LR tests for use in an aquatic environment and assess the performance of healthy young adults. Methods: We collected anthropometric data and information on whether the participant exercised regularly or not. The FR and LR tests were adapted for use in an aquatic environment and administered to 47 healthy subjects aged 20-30 years. Each test was repeated three times. Results: Forty-one females and six males were assessed. The mean FR test score for men was 24.06 cm, whereas the mean value for right lateral reach (RLR was 10.94 cm and for left lateral reach (LLR was 9.78 cm. For females, the mean FR score was 17.57 cm, while the mean values for RLR was 8.84cm and for LLR was 7.76 cm. Men performed better in the FR (p < 0.001 and RLR tests than women (p = 0.037. Individuals who exercised regularly showed no differences in performance level when compared with their counterparts. Conclusion: The FR and LR tests were adapted for use in an aquatic environment. Males performed better on the FR and RLR tests, when compared to females. There was no correlation between the FR and LR tests and weight, height, Body Mass Index (BMI, foot length or length of the dominant upper limb.

  19. Psychometric Evaluation of the Italian Adaptation of the Test of Inferential and Creative Thinking

    Science.gov (United States)

    Faraci, Palmira; Hell, Benedikt; Schuler, Heinz

    2016-01-01

    This article describes the psychometric properties of the Italian adaptation of the "Analyse des Schlussfolgernden und Kreativen Denkens" (ASK; Test of Inferential and Creative Thinking) for measuring inferential and creative thinking. The study aimed to (a) supply evidence for the factorial structure of the instrument, (b) describe its…

  20. Individualized evaluation of cholinesterase inhibitors effects in dementia with adaptive cognitive testing

    NARCIS (Netherlands)

    Wouters, Hans; van Campen, Jos P. C. M.; Appels, Bregje A.; Beijnen, Jos H.; Zwinderman, Aeilko H.; van Gool, Willem A.; Schmand, Ben

    2016-01-01

    Computerized Adaptive Testing (CAT) of cognitive function, selects for every individual patient, only items of appropriate difficulty to estimate his or her level of cognitive impairment. Therefore, CAT has the potential to combine brevity with precision. We retrospectively examined the evaluation

  1. Individualized evaluation of cholinesterase inhibitors effects in dementia with adaptive cognitive testing

    NARCIS (Netherlands)

    Wouters, Hans; Van Campen, Jos P C M; Appels, Bregje A; Beijnen, Jos H; Zwinderman, Aeilko H; Van Gool, Willem A; Schmand, Ben

    2015-01-01

    Computerized Adaptive Testing (CAT) of cognitive function, selects for every individual patient, only items of appropriate difficulty to estimate his or her level of cognitive impairment. Therefore, CAT has the potential to combine brevity with precision. We retrospectively examined the evaluation

  2. Individualized evaluation of cholinesterase inhibitors effects in dementia with adaptive cognitive testing

    NARCIS (Netherlands)

    Wouters, H.; van Campen, J.P.C.M.; Appels, B.A.; Beijnen, J.H.; Zwinderman, A.H.; van Gool, W.A.; Schmand, B.

    Computerized Adaptive Testing (CAT) of cognitive function, selects for every individual patient, only items of appropriate difficulty to estimate his or her level of cognitive impairment. Therefore, CAT has the potential to combine brevity with precision. We retrospectively examined the evaluation

  3. A Feedback Control Strategy for Enhancing Item Selection Efficiency in Computerized Adaptive Testing

    Science.gov (United States)

    Weissman, Alexander

    2006-01-01

    A computerized adaptive test (CAT) may be modeled as a closed-loop system, where item selection is influenced by trait level ([theta]) estimation and vice versa. When discrepancies exist between an examinee's estimated and true [theta] levels, nonoptimal item selection is a likely result. Nevertheless, examinee response behavior consistent with…

  4. Using the U.S. "Test of Financial Literacy" in Germany--Adaptation and Validation

    Science.gov (United States)

    Förster, Manuel; Happ, Roland; Molerov, Dimitar

    2017-01-01

    In this article, the authors present the adaptation and validation processes conducted to render the American "Test of Financial Literacy" (TFL) suitable for use in Germany (TFL-G). First, they outline the translation procedure followed and the various cultural adjustments made in line with international standards. Next, they present…

  5. Psychometric evaluation of the EORTC computerized adaptive test (CAT) fatigue item pool

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Giesinger, Johannes M; Holzner, Bernhard

    2013-01-01

    Fatigue is one of the most common symptoms associated with cancer and its treatment. To obtain a more precise and flexible measure of fatigue, the EORTC Quality of Life Group has developed a computerized adaptive test (CAT) measure of fatigue. This is part of an ongoing project developing a CAT...

  6. Detection of person misfit in computerized adaptive tests with polytomous items

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    2000-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be estimated inaccurately. For computerized adaptive tests (CAT) with dichotomous items, several person-fit statistics for detecting nonfitting item score patterns have been proposed. Both for

  7. A Method for the Comparison of Item Selection Rules in Computerized Adaptive Testing

    Science.gov (United States)

    Barrada, Juan Ramon; Olea, Julio; Ponsoda, Vicente; Abad, Francisco Jose

    2010-01-01

    In a typical study comparing the relative efficiency of two item selection rules in computerized adaptive testing, the common result is that they simultaneously differ in accuracy and security, making it difficult to reach a conclusion on which is the more appropriate rule. This study proposes a strategy to conduct a global comparison of two or…

  8. Approaching Sign Language Test Construction: Adaptation of the German Sign Language Receptive Skills Test

    Science.gov (United States)

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired…

  9. The case for bilingual language tests: a study of test adaptation and ...

    African Journals Online (AJOL)

    The justification for the use of language tests in education in multilingual and multicultural societies needs to include both the aims of bilingual education, and evidence that the international standards for tests that are available in two or more languages are being met. In multilingual and multicultural societies, language tests ...

  10. Computationally Efficient Blind Code Synchronization for Asynchronous DS-CDMA Systems with Adaptive Antenna Arrays

    Directory of Open Access Journals (Sweden)

    Chia-Chang Hu

    2005-04-01

    Full Text Available A novel space-time adaptive near-far robust code-synchronization array detector for asynchronous DS-CDMA systems is developed in this paper. There are the same basic requirements that are needed by the conventional matched filter of an asynchronous DS-CDMA system. For the real-time applicability, a computationally efficient architecture of the proposed detector is developed that is based on the concept of the multistage Wiener filter (MWF of Goldstein and Reed. This multistage technique results in a self-synchronizing detection criterion that requires no inversion or eigendecomposition of a covariance matrix. As a consequence, this detector achieves a complexity that is only a linear function of the size of antenna array (J, the rank of the MWF (M, the system processing gain (N, and the number of samples in a chip interval (S, that is, 𝒪(JMNS. The complexity of the equivalent detector based on the minimum mean-squared error (MMSE or the subspace-based eigenstructure analysis is a function of 𝒪((JNS3. Moreover, this multistage scheme provides a rapid adaptive convergence under limited observation-data support. Simulations are conducted to evaluate the performance and convergence behavior of the proposed detector with the size of the J-element antenna array, the amount of the L-sample support, and the rank of the M-stage MWF. The performance advantage of the proposed detector over other DS-CDMA detectors is investigated as well.

  11. Translation, Cultural Adaptation and Validation of the Simple Shoulder Test to Spanish

    OpenAIRE

    Arcuri, Francisco; Barclay, Fernando; Nacul, Ivan

    2015-01-01

    Background: The validation of widely used scales facilitates the comparison across international patient samples. Objective: The objective was to translate, culturally adapt and validate the Simple Shoulder Test into Argentinian Spanish. Methods: The Simple Shoulder Test was translated from English into Argentinian Spanish by two independent translators, translated back into English and evaluated for accuracy by an expert committee to correct the possible discrepancies. It was then administer...

  12. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    Science.gov (United States)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the

  13. Adaptive Neuro-Fuzzy Computing Technique for Determining Turbulent Flow Friction Coefficient

    Directory of Open Access Journals (Sweden)

    Mohammad Givehchi

    2013-08-01

    Full Text Available Estimation of the friction coefficient in pipes is very important in many water and wastewater engineering issues, such as distribution of velocity and shear stress, erosion, sediment transport and head loss. In analyzing these problems, knowing the friction coefficient, can obtain estimates that are more accurate. In this study in order to estimate the friction coefficient in pipes, using adaptive neuro-fuzzy inference systems (ANFIS, grid partition method was used. For training and testing of neuro-fuzzy model, the data derived from the Colebrook’s equation was used. In the neuro-fuzzy approach, pipe relative roughness and Reynolds number are considered as input variables and friction coefficient as output variable is considered. Performance of the proposed approach was evaluated by using of the data obtained from the Colebrook’s equation and based on statistical indicators such as coefficient determination (R2, root mean squared error (RMSE and mean absolute error (MAE. The results showed that the adaptive nerou-fuzzy inference system with grid partition method and gauss model as an input membership function and linear as an output function could estimate friction coefficient more accurately than other conditions. The new proposed approach in this paper has capability of application in the practical design issues and can be combined with mathematical and numerical models of sediment transfer or real-time updating of these models.

  14. Computational hydrodynamics and optical performance of inductively-coupled plasma adaptive lenses

    Energy Technology Data Exchange (ETDEWEB)

    Mortazavi, M.; Urzay, J., E-mail: jurzay@stanford.edu; Mani, A. [Center for Turbulence Research, Stanford University, Stanford, California 94305-3024 (United States)

    2015-06-15

    This study addresses the optical performance of a plasma adaptive lens for aero-optical applications by using both axisymmetric and three-dimensional numerical simulations. Plasma adaptive lenses are based on the effects of free electrons on the phase velocity of incident light, which, in theory, can be used as a phase-conjugation mechanism. A closed cylindrical chamber filled with Argon plasma is used as a model lens into which a beam of light is launched. The plasma is sustained by applying a radio-frequency electric current through a coil that envelops the chamber. Four different operating conditions, ranging from low to high powers and induction frequencies, are employed in the simulations. The numerical simulations reveal complex hydrodynamic phenomena related to buoyant and electromagnetic laminar transport, which generate, respectively, large recirculating cells and wall-normal compression stresses in the form of local stagnation-point flows. In the axisymmetric simulations, the plasma motion is coupled with near-wall axial striations in the electron-density field, some of which propagate in the form of low-frequency traveling disturbances adjacent to vortical quadrupoles that are reminiscent of Taylor-Görtler flow structures in centrifugally unstable flows. Although the refractive-index fields obtained from axisymmetric simulations lead to smooth beam wavefronts, they are found to be unstable to azimuthal disturbances in three of the four three-dimensional cases considered. The azimuthal striations are optically detrimental, since they produce high-order angular aberrations that account for most of the beam wavefront error. A fourth case is computed at high input power and high induction frequency, which displays the best optical properties among all the three-dimensional simulations considered. In particular, the increase in induction frequency prevents local thermalization and leads to an axisymmetric distribution of electrons even after introduction of

  15. Airflow Patterns In Nuclear Workplace - Computer Simulation And Qualitative Tests

    International Nuclear Information System (INIS)

    Haim, M.; Szanto, M.; Weiss, Y.; Kravchick, T.; Levinson, S.; German, U.

    1999-01-01

    Concentration of airborne radioactive materials inside a room can vary widely from one location to another, sometimes by orders of magnitude even for locations that are relatively close. Inappropriately placed samplers can give misleading results and. therefore, the location of air samplers is important. Proper placement of samplers cannot be determined simply by observing the position of room air supply and exhaust vents. Airflow studies, such as the release of smoke aerosols, should be used. The significance of airflow pattern studies depends on the purpose of sampling - for estimating worker intakes, warning of high concentrations. defacing airborne radioactive areas, testing for confinement of sealed radioactive materials. etc. When sampling air in rooms with complex airflow patterns, it may be useful to use qualitative airflow studies with smoke tubes, smoke candles or isostatic bubbles. The U.S. Nuclear Regulatory Commission - Regulatory Guide 8.25 [1]. suggests that an airflow study should be conducted after any changes at work area including changes in the setup of work areas, ventilation system changes, etc. The present work presents an airflow patterns study conducted in a typical room using two methods: a computer simulation and a qualitative test using a smoke tube

  16. Implementation of Computer Assisted Test Selection System in Local Governments

    Directory of Open Access Journals (Sweden)

    Abdul Azis Basri

    2016-05-01

    Full Text Available As an evaluative way of selection of civil servant system in all government areas, Computer Assisted Test selection system was started to apply in 2013. In phase of implementation for first time in all areas in 2014, this system selection had trouble in several areas, such as registration procedure and passing grade. The main objective of this essay was to describe implementation of new selection system for civil servants in the local governments and to seek level of effectiveness of this selection system. This essay used combination of study literature and field survey which data collection was made by interviews, observations, and documentations from various sources, and to analyze the collected data, this essay used reduction, display data and verification for made the conclusion. The result of this essay showed, despite there a few parts that be problem of this system such as in the registration phase but almost all phases of implementation of CAT selection system in local government areas can be said was working clearly likes in preparation, implementation and result processing phase. And also this system was fulfilled two of three criterias of effectiveness for selection system, they were accuracy and trusty. Therefore, this selection system can be said as an effective way to select new civil servant. As suggestion, local governments have to make prime preparation in all phases of test and make a good feedback as evaluation mechanism and together with central government to seek, fix and improve infrastructures as supporting tool and competency of local residents.

  17. Adapting the Get Yourself Tested Campaign to Reach Black and Latino Sexual-Minority Youth.

    Science.gov (United States)

    Garbers, Samantha; Friedman, Allison; Martinez, Omar; Scheinmann, Roberta; Bermudez, Dayana; Silva, Manel; Silverman, Jen; Chiasson, Mary Ann

    2016-09-01

    Culturally appropriate efforts are needed to increase sexually transmitted disease (STD) testing and care among Black and Latino sexual-minority youth, who are at high risk for STDs. Get Yourself Tested, a national testing campaign, has demonstrated success among youth, but it has yet to be assessed for relevance or impact among this population. This effort included (1) formative and materials-testing research through focus groups; (2) adaptation of existing Get Yourself Tested campaign materials to be more inclusive of Black and Latino sexual-minority youth; (3) a 3-month campaign in four venues of New York City, promoting STD testing at events and through mobile testing and online and social media platforms; (4) process evaluation of outreach activities; and (5) an outcome evaluation of testing at select campaign venues, using a preexperimental design. During the 3-month campaign period, the number of STD tests conducted at select campaign venues increased from a comparable 3-month baseline period. Although testing uptake through mobile vans remained low in absolute numbers, the van drew a high-prevalence sample, with positivity rates of 26.9% for chlamydia and 11.5% for gonorrhea. This article documents the process and lessons learned from adapting and implementing a local campaign for Black and Latino sexual-minority youth. © 2016 Society for Public Health Education.

  18. Cross-Mode Comparability of Computer-Based Testing (CBT) versus Paper-Pencil Based Testing (PPT): An Investigation of Testing Administration Mode among Iranian Intermediate EFL Learners

    Science.gov (United States)

    Khoshsima, Hooshang; Hosseini, Monirosadat; Toroujeni, Seyyed Morteza Hashemi

    2017-01-01

    Advent of technology has caused growing interest in using computers to convert conventional paper and pencil-based testing (Henceforth PPT) into Computer-based testing (Henceforth CBT) in the field of education during last decades. This constant promulgation of computers to reshape the conventional tests into computerized format permeated the…

  19. Usability of an adaptive computer assistant that improves self-care and health literacy of older adults

    NARCIS (Netherlands)

    Blanson Henkemans, O.A.; Rogers, W.A.; Fisk, A.D.; Neerincx, M.A.; Lindenberg, J.; Mast, C.A.P.G. van der

    2008-01-01

    Objectives: We developed an adaptive computer assistant for the supervision of diabetics' self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient's electronic diary. Accordingly, it provides

  20. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  1. Using Artificial Intelligence to Control and Adapt Level of Difficulty in Computer Based, Cognitive Therapy – an Explorative Study

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2011-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...

  2. Utilisation of Wearable Computing for Space Programmes Test Activities Optimasation

    Science.gov (United States)

    Basso, V.; Lazzari, D.; Alemanni, M.

    2004-08-01

    New technologies are assuming a relevant importance in the Space business domain also in the Assembly Integration and Test (AIT) activities allowing process optimization and capability that were unthinkable only few years ago. This paper has the aim to describe Alenia Spazio (ALS) gained experience on the remote interaction techniques as a results of collaborations established both on European Communities (EC) initiatives, with Alenia Aeronautica (ALA) and Politecnico of Torino (POLITO). The H/W and S/W components performances increase and costs reduction due to the home computing massive utilization (especially demanded by the games business) together with the network technology possibility (offered by the web as well as the hi-speed links and the wireless communications) allow today to re-think the traditional AIT process activities in the light of the multimedia data exchange: graphical, voice video and by sure more in the future. Aerospace business confirm its innovation vocation which in the year '80 represents the cradle of the CAD systems and today is oriented to the 3D data visualization/ interaction technologies and remote visualisation/ interaction in collaborative way on a much more user friendly bases (i.e. not for specialists). Fig. 1 collects AIT extended scenario studied and adopted by ALS in these years. ALS experimented two possibilities of remote visualization/interaction: Portable [e.g. Fig.2 Personal Digital Assistant (PDA), Wearable] and walls (e.g.VR-Lab) screens as both 2D/3D visualisation and interaction devices which could support many types of traditional (mainly based on EGSE and PDM/CAD utilisation/reports) company internal AIT applications: 1. design review support 2. facility management 3. storage management 4. personnel training 5. integration sequences definition 6. assembly and test operations follow up 7. documentation review and external access to AIT activities for remote operations (e.g. tele-testing) EGSE Portable Clean room

  3. Ultralow dose computed tomography attenuation correction for pediatric PET CT using adaptive statistical iterative reconstruction

    International Nuclear Information System (INIS)

    Brady, Samuel L.; Shulkin, Barry L.

    2015-01-01

    Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (10–35 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV bw ) of various diameter targets (range 8–37 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV bw , background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.3–0.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake

  4. Influence of Adaptive Statistical Iterative Reconstruction on coronary plaque analysis in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Kitslaar, Pieter H; Broersen, Alexander; Dijkstra, Jouke; Gerke, Oke; Thygesen, Jesper; Egstrup, Kenneth; Lambrechtsen, Jess

    The purpose of this study was to study the effect of iterative reconstruction (IR) software on quantitative plaque measurements in coronary computed tomography angiography (CCTA). Thirty patients with a three clinical risk factors for coronary artery disease (CAD) had one CCTA performed. Images were reconstructed using FBP, 30% and 60% adaptive statistical IR (ASIR). Coronary plaque analysis was performed as per patient and per vessel (LM, LAD, CX and RCA) measurements. Lumen and vessel volumes and plaque burden measurements were based on automatic detected contours in each reconstruction. Lumen and plaque intensity measurements and HU based plaque characterization were based on corrected contours copied to each reconstruction. No significant changes between FBP and 30% ASIR were found except for lumen- (-2.53 HU) and plaque intensities (-1.28 HU). Between FBP and 60% ASIR the change in total volume showed an increase of 0.94%, 4.36% and 2.01% for lumen, plaque and vessel, respectively. The change in total plaque burden between FBP and 60% ASIR was 0.76%. Lumen and plaque intensities decreased between FBP and 60% ASIR with -9.90 HU and -1.97 HU, respectively. The total plaque component volume changes were all small with a maximum change of -1.13% of necrotic core between FBP and 60% ASIR. Quantitative plaque measurements only showed modest differences between FBP and the 60% ASIR level. Differences were increased lumen-, vessel- and plaque volumes, decreased lumen- and plaque intensities and a small percentage change in the individual plaque component volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  5. Ultralow dose computed tomography attenuation correction for pediatric PET CT using adaptive statistical iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Brady, Samuel L., E-mail: samuel.brady@stjude.org [Division of Diagnostic Imaging, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States); Shulkin, Barry L. [Nuclear Medicine and Department of Radiological Sciences, St. Jude Children’s Research Hospital, Memphis, Tennessee 38105 (United States)

    2015-02-15

    Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (10–35 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV{sub bw}) of various diameter targets (range 8–37 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV{sub bw}, background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.3–0.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake.

  6. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Directory of Open Access Journals (Sweden)

    Cohen Laurent

    2006-05-01

    Full Text Available Abstract Background In a companion article 1, we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed.

  7. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    Science.gov (United States)

    Wilson, Anna J; Revkin, Susannah K; Cohen, David; Cohen, Laurent; Dehaene, Stanislas

    2006-01-01

    Background In a companion article [1], we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed. PMID:16734906

  8. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography.

    Science.gov (United States)

    Precht, Helle; Thygesen, Jesper; Gerke, Oke; Egstrup, Kenneth; Waaler, Dag; Lambrechtsen, Jess

    2016-12-01

    Coronary computed tomography angiography (CCTA) requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR) techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. To evaluate whether adaptive statistical iterative reconstruction (ASIR) enhances perceived image quality in CCTA compared to filtered back projection (FBP). Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA) and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR]) was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR ( P  = 0.004). The objective measures showed significant differences between FBP and 60% ASIR ( P  < 0.0001) for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  9. Numerical Computation of Underground Inundation in Multiple Layers Using the Adaptive Transfer Method

    Directory of Open Access Journals (Sweden)

    Hyung-Jun Kim

    2018-01-01

    Full Text Available Extreme rainfall causes surface runoff to flow towards lowlands and subterranean facilities, such as subway stations and buildings with underground spaces in densely packed urban areas. These facilities and areas are therefore vulnerable to catastrophic submergence. However, flood modeling of underground space has not yet been adequately studied because there are difficulties in reproducing the associated multiple horizontal layers connected with staircases or elevators. This study proposes a convenient approach to simulate underground inundation when two layers are connected. The main facet of this approach is to compute the flow flux passing through staircases in an upper layer and to transfer the equivalent quantity to a lower layer. This is defined as the ‘adaptive transfer method’. This method overcomes the limitations of 2D modeling by introducing layers connecting concepts to prevent large variations in mesh sizes caused by complicated underlying obstacles or local details. Consequently, this study aims to contribute to the numerical analysis of flow in inundated underground spaces with multiple floors.

  10. High-Resolution Adaptive Optics Test-Bed for Vision Science

    International Nuclear Information System (INIS)

    Wilks, S.C.; Thomspon, C.A.; Olivier, S.S.; Bauman, B.J.; Barnes, T.; Werner, J.S.

    2001-01-01

    We discuss the design and implementation of a low-cost, high-resolution adaptive optics test-bed for vision research. It is well known that high-order aberrations in the human eye reduce optical resolution and limit visual acuity. However, the effects of aberration-free eyesight on vision are only now beginning to be studied using adaptive optics to sense and correct the aberrations in the eye. We are developing a high-resolution adaptive optics system for this purpose using a Hamamatsu Parallel Aligned Nematic Liquid Crystal Spatial Light Modulator. Phase-wrapping is used to extend the effective stroke of the device, and the wavefront sensing and wavefront correction are done at different wavelengths. Issues associated with these techniques will be discussed

  11. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  12. Design and Preliminary Testing of the International Docking Adapter's Peripheral Docking Target

    Science.gov (United States)

    Foster, Christopher W.; Blaschak, Johnathan; Eldridge, Erin A.; Brazzel, Jack P.; Spehar, Peter T.

    2015-01-01

    The International Docking Adapter's Peripheral Docking Target (PDT) was designed to allow a docking spacecraft to judge its alignment relative to the docking system. The PDT was designed to be compatible with relative sensors using visible cameras, thermal imagers, or Light Detection and Ranging (LIDAR) technologies. The conceptual design team tested prototype designs and materials to determine the contrast requirements for the features. This paper will discuss the design of the PDT, the methodology and results of the tests, and the conclusions pertaining to PDT design that were drawn from testing.

  13. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  14. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  15. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  16. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    Science.gov (United States)

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Sleep deprivation selectively disrupts top-down adaptation to cognitive conflict in the Stroop test.

    Science.gov (United States)

    Gevers, Wim; Deliens, Gaetane; Hoffmann, Sophie; Notebaert, Wim; Peigneux, Philippe

    2015-12-01

    Sleep deprivation is known to exert detrimental effects on various cognitive domains, including attention, vigilance and working memory. Seemingly at odds with these findings, prior studies repeatedly failed to evidence an impact of prior sleep deprivation on cognitive interference in the Stroop test, a hallmark paradigm in the study of cognitive control abilities. The present study investigated further the effect of sleep deprivation on cognitive control using an adapted version of the Stroop test that allows to segregate top-down (attentional reconfiguration on incongruent items) and bottom-up (facilitated processing after repetitions in responses and/or features of stimuli) components of performance. Participants underwent a regular night of sleep or a night of total sleep deprivation before cognitive testing. Results disclosed that sleep deprivation selectively impairs top-down adaptation mechanisms: cognitive control no longer increased upon detection of response conflict at the preceding trial. In parallel, bottom-up abilities were found unaffected by sleep deprivation: beneficial effects of stimulus and response repetitions persisted. Changes in vigilance states due to sleep deprivation selectively impact on cognitive control in the Stroop test by affecting top-down, but not bottom-up, mechanisms that guide adaptive behaviours. © 2015 European Sleep Research Society.

  18. Capabilities of wind tunnels with two-adaptive walls to minimize boundary interference in 3-D model testing

    Science.gov (United States)

    Rebstock, Rainer; Lee, Edwin E., Jr.

    1989-01-01

    An initial wind tunnel test was made to validate a new wall adaptation method for 3-D models in test sections with two adaptive walls. First part of the adaptation strategy is an on-line assessment of wall interference at the model position. The wall induced blockage was very small at all test conditions. Lift interference occurred at higher angles of attack with the walls set aerodynamically straight. The adaptation of the top and bottom tunnel walls is aimed at achieving a correctable flow condition. The blockage was virtually zero throughout the wing planform after the wall adjustment. The lift curve measured with the walls adapted agreed very well with interference free data for Mach 0.7, regardless of the vertical position of the wing in the test section. The 2-D wall adaptation can significantly improve the correctability of 3-D model data. Nevertheless, residual spanwise variations of wall interference are inevitable.

  19. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    Science.gov (United States)

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  20. Exposure Control Using Adaptive Multi-Stage Item Bundles.

    Science.gov (United States)

    Luecht, Richard M.

    This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…

  1. A study of the image quality of computed tomography adaptive statistical iterative reconstructed brain images using subjective and objective methods

    International Nuclear Information System (INIS)

    Mangat, J.; Morgan, J.; Benson, E.; Baath, M.; Lewis, M.; Reilly, A.

    2016-01-01

    The recent reintroduction of iterative reconstruction in computed tomography has facilitated the realisation of major dose saving. The aim of this article was to investigate the possibility of achieving further savings at a site with well-established Adaptive Statistical iterative Reconstruction (ASiR TM ) (GE Healthcare) brain protocols. An adult patient study was conducted with observers making visual grading assessments using image quality criteria, which were compared with the frequency domain metrics, noise power spectrum and modulation transfer function. Subjective image quality equivalency was found in the 40-70% ASiR TM range, leading to the proposal of ranges for the objective metrics defining acceptable image quality. Based on the findings of both the patient-based and objective studies of the ASiR TM /tube-current combinations tested, 60%/305 mA was found to fall within all, but one, of these ranges. Therefore, it is recommended that an ASiR TM level of 60%, with a noise index of 12.20, is a viable alternative to the currently used protocol featuring a 40% ASiR TM level and a noise index of 11.20, potentially representing a 16% dose saving. (authors)

  2. Computer automation of ultrasonic testing. [inspection of ultrasonic welding

    Science.gov (United States)

    Yee, B. G. W.; Kerlin, E. E.; Gardner, A. H.; Dunmyer, D.; Wells, T. G.; Robinson, A. R.; Kunselman, J. S.; Walker, T. C.

    1974-01-01

    Report describes a prototype computer-automated ultrasonic system developed for the inspection of weldments. This system can be operated in three modes: manual, automatic, and computer-controlled. In the computer-controlled mode, the system will automatically acquire, process, analyze, store, and display ultrasonic inspection data in real-time. Flaw size (in cross-section), location (depth), and type (porosity-like or crack-like) can be automatically discerned and displayed. The results and pertinent parameters are recorded.

  3. The use of computers for the performance and analysis of non-destructive testing

    International Nuclear Information System (INIS)

    Edelmann, X.; Pfister, O.

    1988-01-01

    Examples of the use of computers in non-destructive testing are related. Ultrasonic testing is especially addressed. The employment of computers means improvements for the user, the possibility of registering the reflector position, storage of test data and help with documentation. The test can be automated. The introduction of expert systems is expected for the future. 8 figs., 12 refs

  4. Pepsi-SAXS : an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles

    OpenAIRE

    Grudinin , Sergei; Garkavenko , Maria; Kazennov , Andrei

    2017-01-01

    International audience; A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist–Shannon–Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion ord...

  5. The test ability of an adaptive pulse wave for ADC testing

    NARCIS (Netherlands)

    Sheng, Xiaoqin; Kerkhoff, Hans G.

    2010-01-01

    In the conventional ADC production test method, a high-quality analogue sine wave is applied to the Analogue-to-Digital Converter (ADC), which is expensive to generate. Nowadays, an increasing number of ADCs are integrated into a system-on-chip (SoC) platform design, which usually contains a digital

  6. Stochastic order in dichotomous item response models for fixed tests, research adaptive tests, or multiple abilities

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1995-01-01

    Dichotomous item response theory (IRT) models can be viewed as families of stochastically ordered distributions of responses to test items. This paper explores several properties of such distributiom. The focus is on the conditions under which stochastic order in families of conditional

  7. Measurement of flat samples with rough surfaces by Magnetic Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Tomáš, Ivan; Kadlecová, Jana; Vértesy, G.

    2012-01-01

    Roč. 48, č. 4 (2012), s. 1441-1444 ISSN 0018-9464. [Conference on Soft Magnetic Materials (SMM20) /20./. Kos Island, 18.09.2011-22.09.2011] R&D Projects: GA ČR GA101/09/1323 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic contact * magnetic adaptive testing * magnetically open samples * magnetic NDE Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.422, year: 2012

  8. DEVELOPMENT AND ADAPTATION OF VORTEX REALIZABLE MEASUREMENT SYSTEM FOR BENCHMARK TEST WITH LARGE SCALE MODEL OF NUCLEAR REACTOR

    Directory of Open Access Journals (Sweden)

    S. M. Dmitriev

    2017-01-01

    Full Text Available The last decades development of applied calculation methods of nuclear reactor thermal and hydraulic processes are marked by the rapid growth of the High Performance Computing (HPC, which contribute to the active introduction of Computational Fluid Dynamics (CFD. The use of such programs to justify technical and economic parameters and especially the safety of nuclear reactors requires comprehensive verification of mathematical models and CFD programs. The aim of the work was the development and adaptation of a measuring system having the characteristics necessary for its application in the verification test (experimental facility. It’s main objective is to study the processes of coolant flow mixing with different physical properties (for example, the concentration of dissolved impurities inside a large-scale reactor model. The basic method used for registration of the spatial concentration field in the mixing area is the method of spatial conductometry. In the course of the work, a measurement complex, including spatial conductometric sensors, a system of secondary converters and software, was created. Methods of calibration and normalization of measurement results are developed. Averaged concentration fields, nonstationary realizations of the measured local conductivity were obtained during the first experimental series, spectral and statistical analysis of the realizations were carried out.The acquired data are compared with pretest CFD-calculations performed in the ANSYS CFX program. A joint analysis of the obtained results made it possible to identify the main regularities of the process under study, and to demonstrate the capabilities of the designed measuring system to receive the experimental data of the «CFD-quality» required for verification.The carried out adaptation of spatial sensors allows to conduct a more extensive program of experimental tests, on the basis of which a databank and necessary generalizations will be created

  9. Intelligent Adaptation and Personalization Techniques in Computer-Supported Collaborative Learning

    CERN Document Server

    Demetriadis, Stavros; Xhafa, Fatos

    2012-01-01

    Adaptation and personalization have been extensively studied in CSCL research community aiming to design intelligent systems that adaptively support eLearning processes and collaboration. Yet, with the fast development in Internet technologies, especially with the emergence of new data technologies and the mobile technologies, new opportunities and perspectives are opened for advanced adaptive and personalized systems. Adaptation and personalization are posing new research and development challenges to nowadays CSCL systems. In particular, adaptation should be focused in a multi-dimensional way (cognitive, technological, context-aware and personal). Moreover, it should address the particularities of both individual learners and group collaboration. As a consequence, the aim of this book is twofold. On the one hand, it discusses the latest advances and findings in the area of intelligent adaptive and personalized learning systems. On the other hand it analyzes the new implementation perspectives for intelligen...

  10. Influence of adaptive statistical iterative reconstruction algorithm on image quality in coronary computed tomography angiography

    Directory of Open Access Journals (Sweden)

    Helle Precht

    2016-12-01

    Full Text Available Background Coronary computed tomography angiography (CCTA requires high spatial and temporal resolution, increased low contrast resolution for the assessment of coronary artery stenosis, plaque detection, and/or non-coronary pathology. Therefore, new reconstruction algorithms, particularly iterative reconstruction (IR techniques, have been developed in an attempt to improve image quality with no cost in radiation exposure. Purpose To evaluate whether adaptive statistical iterative reconstruction (ASIR enhances perceived image quality in CCTA compared to filtered back projection (FBP. Material and Methods Thirty patients underwent CCTA due to suspected coronary artery disease. Images were reconstructed using FBP, 30% ASIR, and 60% ASIR. Ninety image sets were evaluated by five observers using the subjective visual grading analysis (VGA and assessed by proportional odds modeling. Objective quality assessment (contrast, noise, and the contrast-to-noise ratio [CNR] was analyzed with linear mixed effects modeling on log-transformed data. The need for ethical approval was waived by the local ethics committee as the study only involved anonymously collected clinical data. Results VGA showed significant improvements in sharpness by comparing FBP with ASIR, resulting in odds ratios of 1.54 for 30% ASIR and 1.89 for 60% ASIR (P = 0.004. The objective measures showed significant differences between FBP and 60% ASIR (P < 0.0001 for noise, with an estimated ratio of 0.82, and for CNR, with an estimated ratio of 1.26. Conclusion ASIR improved the subjective image quality of parameter sharpness and, objectively, reduced noise and increased CNR.

  11. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  12. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs

    International Nuclear Information System (INIS)

    Nyman, Ulf; Kristiansson, Mattias; Leitz, Wolfram; Paahlstorp, Per-Aake

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations

  13. Modeling UV Radiation Feedback from Massive Stars. I. Implementation of Adaptive Ray-tracing Method and Tests

    Science.gov (United States)

    Kim, Jeong-Gyu; Kim, Woong-Tae; Ostriker, Eve C.; Skinner, M. Aaron

    2017-12-01

    We present an implementation of an adaptive ray-tracing (ART) module in the Athena hydrodynamics code that accurately and efficiently handles the radiative transfer involving multiple point sources on a three-dimensional Cartesian grid. We adopt a recently proposed parallel algorithm that uses nonblocking, asynchronous MPI communications to accelerate transport of rays across the computational domain. We validate our implementation through several standard test problems, including the propagation of radiation in vacuum and the expansions of various types of H II regions. Additionally, scaling tests show that the cost of a full ray trace per source remains comparable to that of the hydrodynamics update on up to ∼ {10}3 processors. To demonstrate application of our ART implementation, we perform a simulation of star cluster formation in a marginally bound, turbulent cloud, finding that its star formation efficiency is 12% when both radiation pressure forces and photoionization by UV radiation are treated. We directly compare the radiation forces computed from the ART scheme with those from the M 1 closure relation. Although the ART and M 1 schemes yield similar results on large scales, the latter is unable to resolve the radiation field accurately near individual point sources.

  14. Online adaptation of a c-VEP Brain-computer Interface(BCI) based on error-related potentials and unsupervised learning.

    Science.gov (United States)

    Spüler, Martin; Rosenstiel, Wolfgang; Bogdan, Martin

    2012-01-01

    The goal of a Brain-Computer Interface (BCI) is to control a computer by pure brain activity. Recently, BCIs based on code-modulated visual evoked potentials (c-VEPs) have shown great potential to establish high-performance communication. In this paper we present a c-VEP BCI that uses online adaptation of the classifier to reduce calibration time and increase performance. We compare two different approaches for online adaptation of the system: an unsupervised method and a method that uses the detection of error-related potentials. Both approaches were tested in an online study, in which an average accuracy of 96% was achieved with adaptation based on error-related potentials. This accuracy corresponds to an average information transfer rate of 144 bit/min, which is the highest bitrate reported so far for a non-invasive BCI. In a free-spelling mode, the subjects were able to write with an average of 21.3 error-free letters per minute, which shows the feasibility of the BCI system in a normal-use scenario. In addition we show that a calibration of the BCI system solely based on the detection of error-related potentials is possible, without knowing the true class labels.

  15. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    Science.gov (United States)

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  16. Identifying genetic marker sets associated with phenotypes via an efficient adaptive score test

    KAUST Repository

    Cai, T.

    2012-06-25

    In recent years, genome-wide association studies (GWAS) and gene-expression profiling have generated a large number of valuable datasets for assessing how genetic variations are related to disease outcomes. With such datasets, it is often of interest to assess the overall effect of a set of genetic markers, assembled based on biological knowledge. Genetic marker-set analyses have been advocated as more reliable and powerful approaches compared with the traditional marginal approaches (Curtis and others, 2005. Pathways to the analysis of microarray data. TRENDS in Biotechnology 23, 429-435; Efroni and others, 2007. Identification of key processes underlying cancer phenotypes using biologic pathway analysis. PLoS One 2, 425). Procedures for testing the overall effect of a marker-set have been actively studied in recent years. For example, score tests derived under an Empirical Bayes (EB) framework (Liu and others, 2007. Semiparametric regression of multidimensional genetic pathway data: least-squares kernel machines and linear mixed models. Biometrics 63, 1079-1088; Liu and others, 2008. Estimation and testing for the effect of a genetic pathway on a disease outcome using logistic kernel machine regression via logistic mixed models. BMC bioinformatics 9, 292-2; Wu and others, 2010. Powerful SNP-set analysis for case-control genome-wide association studies. American Journal of Human Genetics 86, 929) have been proposed as powerful alternatives to the standard Rao score test (Rao, 1948. Large sample tests of statistical hypotheses concerning several parameters with applications to problems of estimation. Mathematical Proceedings of the Cambridge Philosophical Society, 44, 50-57). The advantages of these EB-based tests are most apparent when the markers are correlated, due to the reduction in the degrees of freedom. In this paper, we propose an adaptive score test which up- or down-weights the contributions from each member of the marker-set based on the Z-scores of

  17. [Studies on the changes of adaptation with children in the dental setting. The relationship between the changes of adaptation and various psychological tests].

    Science.gov (United States)

    Uchida, T; Mukai, Y; Sasa, R

    1991-01-01

    The purpose of this study was to discover the changes in the adaptation of children to the Dental setting, and to discover the relationship between the adaptation, and the personality of the child, the personality of the mother, as well as the relationship between the mother and child. The subjects were 60 two to six year old children and their mothers who visited at the Department of Pedodontics, School of Dentistry, Showa University. The results were as follows: 1) The changes of adaptation were classified in groups of four classes. Four groups: Continuous Adaptability (45.0%) Acquired Adaptability (18.3%) Continuous Inadaptability (16.7%) Extreme Inadaptability (20.0%) 2) The inadaptability groups (Continuous Inadaptability and Extreme Inadaptability) of the two to three year old children did not correlate to the change of adaptation and personality of the child, and the relationship between the mother and child. 3) The extreme inadaptability group with the four year old children showed a connection with the change of adaptation and the various Psychological Tests. Concerning personality, the children showed elements of "dependence" "retrogression" and "maladaptation to school (kindergarten)". Concerning the mother child relationship, there were elements of "anxiety" "dotage" "follow blindly" "disagreement". 4) Nobody showed extreme inadaptability in the group of five to six year old children. Continuous Inadaptability group with the five to six year old children showed scarcely any problems. 5) The Personality of mother did not correlate to the change of adaptation of children in the dental setting.

  18. Exaptation in human evolution: how to test adaptive vs exaptive evolutionary hypotheses.

    Science.gov (United States)

    Pievani, Telmo; Serrelli, Emanuele

    2011-01-01

    Palaeontologists, Stephen J. Gould and Elisabeth Vrba, introduced the term "ex-aptation" with the aim of improving and enlarging the scientific language available to researchers studying the evolution of any useful character, instead of calling it an "adaptation" by default, coming up with what Gould named an "extended taxonomy of fitness". With the extension to functional co-optations from non-adaptive structures ("spandrels"), the notion of exaptation expanded and revised the neo-Darwinian concept of "pre-adaptation" (which was misleading, for Gould and Vrba, suggesting foreordination). Exaptation is neither a "saltationist" nor an "anti-Darwinian" concept and, since 1982, has been adopted by many researchers in evolutionary and molecular biology, and particularly in human evolution. Exaptation has also been contested. Objections include the "non-operationality objection".We analyze the possible operationalization of this concept in two recent studies, and identify six directions of empirical research, which are necessary to test "adaptive vs. exaptive" evolutionary hypotheses. We then comment on a comprehensive survey of literature (available online), and on the basis of this we make a quantitative and qualitative evaluation of the adoption of the term among scientists who study human evolution. We discuss the epistemic conditions that may have influenced the adoption and appropriate use of exaptation, and comment on the benefits of an "extended taxonomy of fitness" in present and future studies concerning human evolution.

  19. Flight Test of L1 Adaptive Control Law: Offset Landings and Large Flight Envelope Modeling Work

    Science.gov (United States)

    Gregory, Irene M.; Xargay, Enric; Cao, Chengyu; Hovakimyan, Naira

    2011-01-01

    This paper presents new results of a flight test of the L1 adaptive control architecture designed to directly compensate for significant uncertain cross-coupling in nonlinear systems. The flight test was conducted on the subscale turbine powered Generic Transport Model that is an integral part of the Airborne Subscale Transport Aircraft Research system at the NASA Langley Research Center. The results presented include control law evaluation for piloted offset landing tasks as well as results in support of nonlinear aerodynamic modeling and real-time dynamic modeling of the departure-prone edges of the flight envelope.

  20. Adaptation of the C.H.A.D. computer library to nuclear simulations

    Science.gov (United States)

    Rock, Daniel Thomas

    The Computational Hydrodynamics for Automotive Design computer program, CHAD, is a modern, three-dimensional computational fluid dynamics code that holds promise for fulfilling a need in the nuclear industry and academia. Because CHAD may be freely distributed to non export controlled countries, it offers a cheap and customizable CFD capability. Several modifications were made to CHAD to make it more usable to those in industry and academia. A first order up-winding scheme for momentum and enthalpy and a reformulated continuity equation were migrated from a historical version of CHAD developed at Argonne National Laboratory. The Portable, Extensible Toolkit for Scientific Computing, PETSc, was also added as an optional solver package for the original and reformulated continuity equations. PETSc's highly optimized parallel solvers can be activated from either CHAD's input file or the command line. Solution times for PETSc based calculations depend in large part on convergence criteria provided, however improvements in CPU time of approximately one-third have been observed. CHAD was further extended by adding a capability to monitor solution progress by specifying a coordinate in space, as well as monitoring the residuals in the problem. The ability to model incompressible fluids was also added to the code. Incompressible fluid comparisons were made using several test cases against the commercial CFD code Fluent and found to agree well. A major limitation of CHAD in the academic environment is a limited mesh generation capability. A tool for CHAD was developed that translates Gambit based neutral mesh files into a CHAD usable format. This tool was used to translate a large mesh representing a simplified cooling jacket of a BWR control rod drive. This model serves as a practical, demonstration application of a nuclear application for CHAD and PETSc. Both CHAD with PETSc and Fluent were used to obtain solutions to this problem. The overall agreement between the two

  1. BRBN-T validation: adaptation of the Selective Reminding Test and Word List Generation

    Directory of Open Access Journals (Sweden)

    Mariana Rigueiro Neves

    2015-10-01

    Full Text Available Objective This study aims to present the Selective Reminding Test(SRT and Word List Generation (WLG adaptation to the Portuguese population, within the validation of the Brief Repeatable Battery of Neuropsychological Tests (BRBN-Tfor multiple sclerosis (MS patients.Method 66 healthy participants (54.5% female recruited from the community volunteered to participate in this study.Results A combination of procedures from Classical Test Theory (CTT and Item Response Theory (ITR were applied to item analysis and selection. For each SRT list, 12 words were selected and 3 letters were chosen for WLG to constitute the final versions of these tests for the Portuguese population.Conclusion The combination of CTT and ITR maximized the decision making process in the adaptation of the SRT and WLG to a different culture and language (Portuguese. The relevance of this study lies on the production of reliable standardized neuropsychological tests, so that they can be used to facilitate a more rigorous monitoring of the evolution of MS, as well as any therapeutic effects and cognitive rehabilitation.

  2. Translation, Cultural Adaptation and Validation of the Simple Shoulder Test to Spanish

    Science.gov (United States)

    Arcuri, Francisco; Barclay, Fernando; Nacul, Ivan

    2015-01-01

    Background: The validation of widely used scales facilitates the comparison across international patient samples. Objective: The objective was to translate, culturally adapt and validate the Simple Shoulder Test into Argentinian Spanish. Methods: The Simple Shoulder Test was translated from English into Argentinian Spanish by two independent translators, translated back into English and evaluated for accuracy by an expert committee to correct the possible discrepancies. It was then administered to 50 patients with different shoulder conditions.Psycometric properties were analyzed including internal consistency, measured with Cronbach´s Alpha, test-retest reliability at 15 days with the interclass correlation coefficient. Results: The internal consistency, validation, was an Alpha of 0,808, evaluated as good. The test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.835, evaluated as excellent. Conclusion: The Simple Shoulder Test translation and it´s cultural adaptation to Argentinian-Spanish demonstrated adequate internal reliability and validity, ultimately allowing for its use in the comparison with international patient samples.

  3. Adaptive and Qualitative Changes in Encoding Strategy With Experience: Evidence From the Test-Expectancy Paradigm

    Science.gov (United States)

    Finley, Jason R.; Benjamin, Aaron S.

    2012-01-01

    Three experiments demonstrated learners’ abilities to adaptively and qualitatively accommodate their encoding strategies to the demands of an upcoming test. Stimuli were word pairs. In Experiment 1, test expectancy was induced for either cued recall (of targets given cues) or free recall (of targets only) across 4 study–test cycles of the same test format, followed by a final critical cycle featuring either the expected or the unexpected test format. For final tests of both cued and free recall, participants who had expected that test format outperformed those who had not. This disordinal interaction, supported by recognition and self-report data, demonstrated not mere differences in effort based on anticipated test difficulty, but rather qualitative and appropriate differences in encoding strategies based on expected task demands. Participants also came to appropriately modulate metacognitive monitoring (Experiment 2) and study-time allocation (Experiment 3) across study–test cycles. Item and associative recognition performance, as well as self-report data, revealed shifts in encoding strategies across trials; these results were used to characterize and evaluate the different strategies that participants employed for cued versus free recall and to assess the optimality of participants’ metacognitive control of encoding strategies. Taken together, these data illustrate a sophisticated form of metacognitive control, in which learners qualitatively shift encoding strategies to match the demands of anticipated tests. PMID:22103783

  4. Computational tool for immunotoxic assessment of pyrethroids toward adaptive immune cell receptors.

    Science.gov (United States)

    Kumar, Anoop; Behera, Padma Charan; Rangra, Naresh Kumar; Dey, Suddhasattya; Kant, Kamal

    2018-01-01

    Pyrethroids have prominently known for their insecticidal actions worldwide, but recent reports as anticancer and antiviral applications gained a lot of interest to further understand their safety and immunotoxicity. This encouraged us to carry out our present study to evaluate the interactions of pyrethroids toward adaptive immune cell receptors. Type 1 and Type 2 pyrethroids were tested on T (CD4 and CD8) and B (CD28 and CD45) immune cell receptors using Maestro 9.3 (Schrödinger, LLC, Cambridge, USA). In addition, top-ranked tested ligands were too explored for toxicity prediction in rodents using ProTOX tool. Pyrethroids (specifically type 2) such as fenvalerate (-5.534 kcal/mol: CD8), fluvalinate (-4.644 and - 4.431 kcal/mol: CD4 and CD45), and cypermethrin (-3.535 kcal/mol: CD28) have outcome in less energy or more affinity for B-cell and T-cell immune receptors which may later result in the immunosuppressive and hypersensitivity reactions. The current findings have uncovered that there is a further need to assess the Type 2 pyrethroids with wet laboratory experiments to understand the chemical nature of pyrethroid-induced immunotoxicity. Fenvalerate showed apex glide score toward CD8 immune receptor, while fluvalinate confirmed top-ranked binding with CD4 and CD45 immune proteinsIn addition, cypermethrin outcame in top glide score against CD28 immune receptorTop dock hits (Type 2) pyrethroids have shown probable toxicity targets toward AOFA: Amine oxidase (flavin-containing) A and PGH1: Prostaglandin G/H synthase 1, respectively. Abbreviations used: PDB: Protein Data Bank; AOFA: Amine oxidase (flavin-containing) A; PGH 1: Prostaglandin G/H synthase 1.

  5. Self-adaptive method to distinguish inner and outer contours of industrial computed tomography image for rapid prototype

    International Nuclear Information System (INIS)

    Duan Liming; Ye Yong; Zhang Xia; Zuo Jian

    2013-01-01

    A self-adaptive identification method is proposed for realizing more accurate and efficient judgment about the inner and outer contours of industrial computed tomography (CT) slice images. The convexity-concavity of the single-pixel-wide closed contour is identified with angle method at first. Then, contours with concave vertices are distinguished to be inner or outer contours with ray method, and contours without concave vertices are distinguished with extreme coordinate value method. The method was chosen to automatically distinguish contours by means of identifying the convexity and concavity of the contours. Thus, the disadvantages of single distinguishing methods, such as ray method's time-consuming and extreme coordinate method's fallibility, can be avoided. The experiments prove the adaptability, efficiency, and accuracy of the self-adaptive method. (authors)

  6. Using Tests Designed to Measure Individual Sensorimotor Subsystem Perfomance to Predict Locomotor Adaptability

    Science.gov (United States)

    Peters, B. T.; Caldwell, E. E.; Batson, C. D.; Guined, J. R.; DeDios, Y. E.; Stepanyan, V.; Gadd, N. E.; Szecsy, D. L.; Mulavara, A. P.; Seidler, R. D.; hide

    2014-01-01

    Astronauts experience sensorimotor disturbances during the initial exposure to microgravity and during the readapation phase following a return to a gravitational environment. These alterations may lead to disruption in the ability to perform mission critical functions during and after these gravitational transitions. Astronauts show significant inter-subject variation in adaptive capability following gravitational transitions. The way each individual's brain synthesizes the available visual, vestibular and somatosensory information is likely the basis for much of the variation. Identifying the presence of biases in each person's use of information available from these sensorimotor subsystems and relating it to their ability to adapt to a novel locomotor task will allow us to customize a training program designed to enhance sensorimotor adaptability. Eight tests are being used to measure sensorimotor subsystem performance. Three of these use measures of body sway to characterize balance during varying sensorimotor challenges. The effect of vision is assessed by repeating conditions with eyes open and eyes closed. Standing on foam, or on a support surface that pitches to maintain a constant ankle angle provide somatosensory challenges. Information from the vestibular system is isolated when vision is removed and the support surface is compromised, and it is challenged when the tasks are done while the head is in motion. The integration and dominance of visual information is assessed in three additional tests. The Rod & Frame Test measures the degree to which a subject's perception of the visual vertical is affected by the orientation of a tilted frame in the periphery. Locomotor visual dependence is determined by assessing how much an oscillating virtual visual world affects a treadmill-walking subject. In the third of the visual manipulation tests, subjects walk an obstacle course while wearing up-down reversing prisms. The two remaining tests include direct

  7. Computer applications for the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Worth, G.A.; Patterson, J.R.

    1976-01-01

    Computer applications for the FFTF reactor include plant surveillance functions and fuel handling and examination control functions. Plant surveillance systems provide the reactor operator with a selection of over forty continuously updated, formatted displays of correlated data. All data are checked for limits and validity and the operator is advised of any anomaly. Data are also recorded on magnetic tape for historical purposes. The system also provides calculated variables, such as reactor thermal power and anomalous reactivity. Supplementing the basic plant surveillance computer system is a minicomputer system that monitors the reactor cover gas to detect and characterize absorber or fuel pin failures. In addition to plant surveillance functions, computers are used in the FFTF for controlling selected refueling equipment and for post-irradiation fuel pin examination. Four fuel handling or examination systems operate under computer control with manual monitoring and over-ride capability

  8. Use of computed tomography in nondestructive testing of polymeric materials

    International Nuclear Information System (INIS)

    Persson, S.; Oestman, E.

    1985-01-01

    Computed tomography has been used to detect imperfections and to measure cross-link density gradients in polymeric products, such as airplane tires, rubber shock absorbers, and filament-wound high-pressure tanks

  9. Computationally efficient implementation of sarse-tap FIR adaptive filters with tap-position control on intel IA-32 processors

    OpenAIRE

    Hirano, Akihiro; Nakayama, Kenji

    2008-01-01

    This paper presents an computationally ef cient implementation of sparse-tap FIR adaptive lters with tapposition control on Intel IA-32 processors with single-instruction multiple-data (SIMD) capability. In order to overcome randomorder memory access which prevents a ectorization, a blockbased processing and a re-ordering buffer are introduced. A dynamic register allocation and the use of memory-to-register operations help the maximization of the loop-unrolling level. Up to 66percent speedup ...

  10. Refficientlib: an efficient load-rebalanced adaptive mesh refinement algorithm for high-performance computational physics meshes

    OpenAIRE

    Baiges Aznar, Joan; Bayona Roa, Camilo Andrés

    2017-01-01

    No separate or additional fees are collected for access to or distribution of the work. In this paper we present a novel algorithm for adaptive mesh refinement in computational physics meshes in a distributed memory parallel setting. The proposed method is developed for nodally based parallel domain partitions where the nodes of the mesh belong to a single processor, whereas the elements can belong to multiple processors. Some of the main features of the algorithm presented in this paper a...

  11. Research on the Random Shock Vibration Test Based on the Filter-X LMS Adaptive Inverse Control Algorithm

    Directory of Open Access Journals (Sweden)

    Wang Wei

    2016-01-01

    Full Text Available The related theory and algorithm of adaptive inverse control were presented through the research which pointed out the adaptive inverse control strategy could effectively eliminate the noise influence on the system control. Proposed using a frequency domain filter-X LMS adaptive inverse control algorithm, and the control algorithm was applied to the two-exciter hydraulic vibration test system of random shock vibration control process and summarized the process of the adaptive inverse control strategies in the realization of the random shock vibration test. The self-closed-loop and field test show that using the frequency-domain filter-X LMS adaptive inverse control algorithm can realize high precision control of random shock vibration test.

  12. Adaption and Standardization of the Test of Visual-Motor Skills Revised

    Directory of Open Access Journals (Sweden)

    Mozhgan Farahbod

    2004-06-01

    Full Text Available Objective: This research has been carried out with the aim of adaptation, standardization and finding the validity and reliability of Visual-Motor Skills-revised Test for children. Materials & Methods: A multi-stage sampling from the children of the city of Tehran resulted in a sample of 1281 subjects, ages 2,11 through 13,11.the test consisted of 23 geometric designs and each of the designs was assessed through a definite criteria and was scored as errors(weakness and accuracies(strength.For adaptation and standardization of this test, at first step the examiner`s manual and the test items were translated into Farsi. The final form of the test was obtained after performing the pre-tryout and tryout stages, and doing the data analysis by classic model of reliability. Internal consistency coefficients of the subtests were obtained by Cronbach`s Alpha time consistency of the subtests and compound scores were obtained by test-retest. Alpha coefficients for the compound scores were obtained by Guilford formula, which is designed for estimating the compound scores. To obtain the content validity, criterion-related validity and construct validity of the subtests and compound scores, appropriate methods were used. Results: The results obtained ensure the applicability of this test for the evaluation of visual-motor skills of children of Tehran. Conclusion: According to the findings, this test can be used for the disorders in eye-hand coordination, the identification of children with disorders in visual – motor skills. It can also be used for the documentation of the development of fine – motor skills specially in visual – motor skills in 3-14 years – old children.

  13. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  14. Automatic Delineation of On-Line Head-And-Neck Computed Tomography Images: Toward On-Line Adaptive Radiotherapy

    International Nuclear Information System (INIS)

    Zhang Tiezhi; Chi Yuwei; Meldolesi, Elisa; Yan Di

    2007-01-01

    Purpose: To develop and validate a fully automatic region-of-interest (ROI) delineation method for on-line adaptive radiotherapy. Methods and Materials: On-line adaptive radiotherapy requires a robust and automatic image segmentation method to delineate ROIs in on-line volumetric images. We have implemented an atlas-based image segmentation method to automatically delineate ROIs of head-and-neck helical computed tomography images. A total of 32 daily computed tomography images from 7 head-and-neck patients were delineated using this automatic image segmentation method. Manually drawn contours on the daily images were used as references in the evaluation of automatically delineated ROIs. Two methods were used in quantitative validation: (1) the dice similarity coefficient index, which indicates the overlapping ratio between the manually and automatically delineated ROIs; and (2) the distance transformation, which yields the distances between the manually and automatically delineated ROI surfaces. Results: Automatic segmentation showed agreement with manual contouring. For most ROIs, the dice similarity coefficient indexes were approximately 0.8. Similarly, the distance transformation evaluation results showed that the distances between the manually and automatically delineated ROI surfaces were mostly within 3 mm. The distances between two surfaces had a mean of 1 mm and standard deviation of <2 mm in most ROIs. Conclusion: With atlas-based image segmentation, it is feasible to automatically delineate ROIs on the head-and-neck helical computed tomography images in on-line adaptive treatments

  15. Development of an item bank for computerized adaptive test (CAT) measurement of pain

    DEFF Research Database (Denmark)

    Petersen, Morten Aa.; Aaronson, Neil K; Chie, Wei-Chu

    2016-01-01

    PURPOSE: Patient-reported outcomes should ideally be adapted to the individual patient while maintaining comparability of scores across patients. This is achievable using computerized adaptive testing (CAT). The aim here was to develop an item bank for CAT measurement of the pain domain as measured...... were obtained from 1103 cancer patients from five countries. Psychometric evaluations showed that 16 items could be retained in a unidimensional item bank. Evaluations indicated that use of the CAT measure may reduce sample size requirements with 15-25 % compared to using the QLQ-C30 pain scale....... CONCLUSIONS: We have established an item bank of 16 items suitable for CAT measurement of pain. While being backward compatible with the QLQ-C30, the new item bank will significantly improve measurement precision of pain. We recommend initiating CAT measurement by screening for pain using the two original QLQ...

  16. Supervisory Adaptive Network-Based Fuzzy Inference System (SANFIS Design for Empirical Test of Mobile Robot

    Directory of Open Access Journals (Sweden)

    Yi-Jen Mon

    2012-10-01

    Full Text Available A supervisory Adaptive Network-based Fuzzy Inference System (SANFIS is proposed for the empirical control of a mobile robot. This controller includes an ANFIS controller and a supervisory controller. The ANFIS controller is off-line tuned by an adaptive fuzzy inference system, the supervisory controller is designed to compensate for the approximation error between the ANFIS controller and the ideal controller, and drive the trajectory of the system onto a specified surface (called the sliding surface or switching surface while maintaining the trajectory onto this switching surface continuously to guarantee the system stability. This SANFIS controller can achieve favourable empirical control performance of the mobile robot in the empirical tests of driving the mobile robot with a square path. Practical experimental results demonstrate that the proposed SANFIS can achieve better control performance than that achieved using an ANFIS controller for empirical control of the mobile robot.

  17. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  18. The wire optical test: a thorough analytical study in and out of caustic surface, and advantages of a dynamical adaptation

    Science.gov (United States)

    Alejandro Juárez-Reyes, Salvador; Sosa-Sánchez, Citlalli Teresa; Silva-Ortigoza, Gilberto; de Jesús Cabrera-Rosas, Omar; Espíndola-Ramos, Ernesto; Ortega-Vidals, Paula

    2018-03-01

    Among the best known non-interferometric optical tests are the wire test, the Foucault test and Ronchi test with a low frequency grating. Since the wire test is the seed to understand the other ones, the aim of the present work is to do a thorough study of this test for a lens with symmetry of revolution and to do this study for any configuration of the object and detection planes where both planes could intersect: two, one or no branches of the caustic region (including the marginal and paraxial foci). To this end, we calculated the vectorial representation for the caustic region, and we found the analytical expression for the pattern; we report that the analytical pattern explicitly depends on the magnitude of a branch of the caustic. With the analytical pattern we computed a set of simulations of a dynamical adaptation of the optical wire test. From those simulations, we have done a thorough analysis of the topological structure of the pattern; so we explain how the multiple image formation process and the image collapse process take place for each configuration, in particular, when both the wire and the detection planes are placed inside the caustic region, which has not been studied before. For the first time, we remark that not only the intersections of the object and detection planes with the caustic are important in the change of pattern topology; but also the projection of the intersection between the caustic and the object plane mapped onto the detection plane; and the virtual projection of the intersection between the caustic and the detection plane mapped onto the object plane. We present that for the new configurations of the optical system, the wire image is curves of the Tschirnhausen’s cubic, the piriform and the deformed eight-curve types.

  19. Adapting the Computed Tomography Criteria of Hemorrhagic Transformation to Stroke Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Lars Neeb

    2013-08-01

    Full Text Available Background: The main safety aspect in the use of stroke thrombolysis and in clinical trials of new pharmaceutical or interventional stroke therapies is the incidence of hemorrhagic transformation (HT after treatment. The computed tomography (CT-based classification of the European Cooperative Acute Stroke Study (ECASS distinguishes four categories of HTs. An HT can range from a harmless spot of blood accumulation to a symptomatic space-occupying parenchymal bleeding associated with a massive deterioration of symptoms and clinical prognosis. In magnetic resonance imaging (MRI HTs are often categorized using the ECASS criteria although this classification has not been validated in MRI. We developed MRI-specific criteria for the categorization of HT and sought to assess its diagnostic reliability in a retrospective study. Methods: Consecutive acute ischemic stroke patients, who had received a 3-tesla MRI before and 12-36 h after thrombolysis, were screened retrospectively for an HT of any kind in post-treatment MRI. Intravenous tissue plasminogen activator was given to all patients within 4.5 h. HT categorization was based on a simultaneous read of 3 different MRI sequences (fluid-attenuated inversion recovery, diffusion-weighted imaging and T2* gradient-recalled echo. Categorization of HT in MRI accounted for the various aspects of the imaging pattern as the shape of the bleeding area and signal intensity on each sequence. All data sets were independently categorized in a blinded fashion by 3 expert and 3 resident observers. Interobserver reliability of this classification was determined for all observers together and for each group separately by calculating Kendall's coefficient of concordance (W. Results: Of the 186 patients screened, 39 patients (21% had an HT in post-treatment MRI and were included for the categorization of HT by experts and residents. The overall agreement of HT categorization according to the modified classification was

  20. Functional Task Test: 3. Skeletal Muscle Performance Adaptations to Space Flight

    Science.gov (United States)

    Ryder, Jeffrey W.; Wickwire, P. J.; Buxton, R. E.; Bloomberg, J. J.; Ploutz-Snyder, L.

    2011-01-01

    The functional task test is a multi-disciplinary study investigating how space-flight induced changes to physiological systems impacts functional task performance. Impairment of neuromuscular function would be expected to negatively affect functional performance of crewmembers following exposure to microgravity. This presentation reports the results for muscle performance testing in crewmembers. Functional task performance will be presented in the abstract "Functional Task Test 1: sensory motor adaptations associated with postflight alternations in astronaut functional task performance." METHODS: Muscle performance measures were obtained in crewmembers before and after short-duration space flight aboard the Space Shuttle and long-duration International Space Station (ISS) missions. The battery of muscle performance tests included leg press and bench press measures of isometric force, isotonic power and total work. Knee extension was used for the measurement of central activation and maximal isometric force. Upper and lower body force steadiness control were measured on the bench press and knee extension machine, respectively. Tests were implemented 60 and 30 days before launch, on landing day (Shuttle crew only), and 6, 10 and 30 days after landing. Seven Space Shuttle crew and four ISS crew have completed the muscle performance testing to date. RESULTS: Preliminary results for Space Shuttle crew reveal significant reductions in the leg press performance metrics of maximal isometric force, power and total work on R+0 (pperformance metrics were observed in returning Shuttle crew and these adaptations are likely contributors to impaired functional tasks that are ambulatory in nature (See abstract Functional Task Test: 1). Interestingly, no significant changes in central activation capacity were detected. Therefore, impairments in muscle function in response to short-duration space flight are likely myocellular rather than neuromotor in nature.

  1. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  2. Software abstractions and computational issues in parallel structure adaptive mesh methods for electronic structure calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, S.; Weare, J.; Ong, E.; Baden, S.

    1997-05-01

    We have applied structured adaptive mesh refinement techniques to the solution of the LDA equations for electronic structure calculations. Local spatial refinement concentrates memory resources and numerical effort where it is most needed, near the atomic centers and in regions of rapidly varying charge density. The structured grid representation enables us to employ efficient iterative solver techniques such as conjugate gradient with FAC multigrid preconditioning. We have parallelized our solver using an object- oriented adaptive mesh refinement framework.

  3. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  4. A review of culturally adapted versions of the Oswestry Disability Index: the adaptation process, construct validity, test-retest reliability and internal consistency.

    Science.gov (United States)

    Sheahan, Peter J; Nelson-Wong, Erika J; Fischer, Steven L

    2015-01-01

    The Oswestry Disability Index (ODI) is a self-report-based outcome measure used to quantify the extent of disability related to low back pain (LBP), a substantial contributor to workplace absenteeism. The ODI tool has been adapted for use by patients in several non-English speaking nations. It is unclear, however, if these adapted versions of the ODI are as credible as the original ODI developed for English-speaking nations. The objective of this study was to conduct a review of the literature to identify culturally adapted versions of the ODI and to report on the adaptation process, construct validity, test-retest reliability and internal consistency of these ODIs. Following a pragmatic review process, data were extracted from each study with regard to these four outcomes. While most studies applied adaptation processes in accordance with best-practice guidelines, there were some deviations. However, all studies reported high-quality psychometric properties: group mean construct validity was 0.734 ± 0.094 (indicated via a correlation coefficient), test-retest reliability was 0.937 ± 0.032 (indicated via an intraclass correlation coefficient) and internal consistency was 0.876 ± 0.047 (indicated via Cronbach's alpha). Researchers can be confident when using any of these culturally adapted ODIs, or when comparing and contrasting results between cultures where these versions were employed. Implications for Rehabilitation Low back pain is the second leading cause of disability in the world, behind only cancer. The Oswestry Disability Index (ODI) has been developed as a self-report outcome measure of low back pain for administration to patients. An understanding of the various cross-cultural adaptations of the ODI is important for more concerted multi-national research efforts. This review examines 16 cross-cultural adaptations of the ODI and should inform the work of health care and rehabilitation professionals.

  5. Distributed storage and cloud computing: a test case

    International Nuclear Information System (INIS)

    Piano, S; Ricca, G Delia

    2014-01-01

    Since 2003 the computing farm hosted by the INFN Tier3 facility in Trieste supports the activities of many scientific communities. Hundreds of jobs from 45 different VOs, including those of the LHC experiments, are processed simultaneously. Given that normally the requirements of the different computational communities are not synchronized, the probability that at any given time the resources owned by one of the participants are not fully utilized is quite high. A balanced compensation should in principle allocate the free resources to other users, but there are limits to this mechanism. In fact, the Trieste site may not hold the amount of data needed to attract enough analysis jobs, and even in that case there could be a lack of bandwidth for their access. The Trieste ALICE and CMS computing groups, in collaboration with other Italian groups, aim to overcome the limitations of existing solutions using two approaches: sharing the data among all the participants taking full advantage of GARR-X wide area networks (10 GB/s) and integrating the resources dedicated to batch analysis with the ones reserved for dynamic interactive analysis, through modern solutions as cloud computing.

  6. The quark gluon plasma: Lattice computations put to experimental test

    Indian Academy of Sciences (India)

    I describe how lattice computations are being used to extract experimentally relevant features of the quark gluon plasma. I deal specifically with relaxation times, photon emissivity, strangeness yields, event-by-event fluctuations of conserved quantities and hydrodynamic flow. Finally I give evidence that the plasma is rather ...

  7. Computerized Adaptive Test vs. decision trees: Development of a support decision system to identify suicidal behavior.

    Science.gov (United States)

    Delgado-Gomez, D; Baca-Garcia, E; Aguado, D; Courtet, P; Lopez-Castroman, J

    2016-12-01

    Several Computerized Adaptive Tests (CATs) have been proposed to facilitate assessments in mental health. These tests are built in a standard way, disregarding useful and usually available information not included in the assessment scales that could increase the precision and utility of CATs, such as the history of suicide attempts. Using the items of a previously developed scale for suicidal risk, we compared the performance of a standard CAT and a decision tree in a support decision system to identify suicidal behavior. We included the history of past suicide attempts as a class for the separation of patients in the decision tree. The decision tree needed an average of four items to achieve a similar accuracy than a standard CAT with nine items. The accuracy of the decision tree, obtained after 25 cross-validations, was 81.4%. A shortened test adapted for the separation of suicidal and non-suicidal patients was developed. CATs can be very useful tools for the assessment of suicidal risk. However, standard CATs do not use all the information that is available. A decision tree can improve the precision of the assessment since they are constructed using a priori information. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. An adaptive toolkit for image quality evaluation in system performance test of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, Guozhi; Petrov, Dimitar; Marshall, Nicholas; Bosmans, Hilde

    2017-03-01

    Digital breast tomosynthesis (DBT) is a relatively new diagnostic imaging modality for women. Currently, various models of DBT systems are available on the market and the number of installations is rapidly increasing. EUREF, the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services, has proposed a preliminary Guideline - protocol for the quality control of the physical and technical aspects of digital breast tomosynthesis systems, with an ultimate aim of providing limiting values guaranteeing proper performance for different applications of DBT. In this work, we introduce an adaptive toolkit developed in accordance with this guideline to facilitate the process of image quality evaluation in DBT performance test. This toolkit implements robust algorithms to quantify various technical parameters of DBT images and provides a convenient user interface in practice. Each test is built into a separate module with configurations set corresponding to the European guideline, which can be easily adapted to different settings and extended with additional tests. This toolkit largely improves the efficiency for image quality evaluation of DBT. It is also going to evolve with the development of protocols in quality control of DBT systems.

  9. Health Belief Model Scale for Human Papilloma Virus and its Vaccination: Adaptation and Psychometric Testing.

    Science.gov (United States)

    Guvenc, Gulten; Seven, Memnun; Akyuz, Aygul

    2016-06-01

    To adapt and psychometrically test the Health Belief Model Scale for Human Papilloma Virus (HPV) and Its Vaccination (HBMS-HPVV) for use in a Turkish population and to assess the Human Papilloma Virus Knowledge score (HPV-KS) among female college students. Instrument adaptation and psychometric testing study. The sample consisted of 302 nursing students at a nursing school in Turkey between April and May 2013. Questionnaire-based data were collected from the participants. Information regarding HBMS-HPVV and HPV knowledge and descriptive characteristic of participants was collected using translated HBMS-HPVV and HPV-KS. Test-retest reliability was evaluated and Cronbach α was used to assess internal consistency reliability, and exploratory factor analysis was used to assess construct validity of the HBMS-HPVV. The scale consists of 4 subscales that measure 4 constructs of the Health Belief Model covering the perceived susceptibility and severity of HPV and the benefits and barriers. The final 14-item scale had satisfactory validity and internal consistency. Cronbach α values for the 4 subscales ranged from 0.71 to 0.78. Total HPV-KS ranged from 0 to 8 (scale range, 0-10; 3.80 ± 2.12). The HBMS-HPVV is a valid and reliable instrument for measuring young Turkish women's beliefs and attitudes about HPV and its vaccination. Copyright © 2015 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  10. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  11. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    Science.gov (United States)

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  12. Comparison of Rigid and Adaptive Methods of Propagating Gross Tumor Volume Through Respiratory Phases of Four-Dimensional Computed Tomography Image Data Set

    International Nuclear Information System (INIS)

    Ezhil, Muthuveni; Choi, Bum; Starkschall, George; Bucci, M. Kara; Vedam, Sastry; Balter, Peter

    2008-01-01

    Purpose: To compare three different methods of propagating the gross tumor volume (GTV) through the respiratory phases that constitute a four-dimensional computed tomography image data set. Methods and Materials: Four-dimensional computed tomography data sets of 20 patients who had undergone definitive hypofractionated radiotherapy to the lung were acquired. The GTV regions of interest (ROIs) were manually delineated on each phase of the four-dimensional computed tomography data set. The ROI from the end-expiration phase was propagated to the remaining nine phases of respiration using the following three techniques: (1) rigid-image registration using in-house software, (2) rigid image registration using research software from a commercial radiotherapy planning system vendor, and (3) rigid-image registration followed by deformable adaptation originally intended for organ-at-risk delineation using the same software. The internal GTVs generated from the various propagation methods were compared with the manual internal GTV using the normalized Dice similarity coefficient (DSC) index. Results: The normalized DSC index of 1.01 ± 0.06 (SD) for rigid propagation using the in-house software program was identical to the normalized DSC index of 1.01 ± 0.06 for rigid propagation achieved with the vendor's research software. Adaptive propagation yielded poorer results, with a normalized DSC index of 0.89 ± 0.10 (paired t test, p <0.001). Conclusion: Propagation of the GTV ROIs through the respiratory phases using rigid- body registration is an acceptable method within a 1-mm margin of uncertainty. The adaptive organ-at-risk propagation method was not applicable to propagating GTV ROIs, resulting in an unacceptable reduction of the volume and distortion of the ROIs

  13. Design and testing of botanical thermotropic actuator mechanisms in thermally adaptive building coverings

    Science.gov (United States)

    Barrett, Ronald M.; Barrett, Ronald P.; Barrett, Cassandra M.

    2017-09-01

    This paper lays out the inspiration, operational principles, analytical modeling and coupon testing of a new class of thermally adaptive building coverings. The fundamental driving concepts for these coverings are derived from various families of thermotropic plant structures. Certain plant cellular structures like those in Mimosa pudica (Sensitive Plant), Rhododendron leaves or Albizia julibrissin (Mimosa Tree), exhibit actuation physiology which depends on changes in cellular turgor pressures to generate motion. This form of cellular action via turgor pressure manipulation is an inspiration for a new field of thermally adaptive building coverings which use various forms of cellular foam to aid or enable actuation much like plant cells are used to move leaves. When exposed to high solar loading, the structures use the inherent actuation capability of pockets of air trapped in closed cell foam as actuators to curve plates upwards and outwards. When cold, these same structures curve back towards the building forming large convex pockets of dead air to insulate the building. This paper describes basic classical laminated plate theory models comparing theory and experiment of such coupons containing closed-cell foam actuators. The study concludes with a global description of the effectiveness of this class of thermally adaptive building coverings.

  14. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    Science.gov (United States)

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Auditory adaptation testing as a tool for investigating tinnitus origin: two patients with vestibular schwannoma.

    Science.gov (United States)

    Silverman, Carol A; Silman, Shlomo; Emmer, Michele B

    2017-06-01

    To enhance the understanding of tinnitus origin by disseminating two case studies of vestibular schwannoma (VS) involving behavioural auditory adaptation testing (AAT). Retrospective case study. Two adults who presented with unilateral, non-pulsatile subjective tinnitus and bilateral normal-hearing sensitivity. At the initial evaluation, the otolaryngologic and audiologic findings were unremarkable, bilaterally. Upon retest, years later, VS was identified. At retest, the tinnitus disappeared in one patient and was slightly attenuated in the other patient. In the former, the results of AAT were positive for left retrocochlear pathology; in the latter, the results were negative for the left ear although a moderate degree of auditory adaptation was present despite bilateral normal-hearing sensitivity. Imaging revealed a small VS in both patients, confirmed surgically. Behavioural AAT in patients with tinnitus furnishes a useful tool for exploring tinnitus origin. Decrease or disappearance of tinnitus in patients with auditory adaptation suggests that the tinnitus generator is the cochlea or the cochlear nerve adjacent to the cochlea. Patients with unilateral tinnitus and bilateral, symmetric, normal-hearing thresholds, absent other audiovestibular symptoms, should be routinely monitored through otolaryngologic and audiologic re-evaluations. Tinnitus decrease or disappearance may constitute a red flag for retrocochlear pathology.

  16. Testing Local Adaptation in a Natural Great Tit-Malaria System: An Experimental Approach.

    Directory of Open Access Journals (Sweden)

    Tania Jenkins

    Full Text Available Finding out whether Plasmodium spp. are coevolving with their vertebrate hosts is of both theoretical and applied interest and can influence our understanding of the effects and dynamics of malaria infection. In this study, we tested for local adaptation as a signature of coevolution between malaria blood parasites, Plasmodium spp. and its host, the great tit, Parus major. We conducted a reciprocal transplant experiment of birds in the field, where we exposed birds from two populations to Plasmodium parasites. This experimental set-up also provided a unique opportunity to study the natural history of malaria infection in the wild and to assess the effects of primary malaria infection on juvenile birds. We present three main findings: i there was no support for local adaptation; ii there was a male-biased infection rate; iii infection occurred towards the end of the summer and differed between sites. There were also site-specific effects of malaria infection on the hosts. Taken together, we present one of the few experimental studies of parasite-host local adaptation in a natural malaria system, and our results shed light on the effects of avian malaria infection in the wild.

  17. Development and adaptation of conduction and radiation heat-transfer computer codes for the CFTL

    International Nuclear Information System (INIS)

    Conklin, J.C.

    1981-08-01

    RODCON and HOTTEL are two computational methods used to calculate thermal and radiation heat transfer for the Core Flow Test Loop (CFTL) analysis efforts. RODCON was developed at ORNL to calculate the internal temperature distribution of the fuel rod simulator (FRS) for the CFTL. RODCON solves the time-dependent heat transfer equation in two-dimensional (R angle) cylindrical coordinates at an axial plane with user-specified radial material zones and time- and position-variant surface conditions at the FRS periphery. Symmetry of the FRS periphery boundary conditions is not necessary. The governing elliptic, partial differential heat equation is cast into a fully implicit, finite-difference form by approximating the derivatives with a forward-differencing scheme with variable mesh spacing. The heat conduction path is circumferentially complete, and the potential mathematical problem at the rod center can be effectively ignored. HOTTEL is a revision of an algorithm developed by C.B. Baxi at the General Atomic Company (GAC) to be used in calculating radiation heat transfer in a rod bundle enclosed in a hexagonal duct. HOTTEL uses geometric view factors, surface emissivities, and surface areas to calculate the gray-body or composite view factors in an enclosure having multiple reflections in a nonparticipating medium

  18. Adaptation Computing Parameters of Pan-Tilt-Zoom Cameras for Traffic Monitoring

    Directory of Open Access Journals (Sweden)

    Ya Lin WU

    2014-01-01

    Full Text Available The Closed- CIRCUIT television (CCTV cameras have been widely used in recent years for traffic monitoring and surveillance applications. We can use CCTV cameras to extract automatically real-time traffic parameters according to the image processing and tracking technologies. Especially, the pan-tilt-zoom (PTZ cameras can provide flexible view selection as well as a wider observation range, and this makes the traffic parameters can be accurately calculated. Therefore, that the parameters of PTZ cameras are calibrated plays an important role in vision-based traffic applications. However, in the specific traffic environment, which is that the license plate number of the illegal parking is located, the parameters of PTZ cameras have to be updated according to the position and distance of illegal parking. In proposed traffic monitoring systems, we use the ordinary webcam and PTZ camera. We get vanishing-point of traffic lane lines in the pixel-based coordinate system by fixed webcam. The parameters of PTZ camera can be initialized by distance of the traffic monitoring and specific objectives and vanishing-point. And then we can use the coordinate position of the illegally parked car to update the parameters of PTZ camera and then get the real word coordinate position of the illegally parked car and use it to compute the distance. The result shows the error of the tested distance and real distance is only 0.2064 meter.

  19. Computerized Adaptive Testing with R: Recent Updates of the Package catR

    Directory of Open Access Journals (Sweden)

    David Magis

    2017-01-01

    Full Text Available The purpose of this paper is to list the recent updates of the R package catR. This package allows for generating response patterns under a computerized adaptive testing (CAT framework with underlying item response theory (IRT models. Among the most important updates, well-known polytomous IRT models are now supported by catR; several item selection rules have been added; and it is now possible to perform post-hoc simulations. Some functions were also rewritten or withdrawn to improve the usefulness and performances of the package.

  20. Transitioning the GED[R] Mathematics Test to Computer with and without Accommodations: A Pilot Project

    Science.gov (United States)

    Patterson, Margaret Becker; Higgins, Jennifer; Bozman, Martha; Katz, Michael

    2011-01-01

    We conducted a pilot study to see how the GED Mathematics Test could be administered on computer with embedded accessibility tools. We examined test scores and test-taker experience. Nineteen GED test centers across five states and 216 randomly assigned GED Tests candidates participated in the project. GED candidates completed two GED mathematics…

  1. Development and testing of methods for adaptive image processing in odontology and medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sund, Torbjoern

    2005-07-01

    Medical diagnostic imaging has undergone radical changes during the last ten years. In the early 1990'ies, the medical imaging department was almost exclusively film-based. Today, all major hospitals have converted to digital acquisition and handling of their diagnostic imaging, or are in the process of conversion. It is therefore important to investigate whether diagnostic reading of digitally acquired images on computer display screens can match or even surpass film recording and viewing. At the same time, the digitalisation opens new possibilities for image processing, which may challenge the traditional way of studying medical images. The current work explores some of the possibilities of digital processing techniques, and evaluates the results both by quantitative methods (ROC analysis) and by subjective qualification by real users. Summary of papers: Paper I: Locally adaptive image binarization with a sliding window threshold was used for the detection of bone ridges in radiotherapy portal images. A new thresholding criterion suitable for incremental update within the sliding window was developed, and it was shown that the algorithm gave better results on difficult portal images than various publicly available adaptive thresholding routines. For small windows the routine was also faster than an adaptive implementation of the Otsu algorithm that uses interpolation between fixed tiles, and the resulting images had equal quality. Paper II: It was investigated whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization could enhance the diagnostic quality of intra-oral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was

  2. Development and testing of methods for adaptive image processing in odontology and medicine

    International Nuclear Information System (INIS)

    Sund, Torbjoern

    2005-01-01

    Medical diagnostic imaging has undergone radical changes during the last ten years. In the early 1990'ies, the medical imaging department was almost exclusively film-based. Today, all major hospitals have converted to digital acquisition and handling of their diagnostic imaging, or are in the process of conversion. It is therefore important to investigate whether diagnostic reading of digitally acquired images on computer display screens can match or even surpass film recording and viewing. At the same time, the digitalisation opens new possibilities for image processing, which may challenge the traditional way of studying medical images. The current work explores some of the possibilities of digital processing techniques, and evaluates the results both by quantitative methods (ROC analysis) and by subjective qualification by real users. Summary of papers: Paper I: Locally adaptive image binarization with a sliding window threshold was used for the detection of bone ridges in radiotherapy portal images. A new thresholding criterion suitable for incremental update within the sliding window was developed, and it was shown that the algorithm gave better results on difficult portal images than various publicly available adaptive thresholding routines. For small windows the routine was also faster than an adaptive implementation of the Otsu algorithm that uses interpolation between fixed tiles, and the resulting images had equal quality. Paper II: It was investigated whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization could enhance the diagnostic quality of intra-oral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was first

  3. An efficient Adaptive Mesh Refinement (AMR) algorithm for the Discontinuous Galerkin method: Applications for the computation of compressible two-phase flows

    Science.gov (United States)

    Papoutsakis, Andreas; Sazhin, Sergei S.; Begg, Steven; Danaila, Ionut; Luddens, Francky

    2018-06-01

    We present an Adaptive Mesh Refinement (AMR) method suitable for hybrid unstructured meshes that allows for local refinement and de-refinement of the computational grid during the evolution of the flow. The adaptive implementation of the Discontinuous Galerkin (DG) method introduced in this work (ForestDG) is based on a topological representation of the computational mesh by a hierarchical structure consisting of oct- quad- and binary trees. Adaptive mesh refinement (h-refinement) enables us to increase the spatial resolution of the computational mesh in the vicinity of the points of interest such as interfaces, geometrical features, or flow discontinuities. The local increase in the expansion order (p-refinement) at areas of high strain rates or vorticity magnitude results in an increase of the order of accuracy in the region of shear layers and vortices. A graph of unitarian-trees, representing hexahedral, prismatic and tetrahedral elements is used for the representation of the initial domain. The ancestral elements of the mesh can be split into self-similar elements allowing each tree to grow branches to an arbitrary level of refinement. The connectivity of the elements, their genealogy and their partitioning are described by linked lists of pointers. An explicit calculation of these relations, presented in this paper, facilitates the on-the-fly splitting, merging and repartitioning of the computational mesh by rearranging the links of each node of the tree with a minimal computational overhead. The modal basis used in the DG implementation facilitates the mapping of the fluxes across the non conformal faces. The AMR methodology is presented and assessed using a series of inviscid and viscous test cases. Also, the AMR methodology is used for the modelling of the interaction between droplets and the carrier phase in a two-phase flow. This approach is applied to the analysis of a spray injected into a chamber of quiescent air, using the Eulerian

  4. Assessment of Postflight Locomotor Performance Utilizing a Test of Functional Mobility: Strategic and Adaptive Responses

    Science.gov (United States)

    Warren, L. E.; Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Ruttley, T. M.; Bloomberg, J. J.

    2006-01-01

    Space flight induces adaptive modification in sensorimotor function, allowing crewmembers to operate in the unique microgravity environment. This adaptive state, however, is inappropriate for a terrestrial environment. During a re-adaptation period upon their return to Earth, crewmembers experience alterations in sensorimotor function, causing various disturbances in perception, spatial orientation, posture, gait, and eye-head coordination. Following long duration space flight, sensorimotor dysfunction would prevent or extend the time required to make an emergency egress from the vehicle; compromising crew safety and mission objectives. We are investigating two types of motor learning that may interact with each other and influence a crewmember's ability to re-adapt to Earth's gravity environment. In strategic learning, crewmembers make rapid modifications in their motor control strategy emphasizing error reduction. This type of learning may be critical during the first minutes and hours after landing. In adaptive learning, long-term plastic transformations occur, involving morphological changes and synaptic modification. In recent literature these two behavioral components have been associated with separate brain structures that control the execution of motor strategies: the strategic component was linked to the posterior parietal cortex and the adaptive component was linked to the cerebellum (Pisella, et al. 2004). The goal of this paper was to demonstrate the relative contributions of the strategic and adaptive components to the re-adaptation process in locomotor control after long duration space flight missions on the International Space Station (ISS). The Functional Mobility Test (FMT) was developed to assess crewmember s ability to ambulate postflight from an operational and functional perspective. Sixteen crewmembers were tested preflight (3 sessions) and postflight (days 1, 2, 4, 7, 25) following a long duration space flight (approx 6 months) on the ISS. We

  5. Adapting a receptive vocabulary test for preschool-aged Greek-speaking children.

    Science.gov (United States)

    Okalidou, Areti; Syrika, Asimina; Beckman, Mary E; Edwards, Jan R

    2011-01-01

    Receptive vocabulary is an important measure for language evaluations. Therefore, norm-referenced receptive vocabulary tests are widely used in several languages. However, a receptive vocabulary test has not yet been normed for Modern Greek. To adapt an American English vocabulary test, the Receptive One-Word Picture Vocabulary Test-II (ROWPVT-II), for Modern Greek for use with Greek-speaking preschool children. The list of 170 English words on ROWPVT-II was adapted by (1) developing two lists (A and B) of Greek words that would match either the target English word or another concept corresponding to one of the pictured objects in the four-picture array; and (2) determining a developmental order for the chosen Greek words for preschool-aged children. For the first task, adult word frequency measures were used to select the words for the Greek wordlist. For the second task, 427 children, 225 boys and 202 girls, ranging in age from 2;0 years to 5;11 years, were recruited from urban and suburban areas of Greece. A pilot study of the two word lists was performed with the aim of comparing an equal number of list A and list B responses for each age group and deriving a new developmental list order. The relative difficulty of each Greek word item, that is, its accuracy score, was calculated by taking the average proportion of correct responses across ages for that word. Subsequently, the word accuracy scores in the two lists were compared via regression analysis, which yielded a highly significant relationship (R(2) = 0.97; p word item from the two lists was a better fit. Finally, new starting levels (basals) were established for preschool ages. The revised word list can serve as the basis for adapting a receptive vocabulary test for Greek preschool-aged children. Further steps need to be taken when testing larger numbers of 2;0 to 5;11-year-old children on the revised word list for determination of norms. This effort will facilitate early identification and remediation

  6. Network architecture test-beds as platforms for ubiquitous computing.

    Science.gov (United States)

    Roscoe, Timothy

    2008-10-28

    Distributed systems research, and in particular ubiquitous computing, has traditionally assumed the Internet as a basic underlying communications substrate. Recently, however, the networking research community has come to question the fundamental design or 'architecture' of the Internet. This has been led by two observations: first, that the Internet as it stands is now almost impossible to evolve to support new functionality; and second, that modern applications of all kinds now use the Internet rather differently, and frequently implement their own 'overlay' networks above it to work around its perceived deficiencies. In this paper, I discuss recent academic projects to allow disruptive change to the Internet architecture, and also outline a radically different view of networking for ubiquitous computing that such proposals might facilitate.

  7. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  8. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  9. Computer simulation and cold model testing of CCL cavities

    International Nuclear Information System (INIS)

    Chang, C.R.; Yao, C.G.; Swenson, D.A.; Funk, L.W.

    1993-01-01

    The SSC coupled-cavity-linac (CCL) consists of nine modules with eight tanks in each module. Multicavity magnetically coupled bridge couplers are used to couple the eight tanks within a module into one RF resonant chain. The operating frequency is 1282.851 MHz. In this paper the authors discuss both computer calculations and cold model measurements to determine the geometry dimension of the RF structure

  10. CR-Calculus and adaptive array theory applied to MIMO random vibration control tests

    Science.gov (United States)

    Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.

    2016-09-01

    Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.

  11. Training Senior Teachers in Compulsory Computer Based Language Tests

    Science.gov (United States)

    Laborda, Jesus Garcia; Royo, Teresa Magal

    2009-01-01

    The IBT TOEFL has become the principal example of online high stakes language testing since 2005. Most instructors who do the preparation for IBT TOEFL face two main realities: first, students are eager and highly motivated to take the test because of the prospective implications; and, second, specific studies would be necessary to see if…

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  16. Computer-assisted organization of the weld seam test

    International Nuclear Information System (INIS)

    Lorenz, H.; Richter, I.; Hansen, W.

    1986-01-01

    The article describes the task set and the solution found for an EDP program to be used for assisting in non-destructive tests. It covers the activities of test planning, disposition, calculation, test instructions, documentation and quality statistics. The phases of development and implementation have been essentially concluded. The program is not expected to reduce actual test work. Its advantages rather result from complete planning and execution, from early sensing of deviations, easier documentation and fast as well as easily surveyed informations. When the program has been fully integrated into the flow schedule of order handling, it is expected that 'additional work' will be reduced to an extent close to abt. 15 percent of the total work invested. By transferring the invested work from the test and documentation phase to the planning phase, the target of the program system meets the principle of modern quality assurance, namely to intensify measures for preventing errors. (orig.) [de

  17. Adaptive computations of flow around a delta wing with vortex breakdown

    Science.gov (United States)

    Modiano, David L.; Murman, Earll M.

    1993-01-01

    An adaptive unstructured mesh solution method for the three-dimensional Euler equations was used to simulate the flow around a sharp edged delta wing. Emphasis was on the breakdown of the leading edge vortex at high angle of attack. Large values of entropy, which indicate vortical regions of the flow, specified the region in which adaptation was performed. The aerodynamic normal force coefficients show excellent agreement with wind tunnel data measured by Jarrah, and demonstrate the importance of adaptation in obtaining an accurate solution. The pitching moment coefficient and the location of vortex breakdown are compared with experimental data measured by Hummel and Srinivasan, showing good agreement in cases in which vortex breakdown is located over the wing.

  18. Rational adaptation under task and processing constraints: implications for testing theories of cognition and action.

    Science.gov (United States)

    Howes, Andrew; Lewis, Richard L; Vera, Alonso

    2009-10-01

    The authors assume that individuals adapt rationally to a utility function given constraints imposed by their cognitive architecture and the local task environment. This assumption underlies a new approach to modeling and understanding cognition-cognitively bounded rational analysis-that sharpens the predictive acuity of general, integrated theories of cognition and action. Such theories provide the necessary computational means to explain the flexible nature of human behavior but in doing so introduce extreme degrees of freedom in accounting for data. The new approach narrows the space of predicted behaviors through analysis of the payoff achieved by alternative strategies, rather than through fitting strategies and theoretical parameters to data. It extends and complements established approaches, including computational cognitive architectures, rational analysis, optimal motor control, bounded rationality, and signal detection theory. The authors illustrate the approach with a reanalysis of an existing account of psychological refractory period (PRP) dual-task performance and the development and analysis of a new theory of ordered dual-task responses. These analyses yield several novel results, including a new understanding of the role of strategic variation in existing accounts of PRP and the first predictive, quantitative account showing how the details of ordered dual-task phenomena emerge from the rational control of a cognitive system subject to the combined constraints of internal variance, motor interference, and a response selection bottleneck.

  19. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    Science.gov (United States)

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  20. Systematic Testing should not be a Topic in the Computer Science Curriculum!

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    In this paper we argue that treating "testing" as an isolated topic is a wrong approach in computer science and software engineering teaching. Instead testing should pervade practical topics and exercises in the computer science curriculum to teach students the importance of producing software...

  1. Testing for adaptive evolution of the female reproductive protein ZPC in mammals, birds and fishes reveals problems with the M7-M8 likelihood ratio test.

    Science.gov (United States)

    Berlin, Sofia; Smith, Nick G C

    2005-11-10

    Adaptive evolution appears to be a common feature of reproductive proteins across a very wide range of organisms. A promising way of addressing the evolutionary forces responsible for this general phenomenon is to test for adaptive evolution in the same gene but among groups of species, which differ in their reproductive biology. One can then test evolutionary hypotheses by asking whether the variation in adaptive evolution is consistent with the variation in reproductive biology. We have attempted to apply this approach to the study of a female reproductive protein, zona pellucida C (ZPC), which has been previously shown by the use of likelihood ratio tests (LRTs) to be under positive selection in mammals. We tested for evidence of adaptive evolution of ZPC in 15 mammalian species, in 11 avian species and in six fish species using three different LRTs (M1a-M2a, M7-M8, and M8a-M8). The only significant findings of adaptive evolution came from the M7-M8 test in mammals and fishes. Since LRTs of adaptive evolution may yield false positives in some situations, we examined the properties of the LRTs by several different simulation methods. When we simulated data to test the robustness of the LRTs, we found that the pattern of evolution in ZPC generates an excess of false positives for the M7-M8 LRT but not for the M1a-M2a or M8a-M8 LRTs. This bias is strong enough to have generated the significant M7-M8 results for mammals and fishes. We conclude that there is no strong evidence for adaptive evolution of ZPC in any of the vertebrate groups we studied, and that the M7-M8 LRT can be biased towards false inference of adaptive evolution by certain patterns of non-adaptive evolution.

  2. Adaptive Management Plan for Sensitive Plant Species on the Nevada Test Site

    International Nuclear Information System (INIS)

    Wills, C. A.

    2001-01-01

    The Nevada Test Site supports numerous plant species considered sensitive because of their past or present status under the Endangered Species Act and with federal and state agencies. In 1998, the U.S. Department of Energy, Nevada Operation Office (DOE/NV) prepared a Resource Management Plan which commits to protects and conserve these sensitive plant species and to minimize accumulative impacts to them. This document presents the procedures of a long-term adaptive management plan which is meant to ensure that these goals are met. It identifies the parameters that are measured for all sensitive plant populations during long-term monitoring and the adaptive management actions which may be taken if significant threats to these populations are detected. This plan does not, however, identify the current list of sensitive plant species know to occur on the Nevada Test Site. The current species list and progress on their monitoring is reported annually by DOE/NV in the Resource Management Plan

  3. The analog computation and contrast test of leaked electromagnetic noise in the klystron corridor

    International Nuclear Information System (INIS)

    Tao Xiaoping; Wang Guicheng

    2001-01-01

    In order to obtain a better understand of the characteristics and location of noise source, the leaked electromagnetic noise in the klystron corridor of NSRL has been analogously calculated. The computational method and formula of high frequency leaked noise of the modulator were given. On-the-spot contrast tests have been made on the base of analog computation. The contrast test results show reasonableness of analog computation and whereby offer a theoretic base for reducing noise leakage in corridor

  4. Semi-supervised adaptation in ssvep-based brain-computer interface using tri-training

    DEFF Research Database (Denmark)

    Bender, Thomas; Kjaer, Troels W.; Thomsen, Carsten E.

    2013-01-01

    This paper presents a novel and computationally simple tri-training based semi-supervised steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI). It is implemented with autocorrelation-based features and a Naïve-Bayes classifier (NBC). The system uses nine characters...

  5. Monitoring self-adaptive applications within edge computing frameworks: A state-of-the-art review

    NARCIS (Netherlands)

    Taherizadeh, S.; Jones, A.C.; Taylor, I.; Zhao, Z.; Stankovski, V.

    Recently, a promising trend has evolved from previous centralized computation to decentralized edge computing in the proximity of end-users to provide cloud applications. To ensure the Quality of Service (QoS) of such applications and Quality of Experience (QoE) for the end-users, it is necessary to

  6. Exploratory behaviour in the open field test adapted for larval zebrafish: impact of environmental complexity.

    Science.gov (United States)

    Ahmad, Farooq; Richardson, Michael K

    2013-01-01

    This study aimed to develop and characterize a novel (standard) open field test adapted for larval zebrafish. We also developed and characterized a variant of the same assay consisting of a colour-enriched open field; this was used to assess the impact of environmental complexity on patterns of exploratory behaviours as well to determine natural colour preference/avoidance. We report the following main findings: (1) zebrafish larvae display characteristic patterns of exploratory behaviours in the standard open field, such as thigmotaxis/centre avoidance; (2) environmental complexity (i.e. presence of colours) differentially affects patterns of exploratory behaviours and greatly attenuates natural zone preference; (3) larvae displayed the ability to discriminate colours. As reported previously in adult zebrafish, larvae showed avoidance towards blue and black; however, in contrast to the reported adult behaviour, larvae displayed avoidance towards red. Avoidance towards yellow and preference for green and orange are shown for the first time, (4) compared to standard open field tests, exposure to the colour-enriched open field resulted in an enhanced expression of anxiety-like behaviours. To conclude, we not only developed and adapted a traditional rodent behavioural assay that serves as a gold standard in preclinical drug screening, but we also provide a version of the same test that affords the possibility to investigate the impact of environmental stress on behaviour in larval zebrafish while representing the first test for assessment of natural colour preference/avoidance in larval zebrafish. In the future, these assays will improve preclinical drug screening methodologies towards the goal to uncover novel drugs. This article is part of a Special Issue entitled: insert SI title. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    Science.gov (United States)

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  8. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  9. Cross-Cultural Adaptation of the Urticaria Control Test From German to Castilian Spanish.

    Science.gov (United States)

    García-Díez, I; Curto-Barredo, L; Weller, K; Pujol, R M; Maurer, M; Giménez-Arnau, A M

    2015-11-01

    The clinical concept of urticaria embraces a heterogeneous group of conditions classified according to their clinical course as acute (lasting less than 6 weeks) or chronic (lasting 6 weeks or more). Chronic urticaria may be either spontaneous or induced. Few tools are available for monitoring the various clinical forms of this disease or for evaluating its impact on quality of life. The recently developed Urticaria Control Test to evaluate disease control is available in German, the original language, and American English. To culturally adapt the long and short versions of the Urticaria Control Test to Castilian Spanish to ensure equivalence between the translated items and those of the original version. To translate the Urticaria Control Test we followed the International Society for Pharmacoeconomics and Outcomes Research good practice guidelines, starting with forward translation and moving through back translation and cognitive debriefing steps. Three items were modified when the first Spanish version, translated from German, was discussed (cognitive debriefing). The revised translation was then translated back to German and sent to the Urticaria Control Test authors, who modified one item they considered had acquired a different focus through translation. A third Spanish version was then prepared and after minor proofreading changes was considered definitive. This study was the first step in making it possible to use the Urticaria Control Test questionnaire in Castilian Spanish. The next step will be to validate the translated questionnaire. Copyright © 2015 Elsevier España, S.L.U. and AEDV. All rights reserved.

  10. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

    NARCIS (Netherlands)

    Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

    2015-01-01

    Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

  11. Image-guided adaptive gating of lung cancer radiotherapy: a computer simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Aristophanous, Michalis; Rottmann, Joerg; Park, Sang-June; Berbeco, Ross I [Department of Radiation Oncology, Brigham and Women' s Hospital, Dana Farber Cancer Institute and Harvard Medical School, Boston, MA (United States); Nishioka, Seiko [Department of Radiology, NTT Hospital, Sapporo (Japan); Shirato, Hiroki, E-mail: maristophanous@lroc.harvard.ed [Department of Radiation Medicine, Hokkaido University School of Medicine, Sapporo (Japan)

    2010-08-07

    The purpose of this study is to investigate the effect that image-guided adaptation of the gating window during treatment could have on the residual tumor motion, by simulating different gated radiotherapy techniques. There are three separate components of this simulation: (1) the 'Hokkaido Data', which are previously measured 3D data of lung tumor motion tracks and the corresponding 1D respiratory signals obtained during the entire ungated radiotherapy treatments of eight patients, (2) the respiratory gating protocol at our institution and the imaging performed under that protocol and (3) the actual simulation in which the Hokkaido Data are used to select tumor position information that could have been collected based on the imaging performed under our gating protocol. We simulated treatments with a fixed gating window and a gating window that is updated during treatment. The patient data were divided into different fractions, each with continuous acquisitions longer than 2 min. In accordance to the imaging performed under our gating protocol, we assume that we have tumor position information for the first 15 s of treatment, obtained from kV fluoroscopy, and for the rest of the fractions the tumor position is only available during the beam-on time from MV imaging. The gating window was set according to the information obtained from the first 15 s such that the residual motion was less than 3 mm. For the fixed gating window technique the gate remained the same for the entire treatment, while for the adaptive technique the range of the tumor motion during beam-on time was measured and used to adapt the gating window to keep the residual motion below 3 mm. The algorithm used to adapt the gating window is described. The residual tumor motion inside the gating window was reduced on average by 24% for the patients with regular breathing patterns and the difference was statistically significant (p-value = 0.01). The magnitude of the residual tumor motion

  12. An evaluation of computerized adaptive testing for general psychological distress: combining GHQ-12 and Affectometer-2 in an item bank for public mental health research.

    Science.gov (United States)

    Stochl, Jan; Böhnke, Jan R; Pickett, Kate E; Croudace, Tim J

    2016-05-20

    Recent developments in psychometric modeling and technology allow pooling well-validated items from existing instruments into larger item banks and their deployment through methods of computerized adaptive testing (CAT). Use of item response theory-based bifactor methods and integrative data analysis overcomes barriers in cross-instrument comparison. This paper presents the joint calibration of an item bank for researchers keen to investigate population variations in general psychological distress (GPD). Multidimensional item response theory was used on existing health survey data from the Scottish Health Education Population Survey (n = 766) to calibrate an item bank consisting of pooled items from the short common mental disorder screen (GHQ-12) and the Affectometer-2 (a measure of "general happiness"). Computer simulation was used to evaluate usefulness and efficacy of its adaptive administration. A bifactor model capturing variation across a continuum of population distress (while controlling for artefacts due to item wording) was supported. The numbers of items for different required reliabilities in adaptive administration demonstrated promising efficacy of the proposed item bank. Psychometric modeling of the common dimension captured by more than one instrument offers the potential of adaptive testing for GPD using individually sequenced combinations of existing survey items. The potential for linking other item sets with alternative candidate measures of positive mental health is discussed since an optimal item bank may require even more items than these.

  13. Computer-controlled environmental test systems - Criteria for selection, installation, and maintenance.

    Science.gov (United States)

    Chapman, C. P.

    1972-01-01

    Applications for presently marketed, new computer-controlled environmental test systems are suggested. It is shown that capital costs of these systems follow an exponential cost function curve that levels out as additional applications are implemented. Some test laboratory organization changes are recommended in terms of new personnel requirements, and facility modification are considered in support of a computer-controlled test system. Software for computer-controlled test systems are discussed, and control loop speed constraints are defined for real-time control functions. Suitable input and output devices and memory storage device tradeoffs are also considered.

  14. Comparing Postsecondary Marketing Student Performance on Computer-Based and Handwritten Essay Tests

    Science.gov (United States)

    Truell, Allen D.; Alexander, Melody W.; Davis, Rodney E.

    2004-01-01

    The purpose of this study was to determine if there were differences in postsecondary marketing student performance on essay tests based on test format (i.e., computer-based or handwritten). Specifically, the variables of performance, test completion time, and gender were explored for differences based on essay test format. Results of the study…

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  16. Test experience on an ultrareliable computer communication network

    Science.gov (United States)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  17. Correlations between the scores of computerized adaptive testing, paper and pencil tests, and the Korean Medical Licensing Examination

    Directory of Open Access Journals (Sweden)

    Mee Young Kim

    2005-06-01

    Full Text Available To evaluate the usefulness of computerized adaptive testing (CAT in medical school, the General Examination for senior medical students was administered as a paper and pencil test (P&P and using CAT. The General Examination is a graduate examination, which is also a preliminary examination for the Korean Medical Licensing Examination (KMLE. The correlations between the results of the CAT and P&P and KMLE were analyzed. The correlation between the CAT and P&P was 0.8013 (p=0.000; that between the CAT and P&P was 0.7861 (p=0.000; and that between the CAT and KMLE was 0.6436 (p=0.000. Six out of 12 students with an ability estimate below 0.52 failed the KMLE. The results showed that CAT could replace P&P in medical school. The ability of CAT to predict whether students would pass the KMLE was 0.5 when the criterion of the theta value was set at -0.52 that was chosen arbitrarily for the prediction of pass or failure.

  18. Adaptive Radar Signal Processing-The Problem of Exponential Computational Cost

    National Research Council Canada - National Science Library

    Rangaswamy, Muralidhar

    2003-01-01

    .... Extensions to handle the case of non-Gaussian clutter statistics are presented. Current challenges of limited training data support, computational cost, and severely heterogeneous clutter backgrounds are outlined...

  19. Climate Analogues for agricultural impact projection and adaptation – a reliability test

    Directory of Open Access Journals (Sweden)

    Swen P.M. Bos

    2015-10-01

    Full Text Available The climate analogue approach is often considered a valuable tool for climate change impact projection and adaptation planning, especially for complex systems that cannot be modelled reliably. Important examples are smallholder farming systems using agroforestry or other mixed-cropping approaches. For the projected climate at a particular site of interest, the analogue approach identifies locations where the current climate is similar to these projected conditions. By comparing baseline-analogue site pairs, information on climate impacts and opportunities for adaptation can be obtained. However, the climate analogue approach is only meaningful, if climate is a dominant driver of differences between baseline and analogue site pairs. For a smallholder farming setting on Mt. Elgon in Kenya, we tested this requirement by comparing yield potentials of maize and coffee (obtained from the IIASA Global Agro-ecological Zones dataset among 50 close analogue sites for different future climate scenarios and models, and by comparing local ecological knowledge and farm characteristics for one baseline-analogue pair.Yield potentials among the 50 closest analogue locations varied strongly within all climate scenarios, hinting at factors other than climate as major drivers of what the analogue approach might interpret as climate effects. However, on average future climatic conditions seemed more favourable to maize and coffee cultivation than current conditions. The detailed site comparison revealed substantial differences between farms in important characteristics, such as farm size and presence of cash crops, casting doubt on the usefulness of the comparison for climate change analysis. Climatic constraints were similar between sites, so that no apparent lessons for adaptation could be derived. Pests and diseases were also similar, indicating that climate change may not lead to strong changes in biotic constraints at the baseline site in the near future. From

  20. Application of computer techniques to charpy impact testing of irradiated pressure vessel steels

    International Nuclear Information System (INIS)

    Landow, M.P.; Fromm, E.O.; Perrin, J.S.

    1982-01-01

    A Rockwell AIM 65 microcomputer has been modified to control a remote Charpy V-notch impact test machine. It controls not only handling and testing of the specimen but also transference and storage of instrumented Charpy test data. A system of electrical solenoid activated pneumatic cylinders and switches provides the interface between the computer and the test apparatus. A command language has been designated that allows the operator to command checkout, test procedure, and data storage via the computer. Automatic compliance with ASTM test procedures is built into the program